CN109925053B - Method, device and system for determining surgical path and readable storage medium - Google Patents

Method, device and system for determining surgical path and readable storage medium Download PDF

Info

Publication number
CN109925053B
CN109925053B CN201910161266.7A CN201910161266A CN109925053B CN 109925053 B CN109925053 B CN 109925053B CN 201910161266 A CN201910161266 A CN 201910161266A CN 109925053 B CN109925053 B CN 109925053B
Authority
CN
China
Prior art keywords
dimensional
image
simulated
projection
adjusting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910161266.7A
Other languages
Chinese (zh)
Other versions
CN109925053A (en
Inventor
何滨
李嗣生
李伟栩
沈丽萍
陈枭
徐琦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Santan Medical Technology Co Ltd
Original Assignee
Hangzhou Santan Medical Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Santan Medical Technology Co Ltd filed Critical Hangzhou Santan Medical Technology Co Ltd
Priority to CN201910161266.7A priority Critical patent/CN109925053B/en
Publication of CN109925053A publication Critical patent/CN109925053A/en
Priority to US17/431,683 priority patent/US20220133409A1/en
Priority to PCT/CN2020/077846 priority patent/WO2020177725A1/en
Application granted granted Critical
Publication of CN109925053B publication Critical patent/CN109925053B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The application provides a method, a device and a system for determining a surgical path and a readable storage medium. The method comprises the steps of obtaining a three-dimensional local image and a virtual path located in the three-dimensional local image; matching a simulated two-dimensional image obtained based on the three-dimensional local image projection with a two-dimensional projection image obtained based on the affected part; and when the simulated two-dimensional image is matched with the two-dimensional projection image, determining an operation path corresponding to the virtual path on the two-dimensional projection image according to the position information of the virtual path on the simulated two-dimensional image.

Description

Method, device and system for determining surgical path and readable storage medium
Technical Field
The present application relates to the field of medical technology, and in particular, to a method, an apparatus, a system, and a readable storage medium for determining a surgical path.
Background
At present, when carrying out the operation to the affected part through surgical instruments, inside the affected part need be inserted usually, and the distance from the body surface to the target point is located internally, and medical personnel can only rely on the experience according to the X piece of shooting to confirm the needle insertion, and it is high to medical personnel's experience requirement, probably needs to shoot the X piece many times moreover and confirms current needle insertion condition, increases the radiation, is unfavorable for reducing patient's misery.
Disclosure of Invention
The application provides a method, a device and a system for determining a surgical path and a readable storage medium, which are used for solving the defects in the related art.
According to a first aspect of embodiments of the present application, there is provided a method for determining a surgical path, including:
acquiring a three-dimensional local image and a virtual path positioned in the three-dimensional local image;
matching a simulated two-dimensional image obtained based on the three-dimensional local image projection with a two-dimensional projection image obtained based on the affected part;
and when the simulated two-dimensional image is matched with the two-dimensional projection image, determining an operation path corresponding to the virtual path on the two-dimensional projection image according to the position information of the virtual path on the simulated two-dimensional image.
Optionally, the matching of the simulated two-dimensional image obtained by projecting the three-dimensional local image and the two-dimensional projection image obtained by locally comprises:
carrying out perspective projection on the affected part to obtain a two-dimensional projection image;
projecting the three-dimensional local image to obtain the simulated two-dimensional image;
and acquiring the coincidence degree of the simulated two-dimensional image and the two-dimensional projection image, and determining that the simulated two-dimensional image is matched with the two-dimensional projection image when the coincidence degree is not less than a preset threshold value.
Optionally, the obtaining of the coincidence degree of the simulated two-dimensional image and the two-dimensional projection image includes:
extracting a first projection area of the simulated two-dimensional image and a second projection area in the two-dimensional projection image;
and calculating the contact ratio according to the edge contour matching degree between the first projection area and the second projection area.
Optionally, obtaining a coincidence degree of the simulated two-dimensional image and the two-dimensional projection image includes:
dividing the simulated two-dimensional image and the two-dimensional projection image along a preset direction according to a preset proportion;
and matching each segmentation area of the simulated two-dimensional image with the segmentation area corresponding to the two-dimensional projection image to obtain the contact ratio.
Optionally, the method further includes:
adjusting the spatial attitude of the three-dimensional local image; and/or
Adjusting projection parameters for the three-dimensional partial image.
Optionally, the adjusting the spatial pose of the three-dimensional local image includes at least one of:
adjusting the rotation angle of the three-dimensional local image aiming at least one coordinate axis;
adjusting the displacement of the three-dimensional local image for at least one coordinate axis;
optionally, the adjusting the projection parameters for the three-dimensional partial image includes at least one of:
adjusting the focal length;
adjusting the position of the virtual light source;
adjusting an imaging resolution, the imaging resolution being related to a size of a projected imaging plane of the three-dimensional partial image and an image size of the simulated two-dimensional image.
According to a second aspect of embodiments of the present application, there is provided a surgical path determination apparatus, including:
the acquisition module acquires a three-dimensional local image and a virtual path positioned in the three-dimensional local image;
the matching module is used for matching a simulated two-dimensional image obtained based on the three-dimensional local image projection and a two-dimensional projection image obtained based on an affected part;
and the determining module is used for determining a surgical path corresponding to the virtual path on the two-dimensional projection image according to the position information of the virtual path on the simulated two-dimensional image when the simulated two-dimensional image is matched with the two-dimensional projection image.
According to a third aspect of embodiments herein, there is provided a computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the steps of the method according to any of the embodiments described above.
According to a fourth aspect of the embodiments of the present application, there is provided a surgical navigation system, including a camera for locally acquiring a two-dimensional projection image and a computer, the computer being communicatively connected to the camera;
wherein the computer comprises:
a display for showing the three-dimensional partial image, the two-dimensional projection image and the simulated two-dimensional image;
a processor;
a memory for storing processor-executable instructions;
wherein the instructions, when executed by the processor, implement the steps of the method according to any of the embodiments described above.
The technical scheme provided by the embodiment of the application can have the following beneficial effects:
according to the embodiment, the simulated two-dimensional image and the two-dimensional projection image are matched, and the path information in the two-dimensional projection image is determined according to the projection of the virtual path in the simulated two-dimensional image under the condition that the simulated two-dimensional image is matched with the two-dimensional projection image, so that medical workers can visually know the path information to assist the medical workers in determining the current needle inserting point and the needle inserting angle, the accuracy of the operation is improved, and the pain of patients is reduced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
Fig. 1 is a flow chart illustrating a surgical path determination method according to an exemplary embodiment.
FIG. 2 is a flow chart illustrating another surgical path determination method according to an exemplary embodiment.
Fig. 3 is a flow chart illustrating yet another surgical path determination method according to an exemplary embodiment.
FIG. 4 is a schematic diagram illustrating another simulated two-dimensional image according to an exemplary embodiment.
FIG. 5 is a schematic diagram illustrating another two-dimensional projection image in accordance with an exemplary embodiment.
Fig. 6 is a schematic structural diagram illustrating a surgical navigation system according to an exemplary embodiment.
FIG. 7 is a schematic block diagram of an apparatus provided in accordance with an exemplary embodiment.
Fig. 8 is a block diagram illustrating a surgical path determination device in accordance with an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
FIG. 1 is a flow chart illustrating a method for determining a surgical path according to an exemplary embodiment. As shown in fig. 1, the determining method may include:
in step 101, a three-dimensional partial image and a virtual path within the three-dimensional partial image are acquired.
In this embodiment, the affected part may be scanned by CT or MR, and reconstructed based on the scanning information, so as to obtain a three-dimensional local image. The affected part in this embodiment may include, but is not limited to, an extremity, a vertebra, a waist, a chest, or a head. The virtual path may be a linear path, and the length, width, and angle of the virtual path may be adjusted in the three-dimensional partial image, specifically based on the specification of an implanted instrument, and the implanted instrument may include a steel nail, and the like.
In step 102, a simulated two-dimensional image projected based on the three-dimensional partial image and a two-dimensional projection image projected based on the affected part are matched.
In this embodiment, the affected part may be photographed by a C-arm machine or other X-ray imaging device to obtain two-dimensional projection image points. The position information of the simulated two-dimensional image and the virtual path in the simulated two-dimensional image can be obtained through a DRR algorithm based on the three-dimensional local image. Alternatively, in other embodiments, the simulated two-dimensional image may be obtained by using a Siddon algorithm, a Shear-Warp algorithm, or the like, which is not limited in this application. Further, the degree of coincidence between the two-dimensional projection image and the simulated two-dimensional image can be determined, and when the degree of coincidence is not less than a preset threshold value, the two-dimensional projection image and the simulated two-dimensional image can be considered to be matched.
Further, when the coincidence degree between the two-dimensional projection image and the simulated two-dimensional image is smaller than a preset threshold value, the difference between the two-dimensional projection image and the simulated two-dimensional image is considered to be large, the adjusted simulated two-dimensional image can be obtained by adjusting the space posture of the three-dimensional local image or aiming at the projection parameters of the three-dimensional local image, and the adjusted simulated two-dimensional image is matched with the two-dimensional projection image. Of course, in other embodiments, the spatial pose of the three-dimensional partial image and the projection parameters for the three-dimensional partial image may be adjusted simultaneously, which is not limited in this application.
Specifically, in an embodiment, the difference between the simulated two-dimensional image and the two-dimensional projection image may be fed back according to the degree of coincidence between the simulated two-dimensional image and the two-dimensional projection image, so as to readjust the spatial posture of the three-dimensional partial image or adjust the projection parameters for the three-dimensional partial image according to the degree of coincidence. Of course, in other embodiments, the spatial posture or the projection parameter of the three-dimensional local image may also be sequentially adjusted at regular intervals, which is not limited in the present application.
Adjusting the spatial pose of the three-dimensional partial image may be adjusting a rotation angle of the three-dimensional partial image with respect to at least one coordinate axis, or may be adjusting a displacement of the three-dimensional partial image with respect to at least one coordinate axis. Of course, in other embodiments, the rotation angle and the displacement of any coordinate axis may be adjusted. Adjusting the projection parameters for the three-dimensional partial image may be adjusting one or more of a position of the virtual light source, a focal length of the virtual light source, and an imaging resolution. Wherein the imaging resolution is used to characterize the relationship between pixels and geometry, which is related to the size of the projected imaging plane of the three-dimensional partial image and the image size of the simulated two-dimensional image.
In the above embodiments, the degree of coincidence between the simulated two-dimensional image and the two-dimensional projection image may be implemented as follows:
in one embodiment, the user person may match the two-dimensional projection image and the simulated two-dimensional image to obtain the degree of coincidence.
In another embodiment, a first projection region in the simulated two-dimensional image and a second projection region in the two-dimensional projection image may be extracted, and the degree of coincidence may be obtained according to the degree of edge contour matching between the first projection region and the second projection region.
In another embodiment, the simulated two-dimensional image and the two-dimensional projection image may be divided along a preset direction according to a preset ratio, and each divided region of the simulated two-dimensional image may be matched with a divided region corresponding to the two-dimensional projection image, so as to obtain a degree of overlap.
In step 103, when the simulated two-dimensional image and the two-dimensional projection image are matched, a surgical path corresponding to the virtual path on the two-dimensional projection image is determined according to the position information of the virtual path on the simulated two-dimensional image.
In this embodiment, when the simulated two-dimensional image is matched with the two-dimensional projection image, it may be considered that the posture information of the three-dimensional partial image may be taken as perspective position information of the affected part based on a perspective coordinate system of the photographing device, so that the position information of the virtual path in the three-dimensional partial image in the simulated two-dimensional image at this time may be taken as the posture information of the surgical path in the two-dimensional projection image, thereby obtaining the surgical path in the two-dimensional projection image, and facilitating to know the needle insertion point and the needle insertion angle of the surgical instrument.
According to the embodiment, the simulated two-dimensional image and the two-dimensional projection image are matched, and the path information in the two-dimensional projection image is determined according to the projection of the virtual path in the simulated two-dimensional image under the condition that the simulated two-dimensional image is matched with the two-dimensional projection image, so that medical workers can visually know the path information to assist the medical workers in determining the current needle inserting point and the needle inserting angle, the accuracy of the operation is improved, and the pain of patients is reduced.
With respect to the above technical solutions, the following detailed description will be based on a specific embodiment. As shown in fig. 2, the procedure of the method for determining the surgical path may include the following steps:
in step 201, a three-dimensional partial image is reconstructed from the scan information.
In step 202, a virtual path located within the three-dimensional partial image is acquired.
In this embodiment, three-dimensional scanning information can be obtained through CT scanning or MR imaging, and further a three-dimensional local image can be obtained. Further, a virtual path may be established for the lesion according to the current state of the lesion shown by the three-dimensional partial image.
In step 203, the spatial pose and projection parameters of the three-dimensional partial image are adjusted.
In the present embodiment, rotation or translation may be performed based on the illustrated three-dimensional partial image to adjust the spatial posture of the three-dimensional partial image. The spatial pose may include angle information made between the three-dimensional partial image and each coordinate axis, position information on each coordinate axis, and the like. The projection parameters may include adjusting a focal length of the virtual light source, i.e., a separation distance between the virtual light source and the virtual projection imaging plane, and may further include position information of the virtual light source with respect to each coordinate axis and an imaging resolution, which is related to a size of the projection imaging plane of the three-dimensional partial image and an image size of the emulated two-dimensional image, which may be the same as the image size of the two-dimensional projection image. For example, assuming that the size of the projection imaging plane is 200mm × 200mm and the image size of the simulated two-dimensional image is 1000 pixels × 1000 pixels, the imaging resolution is 0.2 mm/pixel. For another example, when the projection imaging plane has a size of 200mm × 300mm and the image size of the simulated two-dimensional image is 1000 pixels × 1000 pixels, the imaging resolution is 0.2 mm/pixel in one direction and 0.3 mm/pixel in the other direction.
In step 204, the three-dimensional partial image is projected to obtain a simulated two-dimensional image.
In this embodiment, the position information of the simulated two-dimensional image and the virtual path in the simulated two-dimensional image may be obtained by a DRR algorithm based on the three-dimensional local image. Alternatively, in other embodiments, the simulated two-dimensional image may be obtained by using a Siddon algorithm, a Shear-Warp algorithm, or the like, which is not limited in this application.
In step 205, a first projection region is extracted based on the simulated two-dimensional image.
In step 206, the lesion is photographed to obtain a two-dimensional projection image.
In step 207, a second projection region is extracted based on the two-dimensional projection image.
In this embodiment, based on the size of the projection plane and the area covered by the virtual light source, there may be other areas in the simulated two-dimensional image except the area corresponding to the three-dimensional local image. Similarly, there may be other regions in the two-dimensional projection image than the region corresponding to the lesion. Therefore, the present application extracts the first projection region in the simulated two-dimensional image and the second projection region in the two-dimensional projection image by image processing. Wherein the image processing may be extracting a patient area in the simulated two-dimensional image and a patient area in the two-dimensional projection image based on the gray values.
In step 208, the edge profiles of the first projection region and the second projection region are matched to obtain a coincidence ratio.
In this embodiment, the edge profiles of the first projection region and the second projection region may be matched to obtain the coincidence ratio between the two-dimensional projection image and the simulated two-dimensional image. For example, the relative positions of the first projection area and the second projection area may be adjusted based on the mark point on the first projection area and the corresponding position of the mark point on the second projection area, so that the mark point substantially coincides with the corresponding position of the mark point on the second projection area, and then edge contour matching is performed. The mark point may be a specific area on the affected part, such as a nerve mark point, a bone mark point, etc., which is not limited in this application.
In step 209, it is determined whether the contact ratio is greater than a predetermined threshold.
In the present embodiment, step 210 is performed when the distance between the two-dimensional projection image and the simulated two-dimensional image is greater than or equal to a preset threshold, and step 203 is performed when the distance between the two-dimensional projection image and the simulated two-dimensional image is less than the preset threshold. The adjustment amount and the adjustment direction for the three-dimensional local image space attitude and the projection parameters can be determined according to the contact ratio, so that the matching efficiency is improved.
In step 210, position information of the virtual path in the simulated two-dimensional image is obtained.
In step 211, the position information of the surgical path in the two-dimensional projection image is determined based on the position information of the virtual path in the simulated two-dimensional image.
In the present embodiment, the coincidence degree of the simulated two-dimensional image and the two-dimensional projection image is greater than or equal to the preset threshold value. Therefore, based on the projection information of the virtual path in the simulated two-dimensional image, the position information of the operation path on the two-dimensional local image can be determined, and the needle inserting point and the needle inserting direction can be obtained. For example, the distance between the needle insertion point on the simulated two-dimensional image and the edge of the simulated two-dimensional image can be calculated according to the imaging resolution of the simulated two-dimensional image, so that the position of the needle insertion point on the two-dimensional projection image can be obtained according to the distance. Similarly, the position information of other position points on the operation path can also be obtained, and is not described in detail herein.
Fig. 3 is another embodiment according to the present disclosure. As shown in fig. 3, the surgical path determination method may include the steps of:
in step 501, a three-dimensional partial image is reconstructed from the scan information.
In step 502, a virtual path within the three-dimensional partial image is acquired.
In step 503, the spatial pose and projection parameters of the three-dimensional partial image are adjusted.
In step 504, the three-dimensional partial image is projected to obtain a simulated two-dimensional image.
In the present embodiment, the step 501-504 can refer to the step 201-204 in the embodiment shown in fig. 2, and is not described herein again.
In step 505, the simulated two-dimensional image is segmented along a preset direction according to a preset proportion.
In step 506, the lesion is photographed to obtain a two-dimensional projection image.
In step 507, the two-dimensional projection image is divided along a preset direction according to a preset proportion.
In the present embodiment, as shown in fig. 4 and 5, the simulated two-dimensional image of fig. 4 and the two-dimensional image of fig. 5 may be divided into N divided regions having a size of D × D by dividing the two-dimensional image of fig. 4 in a direction indicated by an arrow a and a direction indicated by an arrow B.
In step 508, each segmented image of the simulated two-dimensional image is matched with a segmented image of a corresponding region on the two-dimensional projection image.
In step 509, a degree of coincidence between the simulated two-dimensional image and the two-dimensional projection image is calculated based on the matching result.
In the present embodiment, as shown in fig. 4 and fig. 5, each of the divided regions in fig. 4 has a corresponding region in fig. 5, for example, the divided region D1 located in the third column and the second row in fig. 4 corresponds to the divided region D1 located in the third column and the second row in fig. 5, and further the divided regions D1 and D1 may be matched to obtain a matching value. Similarly, the other segmentation regions in fig. 4 can be matched with the corresponding segmentation regions in fig. 5, and finally, the coincidence degree is obtained. For example, each partition may correspond to a weight coefficient, and the sum of the product of the matching degree and the weight coefficient of each partition and the product of the matching degree and the weight coefficient of other partitions is the coincidence degree.
In step 510, it is determined whether the contact ratio is greater than a predetermined threshold.
In the present embodiment, step 511 is performed when the distance between the two-dimensional projection image and the simulated two-dimensional image is equal to or greater than a preset threshold, and step 503 is performed when the distance between the two-dimensional projection image and the simulated two-dimensional image is less than the preset threshold.
In step 511, position information of the virtual path in the simulated two-dimensional image is acquired.
In step 512, a surgical path pointing to the target point in the two-dimensional projection image is determined according to the position information of the virtual path in the simulated two-dimensional image.
In this embodiment, steps 511 and 512 may refer to steps 510 and 511 in the embodiment shown in fig. 2, and are not described herein again.
Based on the technical solution of the present application, a surgical navigation system is further provided, as shown in fig. 6, the navigation system may include a camera 701 and a computer 702, the camera 701 may be used for shooting a lesion to obtain a two-dimensional projection image, for example, the camera 701 may include a C-arm machine or other X-ray imaging equipment. The computer 702 and the camera 701 can be connected in a wired or wireless communication manner. The computer 702 may also include a display 7021, which display 7021 may be used to show related images such as three-dimensional partial images, two-dimensional projection images, and simulated two-dimensional images. The computer may further comprise a processor (not shown in the figures) and a memory for storing processor-executable instructions, the processor being configured for implementing the steps of the method as described in any of the above figures 1-3.
Corresponding to the foregoing embodiments of the surgical path determining method, the present application also provides embodiments of a surgical path determining apparatus.
Fig. 7 is a schematic block diagram of an apparatus provided in an exemplary embodiment. Referring to fig. 7, at the hardware level, the apparatus includes a processor 802, an internal bus 804, a network interface 806, a memory 808, and a non-volatile memory 810, but may also include hardware required for other services. The processor 802 reads the corresponding computer program from the non-volatile memory 810 into the memory 808 and runs it to form the determination apparatus 800 of the surgical path on a logical level. Of course, besides software implementation, the one or more embodiments in this specification do not exclude other implementations, such as logic devices or combinations of software and hardware, and so on, that is, the execution subject of the following processing flow is not limited to each logic unit, and may also be hardware or logic devices.
Referring to fig. 8, in a software implementation, the device 800 for determining a surgical path may include an obtaining module 901, a matching module 902, and a determining module 903; wherein:
an obtaining module 901, configured to obtain a three-dimensional local image and a virtual path located in the three-dimensional local image;
a matching module 902, configured to match a simulated two-dimensional image obtained based on the three-dimensional local image projection with a two-dimensional projection image obtained based on local;
a determining module 903, configured to determine, when the simulated two-dimensional image matches the two-dimensional projection image, a surgical path corresponding to the virtual path on the two-dimensional projection image according to the position information of the virtual path on the simulated two-dimensional image.
The matching module 902 includes a first projection unit, a second projection unit, and an acquisition unit, wherein:
and the first projection unit is used for carrying out perspective projection on the affected part to obtain the two-dimensional projection image.
And the second projection unit is used for projecting the three-dimensional local image to obtain the simulated two-dimensional image.
And the acquisition unit is used for acquiring the coincidence degree of the simulated two-dimensional image and the two-dimensional projection image, and determining that the simulated two-dimensional image is matched with the two-dimensional projection image when the coincidence degree is not less than a preset threshold value.
The acquisition unit comprises an extraction subunit and a first calculation subunit, wherein:
an extraction subunit that extracts a first projection region of the simulated two-dimensional image and a second projection region in the two-dimensional projection image;
and the calculating subunit calculates the contact ratio according to the edge contour matching degree between the first projection area and the second projection area.
The acquisition unit comprises a segmentation subunit and a second calculation subunit, wherein:
the segmentation subunit is used for segmenting the simulation two-dimensional image and the two-dimensional projection image along a preset direction according to a preset proportion;
and the second calculating subunit is used for matching each segmentation region of the simulated two-dimensional image with the segmentation region corresponding to the two-dimensional projection image to obtain the coincidence degree.
The device 800 for determining a surgical path further includes:
the first adjusting module is used for adjusting the spatial posture of the three-dimensional local image; and/or
And the second adjusting module is used for adjusting the projection parameters aiming at the three-dimensional local image.
The first adjusting module is used for:
adjusting the rotation angle of the three-dimensional local image aiming at least one coordinate axis;
adjusting the displacement of the three-dimensional local image for at least one coordinate axis;
the second adjusting module is used for:
adjusting the focal length;
adjusting the position of the virtual light source;
adjusting an imaging resolution, the imaging resolution being related to a size of a projected imaging plane of the three-dimensional partial image and the simulated image size.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 810 comprising instructions, executable by the processor 802 of the electronic device to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (8)

1. A computer-readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the steps of the method of:
acquiring a three-dimensional local image and a virtual path positioned in the three-dimensional local image;
adjusting projection parameters and spatial attitude for the three-dimensional local image;
matching a simulated two-dimensional image obtained by projecting the three-dimensional local image with a two-dimensional projection image obtained based on an affected part;
when the simulated two-dimensional image is matched with the two-dimensional projection image, the posture information of the three-dimensional local image is perspective position information of an affected part based on a perspective coordinate system of a shooting device, and an operation path corresponding to the virtual path on the two-dimensional projection image is determined according to the position information of the virtual path on the simulated two-dimensional image.
2. The computer-readable storage medium of claim 1, wherein matching the simulated two-dimensional image based on the three-dimensional partial image projection and the two-dimensional projection image based on the lesion comprises:
carrying out perspective projection on the affected part to obtain a two-dimensional projection image;
and acquiring the coincidence degree of the simulated two-dimensional image and the two-dimensional projection image, and determining that the simulated two-dimensional image is matched with the two-dimensional projection image when the coincidence degree is not less than a preset threshold value.
3. The computer-readable storage medium of claim 2, wherein said obtaining a degree of coincidence of the simulated two-dimensional image and the two-dimensional projection image comprises:
extracting a first projection area in the simulated two-dimensional image and a second projection area in the two-dimensional projection image;
and calculating the contact ratio according to the edge contour matching degree between the first projection area and the second projection area.
4. The computer-readable storage medium of claim 2, wherein said obtaining a degree of coincidence of the simulated two-dimensional image and the two-dimensional projection image comprises:
dividing the simulated two-dimensional image and the two-dimensional projection image along a preset direction according to a preset proportion;
and matching each segmentation area of the simulated two-dimensional image with the segmentation area corresponding to the two-dimensional projection image to obtain the contact ratio.
5. The computer-readable storage medium of claim 1, wherein the adjusting the spatial pose of the three-dimensional partial image comprises at least one of:
adjusting the rotation angle of the three-dimensional local image aiming at least one coordinate axis;
and adjusting the displacement of the three-dimensional local image relative to at least one coordinate axis.
6. The computer-readable storage medium of claim 1, wherein the adjusting the projection parameters for the three-dimensional partial image comprises at least one of:
adjusting the focal length;
adjusting the position of the virtual light source;
adjusting an imaging resolution, the imaging resolution being related to a size of a projected imaging plane of the three-dimensional partial image and an image size of the simulated two-dimensional image.
7. An apparatus for determining a surgical path, comprising:
the acquisition module acquires a three-dimensional local image and a virtual path positioned in the three-dimensional local image;
the matching module is used for adjusting the projection parameters and the spatial attitude of the three-dimensional local image, and matching a simulated two-dimensional image obtained by projecting the three-dimensional local image and a two-dimensional projection image obtained based on an affected part;
and the determining module is used for determining an operation path corresponding to the virtual path on the two-dimensional projection image according to the position information of the virtual path on the simulated two-dimensional image, wherein when the simulated two-dimensional image is matched with the two-dimensional projection image, the posture information of the three-dimensional local image is perspective position information of an affected part based on a perspective coordinate system of a shooting device.
8. The surgical navigation system is characterized by comprising a shooting device and a computer, wherein the shooting device is used for shooting a local two-dimensional projection image, and the computer is in communication connection with the shooting device;
wherein the computer comprises:
a display for showing the three-dimensional partial image, the two-dimensional projection image and the simulated two-dimensional image;
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured for implementing the steps of the method according to any one of claims 1-6.
CN201910161266.7A 2019-03-04 2019-03-04 Method, device and system for determining surgical path and readable storage medium Active CN109925053B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201910161266.7A CN109925053B (en) 2019-03-04 2019-03-04 Method, device and system for determining surgical path and readable storage medium
US17/431,683 US20220133409A1 (en) 2019-03-04 2020-03-04 Method for Determining Target Spot Path
PCT/CN2020/077846 WO2020177725A1 (en) 2019-03-04 2020-03-04 Target path determining method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910161266.7A CN109925053B (en) 2019-03-04 2019-03-04 Method, device and system for determining surgical path and readable storage medium

Publications (2)

Publication Number Publication Date
CN109925053A CN109925053A (en) 2019-06-25
CN109925053B true CN109925053B (en) 2021-06-22

Family

ID=66986416

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910161266.7A Active CN109925053B (en) 2019-03-04 2019-03-04 Method, device and system for determining surgical path and readable storage medium

Country Status (1)

Country Link
CN (1) CN109925053B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220133409A1 (en) * 2019-03-04 2022-05-05 Hangzhou Santan Medical Technology Co., Ltd Method for Determining Target Spot Path
CN113952030B (en) * 2021-10-28 2023-12-15 北京深睿博联科技有限责任公司 Planning method and device for needle insertion path and ablation position of radio frequency electrode

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1608386A (en) * 2001-10-24 2005-04-20 纽鲁克公司 Projection of three-dimensional images
CN106794044A (en) * 2015-06-05 2017-05-31 陈阶晓 Method for tracing in art

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005028746B4 (en) * 2005-06-21 2018-02-22 Siemens Healthcare Gmbh Method for determining the position and orientation of an object, in particular a catheter, from two-dimensional x-ray images
FR2897182A1 (en) * 2006-02-09 2007-08-10 Gen Electric METHOD FOR PROCESSING TOMOSYNTHESIS PROJECTION IMAGES FOR DETECTION OF RADIOLOGICAL SIGNS
WO2008065581A2 (en) * 2006-11-28 2008-06-05 Koninklijke Philips Electronics N.V. Apparatus for determining a position of a first object within a second object
RU2568635C2 (en) * 2007-12-18 2015-11-20 Конинклейке Филипс Электроникс, Н.В. Feature-based recording of two-/three-dimensional images
US10610172B2 (en) * 2012-07-17 2020-04-07 Koninklijke Philips N.V. Imaging system and method for enabling instrument guidance
CN103892861B (en) * 2012-12-28 2016-05-11 北京思创贯宇科技开发有限公司 A kind of analogue navigation system and method merging based on CT-XA image multi-dimensional
US9861295B2 (en) * 2013-05-21 2018-01-09 Autonomic Technologies, Inc. System and method for surgical planning and navigation to facilitate placement of a medical device within a target region of a patient
EP3280344A2 (en) * 2015-04-07 2018-02-14 King Abdullah University Of Science And Technology Method, apparatus, and system for utilizing augmented reality to improve surgery
US10716631B2 (en) * 2016-03-13 2020-07-21 Vuze Medical Ltd. Apparatus and methods for use with skeletal procedures
CN107679574A (en) * 2017-09-29 2018-02-09 深圳开立生物医疗科技股份有限公司 Ultrasonoscopy processing method and system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1608386A (en) * 2001-10-24 2005-04-20 纽鲁克公司 Projection of three-dimensional images
CN106794044A (en) * 2015-06-05 2017-05-31 陈阶晓 Method for tracing in art

Also Published As

Publication number Publication date
CN109925053A (en) 2019-06-25

Similar Documents

Publication Publication Date Title
US8597211B2 (en) Determination of indicator body parts and pre-indicator trajectories
US9240046B2 (en) Method and system to assist 2D-3D image registration
CN109925054B (en) Auxiliary method, device and system for determining target point path and readable storage medium
CN109993792B (en) Projection method, device and system and readable storage medium
US20220058797A1 (en) Artificial-intelligence-based determination of relative positions of objects in medical images
CN109925052B (en) Target point path determination method, device and system and readable storage medium
WO2013171441A2 (en) Virtual fiducial markers
US10078906B2 (en) Device and method for image registration, and non-transitory recording medium
CN109925053B (en) Method, device and system for determining surgical path and readable storage medium
CN108430376B (en) Providing a projection data set
KR102204309B1 (en) X-ray Image Display Method Based On Augmented Reality
US11478207B2 (en) Method for visualizing a bone
US9420984B2 (en) Method and device for assisting in the treatment of bone fractures
US11954887B2 (en) Artificial-intelligence based reduction support
CN109155068B (en) Motion compensation in combined X-ray/camera interventions
Zhang et al. An automatic ICP-based 2D-3D registration method for a high-speed biplanar videoradiography imaging system
KR102469141B1 (en) Medical image processing apparatus, medical image processing method, and program
US20220044440A1 (en) Artificial-intelligence-assisted surgery
CN114340727A (en) Medical image processing device, medical image processing program, medical device, and treatment system
TWI836493B (en) Method and navigation system for registering two-dimensional image data set with three-dimensional image data set of body of interest
EP3931799B1 (en) Interventional device tracking
EP3968215A1 (en) Determining target object type and position
TW202333631A (en) Method and navigation system for registering two-dimensional image data set with three-dimensional image data set of body of interest

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant