CN106974730A - Surgical simulation method, device and equipment based on virtual reality and medical image - Google Patents
Surgical simulation method, device and equipment based on virtual reality and medical image Download PDFInfo
- Publication number
- CN106974730A CN106974730A CN201710214282.9A CN201710214282A CN106974730A CN 106974730 A CN106974730 A CN 106974730A CN 201710214282 A CN201710214282 A CN 201710214282A CN 106974730 A CN106974730 A CN 106974730A
- Authority
- CN
- China
- Prior art keywords
- file
- modeling
- virtual environment
- interactive
- simulation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Processing Or Creating Images (AREA)
Abstract
The invention provides a kind of surgical simulation method, device and equipment based on virtual reality and medical image, the surgical simulation method includes:Build the virtual environment for simulating surgical procedure;Generated and shown in the virtual environment and interactive modeling file based on medical image;According to clinical demand, the interactive file of simulation modeling is set up;The modeling file and the interactive file of the simulation modeling are imported into the virtual environment;After the modeling file and the interactive file of the simulation modeling are imported into the virtual environment, surgical procedure is simulated based on the virtual environment.Technical scheme can provide accurately and intuitively pathological anatomy structural information and visual feedback for clinician, and fast and accurate diagnosis and treatment are realized so as to help to assist a physician.
Description
Technical field
The present invention relates to medical information technical field, virtual reality and medical image are based in particular to one kind
Surgical simulation method, device and electronic equipment.
Background technology
At present, virtual reality technology (Virtual Reality, abbreviation VR) gradually penetrates into each scientific domain,
Especially field of medicaments, being primarily due to can be according to the characteristics of sufferer and clinic needs to realize individual character by virtual reality technology
Change diagnosis and treatment, therefore virtual reality technology has turned into the potential powerful measure of the accurate medical treatment of auxiliary.
But, how accurately and intuitively pathological anatomy structural information and visual feedback are provided for clinician, from
And assist clinician to realize that fast and accurate diagnosis and treatment are always technical problem urgently to be resolved hurrily in industry.
It should be noted that information is only used for strengthening the reason of the background to the present invention disclosed in above-mentioned background section
Solution, therefore can include not constituting the information to prior art known to persons of ordinary skill in the art.
The content of the invention
It is an object of the invention to provide a kind of surgical simulation method, device and electricity based on virtual reality and medical image
Sub- equipment, and then one or more is asked caused by least overcoming limitation and the defect due to correlation technique to a certain extent
Topic.
Other characteristics and advantage of the present invention will be apparent from by following detailed description, or partially by the present invention
Practice and acquistion.
According to the first aspect of the invention there is provided a kind of surgical simulation method based on virtual reality and medical image,
Including:Build the virtual environment for simulating surgical procedure;Shown based on medical image generation in the virtual environment
And the modeling file of interaction;According to clinical demand, the interactive file of simulation modeling is set up;The modeling file and the emulation are built
Mould interaction file imports the virtual environment;It is described virtual the modeling file and the interactive file of the simulation modeling are imported
After environment, surgical procedure is simulated based on the virtual environment.
In some embodiments of the invention, based on aforementioned schemes, described surgical simulation method also includes:Configuration is described
The scenario parameters of virtual environment, and the corresponding input manager of the virtual environment and/or output manager.
In some embodiments of the invention, based on aforementioned schemes, the scenario parameters include at least one of or many
Individual combination:Observe field, field-of-view angle, lens direction, illumination direction, coordinate system.
In some embodiments of the invention, based on aforementioned schemes, described surgical simulation method also includes:Configuration emulation
Intervene the deformation model that the interactive operation of point is produced to the tissue in the virtual environment or organ.
In some embodiments of the invention, based on aforementioned schemes, described surgical simulation method also includes:Obtain true
Thing is intervened when intervening live tissue or organ, the stress deformation data and/or essence produced to the live tissue or organ
The deformation data of character state;The shape is configured according to the deformation data of the stress deformation data and/or the substantive state
Varying model.
In some embodiments of the invention, based on aforementioned schemes, described surgical simulation method also includes:Configuration emulation
Intervention point and the safe distance between tissue or organ in the virtual environment.
In some embodiments of the invention, based on aforementioned schemes, based on medical image generation in the virtual environment
The step of modeling file for being shown and being interacted, including:According to clinical demand and needing to be presented on the focus in virtual environment
And/or the information of organ, gather the of the dissection comprising the focus and/or organ and function information by the first image documentation equipment
One image file;Three-dimensional reconstruction is carried out based on first image file, file is rebuild in generation;By the form of the reconstruction file
The file format that the virtual environment is supported is converted to, to obtain the modeling file.
In some embodiments of the invention, based on aforementioned schemes, three-dimensional reconstruction is carried out based on first image file,
The step of file is rebuild in generation, including:According to the picture characteristics of first image file, first image file is carried out
The division of area-of-interest;The area-of-interest based on division carries out image segmentation and three-dimensional to first image file
Rebuild, to obtain the reconstruction file for each area-of-interest.
In some embodiments of the invention, based on aforementioned schemes, described surgical simulation method also includes:Pass through first
Image documentation equipment gathers the focus and/or organ is in multiple first image files of out of phase phase, and/or passes through the second shadow
Dissection and the second image file of function information comprising the focus and/or organ are gathered as equipment;To according to the multiple
The file of rebuilding of first image file generation carries out the image registration processing based on anatomical structure and phase time, and/or to institute
State image of the reconstruction file progress based on anatomical features point for rebuilding file and first image file of the second image file
Registration process.
In some embodiments of the invention, based on aforementioned schemes, the step of setting up simulation modeling interaction file, including:
According to clinical demand, it is determined that the apparatus needed to use;Surface, physical function and intervention angle based on the apparatus, it is raw
Into the interactive file of the simulation modeling.
In some embodiments of the invention, based on aforementioned schemes, the step of surgical procedure is simulated based on the virtual environment
Suddenly, including:Determine the trace point or point of observation in the virtual environment;Be calculated and be shown the trace point or point of observation with it is described
The coordinate relative position and distance of focus in modeling file, and/or the trace point or point of observation and the seat of its surrounding tissue
Mark relative position and distance.
In some embodiments of the invention, based on aforementioned schemes, the trace point or point of observation include:The emulation is built
The first make contact of mould interaction file and the modeling file.
In some embodiments of the invention, based on aforementioned schemes, the step of surgical procedure is simulated based on the virtual environment
Suddenly, including:During the simulation modeling file intervenes the modeling file, the simulation modeling is calculated and be shown interactive
Interactive relation and relevant parameter between file and the modeling file.
In some embodiments of the invention, based on aforementioned schemes, the interactive relation and relevant parameter include:It is described imitative
True interactive each border of file of modeling and the coordinate relative position of its surrounding tissue and distance, and/or the simulation modeling are mutual
Dynamic file and the coordinate relative position and distance of the focus in the modeling file.
In some embodiments of the invention, based on aforementioned schemes, the step of surgical procedure is simulated based on the virtual environment
Suddenly, including:During the simulation modeling file intervenes the modeling file, the simulation modeling is calculated and be shown interactive
The safe range of file and the responsive tissues in the modeling file, and/or due to Jie of the interactive file of the simulation modeling
The deformation data for entering and being caused to the tissue in the modeling file.
In some embodiments of the invention, based on aforementioned schemes, the step of surgical procedure is simulated based on the virtual environment
Suddenly, including:Calculate possible path and/or optimal path that the interactive file of the simulation modeling intervenes the modeling file;Display
The possible path and/or the optimal path.
In some embodiments of the invention, based on aforementioned schemes, the possible path and/or the optimal path are calculated
The parameter of institute's foundation includes:The interactive file of the simulation modeling intervenes the intervention point and intervention angle, described of the modeling file
The minimum of the simulation modeling interaction structure of file, the interactive file of the simulation modeling to the responsive tissues in the modeling file
Injury, the interactive file of the simulation modeling reach the beeline of the focus in the modeling file.
In some embodiments of the invention, based on aforementioned schemes, the step of surgical procedure is simulated based on the virtual environment
Suddenly, including:When the interactive file is contacted with the responsive tissues in the image modeling file, early warning is carried out.
In some embodiments of the invention, based on aforementioned schemes, described surgical simulation method also includes:According to described
Modeling file, the virtual environment and the complexity in simulation calculation, it is determined that running the device type of the virtual environment;
Based on the device type, the application program of virtual environment described in generating run.
According to the second aspect of the invention there is provided a kind of surgical simulation device based on virtual reality and medical image,
Including:Construction unit, builds the virtual environment for simulating surgical procedure;Generation unit, for being existed based on medical image generation
Shown in the virtual environment and interactive modeling file;Unit is set up, for according to clinical demand, setting up simulation modeling
Interactive file;Import unit, for the modeling file and the interactive file of the simulation modeling to be imported into the virtual environment;Place
Unit is managed, for the modeling file and the interactive file of the simulation modeling to be imported into the virtual environment in the import unit
Afterwards, surgical procedure is simulated based on the virtual environment.
According to the third aspect of the invention we there is provided a kind of electronic equipment, including:Processor and memory, the storage
Device is stored with executable instruction, and the processor is used to calling the executable instruction of the memory storage to perform as described above first
Method described in aspect.
In the technical scheme that some embodiments of the present invention are provided, by by medical image and virtual reality technology phase
With reference to carrying out the simulation of surgical procedure, and by carrying out ROI (Region of Interest, region of interest to medical image
Domain) the processing such as division, image segmentation, the generating of modeling file, registration optimization, enabling set up standardization, accurately
Anatomical model comprising pathological information, and shown the modeling file of anatomical structure one by one based on virtual platform, and then
Accurately and intuitively pathological anatomy structural information and visual feedback can be provided for clinician, so as to contribute to the realization that assists a physician
Fast and accurate diagnosis and treatment.It can be seen that, technical scheme is to promoting accurate medical treatment and realizing that patient personalized treatment has weight
The realistic meaning and social benefit wanted.
It should be appreciated that the general description of the above and detailed description hereinafter are only exemplary and explanatory, not
Can the limitation present invention.
Brief description of the drawings
Accompanying drawing herein is merged in specification and constitutes the part of this specification, shows the implementation for meeting the present invention
Example, and for explaining principle of the invention together with specification.It should be evident that drawings in the following description are only the present invention
Some embodiments, for those of ordinary skill in the art, on the premise of not paying creative work, can also basis
These accompanying drawings obtain other accompanying drawings.In the accompanying drawings:
Fig. 1 diagrammatically illustrates the operation based on virtual reality and medical image according to first embodiment of the invention
The flow chart of analogy method;
Fig. 2 diagrammatically illustrates the flow according to an embodiment of the invention that modeling file is generated based on medical image
Figure;
Fig. 3 diagrammatically illustrates the operation based on virtual reality and medical image of second embodiment according to the present invention
The flow chart of analogy method;
Fig. 4 diagrammatically illustrates the stream of the three-dimensional modeling based on medical image of first embodiment according to the present invention
Cheng Tu;
Fig. 5 diagrammatically illustrates the stream of the three-dimensional modeling based on medical image of second embodiment according to the present invention
Cheng Tu;
Fig. 6 diagrammatically illustrates the flow chart according to an embodiment of the invention for building VR simulated environments;
Fig. 7 diagrammatically illustrates setting for the virtual environment built according to an embodiment of the invention and basic scene parameter
The interface schematic diagram put;
Fig. 8 diagrammatically illustrate according to an embodiment of the invention trace point/point of observation tissue or intraorganic signal
Figure;
Fig. 9 was diagrammatically illustrated showing that virtual interactive environment is run on mobile phone A PP according to an embodiment of the invention
It is intended to;
Figure 10 diagrammatically illustrates virtual friendship according to an embodiment of the invention by displays such as virtual glasses or the helmets
Mutual environment schematic;
Figure 11 diagrammatically illustrates the operation according to an embodiment of the invention based on virtual reality and medical image
The block diagram of analogue means.
Embodiment
Example embodiment is described more fully with referring now to accompanying drawing.However, example embodiment can be with a variety of shapes
Formula is implemented, and is not understood as limited to example set forth herein;On the contrary, thesing embodiments are provided so that the present invention will more
Fully and completely, and by the design of example embodiment those skilled in the art is comprehensively conveyed to.
Implement in addition, described feature, structure or characteristic can be combined in any suitable manner one or more
In example.Embodiments of the invention are fully understood so as to provide there is provided many details in the following description.However,
It will be appreciated by persons skilled in the art that technical scheme can be put into practice without one or more in specific detail,
Or can be using other methods, constituent element, device, step etc..In other cases, it is not shown in detail or describes known side
Method, device, realization operate to avoid fuzzy each aspect of the present invention.
Block diagram shown in accompanying drawing is only functional entity, not necessarily must be corresponding with physically separate entity.
I.e., it is possible to realize these functional entitys using software form, or realized in one or more hardware modules or integrated circuit
These functional entitys, or realize in heterogeneous networks and/or processor device and/or microcontroller device these functional entitys.
Flow chart shown in accompanying drawing is merely illustrative, it is not necessary to including all contents and operation/step,
It is not required to perform by described order.For example, some operation/steps can also be decomposed, and some operation/steps can be closed
And or part merge, therefore the actual order performed is possible to be changed according to actual conditions.
Fig. 1 diagrammatically illustrates the operation based on virtual reality and medical image according to first embodiment of the invention
The flow chart of analogy method.
Reference picture 1, the surgical simulation side based on virtual reality and medical image according to first embodiment of the invention
Method, comprises the following steps:
Step S10, builds the virtual environment for simulating surgical procedure.
In embodiments of the present invention, virtual environment (Virtual Environment) is generated by computer, is led to
Cross vision, hearing, touch feel etc. and act on user, the interactive mode for being allowed to produce sensation on the spot in person is emulated regarding border.It is used for mould building
When intending the virtual environment of surgical procedure, it is possible to use virtual engine is built, for example, can be based on Unity engines come structure
Build virtual environment.
After virtual environment is built, in addition it is also necessary to configure the scenario parameters of the virtual environment, and the virtual environment
Corresponding input manager and/or output manager.
In embodiments of the present invention, the scenario parameters of virtual environment include but is not limited to:Observe field, field-of-view angle,
Lens direction, illumination direction, coordinate system.Configuration input manager, which is mainly, sets the instruction for inputing to virtual environment and interaction letter
Breath;Configuration output manager is mainly the instruction for setting virtual environment to export and interactive information.
In embodiments of the present invention, because the tissue and organ of normal human can be produced because of the entrance of external object
Raw corresponding deformation, in order to simulate this deformation in virtual environment, it is necessary to which after virtual environment is built, configuration emulation is intervened
The deformation model that the interactive operation of point is produced to the tissue in the virtual environment or organ.
Specifically, can first obtain true intervention thing when intervening live tissue or organ, to the live tissue or
Stress deformation data and/or the deformation data of substantive state that organ is produced;Then according to the stress deformation data and/or
The deformation data of the substantive state configures the deformation model.
It should be noted that when above-mentioned stress deformation data are mainly true intervention thing intervention live tissue or organ,
The data of the mechanical deformation produced due to true structure, shape, size for intervening thing etc. to live tissue or organ;Above-mentioned
The deformation data of substantive state is mainly due to the true physical function (such as burning) for intervening thing to live tissue or organ
The change of the substantive state of generation.
In addition, in some embodiments of the invention, after virtual environment is built, can also configure emulation intervention point with
The safe distance between tissue or organ in the virtual environment.When so can simulate surgical procedure in virtual environment,
The early warning of safe range can be carried out according to the safe distance of configuration.
Step S12, is generated based on medical image and is shown in the virtual environment and interactive modeling file.
According to the exemplary embodiment of the present invention, as shown in Fig. 2 step S12 is specifically included:
Step S121, according to clinical demand and needing to be presented on the focus in virtual environment and/or the information of organ, passes through
Dissection and first image file of function information of the collection of first image documentation equipment comprising the focus and/or organ.
In an embodiment of the present invention, the first image documentation equipment can be that (Computed Tomography, computer breaks CT
Layer scanning) equipment.
Step S122, three-dimensional reconstruction is carried out based on first image file, and file is rebuild in generation.
According to the exemplary embodiment of the present invention, step S122 is specifically included:According to the image of first image file
Characteristic, the division of area-of-interest is carried out to first image file;The area-of-interest based on division is to described
One image file carries out image segmentation and three-dimensional reconstruction, to obtain the reconstruction file for each area-of-interest.
It should be noted that the division due to area-of-interest can be carried out to the first image file, therefore, it is possible to will be every
One organ, functional block, focus etc. are independently divided into an area-of-interest, and then can make each organ, functional block, disease
Stove etc. is individually formed a reconstruction file.
Step S123, is converted to the file format that the virtual environment is supported, to obtain by the form of the reconstruction file
The modeling file.
In an embodiment of the present invention, the file format that virtual environment is supported such as can be " .OBJ " form.In form
During conversion, therefore, to assure that anatomic information is non-warping, do not lose, and can enter for modeling file in virtual environment
Row material, color, refraction, transparency, the definition and setting of gloss.
In an embodiment of the present invention, it is above-mentioned in order to provide the anatomical structure function information become apparent to clinician
The initial data that can be gathered for multiple reconstruction files of each organ, functional block, focus etc. based on different image documentation equipments
Either DICOM image files are rebuild generation by volume of data and image processing algorithm or adopted based on same image documentation equipment
Volume of data and image processing algorithm are passed through in initial data or DICOM the image files processing of the different time phase phase of collection
Rebuild generation.Further, it is also possible to carry out image registration processing to multiple reconstruction files of generation, clearly accurately weighed so as to realize
Existing anatomical structure and organ dysfunction information.Specifically, the embodiments of the invention provide following three kinds of image registration schemes:
Image registration scheme one:
The focus is gathered by the first image documentation equipment and/or organ is in multiple first images text of out of phase phase
Part;The image based on anatomical structure and phase time is carried out to the reconstruction file generated according to the multiple first image file to match somebody with somebody
Quasi- processing.
Image registration scheme two:
The second image text of the dissection comprising the focus and/or organ and function information is gathered by the second image documentation equipment
Part;To the reconstruction file for rebuilding file and the first image file of first image documentation equipment collection of second image file
Carry out the image registration processing based on anatomical features point.
Image registration scheme three:
The image registration scheme is the association schemes of above-mentioned image registration scheme one and image registration scheme two, i.e., by the
One image documentation equipment gathers the focus and/or organ is in multiple first image files of out of phase phase, and passes through the second shadow
As equipment gathers dissection and the second image file of function information comprising the focus and/or organ, the second shadow is then based on
As the reconstruction file for rebuilding file and multiple first image files of file carries out image registration processing.
In a kind of possible embodiment, the weight for multiple first image files that first can be gathered to the first image documentation equipment
Build file and carry out the image registration processing based on anatomical structure and phase time, then again to according to multiple first image files
The result and the reconstruction file of the second image file of the second image documentation equipment collection rebuild after file registration carry out special based on dissection
Levy image registration processing a little.Certainly, embodiments of the present invention are not limited to this.
Set it should be noted that the second above-mentioned image documentation equipment can be MR (Magnetic Resonance, magnetic resonance)
It is standby.
Step S14, according to clinical demand, sets up the interactive file of simulation modeling.
According to the exemplary embodiment of the present invention, step S14 includes:According to clinical demand, it is determined that the apparatus needed to use;
Surface, physical function and intervention angle based on the apparatus, generate the interactive file of the simulation modeling.
Step S16, the virtual environment is imported by the modeling file and the interactive file of the simulation modeling.
Step S18, after the modeling file and the interactive file of the simulation modeling are imported into the virtual environment, is based on
The virtual environment simulates surgical procedure.
For step S18, the embodiment of several simulation surgical procedures, certain embodiments of the present invention are exemplified below
Not limited to this:
Embodiment one:
Step S18 includes:Determine the trace point or point of observation in the virtual environment;Be calculated and be shown the trace point or
The coordinate relative position and distance of focus in point of observation and the modeling file, and/or the trace point or point of observation and its
The coordinate relative position and distance of surrounding tissue.
, can be by the interactive file of simulation modeling and the first make contact of modeling file according to the exemplary embodiment of the present invention
As above-mentioned trace point or point of observation, but embodiments of the invention are not limited to this.
Embodiment two:
Step S18 includes:During the simulation modeling file intervenes the modeling file, it is calculated and be shown described
Interactive relation and relevant parameter between simulation modeling interaction file and the modeling file.
According to the exemplary embodiment of the present invention, the interactive relation and relevant parameter include:The simulation modeling is interactive
Each border of file and the coordinate relative position of its surrounding tissue and distance, and/or the interactive file of the simulation modeling and institute
State the coordinate relative position and distance of the focus in modeling file.
Embodiment three:
Step S18 includes:During the simulation modeling file intervenes the modeling file, it is calculated and be shown described
The safe range of simulation modeling interaction file and the responsive tissues in the modeling file, and/or due to the simulation modeling
The intervention of interactive file and the deformation data caused to the tissue in the modeling file.
Embodiment four:
Step S18 includes:Calculate the possible path and/or most that the interactive file of the simulation modeling intervenes the modeling file
Shortest path;Show the possible path and/or the optimal path.
According to the exemplary embodiment of the present invention, the ginseng of the possible path and/or optimal path institute foundation is calculated
Number includes but is not limited to:The interactive file of the simulation modeling intervenes the intervention point and intervention angle, described imitative of the modeling file
The structure of the true interactive file of modeling, the interactive file of the simulation modeling are hindered to the minimum of the responsive tissues in the modeling file
The interactive file of harmful, described simulation modeling reaches the beeline of the focus in the modeling file.
Embodiment five:
Step S18 includes:When the interactive file is contacted with the responsive tissues in the image modeling file, carry out
Early warning.
In an embodiment of the present invention, the mode of early warning includes but is not limited to:Pointed out by way of sound,
Pointed out, pointed out by way of display screen is shown by way of light, by external equipment (such as handle, gloves)
Vibrations point out etc., the early warning screen in virtual environment is pointed out.
In some embodiments of the invention, based on aforementioned schemes, above-mentioned surgical simulation method can also include:According to
The modeling file, the virtual environment and the complexity in simulation calculation, it is determined that running the equipment of the virtual environment
Type;Based on the device type, the application program of virtual environment described in generating run.
Specifically, if modeling file, virtual environment and simulation calculation are complex, it can select with relatively strong meter
Run in the equipment (such as computer) of calculation ability;On the contrary, if modeling file, virtual environment and simulation calculation are relatively easy,
It can select to run on mobile device (such as smart mobile phone, tablet personal computer).
Fig. 3 diagrammatically illustrates the operation based on virtual reality and medical image of second embodiment according to the present invention
The flow chart of analogy method.
Reference picture 3, according to the surgical simulation side based on virtual reality and medical image of second embodiment of the present invention
Method, comprises the following steps:
Step S301, sets up clinical demand.
Various complicated case/focuses clinically be there are, it is necessary to which clinician controls for individual case design feasibility
Treatment scheme, and be based on choosing the best alternatives on the basis of feasibility.The combination of virtual reality technology and clinical image, can reappear
The sensitive sexual organ of focus and periphery, simulation modelling operation or minimally invasive intervention environment, it is allowed to which doctor is again complete in a simulation virtual
Formulate, perform, optimize in the full environment that real anatomy pathologic structure is presented, and the process for building operation or intervention repeated,
Increase the treatment feasibility and success rate of operation of difficult and complicated illness.
Step S302, the three-dimensional modeling of focus/organ and surrounding anatomical structures based on medical image, obtains modeling file.
For step S302, the present invention proposes the following two kinds specific embodiment:
Embodiment one:
Shown in reference picture 4, in one exemplary embodiment of the present invention, step S302 includes:
Step S3021, image and data acquisition.
Specifically, virtual environment is presented on according to clinical demand, and the need for clinician and image expert confirmation
In focus and histoorgan information, gather dissection and the function information of corresponding organ/focus.If for example, three-dimensional modeling
For formulating operation optimal case, then modeling needs are included:Lesion information, perilesional histoorgan information, sensitiveness device
In official's information, surgical procedure/intervention procedure in routing information, and the important untouchable organ letter being connected with focus
Breath (for example, sustainer, nerve).
Meanwhile, organ and lesion information can be included by CT and MR collections, accurately to react the shadow of human anatomic structure
As file, this document includes DICOM (Digital Imaging and Communications in Medicine, medical science number
Word is imaged and communicated) file and uncorrected data (raw data) file.In order to obtain the sufficiently high three-dimensional reconstruction figure of spatial resolution
Based on picture, the image file that can use high-end CT, data acquisition thickness is generally less than 0.6mm, makes by specific algorithm
Thickness is obtained between 0.3mm-0.6mm, temporal resolution needs the cardiac cycle higher than 1/2, keep x-y-z direction of principal axis each to same
Property.
It should be noted that:Above-mentioned DICOM is the international standard (ISO 12052) of medical image and relevant information, it
Define the Medical Image Format available for data exchange that quality can meet clinical needs.DICOM is widely used in radiation
Medical treatment, angiocarpy imaging and treatment for radiation-caused disease diagnostic device (X-ray, CT, nuclear magnetic resonance, ultrasound etc.), and in ophthalmology and tooth
Other medical domains such as section obtain more and more deeply being widely applied.Ten hundreds of with medical imaging device, DICOM
It is one of most widely-installed medical information standard.
Above-mentioned uncorrected data is the proper noun of a medical image, for example, can refer to and directly be extracted in CT system
Do not do 2 dimension reconstruction images (DICOM) it is direct gathered from detector process human body decay data.
In addition, in order to obtain more accurately three-dimensional reconstruction file, using the out of phase phase in same image documentation equipment
Image file be also necessary, for example, CT imaging in order to it is clearer display artery and vein vessel information, can exist
The certain diodone of injection in patient's body, and the image information in arterial phase and venous phase is gathered respectively, it is final three-dimensional
Rebuild and data are provided.
Step S3022, image preprocessing.
Step S3023, data prediction and 2 dimension image reconstructions.
Step S3024,3 d image is rebuild and post processing of image.
Step S3025, the image modeling based on medical image.
In an exemplary embodiment of the present invention, the processing to image can include image preprocessing, post processing, and 2D
Change 3D rendering modeling.Based on DICOM images or uncorrected data, according to picture characteristics, such as CT HU values and low contrast are right
Image carries out area-of-interest division, image segmentation and 3D modeling, finally makes each organ, functional block, focus is individually formed
One reconstruction file.And by image registration, such as image with reference to same organs/focus under MR, the high spatial point with acquisition
The CT images of resolution carry out registration, the function information become apparent from according to anatomical features point.
Step S3026, generates being shown under VR environment for focus/organ based on medical image and surrounding anatomical structures
With interactive modeling file.
Specifically, each focus and organ and the 3D modeling file of perienchyma based on medical image are converted into void
Intend actual environment acceptable file, such as " .OBJ " file (a kind of 3D model files form of standard).And in transfer process really
Guarantor's anatomic information is non-warping, do not lose, and to each 3D modeling file, the carry out material in the environment of virtual reality, color,
Refraction, transparency, the definition and setting of gloss.
Embodiment two:
Shown in reference picture 5, in two exemplary embodiments of the present invention, medical image DICOM file is mainly based upon
And/or uncorrected data generation modeling file.
Specifically, according to be presented on the need for clinician and image expert engineer confirmation focus in virtual environment and
The message of histoorgan, gathers clinical medicine image data, and image preprocessing, post processing are carried out to it, is rebuild and form conversion
With a series of image processing techniques such as optimization, the file format that can be received by reality environment is formed.With reference to Fig. 5 to it
Illustrate.
First, image and data acquisition are carried out.
In one embodiment of the invention, DICOM images can be obtained.In another embodiment of the invention, another
In one embodiment, uncorrected data can be obtained, data prediction and image reconstruction then are carried out to the uncorrected data.For example:Noise reduction, go
Artifact, area-of-interest selection (ROI (region of interest) info), iterative approximation enhancing etc..
Secondly, the image and data to above-mentioned collection carry out image procossing.
In the embodiment shown in Fig. 5, image procossing can be built including image preprocessing, post processing of image and image
Mould.
Wherein, image preprocessing can for example include:Image registration, area-of-interest enhancing, bone is extracted, muscle is extracted,
Image artifacts elimination etc..Post processing of image can for example include:Image divides, and filters, and spreads, and quantifies, conversion etc..Image modeling
It can such as include:Cutting, filtering, mark etc..
Wherein, image modeling here can use three-dimensional image reconstruction and display:It is right using classical three-dimensional reconstruction algorithm
The sequence image of acquisition carries out three-dimensional reconstruction on computers;And show each tomography in die body data using visualization technique
View data, while including cutting, zoomed image processing function, realizes more interaction process to image.
Ensure image/mould by the iterative process to above-mentioned image procossing, and through image expert engineer and clinician
Organ and lesion information is accurately presented in body, generates modeling file.
With continued reference to Fig. 3, described surgical simulation method also includes:
Step S303, builds VR simulated environments.
According to the exemplary embodiment of the present invention, it is possible to use virtual engine builds virtual environment, for example, can pass through
Unity engines and C-sharp language editting function functions realize the setting built with incoming/outgoing management device of virtual environment.
Detailed process as shown in fig. 6, including:
Step S3031, sets up the basic scene parameters such as observation field, field-of-view angle/camera lens orientation, illumination direction, coordinate system.
Specifically, Fig. 7 shows the boundary of the setting of the virtual environment built according to an embodiment of the invention and basic scene parameter
Face schematic diagram.
Step S3032, carries out input manager design and configuration.
Wherein, input manager design and configuration mainly configuration input to instruction and the interactive information of VR scenes.For example:
Position is defined and input instruction, guiding definition and input instruction, susceptibility definition and input instruction, action definition and input instruction
Deng, and each external equipment, such as handle, gloves, the input setting of the helmet., can according to the exemplary embodiment of the present invention
To carry out the design and configuration that input management by configuration interface.
Step S3033, carries out output manager design and configuration.
Wherein, output manager design and the configuration mainly instruction of configuration VR scenes output and interactive information.For example, field
The position coordinates of mutual files is shown in scape, safe space is shown, modeling file collision output is set etc., and each external sets
Standby, such as handle, the output of gloves is set.
Step S3034, sets emulation intervention point to intervene the deformation model and safe range of the surrounding tissue/organ produced.
Because the tissue and organ of normal human can produce corresponding deformation in the approach that external object enters, the deformation
The different stress shapes produced of structure, shape, size including entering object (for example, scalpel, intervenes pin) according to the external world etc.
The change of change and physical function (for example, burning) difference and the substantive state of generation, this deformation is needed in virtual environment
Modeling is set in advance.The same extraneous safe distance for entering object and peripheral sensitive organ, is also to need in virtual environment
Define and set in advance.Wherein, it is necessary to know the size and structure of emulation intervention point when being configured in step S3034.
Step S3035, mutual alignment coordinate is carried out in virtual environment and shows and contact the warning Notes of Key Data.
With continued reference to Fig. 3, described surgical simulation method also includes:
Step S304, sets up the interactive file of simulation modeling.
In an embodiment of the present invention, a series of simulation modeling can be set up according to the demand of different clinical operations
Interactive file is used for representing, for example, scalpel, intervenes the therapeutic devices such as pin.In the interactive file of modeling, including shape, structure,
Size and physical function, and intervention entry angle.According to the exemplary embodiment of the present invention, the form of simulation modeling interaction file
It can be " .OBJ " form.
It should be noted that the execution sequence between above-mentioned steps S302, step S303 and step S304 can be carried out mutually
Change.
Step S305, modeling file and the interactive file of modeling is imported the simulated environment built.
Specifically, in step S305, each file (focus, organ, surrounding tissue, sensitive work(that image is modeled
Energy sexual organ etc.) and interactive file is modeled while importing default VR-Platform.In virtual environment, to image/modeling
Resolution ratio optimizes the requirement for reaching clinical practice while improving its speed of service, and to each modeling file in virtual reality
Environment in carry out material, color, refraction, transparency, the definition and setting of gloss.
Intervention is calculated and be shown according to the size and structure and intervening mode of intervention file in step S306, tracking intervention point
Point and surrounding model tissue/organ, and in the position relationship of coordinate system, there is provided contact warning and safe range warning.
According to the exemplary embodiment of the present invention, the interactive file of modeling (for example, intervention pin) can be set to be modeled with image
The first make contact of file is (for example:Intervene the syringe needle of pin) it is trace point or point of observation.Certainly, embodiments of the invention are not limited
In this.
Described trace point or point of observation, could be arranged to hand motion or position tracking, and by external world's input (such as
Gesture, handle etc.) into virtual environment, display screen parameter carries out interactive tracking, and equally, embodiments of the present invention are not
This is confined to, different modes can be provided with according to the setting of external equipment and I/O management device.
The position relationship of intervention point and focus in coordinate system is calculated and be shown, and advises most in step S307, tracking intervention point
Excellent access path.Specifically, Fig. 8 shows that trace point/point of observation is in tissue or intraorganic according to an embodiment of the invention
Schematic diagram.
, can (interactive file be built with image according to the entry angle of interactive file according to the exemplary embodiment of the present invention
The angle of mould file), the structure of interactive file, interactive file is to the minimum injury of the responsive tissues in modeling file, interactive
File reaches beeline of focus etc. in modeling file, be calculated and be shown interactive file possibly into path and it is optimal enter
Enter path.
In order to further explain how calculate possibly into path and optimal access path, the embodiment of the present invention give as
Lower two examples:If the 1, intervention pin length is 20cm, most short straight line distance if desired has been above 20cm, then the entrance side
The feasibility of case is zero;If the big blood vessel of artery during the 2, intervention pin is inserted during entrance, the feasibility for entering scheme is
Zero.It should be noted that two above-mentioned examples are merely illustrative, embodiments of the present invention are not limited thereto.
Step S308, judges whether simulation modeling is complicated, if so, then performing step S311;Otherwise, step S309 is performed.
Specifically, can be according to the complexity of simulation modeling file, virtual environment, and backstage simulation calculation, really
It is fixed that the APP of smart machine (such as smart mobile phone) is run or generated on computers to the interactive environment program of generation and downloads to
Run in smart machine.
Step S309, generation interactive environment APP.
Step S310, imports smart machine, and link input-output equipment by APP.
Specifically, it can be run for simple virtual environment and simple simulation modeling on smart machine.Such as
Mobile phone A PP can be generated, is then downloaded in smart mobile phone, then mobile phone is arranged in virtual glasses, user can just start
It is interactive with virtual environment.As shown in figure 9, virtual interactive environment is run on mobile phone A PP for the embodiment of the present invention
An example.
Step S311, generates interactive environment computer applied algorithm.
Step S312, application program is run in computer, links input-output equipment.
Specifically, for the virtual environment with complicated hind computation and substantial amounts of simulation modeling file, due to right
The requirement degree of calculating is high, therefore can run on computers, and is shown virtual environment by external equipment and transfer wire
User is shown on the glasses or the helmet of link.As shown in Figure 10, it is the embodiment of the present invention by virtual glasses or head
The virtual interactive environment schematic diagram of the displays such as helmet.
Above-mentioned input-output equipment is primarily referred to as virtual glasses/helmet, handle, gloves, extraneous picture pick-up device and virtual
The operational outfit of environment program.Complexity, the custom of user that specifically can be according to virtual environment program, and number enter output
The instruction definition of the manager concatenated coding different with progress is set.
Step S313, utilizes tracking intervention point and handle coordinate point location and interbehavior progress doctor and virtual environment
Interactive and feedback of the information.
Figure 11 diagrammatically illustrates the operation according to an embodiment of the invention based on virtual reality and medical image
The block diagram of analogue means.
Reference picture 11, the surgical simulation device according to an embodiment of the invention based on virtual reality and medical image
1100, including:Construction unit 1101, generation unit 1102, set up unit 1103, import unit 1104 and processing unit 1105.
Specifically, construction unit 1101 builds the virtual environment for simulating surgical procedure;Generation unit 1102 is used for base
Generate and shown in the virtual environment and interactive modeling file in medical image;Setting up unit 1103 is used for according to facing
Bed demand, sets up the interactive file of simulation modeling;Import unit 1104 is used for the modeling file and the simulation modeling is interactive
File imports the virtual environment;Processing unit 1105 is used to build the modeling file and the emulation in the import unit
Mould interaction file is imported after the virtual environment, and surgical procedure is simulated based on the virtual environment.
It should be noted that each module/unit in the above-mentioned surgical simulation device based on virtual reality and medical image
Detail is described in detail in the corresponding surgical simulation method based on virtual reality and medical image, because
Here is omitted for this.
In addition, embodiments of the present invention also provide a kind of electronic equipment, it can include:Processor and memory, wherein,
The memory storage has executable instruction, and the processor is used to call the executable instruction of the memory storage to perform such as
The surgical simulation method based on virtual reality and medical image described in the above embodiment of the present invention.
In the exemplary embodiment, a kind of non-transitorycomputer readable storage medium including instructing, example are additionally provided
Such as include the memory of instruction, above-mentioned instruction can be completed in the exemplary embodiment upper by the computing device of electronic equipment
State technical scheme.For example, the non-transitorycomputer readable storage medium can be ROM, random access memory (RAM),
CD-ROM, tape, floppy disk and optical data storage devices etc..
Although it should be noted that being referred to some modules or list of the equipment for action executing in above-detailed
Member, but this division is not enforceable.In fact, according to the embodiment of the present invention, it is above-described two or more
Module or the feature and function of unit can embody in a module or unit.Conversely, an above-described mould
The feature and function of block or unit can be further divided into being embodied by multiple modules or unit.
Through the above description of the embodiments, those skilled in the art is it can be readily appreciated that example described herein is implemented
Mode can be realized by software, can also be realized by way of software combines necessary hardware.Therefore, according to the present invention
The technical scheme of embodiment can be embodied in the form of software product, the software product can be stored in one it is non-volatile
Property storage medium (can be CD-ROM, USB flash disk, mobile hard disk etc.) in or network on, including some instructions are to cause a calculating
Equipment (can be personal computer, server, touch control terminal or network equipment etc.) is performed according to embodiment of the present invention
Method.
Those skilled in the art will readily occur to its of the present invention after considering specification and putting into practice invention disclosed herein
Its embodiment.The application be intended to the present invention any modification, purposes or adaptations, these modifications, purposes or
Person's adaptations follow the general principle of the present invention and including undocumented common knowledge in the art of the invention
Or conventional techniques.Description and embodiments are considered only as exemplary, and true scope and spirit of the invention are by following
Claim is pointed out.
It should be appreciated that the invention is not limited in the precision architecture for being described above and being shown in the drawings, and
And various modifications and changes can be being carried out without departing from the scope.The scope of the present invention is only limited by appended claim.
Claims (21)
1. a kind of surgical simulation method based on virtual reality and medical image, it is characterised in that including:
Build the virtual environment for simulating surgical procedure;
Generated and shown in the virtual environment and interactive modeling file based on medical image;
According to clinical demand, the interactive file of simulation modeling is set up;
The modeling file and the interactive file of the simulation modeling are imported into the virtual environment;
After the modeling file and the interactive file of the simulation modeling are imported into the virtual environment, based on the virtual environment
Simulate surgical procedure.
2. surgical simulation method according to claim 1, it is characterised in that also include:
Configure the scenario parameters of the virtual environment, and the corresponding input manager of the virtual environment and/or outgoing management
Device.
3. surgical simulation method according to claim 2, it is characterised in that the scenario parameters include at least one of
Or multiple combinations:
Observe field, field-of-view angle, lens direction, illumination direction, coordinate system.
4. surgical simulation method according to claim 1, it is characterised in that also include:
The deformation model that the interactive operation of configuration emulation intervention point is produced to the tissue in the virtual environment or organ.
5. surgical simulation method according to claim 4, it is characterised in that also include:
True intervention thing is obtained when intervening live tissue or organ, the stress deformation number produced to the live tissue or organ
According to and/or substantive state deformation data;
The deformation model is configured according to the deformation data of the stress deformation data and/or the substantive state.
6. surgical simulation method according to claim 1, it is characterised in that also include:
Configuration emulation intervention point and the safe distance between tissue or organ in the virtual environment.
7. surgical simulation method according to claim 1, it is characterised in that based on medical image generation in the virtual ring
Shown in border and interactive modeling file, including:
According to clinical demand and needing to be presented on the focus in virtual environment and/or the information of organ, pass through the first image documentation equipment
Dissection and first image file of function information of the collection comprising the focus and/or organ;
Three-dimensional reconstruction is carried out based on first image file, file is rebuild in generation;
The form of the reconstruction file is converted into the file format that the virtual environment is supported, to obtain the modeling file.
8. surgical simulation method according to claim 7, it is characterised in that carried out based on first image file three-dimensional
Rebuild, the step of file is rebuild in generation, including:
According to the picture characteristics of first image file, the division of area-of-interest is carried out to first image file;
The area-of-interest based on division carries out image segmentation and three-dimensional reconstruction to first image file, to obtain pin
To the reconstruction file of each area-of-interest.
9. surgical simulation method according to claim 7, it is characterised in that also include:
Multiple first image files in the out of phase phase by the first image documentation equipment collection focus and/or organ, and/
Or dissection and the second image file of function information comprising the focus and/or organ are gathered by the second image documentation equipment;
Image based on anatomical structure and phase time is carried out to the file of rebuilding generated according to the multiple first image file
Registration process, and/or base is carried out to the reconstruction file for rebuilding file and first image file of second image file
Handled in the image registration of anatomical features point.
10. surgical simulation method according to claim 1, it is characterised in that the step of setting up simulation modeling interaction file,
Including:
According to clinical demand, it is determined that the apparatus needed to use;
Surface, physical function and intervention angle based on the apparatus, generate the interactive file of the simulation modeling.
11. surgical simulation method according to claim 1, it is characterised in that performed the operation based on virtual environment simulation
The step of journey, including:
Determine the trace point or point of observation in the virtual environment;
The trace point or point of observation and the coordinate relative position and distance of the focus in the modeling file is calculated and be shown,
And/or the trace point or point of observation and the coordinate relative position and distance of its surrounding tissue.
12. surgical simulation method according to claim 11, it is characterised in that the trace point or point of observation include:Institute
State the first make contact of the interactive file of simulation modeling and the modeling file.
13. surgical simulation method according to claim 1, it is characterised in that performed the operation based on virtual environment simulation
The step of journey, including:
During the simulation modeling file intervenes the modeling file, the interactive file of the simulation modeling is calculated and be shown
Interactive relation and relevant parameter between the modeling file.
14. surgical simulation method according to claim 13, it is characterised in that interactive relation and the relevant parameter bag
Include:
Interactive each border of file of the simulation modeling and the coordinate relative position of its surrounding tissue and distance, and/or it is described
Simulation modeling interaction file and the coordinate relative position and distance of the focus in the modeling file.
15. surgical simulation method according to claim 1, it is characterised in that performed the operation based on virtual environment simulation
The step of journey, including:
During the simulation modeling file intervenes the modeling file, the interactive file of the simulation modeling is calculated and be shown
With the safe range of the responsive tissues in the modeling file, and/or due to the interactive file of the simulation modeling intervention and
The deformation data caused to the tissue in the modeling file.
16. surgical simulation method according to claim 1, it is characterised in that performed the operation based on virtual environment simulation
The step of journey, including:
Calculate possible path and/or optimal path that the interactive file of the simulation modeling intervenes the modeling file;
Show the possible path and/or the optimal path.
17. surgical simulation method according to claim 16, it is characterised in that calculate the possible path and/or described
The parameter of optimal path institute foundation includes:
The interactive file of the simulation modeling intervenes the intervention point and the interactive text of intervention angle, the simulation modeling of the modeling file
Minimum injury, the emulation of the interactive file of the structure of part, the simulation modeling to the responsive tissues in the modeling file
The interactive file of modeling reaches the beeline of the focus in the modeling file.
18. surgical simulation method according to claim 1, it is characterised in that performed the operation based on virtual environment simulation
The step of journey, including:
When the interactive file is contacted with the responsive tissues in the image modeling file, early warning is carried out.
19. the surgical simulation method according to any one of claim 1 to 18, it is characterised in that also include:
According to the modeling file, the virtual environment and the complexity in simulation calculation, it is determined that running the virtual ring
The device type in border;
Based on the device type, the application program of virtual environment described in generating run.
20. a kind of surgical simulation device based on virtual reality and medical image, it is characterised in that including:
Construction unit, builds the virtual environment for simulating surgical procedure;
Generation unit, is shown and interactive modeling file for being generated based on medical image in the virtual environment;
Unit is set up, for according to clinical demand, setting up the interactive file of simulation modeling;
Import unit, for the modeling file and the interactive file of the simulation modeling to be imported into the virtual environment;
Processing unit, for the modeling file and the interactive file of the simulation modeling to be imported into the void in the import unit
Behind near-ring border, surgical procedure is simulated based on the virtual environment.
21. a kind of electronic equipment, it is characterised in that including:Processor and memory, the memory storage have executable finger
Order, the processor is used to call the executable instruction of the memory storage to perform such as any one of claim 1 to 19 institute
The surgical simulation method stated.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710214282.9A CN106974730A (en) | 2017-04-01 | 2017-04-01 | Surgical simulation method, device and equipment based on virtual reality and medical image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710214282.9A CN106974730A (en) | 2017-04-01 | 2017-04-01 | Surgical simulation method, device and equipment based on virtual reality and medical image |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106974730A true CN106974730A (en) | 2017-07-25 |
Family
ID=59344434
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710214282.9A Pending CN106974730A (en) | 2017-04-01 | 2017-04-01 | Surgical simulation method, device and equipment based on virtual reality and medical image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106974730A (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107481254A (en) * | 2017-08-24 | 2017-12-15 | 上海术理智能科技有限公司 | Processing method, device, medium and the electronic equipment of medical image |
CN109171604A (en) * | 2018-08-05 | 2019-01-11 | 广州高通影像技术有限公司 | A kind of intelligent endoscope operating system having AR function |
CN109350242A (en) * | 2018-12-11 | 2019-02-19 | 艾瑞迈迪科技石家庄有限公司 | A kind of surgical navigational method for early warning, storage medium and terminal device based on distance |
CN109419554A (en) * | 2017-09-04 | 2019-03-05 | 北京航空航天大学 | A kind of carpal bone system of virtual operation and method based on unity3d |
CN111243746A (en) * | 2020-01-20 | 2020-06-05 | 上海奥朋医疗科技有限公司 | Operation simulation method and system of vascular intervention operation robot |
CN111489623A (en) * | 2020-04-28 | 2020-08-04 | 中南民族大学 | CT principle virtual simulation experiment teaching system based on Ulty |
CN112914730A (en) * | 2021-01-19 | 2021-06-08 | 上海市第十人民医院 | Remote interventional therapy system based on VR technology |
CN113096253A (en) * | 2021-04-12 | 2021-07-09 | 广东视明科技发展有限公司 | Visual simulation system and method based on local visual field stimulation |
CN113616336A (en) * | 2021-09-13 | 2021-11-09 | 上海微创医疗机器人(集团)股份有限公司 | Surgical robot simulation system, simulation method, and readable storage medium |
CN113768619A (en) * | 2020-06-10 | 2021-12-10 | 长庚大学 | Path positioning method, information display device, storage medium and integrated circuit chip |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6256529B1 (en) * | 1995-07-26 | 2001-07-03 | Burdette Medical Systems, Inc. | Virtual reality 3D visualization for surgical procedures |
CN101393654A (en) * | 2008-10-07 | 2009-03-25 | 华南师范大学 | Computer assistant organ operation simulating system |
CN104463965A (en) * | 2014-12-17 | 2015-03-25 | 中国科学院自动化研究所 | Training scene simulation system and method for minimally invasive cardiovascular interventional operation |
CN104739519A (en) * | 2015-04-17 | 2015-07-01 | 中国科学院重庆绿色智能技术研究院 | Force feedback surgical robot control system based on augmented reality |
CN105264459A (en) * | 2012-09-27 | 2016-01-20 | 沉浸式触感有限公司 | Haptic augmented and virtual reality system for simulation of surgical procedures |
CN106200982A (en) * | 2016-07-26 | 2016-12-07 | 南京意斯伽生态科技有限公司 | A kind of medical surgery training system based on virtual reality technology |
-
2017
- 2017-04-01 CN CN201710214282.9A patent/CN106974730A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6256529B1 (en) * | 1995-07-26 | 2001-07-03 | Burdette Medical Systems, Inc. | Virtual reality 3D visualization for surgical procedures |
CN101393654A (en) * | 2008-10-07 | 2009-03-25 | 华南师范大学 | Computer assistant organ operation simulating system |
CN105264459A (en) * | 2012-09-27 | 2016-01-20 | 沉浸式触感有限公司 | Haptic augmented and virtual reality system for simulation of surgical procedures |
CN104463965A (en) * | 2014-12-17 | 2015-03-25 | 中国科学院自动化研究所 | Training scene simulation system and method for minimally invasive cardiovascular interventional operation |
CN104739519A (en) * | 2015-04-17 | 2015-07-01 | 中国科学院重庆绿色智能技术研究院 | Force feedback surgical robot control system based on augmented reality |
CN106200982A (en) * | 2016-07-26 | 2016-12-07 | 南京意斯伽生态科技有限公司 | A kind of medical surgery training system based on virtual reality technology |
Non-Patent Citations (1)
Title |
---|
王子罡、唐泽圣、王田苗等: "基于虚拟现实的计算机辅助立体定向神经外科手术系统", 《计算机学报》 * |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107481254A (en) * | 2017-08-24 | 2017-12-15 | 上海术理智能科技有限公司 | Processing method, device, medium and the electronic equipment of medical image |
CN109419554A (en) * | 2017-09-04 | 2019-03-05 | 北京航空航天大学 | A kind of carpal bone system of virtual operation and method based on unity3d |
CN109171604A (en) * | 2018-08-05 | 2019-01-11 | 广州高通影像技术有限公司 | A kind of intelligent endoscope operating system having AR function |
CN109350242A (en) * | 2018-12-11 | 2019-02-19 | 艾瑞迈迪科技石家庄有限公司 | A kind of surgical navigational method for early warning, storage medium and terminal device based on distance |
CN111243746A (en) * | 2020-01-20 | 2020-06-05 | 上海奥朋医疗科技有限公司 | Operation simulation method and system of vascular intervention operation robot |
CN111489623A (en) * | 2020-04-28 | 2020-08-04 | 中南民族大学 | CT principle virtual simulation experiment teaching system based on Ulty |
CN113768619A (en) * | 2020-06-10 | 2021-12-10 | 长庚大学 | Path positioning method, information display device, storage medium and integrated circuit chip |
US11806088B2 (en) | 2020-06-10 | 2023-11-07 | Chang Gung University | Method, system, computer program product and application-specific integrated circuit for guiding surgical instrument |
TWI790447B (en) * | 2020-06-10 | 2023-01-21 | 長庚大學 | Surgical path positioning method, information display device, computer-readable recording medium, and application-specific integrated circuit chip |
CN112914730A (en) * | 2021-01-19 | 2021-06-08 | 上海市第十人民医院 | Remote interventional therapy system based on VR technology |
CN112914730B (en) * | 2021-01-19 | 2022-12-16 | 上海市第十人民医院 | Remote interventional therapy system based on VR technology |
CN113096253B (en) * | 2021-04-12 | 2021-10-26 | 广东视明科技发展有限公司 | Visual simulation system and method based on local visual field stimulation |
CN113096253A (en) * | 2021-04-12 | 2021-07-09 | 广东视明科技发展有限公司 | Visual simulation system and method based on local visual field stimulation |
CN113616336A (en) * | 2021-09-13 | 2021-11-09 | 上海微创医疗机器人(集团)股份有限公司 | Surgical robot simulation system, simulation method, and readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106974730A (en) | Surgical simulation method, device and equipment based on virtual reality and medical image | |
CN106874700A (en) | Surgical simulation method, surgical simulation device and electronic equipment based on Web | |
JP7522269B2 (en) | Medical image processing method, medical image processing device, medical image processing system, and medical image processing program | |
Kalra | Developing fe human models from medical images | |
US11776123B2 (en) | Systems and methods for displaying augmented anatomical features | |
CN110021445A (en) | A kind of medical system based on VR model | |
Tan et al. | Short paper: Using BSN for tele-health application in upper limb rehabilitation | |
Li et al. | MedShapeNet--A large-scale dataset of 3D medical shapes for computer vision | |
US20220375621A1 (en) | Digital twin | |
CN115942899A (en) | Medical examination of the human body using tactile sensation | |
Greenleaf | Developing the tools for practical VR applications [Medicine] | |
CN107315915A (en) | A kind of simulated medical surgery method and system | |
Izard et al. | NextMed: How to enhance 3D radiological images with Augmented and Virtual Reality | |
Reddivari et al. | VRvisu: a tool for virtual reality based visualization of medical data | |
CN112215969A (en) | User data processing method and device based on virtual reality | |
US20230290085A1 (en) | Systems and Methods for Displaying Layered Augmented Anatomical Features | |
Berti et al. | Medical simulation services via the grid | |
Goral et al. | Development of a CapsNet and Fuzzy Logic decision support system for diagnosing the Scoliosis and planning treatments via Schroth method | |
Greenleaf et al. | Medical applications of virtual reality technology | |
Rudnicka et al. | Health Digital Twins Supported by Artificial Intelligence-based Algorithms and Extended Reality in Cardiology | |
US20110242096A1 (en) | Anatomy diagram generation method and apparatus, and medium storing program | |
Zhang et al. | [Retracted] The Application of 3D Virtual Technology in the Teaching of Clinical Medicine | |
Chen et al. | A system design for virtual reality visualization of medical image | |
Marsh et al. | VR in medicine: virtual colonoscopy | |
WO2024217461A1 (en) | Method and system for remote phsical examination |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20170725 |
|
WD01 | Invention patent application deemed withdrawn after publication |