CN110739084A - Virtual reality interaction system and interaction method - Google Patents

Virtual reality interaction system and interaction method Download PDF

Info

Publication number
CN110739084A
CN110739084A CN201911015563.7A CN201911015563A CN110739084A CN 110739084 A CN110739084 A CN 110739084A CN 201911015563 A CN201911015563 A CN 201911015563A CN 110739084 A CN110739084 A CN 110739084A
Authority
CN
China
Prior art keywords
data
virtual
ablation
nodule
liquid injection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911015563.7A
Other languages
Chinese (zh)
Inventor
李萍
唐为忠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201911015563.7A priority Critical patent/CN110739084A/en
Priority to PCT/CN2019/121818 priority patent/WO2021077542A1/en
Publication of CN110739084A publication Critical patent/CN110739084A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • G09B23/285Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine for injections, endoscopy, bronchoscopy, sigmoidscopy, insertion of contraceptive devices or enemas

Abstract

The invention provides virtual reality interaction systems which are applied to a thyroid nodule ablation technology and comprise a processing module, an operation module and a display module, wherein the processing module stores at least kinds of human thyroid data and at least kinds of thyroid nodule associated data, different virtual three-dimensional human body modules are established according to the human thyroid data and the thyroid nodule associated data, so that the processing module carries out different liquid isolation guidance and nodule ablation guidance aiming at isolation operation result data and ablation operation result data, and the display module displays the virtual liquid isolation operation, the virtual nodule ablation operation, the liquid isolation guidance and the nodule ablation guidance on a virtual three-dimensional human body model, thereby being convenient for correcting the defects of the liquid isolation operation and the nodule ablation operation and avoiding the problem of insufficient guidance of the liquid isolation operation and the nodule ablation operation of a user.

Description

Virtual reality interaction system and interaction method
Technical Field
The invention relates to the technical field of virtual reality interaction, in particular to virtual reality interaction systems and interaction methods applied to a thyroid nodule ablation technology.
Background
Thyroid nodules need to use a thyroid tumor thermal ablation technology, the thyroid tumor thermal ablation technology generally uses a neck cross section to guide a puncture process, so that important results of thyroid tumor ablation, organs, esophagus, blood vessels, muscles and the like are completely placed in the same ultrasonic cross section, needle insertion in the neck cross section is beneficial to displaying an anatomical relationship between the ablated thyroid tumor and an adjacent important structure so as to improve monitoring safety, and is also beneficial to eliminating influence on adjustment of a puncture diameter-entering angle due to blockage of a human body when the needle is inserted along a longitudinal axis plane of the human body, selected needle-inserting angles, selected positions and the like are uniformly distributed according to different positions, sizes and numbers of thyroid tumors, in teaching, the needle-inserting angles, the positions and the like are basic training items, and have the characteristic of difficult selection of angles, positions and the like, and a traditional teaching mode mainly combines classroom teaching with illustration and molds or exercises in the human body, and has -set learning limitation and risk hidden dangers.
Virtual Reality (VR) is methods for people to visualize and interoperate computers and their very complex data, and VR technology combines multiple field achievements such as computer graphics, sensor technology, dynamics, optics, artificial intelligence, social psychology, and the like, and takes computer technology as a core, generates a realistic Virtual environment in a specific range of visual, audio, and tactile by multimedia and three-dimensional technologies, and a user interacts and interacts with objects in the Virtual environment in a natural way by means of necessary equipment, thereby generating feeling and experience that are personally equivalent to a real environment.
Therefore, there is a need to provide new virtual reality interaction systems and methods to solve the above problems in the prior art.
Disclosure of Invention
The invention aims to provide virtual reality interaction systems and interaction methods, which are used for quickly and effectively guiding a liquid isolation operation and a nodule ablation operation of a user and avoiding the problem of insufficient guidance of the liquid isolation operation and the nodule ablation operation in the prior art.
In order to achieve the above object, the virtual reality interaction system of the present invention is applied to a thyroid nodule ablation technology, and includes a processing module, an operation module and a display module, where the processing module stores at least kinds of data of a human thyroid gland, at least kinds of data related to a thyroid nodule and data of a virtual scene, the data related to the thyroid nodule includes surrounding tissue data and nodule data, the data related to the thyroid nodule includes priming isolation data and nodule ablation data, the processing module is configured to perform data transmission with the display module and establish a virtual three-dimensional human body model according to the data related to the human thyroid nodule and the data, the display module is configured to display the virtual three-dimensional human body model and the virtual scene, the operation module is configured to perform a virtual liquid isolation operation and a virtual nodule ablation operation on the virtual three-dimensional human body model and the virtual scene to obtain isolation operation result data and ablation operation result data, and the processing module is capable of comparing the isolation operation result data and the ablation operation result data with the data related to provide a liquid isolation and an ablation instruction and presenting the associated data in the form of the virtual nodule on the human body model.
The virtual reality interaction system has the advantages that the processing module stores at least kinds of human thyroid data and at least kinds of thyroid nodule associated data, different virtual three-dimensional human body modules can be established according to the human thyroid data and the thyroid nodule associated data, so that the processing module can conduct different liquid isolation guidance and nodule ablation guidance on a user according to isolation operation result data and ablation operation result data, the user can clearly and intuitively display the virtual liquid isolation operation and the virtual nodule ablation operation of the user and the liquid isolation guidance and the nodule ablation guidance on the virtual three-dimensional human body model through the display module, the user can conveniently correct the defects of the liquid isolation operation and the nodule ablation operation, the operation level of the user is improved, and the problem of insufficient guidance of the liquid isolation operation and the nodule ablation operation of the user in the prior art is solved.
Preferably, a risk level selection unit is arranged in the display module, and the risk level selection unit is used for selecting a risk level and selecting the human thyroid data and the thyroid nodule associated data used for establishing the virtual three-dimensional human model according to the risk level. The beneficial effects are that: the virtual three-dimensional human body model is convenient to establish different virtual three-dimensional human body models according to the requirements of users so as to carry out different liquid isolation guidance and nodule ablation guidance on the users.
Preferably, the operating module embeds there is gesture recognition unit, operating module passes through display module shows the virtual needle hand of holding virtual notes liquid needle and virtual ablation needle to the user, gesture recognition unit is used for catching the removal action of user's hand and passes through virtual needle hand is held present in the virtual scene for the user is right virtual three-dimensional manikin goes on virtual liquid isolation operation with virtual knot melts the operation, in order to obtain keep apart operation result data with melt operation result data, keep apart operation result data including annotating liquid needle position data, annotate liquid pressure data and annotate liquid volume data, melt operation result data including melting needle position data, melting needle insertion number of times data and melting needle insertion dwell time data. The beneficial effects are that: the virtual needle holding hand holding the virtual liquid injection needle and the virtual ablation needle can be synchronous with the real hand action of a human user, and the interaction effect is enhanced.
, preferably, the operation module is further provided with a position sensing unit, the position sensing unit is used for capturing the position of the virtual infusion needle and the position of the virtual ablation needle on the virtual three-dimensional human body model, so as to obtain the position data of the infusion needle and the position data of the ablation needle and feed back the position data to the processing module, the position data of the infusion needle comprises the position data of the infusion needle and the angle data of the infusion needle, and the position data of the ablation needle comprises the position data of the ablation needle and the angle data of the ablation needle.
, preferably, the operation module further comprises a liquid injection sensing unit, wherein the liquid injection sensing unit is used for capturing the liquid injection pressure and the liquid injection amount of the virtual liquid injection needle on the virtual three-dimensional human body model to obtain the liquid injection pressure data and the liquid injection amount data and feed back the data to the processing module.
, it is preferable that the operation module further comprises a statistic unit built therein, and the statistic unit is used for capturing the needle insertion times and the residence time of the virtual ablation needle on the virtual three-dimensional human body model to obtain the ablation needle insertion time data and the ablation needle residence time data.
, preferably, the operation module is further provided with a force feedback unit, the force feedback unit is used for outputting resistance feedback during the process that the virtual liquid injection needle and the virtual ablation needle enter the virtual three-dimensional human body model, and the resistance feedback can simulate the resistance felt by the real liquid injection needle and the real virtual ablation needle during the process that the real virtual ablation needle enters the real human body.
Preferably, the processing module comprises a liquid isolation guiding unit, the liquid isolation guiding unit is used for calling the liquid injection isolation data to conduct the liquid isolation guidance on a user, and the liquid injection isolation data comprises known liquid injection needle injection position data, known liquid injection needle injection angle data, known liquid injection pressure data and known liquid injection quantity data.
preferably, the liquid isolation guiding unit comprises a liquid injection and needle insertion position guiding unit, a liquid injection and needle insertion angle guiding unit, a liquid injection pressure guiding unit and a liquid injection amount guiding unit, the liquid injection and needle insertion position guiding unit is used for calling the known liquid injection and needle insertion position data to guide the liquid injection and needle insertion position of a user, the liquid injection and needle insertion angle guiding unit is used for calling the known liquid injection and needle insertion angle data to guide the liquid injection and needle insertion angle of the user, the liquid injection pressure guiding unit is used for calling the known liquid injection pressure data to guide the liquid injection pressure of the user, and the liquid injection amount guiding unit is used for calling the known liquid injection amount data to guide the liquid injection amount of the user.
, the processing module further comprises a nodule ablation guidance unit for invoking the nodule ablation data to guide the user for the nodule ablation, wherein the nodule ablation data comprises known ablation needle position data, known ablation needle angle data, known ablation needle number data and known ablation needle dwell time data.
, the nodule ablation guiding unit comprises an ablation needle insertion position guiding unit, an ablation needle insertion angle guiding unit, an ablation needle insertion frequency guiding unit and an ablation needle insertion dwell time guiding unit, the ablation needle insertion position guiding unit is used for calling the known ablation needle insertion position data to guide the ablation needle insertion position of the user, the ablation needle insertion angle guiding unit is used for calling the known ablation needle insertion angle data to guide the ablation needle insertion angle of the user, the ablation needle insertion frequency guiding unit is used for calling the known ablation needle insertion frequency data to guide the ablation needle insertion frequency of the user, and the ablation needle insertion dwell time guiding unit is used for calling the known ablation needle insertion dwell time data to guide the ablation needle insertion dwell time of the user.
, the processing module further comprises a stress testing unit for simulating human body stress reaction by the virtual three-dimensional human body model and a stress time detecting unit for detecting the time from the occurrence of human body stress reaction to the stop of the virtual liquid injection needle and the virtual ablation needle on the virtual three-dimensional human body model.
, the processing module further comprises a stress time judging unit for comparing the time for stopping the needle insertion with a maximum stress time threshold to judge whether the time for stopping the needle insertion is qualified.
The invention also provides interaction methods for realizing the virtual reality interaction system, which comprise the following steps:
s1, providing a processing module, an operation module and a display module, wherein the processing module stores at least kinds of human thyroid data, at least kinds of thyroid nodule associated data and virtual scene data, the human thyroid data comprises surrounding tissue data and nodule data, the thyroid nodule associated data comprises injection isolation data and nodule ablation data, and a user wears the display module on the head and holds the operation module;
s2: the processing module establishes a virtual three-dimensional human body model according to the human thyroid data and the thyroid nodule associated data, and outputs the virtual three-dimensional human body model and the virtual scene through the display module;
s3: performing virtual liquid isolation operation and virtual nodule ablation operation on the virtual three-dimensional human body model through the operation module to obtain isolation operation result data and ablation operation result data;
s4: the processing module compares the isolation operation result data and the ablation operation result data with the thyroid nodule associated data, provides liquid isolation guidance and nodule ablation guidance according to the comparison result, and presents the injection isolation data and the nodule ablation data on the virtual three-dimensional human body model in an image form.
The interactive method provided by the invention has the beneficial effects that the processing module stores at least kinds of human thyroid data and at least kinds of thyroid nodule associated data, different virtual three-dimensional human body modules can be established according to the human thyroid data and the thyroid nodule associated data, so that the processing module can carry out different liquid isolation guidance and nodule ablation guidance on a user according to isolation operation result data and ablation operation result data, and the user can clearly and intuitively display the virtual liquid isolation operation and the virtual nodule ablation operation as well as the liquid isolation guidance and the nodule ablation guidance of the user on the virtual three-dimensional human body model through the display module, so that the user can correct the defects of the liquid isolation operation and the nodule ablation operation, the operation level of the user is improved, and the problem of insufficient guidance of the liquid isolation operation and the nodule ablation operation of the user in the prior art is solved.
It is further preferable that the peripheral tissue data include thyroid gland size data, anterior thyroid tissue data including size data and mutual position relationship data of the skin, superficial fascia, deep fascia superficial layer, infrahyoid muscle group and anterior tracheal fascia, posterior thyroid tissue data including size data and mutual position relationship data of the larynx, trachea, pharynx, esophagus, recurrent laryngeal nerve and parathyroid gland, posterior thyroid tissue data including size data and mutual position relationship data of the jugular vascular sheath, sympodial trunk and vagus nerve.
Further , the nodule data preferably includes nodule position data and nodule size data.
, the injection isolation data preferably includes known injection needle position data, known injection needle angle data, known injection pressure data, and known injection quantity data.
preferably, the nodule ablation data includes known ablation needle position data, known ablation needle angle data, known ablation needle number data, and known ablation needle dwell time data.
Drawings
FIG. 1 is a block diagram of a virtual reality interaction system according to the present invention;
FIG. 2 is a flow chart of an interaction method of the present invention;
FIG. 3 is a schematic view of the extreme high risk grade nodule infusion needle insertion position guidance of the present invention;
FIG. 4 is a schematic view of the extreme high risk grade nodule ablation needle insertion position guidance of the present invention;
FIG. 5 is a schematic diagram of an operating module and a display module of the present invention;
fig. 6 is a schematic view of a VR handle of the present invention.
Detailed Description
For purposes of making the objects, aspects and advantages of the present invention more apparent and more fully hereinafter, embodiments of the present invention are described in conjunction with the accompanying drawings of the present invention, it being understood that the described embodiments are some, but not all, of the embodiments of the present invention.
In order to solve the problems in the prior art, the embodiment of the present invention provides virtual reality interaction systems for thyroid nodule ablation technology, where the virtual reality interaction systems include a processing module, an operation module, and a display module.
In embodiments of the present invention, the processing module stores at least kinds of data of human thyroid gland, at least kinds of data associated with thyroid nodule, and virtual scene data, the data of human thyroid gland includes data of surrounding tissue and data of nodule, and the data associated with thyroid nodule includes data of fluid injection isolation and data of nodule ablation.
according to some embodiments of the present invention, the peripheral tissue data includes thyroid gland size data, anterior thyroid tissue data including size data and mutual position relationship data of skin, superficial fascia, superficial deep fascia, infrahyoid muscle groups, and anterior tracheal fascia, and posterior thyroid tissue data including size data and mutual position relationship data of larynx, trachea, pharynx, esophagus, recurrent laryngeal nerve, and parathyroid gland.
In embodiments of the present invention, the nodule data includes nodule position data and nodule size data.
embodiments of the invention, the priming isolation data includes known priming location data, known priming angle data, known priming pressure data, and known priming volume data.
embodiments of the invention, the nodule ablation data includes known ablation needle location data, known ablation needle angle data, known ablation needle times data, and known ablation needle dwell time data.
Referring to fig. 1, the virtual reality interaction system 10 includes a processing module 11, an operating module 12 and a display module 13, the processing module 11 is configured to perform data transmission with the display module 13 and establish a virtual three-dimensional human body model according to the human thyroid data and the thyroid nodule associated data, the display module 13 is configured to display the virtual three-dimensional human body model and the virtual scene, the operating module 12 is configured to perform a virtual fluid isolation operation and a virtual nodule ablation operation on the virtual three-dimensional human body model and the virtual scene to obtain isolation operation result data and ablation operation result data, and the processing module 11 is capable of comparing the isolation operation result data and the ablation operation result data with the thyroid nodule associated data, providing a fluid isolation guidance and a nodule ablation guidance according to a result of the comparison, and presenting the thyroid nodule associated data on the virtual three-dimensional human body model in an image form and providing the data to a user.
FIG. 2 is a flow chart of an interaction method according to embodiments of the present invention, referring to FIG. 2, the interaction method includes the following steps:
s1, providing a processing module, an operation module and a display module, wherein the processing module stores at least kinds of human thyroid data, at least kinds of thyroid nodule associated data and virtual scene data, the human thyroid data comprises surrounding tissue data and nodule data, the thyroid nodule associated data comprises injection isolation data and nodule ablation data, and a user wears the display module on the head and holds the operation module;
s2: the processing module establishes a virtual three-dimensional human body model according to the human thyroid data and the thyroid nodule associated data, and outputs the virtual three-dimensional human body model and the virtual scene through the display module;
s3: performing virtual liquid isolation operation and virtual nodule ablation operation on the virtual three-dimensional human body model through the operation module to obtain isolation operation result data and ablation operation result data;
s4: the processing module compares the isolation operation result data and the ablation operation result data with the thyroid nodule associated data, provides liquid isolation guidance and nodule ablation guidance according to the comparison result, and presents the injection isolation data and the nodule ablation data on the virtual three-dimensional human body model in an image form.
In embodiments of the present invention, referring to fig. 1, the display module 13 is internally provided with a risk level selection unit 131, and the risk level selection unit 131 is configured to select a risk level and select the human thyroid data and the thyroid nodule associated data for building the virtual three-dimensional human model according to the risk level.
embodiments of the present invention, the risk level is related to the location of the nodule and is classified into a low risk level, a medium risk level, a high risk level, and a very high risk level based on the location of the nodule.
Fig. 3 is a schematic view of guiding the needle inserting position of nodule infusion at a very high risk level in the embodiments of of the present invention, referring to fig. 3, the needle comprises lateral muscle tissue 31, anterior muscle tissue 32, thyroid gland 33, thyroid nodule 34, carotid artery 35, recurrent laryngeal nerve 36, trachea 37, esophagus 38 and infusion part 39, the virtual infusion needle is used for injecting 10ML to 20ML along the direction a, namely carotid artery 60 degrees, injecting liquid 5 ML to 20ML along the direction b, and injecting liquid 5 ML to 10ML along the direction c, namely forming the infusion part 39, the positions of the lateral muscle tissue 31, the anterior muscle tissue 32, the thyroid gland 33, the thyroid nodule 34, the carotid artery 35, the recurrent laryngeal nerve 36, the trachea 37 and the esophagus 38 are known technologies, and no further description is given here.
In embodiments of the present invention, referring to fig. 1, a gesture recognition unit 121 is built in the operation module 12, the operation module 12 displays a virtual needle holder holding a virtual infusion needle and a virtual ablation needle to a user through the display module 13, the gesture recognition unit 121 is configured to capture a movement of the hand of the user and present the movement in the virtual scene through the virtual needle holder to allow the user to perform the virtual liquid isolation operation and the virtual nodule ablation operation on the virtual three-dimensional human body model to obtain isolation operation result data and the ablation operation result data, the isolation operation result data includes infusion needle position data, infusion pressure data and infusion amount data, and the ablation operation result data includes ablation needle position data, ablation needle insertion frequency data and ablation needle insertion dwell time data.
In embodiments of the present invention, referring to fig. 1, the operation module 12 further includes a position sensing unit 122, where the position sensing unit 122 is configured to capture positions of the virtual infusion needle and the virtual ablation needle on the virtual three-dimensional human body model, so as to obtain and feed back to the processing module 11 the infusion needle position data and the ablation needle position data, where the infusion needle position data includes infusion needle insertion position data and infusion needle insertion angle data, and the ablation needle position data includes ablation needle insertion position data and ablation needle insertion angle data.
In embodiments of the present invention, referring to fig. 1, the operation module 12 further includes a liquid injection sensing unit 123, and the liquid injection sensing unit 123 is configured to capture a liquid injection pressure and a liquid injection amount of the virtual liquid injection needle on the virtual three-dimensional human body model, so as to obtain the liquid injection pressure data and the liquid injection amount data and feed back the liquid injection pressure data and the liquid injection amount data to the processing module.
In embodiments of the present invention, referring to fig. 1, the operation module 12 further includes a statistic unit 124, and the statistic unit 124 is configured to capture the needle insertion times and the residence time of the virtual ablation needle on the virtual three-dimensional human body model to obtain the ablation needle insertion time data and the ablation needle insertion residence time data.
In embodiments of the present invention, referring to fig. 1, the operation module 12 further includes a force feedback unit 125, and the force feedback unit 125 is configured to output a resistance feedback during the process of entering the virtual three-dimensional human body model by the virtual liquid injection needle and the virtual ablation needle, and the resistance feedback can simulate the resistance sensed during the process of entering the real human body by the real liquid injection needle and the real virtual ablation needle.
In examples of the present invention, referring to fig. 1, the operation module 12 further includes an operation switching unit 126, and the operation switching unit 126 is configured to switch between the virtual fluid isolation operation and the virtual nodule ablation operation.
In embodiments of the present invention, referring to fig. 1, the processing module 11 includes a liquid isolation guiding unit 111, and the liquid isolation guiding unit 111 is configured to invoke the liquid injection isolation data to guide the user to perform the liquid isolation guidance, where the liquid injection isolation data includes known liquid injection needle insertion position data, known liquid injection needle insertion angle data, known liquid injection pressure data, and known liquid injection amount data.
In embodiments of the present invention, the liquid isolation guidance unit 111 includes a liquid injection needle insertion position guidance unit, a liquid injection needle insertion angle guidance unit, a liquid injection pressure guidance unit, and a liquid injection amount guidance unit, the liquid injection needle insertion position guidance unit is configured to call the known liquid injection needle insertion position data to perform liquid injection needle insertion position guidance for a user, the liquid injection needle insertion angle guidance unit is configured to call the known liquid injection needle insertion angle data to perform liquid injection needle insertion angle guidance for the user, the liquid injection pressure guidance unit is configured to call the known liquid injection pressure data to perform liquid injection pressure guidance for the user, and the liquid injection amount guidance unit is configured to call the known liquid injection amount data to perform liquid injection amount guidance for the user.
In examples of the present invention, referring to fig. 1, the processing module 11 further includes a nodule ablation guidance unit 112, the nodule ablation guidance unit 112 is configured to invoke the nodule ablation data to guide the user in the nodule ablation, and the nodule ablation data includes known ablation needle insertion position data, known ablation needle insertion angle data, known ablation needle insertion times data, and known ablation needle insertion dwell time data.
In embodiments of the present invention, the nodule ablation guiding unit 112 includes an ablation needle insertion position guiding unit, an ablation needle insertion frequency guiding unit, and an ablation needle insertion dwell time guiding unit, the ablation needle insertion position guiding unit is configured to invoke the known ablation needle insertion position data to guide the ablation needle insertion position of the user, the ablation needle insertion angle guiding unit is configured to invoke the known ablation needle insertion angle data to guide the ablation needle insertion angle of the user, the ablation needle insertion frequency guiding unit is configured to invoke the known ablation needle insertion frequency data to guide the ablation needle insertion frequency of the user, and the ablation needle insertion dwell time guiding unit is configured to invoke the known ablation needle insertion dwell time data to guide the ablation needle insertion dwell time of the user.
Fig. 4 is a guidance diagram of needle insertion positions for ablation of high risk grade nodules according to of the present invention, referring to fig. 4, the virtual ablation needle is perpendicular to the plane of maximum diameter of the thyroid nodule 34, and the bottom of the rhynchophorus is moved for ablation from needle insertion to needle insertion 302.
In embodiments of the present invention, referring to fig. 1, the processing module 11 further includes a stress testing unit 113 and a stress time detecting unit 114, the stress testing unit 113 is configured to simulate a human stress response through the virtual three-dimensional human body model, and the stress time detecting unit 114 is configured to detect a time taken from occurrence of the human stress response to stop of the virtual liquid injection needle and the virtual ablation needle on the virtual three-dimensional human body model.
In embodiments of the present invention, referring to fig. 1, the processing module 11 further includes a stress time determining unit 115, where the stress time determining unit 115 is configured to compare the time for needle insertion stop with a maximum stress time threshold to determine whether the time for needle insertion stop is qualified.
in some embodiments of the present invention, the human stress simulates swallowing activity in a human.
Fig. 5 is a schematic diagram of an operation module and a display module according to of the present invention, referring to fig. 5, the operation module is a VR handle 61, the display module is a VR eyeshade 62, the VR eyeshade 62 is worn on the head by a user, and the VR handle 61 is held by the left hand and the right hand respectively.
Fig. 6 is a schematic structural view of a VR handle in embodiments of the present invention, referring to fig. 6, an operation control portion 613 and an operation switching control portion 614 are sequentially disposed on a front side of the VR handle 61 from top to bottom, and a needle holding control portion 611 and a needle insertion control portion 612 are sequentially disposed on a right side of the VR handle 61 from top to bottom.
In the embodiments of the present invention, referring to fig. 6, after the user holds the VR handle 61 and presses the needle holding control portion 611, the user can observe the virtual needle holder through the VR eyeshade 62.
In the specific embodiments of the present invention, referring to fig. 6, the user holds the VR handle 61 with his or her right hand and presses the needle insertion control unit 612, and the virtual needle holder performs a virtual needle insertion operation on the virtual three-dimensional phantom.
Although the embodiments of the present invention have been described in detail hereinabove, it is apparent to those skilled in the art that various modifications and variations can be made to these embodiments. However, it is to be understood that such modifications and variations are within the scope and spirit of the present invention as set forth in the following claims. Moreover, the invention as described herein is capable of other embodiments and of being practiced and carried out in various ways.

Claims (18)

  1. The utility model provides an kinds of virtual reality interactive system, be applied to thyroid nodule ablation technique, its characterized in that, virtual reality interactive system includes processing module, operation module and display module, processing module stores kinds of human thyroid gland data at least, kinds of thyroid nodule associated data and virtual scene data, human thyroid gland data include surrounding tissue data and nodule data, thyroid nodule associated data include notes liquid isolation data and nodule ablation data, processing module be used for with display module carry out data transmission and according to human thyroid gland data with thyroid nodule associated data establishes virtual three-dimensional human body model, display module is used for showing virtual three-dimensional human body model with the virtual scene, operation module is used for carrying out virtual liquid isolation operation and virtual nodule ablation operation to virtual three-dimensional human body model with the virtual scene to obtain isolation operation result data and ablation operation result data, processing module can with the isolation operation result data with the thyroid nodule associated data compare, and provide liquid isolation and ablation guide according to the result of comparing and with the thyroid nodule associated data is in the virtual three-dimensional human body nodule associated data with the virtual nodule in the human body model of the form.
  2. 2. The virtual reality interactive system of claim 1, wherein a risk level selection unit is built in the display module, and the risk level selection unit is configured to select a risk level and select the thyroid data and the thyroid nodule associated data for building the virtual three-dimensional human model according to the risk level.
  3. 3. The virtual reality interaction system of claim 1, wherein the manipulation module is embedded with a gesture recognition unit, the operation module displays a virtual needle holder holding a virtual infusion needle and a virtual ablation needle to a user through the display module, the gesture recognition unit is used for capturing the moving action of the hand of the user and presenting the moving action in the virtual scene through the virtual needle-holding hand so that the user can perform the virtual liquid isolation operation and the virtual nodule ablation operation on the virtual three-dimensional human body model, so as to obtain the isolation operation result data and the ablation operation result data, wherein the isolation operation result data comprises injection needle position data, injection pressure data and injection quantity data, the ablation operation result data comprises ablation needle position data, ablation needle insertion frequency data and ablation needle insertion residence time data.
  4. 4. The virtual reality interactive system of claim 3, wherein a position sensing unit is further built in the operation module, and the position sensing unit is used for capturing the positions of the virtual infusion needle and the virtual ablation needle on the virtual three-dimensional human body model to obtain and feed back the infusion needle position data and the ablation needle position data to the processing module, the infusion needle position data comprises infusion needle insertion position data and infusion needle insertion angle data, and the ablation needle position data comprises ablation needle insertion position data and ablation needle insertion angle data.
  5. 5. The virtual reality interactive system of claim 4, wherein the operation module is further provided with a liquid injection sensing unit inside, and the liquid injection sensing unit is used for capturing the liquid injection pressure and the liquid injection amount of the virtual liquid injection needle on the virtual three-dimensional human body model, so as to obtain the liquid injection pressure data and the liquid injection amount data and feed back the data to the processing module.
  6. 6. The virtual reality interactive system of claim 5, wherein the operation module further comprises a statistic unit, and the statistic unit is configured to capture needle insertion times and residence time of the virtual ablation needle on the virtual three-dimensional human body model to obtain the ablation needle insertion time data and the ablation needle insertion residence time data.
  7. 7. The virtual reality interaction system of claim 6, wherein a force feedback unit is further built in the operation module, and the force feedback unit is used for outputting resistance feedback in the process that the virtual liquid injection needle and the virtual ablation needle enter the virtual three-dimensional human body model, and the resistance feedback can simulate the resistance sensed in the process that the real liquid injection needle and the real virtual ablation needle enter the real human body.
  8. 8. The virtual reality interactive system of claim 1, wherein the processing module comprises a liquid isolation guidance unit, the liquid isolation guidance unit is configured to invoke the liquid injection isolation data to perform the liquid isolation guidance on the user, and the liquid injection isolation data comprises known liquid injection needle insertion position data, known liquid injection needle insertion angle data, known liquid injection pressure data, and known liquid injection amount data.
  9. 9. The virtual reality interactive system of claim 8, wherein the liquid isolation guidance unit comprises a liquid injection and needle insertion position guidance unit, a liquid injection and needle insertion angle guidance unit, a liquid injection pressure guidance unit and a liquid injection amount guidance unit, the liquid injection and needle insertion position guidance unit is used for calling the known liquid injection and needle insertion position data to guide the liquid injection and needle insertion position of a user, the liquid injection and needle insertion angle guidance unit is used for calling the known liquid injection and needle insertion angle data to guide the liquid injection and needle insertion angle of the user, the liquid injection pressure guidance unit is used for calling the known liquid injection pressure data to guide the liquid injection pressure of the user, and the liquid injection amount guidance unit is used for calling the known liquid injection amount data to guide the liquid injection amount of the user.
  10. 10. The virtual reality interaction system of claim 8, wherein the processing module further comprises a nodule ablation guidance unit, the nodule ablation guidance unit is configured to invoke the nodule ablation data to guide the user in the nodule ablation, and the nodule ablation data comprises known ablation needle insertion position data, known ablation needle insertion angle data, known ablation needle insertion frequency data, and known ablation needle insertion dwell time data.
  11. 11. The virtual reality interaction system of claim 10, wherein the nodule ablation guidance unit comprises an ablation needle insertion position guidance unit, an ablation needle insertion angle guidance unit, an ablation needle insertion frequency guidance unit and an ablation needle insertion dwell time guidance unit, the ablation needle inserting position guiding unit is used for calling the known ablation needle inserting position data to guide the ablation needle inserting position of a user, the ablation needle insertion angle guiding unit is used for calling the known ablation needle insertion angle data to guide the ablation needle insertion angle of a user, the ablation needle insertion frequency guiding unit is used for calling the known ablation needle insertion frequency data to guide the ablation needle insertion frequency of the user, the ablation needle insertion stay time guiding unit is used for calling the known ablation needle insertion stay time data to guide the ablation needle insertion stay time of a user.
  12. 12. The virtual reality interaction system of claim 3 or 10, wherein the processing module further comprises a stress testing unit and a stress time detection unit, the stress testing unit is used for simulating human body stress response through the virtual three-dimensional human body model, and the stress time detection unit is used for detecting the time from the occurrence of the human body stress response to the stop of the virtual liquid injection needle and the virtual ablation needle on the virtual three-dimensional human body model.
  13. 13. The virtual reality interactive system of claim 12, wherein the processing module further comprises a stress time judging unit, and the stress time judging unit is configured to compare the time for needle insertion stop with a maximum stress time threshold to judge whether the time for needle insertion stop is qualified.
  14. 14, interaction method for the virtual reality interaction system of any of claims 1-13, comprising the steps of:
    s1, providing a processing module, an operation module and a display module, wherein the processing module stores at least kinds of human thyroid data, at least kinds of thyroid nodule associated data and virtual scene data, the human thyroid data comprises surrounding tissue data and nodule data, the thyroid nodule associated data comprises injection isolation data and nodule ablation data, and a user wears the display module on the head and holds the operation module;
    s2: the processing module establishes a virtual three-dimensional human body model according to the human thyroid data and the thyroid nodule associated data, and outputs the virtual three-dimensional human body model and the virtual scene through the display module;
    s3: performing virtual liquid isolation operation and virtual nodule ablation operation on the virtual three-dimensional human body model through the operation module to obtain isolation operation result data and ablation operation result data;
    s4: the processing module compares the isolation operation result data and the ablation operation result data with the thyroid nodule associated data, provides liquid isolation guidance and nodule ablation guidance according to the comparison result, and presents the injection isolation data and the nodule ablation data on the virtual three-dimensional human body model in an image form.
  15. 15. The interactive method of claim 14, wherein the surrounding tissue data comprises data on the relative positions of thyroid gland size data, anterior thyroid tissue data, medial thyroid lobe tissue data, and lateral thyroid lobe tissue data, the anterior thyroid tissue data comprises data on the sizes and relative positions of skin, superficial fascia, superficial deep fascia, infrahyoid muscle group, and anterior tracheal fascia, the medial thyroid lobe tissue data comprises data on the sizes and relative positions of larynx, trachea, pharynx, esophagus, recurrent laryngeal nerve, and parathyroid gland, and the lateral thyroid lobe tissue data comprises data on the sizes and relative positions of cervical sheath, sympathetic cervical stem, and vagus nerve.
  16. 16. The interactive method of claim 14, wherein the nodule data comprises nodule position data and nodule size data.
  17. 17. The interactive method of claim 14, wherein the priming isolation data comprises known priming needle position data, known priming needle angle data, known priming pressure data, and known priming volume data.
  18. 18. The interactive method of claim 14, wherein the nodule ablation data comprises known ablation needle location data, known ablation needle angle data, known ablation needle number data, and known ablation needle dwell time data.
CN201911015563.7A 2019-10-24 2019-10-24 Virtual reality interaction system and interaction method Pending CN110739084A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911015563.7A CN110739084A (en) 2019-10-24 2019-10-24 Virtual reality interaction system and interaction method
PCT/CN2019/121818 WO2021077542A1 (en) 2019-10-24 2019-11-29 Virtual reality interactive system and interaction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911015563.7A CN110739084A (en) 2019-10-24 2019-10-24 Virtual reality interaction system and interaction method

Publications (1)

Publication Number Publication Date
CN110739084A true CN110739084A (en) 2020-01-31

Family

ID=69271186

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911015563.7A Pending CN110739084A (en) 2019-10-24 2019-10-24 Virtual reality interaction system and interaction method

Country Status (2)

Country Link
CN (1) CN110739084A (en)
WO (1) WO2021077542A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104517016A (en) * 2013-09-28 2015-04-15 南京专创知识产权服务有限公司 Surgery simulation system using motion sensing technology and virtual reality technology
CN109065147A (en) * 2018-07-30 2018-12-21 广州狄卡视觉科技有限公司 Medical Digital 3D model human body surgical simulation human-computer interaction system and method
CN109077804A (en) * 2018-08-19 2018-12-25 天津大学 A kind of Microwave Coagulation Therapy method of planning based on ct images

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1809628B1 (en) * 2004-10-13 2011-06-08 Merck Patent GmbH Phenylurea derivatives used as inhibitors of tyrosinkinase for treating tumors
CN105310760A (en) * 2015-05-26 2016-02-10 梅亮亮 Superconducting pinhole minimally-invasive ablation
CN106264721B (en) * 2016-08-29 2019-01-11 安隽医疗科技(南京)有限公司 A kind of SAPMAC method ablation needle system
CN107978195A (en) * 2017-12-29 2018-05-01 福州大学 A kind of lateral cerebral ventricle puncture operative training system based on Virtual Reality Platform

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104517016A (en) * 2013-09-28 2015-04-15 南京专创知识产权服务有限公司 Surgery simulation system using motion sensing technology and virtual reality technology
CN109065147A (en) * 2018-07-30 2018-12-21 广州狄卡视觉科技有限公司 Medical Digital 3D model human body surgical simulation human-computer interaction system and method
CN109077804A (en) * 2018-08-19 2018-12-25 天津大学 A kind of Microwave Coagulation Therapy method of planning based on ct images

Also Published As

Publication number Publication date
WO2021077542A1 (en) 2021-04-29

Similar Documents

Publication Publication Date Title
CN107067856B (en) Medical simulation training system and method
US11361516B2 (en) Interactive mixed reality system and uses thereof
US10849688B2 (en) Sensory enhanced environments for injection aid and social training
Coles et al. Integrating haptics with augmented reality in a femoral palpation and needle insertion training simulation
US20090042695A1 (en) Interactive rehabilitation method and system for movement of upper and lower extremities
US10360814B2 (en) Motion learning support apparatus
CN111063416A (en) Alzheimer disease rehabilitation training and capability assessment system based on virtual reality
WO2020179128A1 (en) Learning assist system, learning assist device, and program
CN105096670B (en) A kind of intelligent immersion tutoring system and device for nose catheter operation real training
Heng et al. Intelligent inferencing and haptic simulation for Chinese acupuncture learning and training
WO2012106706A2 (en) Hybrid physical-virtual reality simulation for clinical training capable of providing feedback to a physical anatomic model
CN104485047A (en) Acupuncture simulation system and method
CN109885156A (en) A kind of virtual reality interaction systems and interactive approach
CN112071149A (en) Wearable medical simulation puncture skill training system and method
CN105719526A (en) Sunk cord eyebrow lifting plastic surgery simulation system based on force feedback
Echeverria et al. KUMITRON: Artificial intelligence system to monitor karate fights that synchronize aerial images with physiological and inertial signals
Santos Psychomotor learning in martial arts: An opportunity for user modeling, adaptation and personalization
Kim et al. A convergence research for development of VR education contents for core fundamental nursing skills
CN110739084A (en) Virtual reality interaction system and interaction method
Tai et al. Tissue and force modelling on multi-layered needle puncture for percutaneous surgery training
Coles et al. The effectiveness of commercial haptic devices for use in virtual needle insertion training simulations
Coles Investigating augmented reality visio-haptic techniques for medical training
TWI691345B (en) Mannequin injection practice system
Sung et al. Intelligent haptic virtual simulation for suture surgery
CN107292952A (en) A kind of virtual emulation nursing detection method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200131