CN115132013B - Medical ultrasonic simulation teaching method and system - Google Patents

Medical ultrasonic simulation teaching method and system Download PDF

Info

Publication number
CN115132013B
CN115132013B CN202210885276.7A CN202210885276A CN115132013B CN 115132013 B CN115132013 B CN 115132013B CN 202210885276 A CN202210885276 A CN 202210885276A CN 115132013 B CN115132013 B CN 115132013B
Authority
CN
China
Prior art keywords
ultrasonic
medical
scene
teaching
tracker
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210885276.7A
Other languages
Chinese (zh)
Other versions
CN115132013A (en
Inventor
石宇
陈芸
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Wangyue Medical Technology Co ltd
Peking University Shenzhen Hospital
Original Assignee
Shenzhen Wangyue Medical Technology Co ltd
Peking University Shenzhen Hospital
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Wangyue Medical Technology Co ltd, Peking University Shenzhen Hospital filed Critical Shenzhen Wangyue Medical Technology Co ltd
Priority to CN202210885276.7A priority Critical patent/CN115132013B/en
Publication of CN115132013A publication Critical patent/CN115132013A/en
Application granted granted Critical
Publication of CN115132013B publication Critical patent/CN115132013B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • G09B23/286Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine for scanning or photography techniques, e.g. X-rays, ultrasonics

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Algebra (AREA)
  • Radiology & Medical Imaging (AREA)
  • Chemical & Material Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Medicinal Chemistry (AREA)
  • Human Computer Interaction (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention relates to the technical field of intelligent teaching, in particular to a medical ultrasonic simulation teaching method and system. The method comprises the following steps: s1, building a virtual scene corresponding to a real teaching scene through a 3D engine according to the real teaching operation scene; s2, scanning and tracking the spatial position of each entity teaching part in the real scene by the positioning equipment, feeding the spatial position back to the virtual scene, and establishing the position association between the virtual scene and the corresponding teaching part in the real teaching scene; s3, selecting an ultrasonic project to open an examination, enabling an ultrasonic detection handle to be close to the medical dummy model in a real teaching scene, and acquiring image volume data generated on the dummy model for traditional Chinese medicine in a virtual scene; and calling the picture data file of the corresponding part on the medical dummy model and transmitting the picture data file to the virtual scene for imaging. The positioning equipment is used for positioning each teaching part, so that the imaging of each teaching part in the virtual scene corresponds to the position relation in the real scene, and the positioning accuracy is improved.

Description

Medical ultrasonic simulation teaching method and system
Technical Field
The invention relates to the technical field of intelligent teaching, in particular to a medical ultrasonic simulation teaching method and system.
Background
The ultrasonic diagnosis training is an important link in medical teaching, in the past teaching, a teacher needs to perform operation demonstration or practice on students, for the medical students who practice in hospitals, the teacher can lead the students to perform practice on patients, but as the medical students need more practical chances, the number of cases is limited, and the practical exercises are performed on the patients every time, so that inconvenience is brought to the patients, the times of the practical exercises of the students are limited, the teaching effect is difficult to achieve, and therefore how to improve the efficiency of the ultrasonic diagnosis training becomes a pain point in the medical teaching.
With the popularization of virtual reality technology, simulation teaching begins to become a supplementary scheme in medical teaching. The existing virtual scene teaching uses a simulation all-in-one machine with high integration level, and functions such as ultrasonic signal acquisition, image display and the like are completed through one piece of equipment, but the all-in-one machine with high integration level is expensive and has high later maintenance cost, so that the all-in-one machine cannot be widely popularized in medical teaching; and the simulation all-in-one does not have space positioning equipment, when carrying out the position seizure location to ultrasonic testing handle and dummy model, can appear the deviation easily, leads to the formation of image result not conform with actual operation process, and then influences the teaching effect.
Disclosure of Invention
The invention provides a medical ultrasonic simulation teaching method and system, and aims to solve the problems that the existing simulation teaching equipment is high in cost and easy to deviate.
The invention provides a medical ultrasonic simulation teaching method, which comprises the following steps:
s1, scene construction: according to a real teaching operation scene, building a virtual scene corresponding to the real teaching scene through a 3D engine;
s2, capturing and positioning: the positioning equipment scans and tracks the spatial position of each entity teaching part in the real scene, feeds the spatial position back to the virtual scene, and establishes the position association between the virtual scene and the corresponding teaching part in the real teaching scene;
s3, simulating operation and generating an image: selecting an ultrasonic item to open examination, enabling an ultrasonic detection handle to be close to a medical dummy model in a real teaching scene, and acquiring image volume data generated on the medical dummy model in a virtual scene; and calling the picture data file of the corresponding part on the medical dummy model and transmitting the picture data file to the virtual scene for imaging.
As a further improvement of the present invention, the step S1 specifically includes:
and combining a real teaching operation scene, the PC host machine completes modeling of a virtual scene and operation functions in the scene by using the 3D engine, and transmits visual information of the virtual scene to the desktop client and/or the VR terminal.
As a further improvement of the present invention, the step S2 specifically includes:
s21, scanning and tracking spatial positions of a VR end in a real scene and a handle tracker arranged in an ultrasonic detection handle by positioning equipment;
s22, scanning a built-in tracker of the medical dummy model by positioning equipment, and confirming the relative position of the medical dummy model and an ultrasonic detection area of the medical dummy model;
and S23, the positioning equipment feeds back the acquired real-time position information to the PC host, and the PC host converts the position information into visual information and transmits the visual information to the desktop client and/or the VR end for imaging.
As a further improvement of the present invention, the step S3 further includes:
s31, detecting the distance between the medical dummy model and a handle tracker arranged in the ultrasonic detection handle in real time through the built-in model tracker, and feeding back a relative spatial position relation to a PC (personal computer) host end by the model tracker after the model tracker detects that the position of the handle tracker reaches an ultrasonic detection area of the medical dummy model;
and S32, the PC host receives data returned by the model tracker, displays ultrasonic pictures at corresponding positions according to different spatial position relations, and sends the ultrasonic pictures to the desktop client and/or the VR end for imaging.
As a further improvement of the present invention, the process of calling the ultrasound image for imaging by the PC host in step S32 further includes:
the PC host stores an ultrasonic picture which needs to be imaged in the ultrasonic simulation teaching, and establishes the association between the ultrasonic picture and different spatial position data in the ultrasonic detection area of the medical dummy model; when the model tracker senses a certain position of the ultrasonic detection area and the handle tracker of the ultrasonic detection handle, the PC host calls and sends the ultrasonic picture corresponding to the space position and displays the ultrasonic picture area of the desktop client and/or the display screen of the VR end.
As a further improvement of the invention, the medical ultrasonic simulation teaching method further comprises the following steps:
s4, image adjusting: and calling a control panel built in the virtual scene to execute image processing operations including image freezing and image brightness adjustment.
The invention also provides a medical ultrasonic simulation teaching system, which comprises
The PC host builds a virtual scene corresponding to the real teaching scene through a 3D engine according to the real teaching operation scene; receiving feedback information of a built-in tracker and positioning equipment of the medical dummy model and outputting imaging data;
the medical dummy model is internally provided with a model tracker and is used for determining the relative position of the medical dummy model in a virtual scene, determining an ultrasonic detection area on the medical dummy model and sensing the distance between the medical dummy model and an ultrasonic detection handle;
the ultrasonic detection handle is used for operating an ultrasonic detection simulation experiment on the medical dummy model; the medical dummy model is provided with a handle tracker which interacts with a built-in model tracker of the medical dummy model;
the positioning device is arranged in the experimental space and is in communication connection with the PC host, and is used for positioning the virtual scene position of the experimental space and scanning and tracking the spatial positions of the VR device, the ultrasonic detection handle and the medical dummy model;
and the VR equipment displays the virtual scene imaging and the ultrasonic picture transmitted by the PC host.
As a further improvement of the invention, a desktop client is arranged in the PC host and is used for displaying virtual scene imaging and ultrasonic pictures transmitted by the PC host; the desktop client and the VR equipment are both provided with control panels, and the control panels are used for calling the control panels built in the virtual scene to execute image processing operations including image freeze frame and image brightness adjustment.
As a further improvement of the present invention, the PC host includes a picture storage and call module, and the picture storage and call module executes: storing an ultrasonic picture to be imaged in ultrasonic simulation teaching, and establishing association between the ultrasonic picture and different spatial position data in an ultrasonic detection area of the medical dummy model; when the model tracker senses a certain position of the ultrasonic detection area and the handle tracker of the ultrasonic detection handle, the model tracker calls and sends an ultrasonic picture corresponding to the space position, and the ultrasonic picture is displayed on an ultrasonic picture area of the desktop client and/or a display screen of the VR end.
As a further improvement of the present invention, the positioning device includes a first positioning device and a second positioning device, the first positioning device is disposed above the medical dummy model and is used for determining a central position of the virtual scene; the second positioning equipment and the first positioning equipment are arranged oppositely and used for scanning the space positions of the VR equipment and the ultrasonic detection handle.
The invention has the beneficial effects that: through set up positioning device in the experimental space, catch the spatial position relation of ultrasonic detection handle, medical dummy model, VR equipment in real time, improved the precision of location, guaranteed in the virtual scene that the formation of image of each teaching part is corresponding with the positional relation in the real scene. And interaction and relative position induction of the medical dummy model built-in tracker and the ultrasonic detection handle built-in tracker are realized, the simulation pictures of corresponding positions are called through feedback information, ultrasonic simulation imaging and simulation detection positions are in one-to-one correspondence, the imaging accuracy is guaranteed, and the teaching effect is improved.
Drawings
Fig. 1 is a schematic structural diagram of a medical ultrasonic simulation teaching system in the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments.
As shown in fig. 1, the medical ultrasound simulation teaching system of the present invention includes:
the PC host 1 builds a virtual scene corresponding to the real teaching scene through a 3D engine according to the real teaching operation scene; receiving feedback information of a built-in tracker and a positioning device 4 of the medical dummy model 2 and outputting imaging data; preferably, the PC host 1 performs scene modeling and function code using a Unity3D engine.
The medical dummy model 2 is internally provided with a model Tracker, preferably a Tracker, namely a positioning Tracker, and is used for determining the relative position of the medical dummy model 2 in a virtual scene, determining an ultrasonic detection area on the medical dummy model 2 and sensing the distance between the model Tracker and the ultrasonic detection handle 3.
An ultrasonic detection handle 3 for operating an ultrasonic detection simulation experiment on the medical dummy model 2; it is provided with a handle tracker which interacts with a built-in model tracker of the medical dummy model 2; the handle Tracker is preferably a Tracker to enable tracking of the position of the ultrasonic probe handle 3.
The positioning device 4 is arranged in the experimental space and is in communication connection with the PC host 1, and is used for positioning the virtual scene position of the experimental space and scanning and tracking the spatial positions of the VR device 5, the ultrasonic detection handle 3 and the medical dummy model 2; the positioning device 4 is preferably a laser positioner, and the position of the Tracker is scanned by high frequency and fed back to the PC host 1 for processing.
And the VR device 5 displays the virtual scene imaging and the ultrasonic picture transmitted by the PC host 1. The VR device 5 is preferably a VR headset, presenting a view of the virtual scene and the virtual handle.
This system passes through positioning device 4, ultrasonic detection handle 3 and its built-in handle Tracker, medical dummy model 2 and its built-in model Tracker, cooperation between the PC host computer 1 has satisfied the demand of ultrasonic simulation teaching, compare in the past the high integrated emulation all-in-one cost defect expensive, each part purchasing cost is low in this system, the better cost of later maintenance is also low, and pass through positioning device 4, the collocation location of Tracker, the accuracy nature of location has been guaranteed comprehensively, make the teaching process in the virtual scene keep unanimous with the simulation operation process in the real scene, be difficult for producing the deviation, the quality of teaching has been guaranteed.
After the Tracker on the medical dummy model 2 detects that the distance from the Tracker on the ultrasonic detection handle 3 reaches the detection area of the model, the PC host 1 displays ultrasonic pictures of corresponding positions according to the data transmitted back by the Tracker and different spatial positions and sends the ultrasonic pictures to the desktop client and the VR device 5.
A desktop client is arranged in the PC host 1 and is used for displaying virtual scene imaging and ultrasonic pictures transmitted by the PC host 1; the desktop client and the VR device 5 are both provided with control panels, and the control panels are used for calling control panels built in the virtual scene to execute image processing operations including image freezing and image brightness adjustment.
The display picture can be adjusted by a control panel at the lower right corner of a left display frame at the client end of the desktop, and when a sliding block is controlled in the control panel in a sliding manner, the PC host 1 processes the image and presents the picture.
PC host computer 1 can be through connecting display 6, through operation desktop client, the virtual formation of image in the synchronous display VR equipment 5 on display 6, in the teaching, the student wears VR equipment 5 and practices, the mr can observe student's operation process through display 6 in real time, thereby clearly understand student's the study condition, also can be through the control panel of desktop client, freeze certain picture at any time to decide, perhaps carry out operations such as adjustting of the lighteness, enlarge to the picture, come to make the pointing in real time to certain operation process, and then improve teaching efficiency.
The PC host 1 comprises a picture storage and calling module, and the picture storage and calling module executes: storing an ultrasonic picture which needs to be imaged in ultrasonic simulation teaching, and establishing association between the ultrasonic picture and different spatial position data in an ultrasonic detection area of the medical dummy model 2; when the model tracker is in a certain position of the ultrasonic detection area and is inducted with the handle tracker of the ultrasonic detection handle 3, the ultrasonic picture corresponding to the space position is called and sent, and the ultrasonic picture area of the desktop client side and/or the display screen of the VR end are/is displayed.
The PC host 1 needs to store corresponding pictures in advance, and when the ultrasonic detection handle 3 is close to a detection area, a Tracker on the medical dummy model 2 senses that the pictures can be fed back to the PC host 1 to call the pictures to be displayed on an ultrasonic picture area of a desktop client and a display screen of the VR virtual ultrasonic machine.
Before the teaching, can be according to the case history condition that will impart knowledge to students, pertinence ground is gone into required supersound picture in PC host computer 1 to correspond and go up the concrete position in medical dummy model 2 supersound detection area, when the student when practising, when supersound detection handle 3 moved the corresponding position in the supersound detection area, PC host computer 1 can call corresponding supersound picture and show on VR equipment 5 and desktop client, accomplishes the simulation teaching to a certain case ultrasonic testing. Different ultrasonic pictures can be changed according to different case teaching in teaching, the learning requirements of different cases are met, and the trouble that the operation needs to be carried out on real patients in the previous learning of different cases is solved.
Positioning device 4 the laser positioning device 4 needs at least 2 in the whole experimental space for scanning and tracking the spatial position (x, y, z) of the VR helmet, the ultrasonic detection handle 3 with Tracker, and the medical dummy model 2 with Tracker. The medical dummy model comprises a first positioning device 41 and a second positioning device 42, wherein the first positioning device 41 is arranged above the medical dummy model 2 and is used for determining the central position of a virtual scene; the second positioning device 42 is arranged opposite to the first positioning device 41 for scanning the spatial position of the VR device 5, the ultrasound probe handle 3. This experimental space can be simulated by the positioning device 4 and the positions of the various components, including the angle of steering when wearing the VR device 5, etc., are accurately located to match the scene in the virtual scene to the real experimental space.
Based on the medical ultrasonic simulation teaching system, the medical ultrasonic simulation teaching method comprises the following steps:
s1, scene construction: and according to the real teaching operation scene, building a virtual scene corresponding to the real teaching scene through a 3D engine.
The method specifically comprises the following steps: in combination with a real teaching operation scene, the PC host 1 completes modeling of a virtual scene and an operation function in the scene by using the Unity3D engine, and transmits visual information of the virtual scene to the desktop client and/or the VR client.
S2, capturing and positioning: the positioning device 4 scans and tracks the spatial position of each entity teaching component in the real scene, feeds the spatial position back to the virtual scene, and establishes the position association between the virtual scene and the corresponding teaching component in the real teaching scene.
The step S2 specifically includes:
s21, scanning and tracking spatial positions of a VR end in a real scene and a handle tracker arranged in the ultrasonic detection handle 3 by the positioning equipment 4;
s22, the positioning device 4 scans the built-in tracker of the medical dummy model 2 and confirms the relative position of the medical dummy model 2 and the ultrasonic detection area of the medical dummy model;
s23, the positioning device 4 feeds back the acquired real-time position information to the PC host 1, and the PC host 1 converts the position information into visual information and transmits the visual information to the desktop client and/or the VR end for imaging.
S3, simulating operation and generating an image: selecting an ultrasonic project to open the examination, enabling an ultrasonic detection handle 3 to be close to a medical dummy model 2 in a real teaching scene, and acquiring semitransparent blue spherical image volume data generated on the medical dummy model 2 in a virtual scene; when the ultrasonic detection handle 3 with the Tracker is close to the periphery of the data of the medical dummy model 2, the PC host 1 calls the picture data file of the corresponding part on the body of the medical dummy model 2 and transmits the picture data file to the desktop client and the VR device 5 for imaging display.
Step S3 further includes:
s31, the medical dummy model 2 detects the distance between the medical dummy model 2 and a built-in handle tracker of the ultrasonic detection handle 3 in real time through the built-in model tracker, and if the model tracker detects that the position of the handle tracker reaches an ultrasonic detection area of the medical dummy model 2, the model tracker feeds back a relative spatial position relation to the end of the PC host 1;
and S32, the PC host 1 receives the data returned by the model tracker, displays the ultrasonic pictures at corresponding positions according to different spatial position relations, and sends the ultrasonic pictures to the desktop client and/or the VR end for imaging.
The process that the PC host 1 calls the ultrasonic picture to image further comprises the following steps:
the PC host 1 stores an ultrasonic picture which needs to be imaged in ultrasonic simulation teaching, and establishes association between the ultrasonic picture and different spatial position data in an ultrasonic detection area of the medical dummy model 2; when the model tracker senses a certain position of the ultrasonic detection area and the handle tracker of the ultrasonic detection handle 3, the PC host 1 calls and sends the ultrasonic picture corresponding to the space position and displays the ultrasonic picture area of the desktop client and/or the display screen of the VR end.
S4, image adjusting: and calling a control panel built in the virtual scene to execute image processing operations including image freeze and image brightness adjustment. The images displayed in the desktop client and the VR device 5 can be frozen and the brightness of the images can be changed through software, so that virtual ultrasonic detection is completed.
The foregoing is a more detailed description of the invention in connection with specific preferred embodiments and it is not intended that the invention be limited to these specific details. For those skilled in the art to which the invention pertains, several simple deductions or substitutions can be made without departing from the spirit of the invention, and all shall be considered as belonging to the protection scope of the invention.

Claims (9)

1. A medical ultrasonic simulation teaching method is characterized by comprising the following steps:
s1, scene construction: according to a real teaching operation scene, building a virtual scene corresponding to the real teaching scene through a 3D engine;
s2, capturing and positioning: the positioning equipment scans and tracks the spatial position of each entity teaching part in the real scene, feeds the spatial position back to the virtual scene, and establishes the position association between the virtual scene and the corresponding teaching part in the real teaching scene;
s3, simulating operation and generating an image: selecting an ultrasonic item to open examination, enabling an ultrasonic detection handle to be close to a medical dummy model in a real teaching scene, and acquiring image volume data generated on the medical dummy model in a virtual scene; transferring picture data files of corresponding parts on the medical dummy model to a virtual scene for imaging;
the step S2 specifically includes:
s21, scanning and tracking spatial positions of a VR end in a real scene and a handle tracker arranged in an ultrasonic detection handle by positioning equipment;
s22, scanning a built-in tracker of the medical dummy model by positioning equipment, and confirming the relative position of the medical dummy model and an ultrasonic detection area of the medical dummy model;
s23, the positioning equipment feeds back the acquired real-time position information to a PC host, and the PC host converts the position information into visual information and transmits the visual information to a desktop client and/or a VR end for imaging;
the positioning equipment comprises first positioning equipment and second positioning equipment, and the first positioning equipment is arranged above the medical dummy model and used for determining the central position of the virtual scene; the second positioning equipment and the first positioning equipment are arranged oppositely and used for scanning the space positions of the VR equipment and the ultrasonic detection handle.
2. The medical ultrasound simulation teaching method according to claim 1, wherein the step S1 specifically includes:
and combining a real teaching operation scene, the PC host completes modeling of a virtual scene and operation functions in the scene by using the 3D engine, and transmits the visual information of the virtual scene to the desktop client and/or the VR terminal.
3. The medical ultrasound simulation teaching method according to claim 1, wherein the step S3 further comprises:
s31, the medical dummy model detects the distance between the medical dummy model and a handle tracker arranged in an ultrasonic detection handle in real time through an internally arranged model tracker, and if the model tracker detects that the position of the handle tracker reaches an ultrasonic detection area of the medical dummy model, the model tracker feeds back a relative spatial position relation to a PC host end;
and S32, the PC host receives data returned by the model tracker, displays ultrasonic pictures of corresponding positions according to different spatial position relations, and sends the ultrasonic pictures to the desktop client and/or the VR end for imaging display.
4. The medical ultrasound simulation teaching method according to claim 3, wherein the process of calling the ultrasound image for imaging by the PC host in the step S32 further comprises:
the PC host stores an ultrasonic picture which needs to be imaged in the ultrasonic simulation teaching, and establishes the association between the ultrasonic picture and different spatial position data in the ultrasonic detection area of the medical dummy model; when the model tracker is in a certain position of the ultrasonic detection area and is inducted with the handle tracker of the ultrasonic detection handle, the PC host calls and sends the ultrasonic picture corresponding to the space position and displays the ultrasonic picture area of the desktop client and/or the display screen of the VR end.
5. The medical ultrasound simulation teaching method according to claim 1, further comprising the steps of:
s4, image adjusting: and calling a control panel built in the virtual scene to execute image processing operations including image freezing and image brightness adjustment.
6. A medical ultrasonic simulation teaching system is characterized by comprising
The PC host builds a virtual scene corresponding to the real teaching scene through a 3D engine according to the real teaching operation scene; receiving feedback information of a built-in tracker and positioning equipment of the medical dummy model and outputting imaging data;
the medical dummy model is internally provided with a model tracker and is used for determining the relative position of the medical dummy model in a virtual scene, determining an ultrasonic detection area on the medical dummy model and sensing the distance between the medical dummy model and an ultrasonic detection handle;
the ultrasonic detection handle is used for operating an ultrasonic detection simulation experiment on the medical dummy model; the medical dummy model is provided with a handle tracker which interacts with a built-in model tracker of the medical dummy model;
the positioning device is arranged in the experimental space and is in communication connection with the PC host, and is used for positioning the virtual scene position of the experimental space and scanning and tracking the spatial positions of the VR device, the ultrasonic detection handle and the medical dummy model;
and the VR equipment displays the virtual scene imaging and the ultrasonic picture transmitted by the PC host.
7. The medical ultrasonic simulation teaching system according to claim 6, wherein a desktop client is built in the PC host for displaying the virtual scene imaging and the ultrasonic picture transmitted by the PC host; the desktop client and the VR equipment are both provided with control panels, and the control panels are used for calling control panels built in the virtual scene to execute image processing operations including image freezing and image brightness adjustment.
8. The medical ultrasound simulation teaching system according to claim 6, wherein the PC host comprises a picture storage and calling module, and the picture storage and calling module executes: storing an ultrasonic picture which needs to be imaged in ultrasonic simulation teaching, and establishing association between the ultrasonic picture and different spatial position data in an ultrasonic detection area of the medical dummy model; when the model tracker senses a certain position of the ultrasonic detection area and the handle tracker of the ultrasonic detection handle, the model tracker calls and sends an ultrasonic picture corresponding to the space position, and the ultrasonic picture is displayed on an ultrasonic picture area of the desktop client and/or a display screen of the VR end.
9. The medical ultrasonic simulation teaching system according to claim 6, wherein the positioning device comprises a first positioning device and a second positioning device, the first positioning device is disposed above the medical dummy model for determining the center position of the virtual scene; the second positioning equipment and the first positioning equipment are arranged oppositely and used for scanning the space positions of the VR equipment and the ultrasonic detection handle.
CN202210885276.7A 2022-07-26 2022-07-26 Medical ultrasonic simulation teaching method and system Active CN115132013B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210885276.7A CN115132013B (en) 2022-07-26 2022-07-26 Medical ultrasonic simulation teaching method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210885276.7A CN115132013B (en) 2022-07-26 2022-07-26 Medical ultrasonic simulation teaching method and system

Publications (2)

Publication Number Publication Date
CN115132013A CN115132013A (en) 2022-09-30
CN115132013B true CN115132013B (en) 2023-03-14

Family

ID=83385385

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210885276.7A Active CN115132013B (en) 2022-07-26 2022-07-26 Medical ultrasonic simulation teaching method and system

Country Status (1)

Country Link
CN (1) CN115132013B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103735312A (en) * 2013-12-11 2014-04-23 中国科学院深圳先进技术研究院 Multimode image navigation system for ultrasonic guidance operation
CN107369349A (en) * 2017-08-25 2017-11-21 段军 Medical supersonic simulation system and method for information display
CN107578662A (en) * 2017-09-01 2018-01-12 北京大学第医院 A kind of virtual obstetric Ultrasound training method and system
CN107862912A (en) * 2017-12-20 2018-03-30 四川纵横睿影医疗技术有限公司 Medical educational system based on VR technologies
CN108538095A (en) * 2018-04-25 2018-09-14 惠州卫生职业技术学院 Medical teaching system and method based on virtual reality technology
CN109840947A (en) * 2017-11-28 2019-06-04 广州腾讯科技有限公司 Implementation method, device, equipment and the storage medium of augmented reality scene
CN110033683A (en) * 2019-04-15 2019-07-19 四川大学华西医院 A kind of ultrasound training system
CN111951651A (en) * 2020-07-30 2020-11-17 中南民族大学 Medical ultrasonic equipment experiment teaching system based on VR
CN112991854A (en) * 2021-02-05 2021-06-18 四川大学华西医院 Ultrasonic teaching method, device and system and electronic equipment
CN114038259A (en) * 2021-10-20 2022-02-11 俞正义 5G virtual reality medical ultrasonic training system and method thereof

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160328998A1 (en) * 2008-03-17 2016-11-10 Worcester Polytechnic Institute Virtual interactive system for ultrasound training

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103735312A (en) * 2013-12-11 2014-04-23 中国科学院深圳先进技术研究院 Multimode image navigation system for ultrasonic guidance operation
CN107369349A (en) * 2017-08-25 2017-11-21 段军 Medical supersonic simulation system and method for information display
CN107578662A (en) * 2017-09-01 2018-01-12 北京大学第医院 A kind of virtual obstetric Ultrasound training method and system
CN109840947A (en) * 2017-11-28 2019-06-04 广州腾讯科技有限公司 Implementation method, device, equipment and the storage medium of augmented reality scene
CN107862912A (en) * 2017-12-20 2018-03-30 四川纵横睿影医疗技术有限公司 Medical educational system based on VR technologies
CN108538095A (en) * 2018-04-25 2018-09-14 惠州卫生职业技术学院 Medical teaching system and method based on virtual reality technology
CN110033683A (en) * 2019-04-15 2019-07-19 四川大学华西医院 A kind of ultrasound training system
CN111951651A (en) * 2020-07-30 2020-11-17 中南民族大学 Medical ultrasonic equipment experiment teaching system based on VR
CN112991854A (en) * 2021-02-05 2021-06-18 四川大学华西医院 Ultrasonic teaching method, device and system and electronic equipment
CN114038259A (en) * 2021-10-20 2022-02-11 俞正义 5G virtual reality medical ultrasonic training system and method thereof

Also Published As

Publication number Publication date
CN115132013A (en) 2022-09-30

Similar Documents

Publication Publication Date Title
US20200279498A1 (en) Augmented and virtual reality simulator for professional and educational training
US20210266518A1 (en) Systems and methods for determining three dimensional measurements in telemedicine application
KR101816172B1 (en) The simulation system for training and the method thereof
CN107527542B (en) Percussion training system based on motion capture
CN113706960A (en) Nursing operation exercise platform based on VR technology and use method
CN109961520B (en) VR/MR classroom based on third view angle technology and construction method thereof
CN112348942A (en) Body-building interaction method and system
GB2622351A (en) Automated scan of common ailments so that a consistent image can be given to a doctor for analysis
CN112102667A (en) Video teaching system and method based on VR interaction
CN114373351B (en) Photoelectric theodolite panoramic simulation training system
CN107945607A (en) Ultrasonic demo system and device
CN108831233A (en) A kind of MRI virtual simulated training system and method
EP3690606A1 (en) Virtual-real combination-based human anatomical structure display and interaction method
CN113256724B (en) Handle inside-out vision 6-degree-of-freedom positioning method and system
CN115132013B (en) Medical ultrasonic simulation teaching method and system
CN114387679A (en) System and method for realizing sight line estimation and attention analysis based on recursive convolutional neural network
TWI687904B (en) Interactive training and testing apparatus
CN117111724A (en) Data processing method and system for XR
Shabir et al. Development and Evaluation of a Mixed-Reality Tele-ultrasound System
CN109461351B (en) Three-screen interactive augmented reality game training system
Sun Research on Dance Motion Capture Technology for Visualization Requirements
CN113283402B (en) Differential two-dimensional fixation point detection method and device
CN113010936B (en) Auxiliary construction method, equipment and readable storage medium
Hua et al. Calibration of an HMPD-based augmented reality system
CN109407840A (en) A kind of visual angle effect method of motion capture technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant