CN114694442A - Ultrasonic training method and device based on virtual reality, storage medium and ultrasonic equipment - Google Patents

Ultrasonic training method and device based on virtual reality, storage medium and ultrasonic equipment Download PDF

Info

Publication number
CN114694442A
CN114694442A CN202011639679.0A CN202011639679A CN114694442A CN 114694442 A CN114694442 A CN 114694442A CN 202011639679 A CN202011639679 A CN 202011639679A CN 114694442 A CN114694442 A CN 114694442A
Authority
CN
China
Prior art keywords
virtual
ultrasonic
dimensional
probe
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011639679.0A
Other languages
Chinese (zh)
Inventor
戴晓
贾廷秀
龚栋梁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuxi Chudian Technology Co ltd
Original Assignee
Wuxi Chudian Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuxi Chudian Technology Co ltd filed Critical Wuxi Chudian Technology Co ltd
Priority to CN202011639679.0A priority Critical patent/CN114694442A/en
Publication of CN114694442A publication Critical patent/CN114694442A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses an ultrasonic training method, an ultrasonic training device, a storage medium and ultrasonic equipment based on virtual reality, wherein the method comprises the following steps: randomly loading a virtual three-dimensional training model at least containing an examination part and a virtual ultrasonic probe from a model library according to the examination part selected by a user, wherein the virtual three-dimensional training model contains a target ultrasonic image; displaying a guide path on the surface of the virtual three-dimensional training model according to the real-time pose information of the virtual ultrasonic probe and the target pose information of the virtual ultrasonic probe corresponding to the target ultrasonic image; and controlling the virtual ultrasonic probe to move along the guide path through the simulation probe to check the virtual three-dimensional training model so as to obtain the target ultrasonic image. The training effect almost the same as that of physical operation can be achieved by implementing the method, and the method has the advantages of low use cost, strong sense of reality and good teaching effect.

Description

Ultrasonic training method and device based on virtual reality, storage medium and ultrasonic equipment
Technical Field
The invention relates to the technical field of medical imaging, in particular to an ultrasonic training method and device based on virtual reality, a storage medium and ultrasonic equipment.
Background
Ultrasonic diagnosis is a diagnostic method which applies ultrasonic detection technology to human body, knows the data form of human body physiology or tissue structure by measuring specific parameters, finds diseases and gives prompts. Ultrasonic diagnosis is highly dependent on operators, and the operators must acquire accurate ultrasonic examination results through professional ultrasonic techniques and ultrasonic image knowledge. Therefore, excellent training of ultrasound operation is the basis for clinical application of ultrasound diagnostic techniques.
The current ultrasonic training course is divided into two parts of classroom theory explanation and clinical teaching. On one hand, the difference between theoretical explanation and actual operation is often large, so that a student cannot intuitively master the key points of the operation technique; on the other hand, clinical teaching is often limited by patients and operating environments, large-scale training cannot be performed, and students cannot perform ultrasonic operation on patients directly, so that the ultrasonic performance of typical diseases is difficult to observe by the existing clinical teaching. The above disadvantages all make the medical ultrasound training not ideal, resulting in the inability of the trainee to master the clinical ultrasound skills well.
Disclosure of Invention
In view of this, embodiments of the present invention provide an ultrasound training method and apparatus based on virtual reality, a storage medium, and an ultrasound device, so as to solve the technical problem that a trainee cannot well master clinical ultrasound skills in the existing ultrasound training.
The technical scheme provided by the invention is as follows:
the embodiment of the invention provides an ultrasonic training method based on virtual reality in a first aspect, which comprises the following steps:
randomly loading a virtual three-dimensional training model at least containing an examination part and a virtual ultrasonic probe from a model library according to the examination part selected by a user, wherein the virtual three-dimensional training model contains a target ultrasonic image;
displaying a guide path on the surface of the virtual three-dimensional training model according to the real-time pose information of the virtual ultrasonic probe and the target pose information of the virtual ultrasonic probe corresponding to the target ultrasonic image;
and controlling the virtual ultrasonic probe to move along the guide path through the simulation probe to check the virtual three-dimensional training model so as to obtain the target ultrasonic image.
Further, still include:
displaying a virtual ultrasound image obtained by the virtual ultrasound probe in front of one eye of the user;
displaying the target ultrasound image in front of the other eye of the user.
Further, the method also comprises the following steps:
hiding the guide path in response to a control instruction of the user;
calculating the checking time of the user for controlling the virtual ultrasonic probe to obtain the target ultrasonic image through the simulation probe once;
and when the checking time is less than the preset time, judging that the user passes the training.
Further, the virtual three-dimensional training model is obtained by the following method:
controlling an ultrasonic probe to perform ultrasonic scanning on a real inspection part along a preset direction, and acquiring an ultrasonic image corresponding to each section of the real inspection part;
acquiring pose information of an ultrasonic probe corresponding to each ultrasonic image;
and inputting the ultrasonic image corresponding to each section and the pose information of the ultrasonic probe corresponding to each ultrasonic image into a trained three-dimensional reconstruction network model to obtain the virtual three-dimensional training model.
Further, the three-dimensional reconstruction network model is obtained by training according to the following method:
inputting the ultrasonic images of a plurality of real examination objects into a first convolution neural network for feature extraction to obtain image features;
inputting pose information of the ultrasonic probes corresponding to the ultrasonic images into a second convolutional neural network for feature extraction to obtain pose features;
fusing image characteristics corresponding to the ultrasonic images of the plurality of real examination objects with pose characteristics of the ultrasonic probes corresponding to the plurality of ultrasonic images to obtain fused characteristics;
and generating a three-dimensional reconstruction network model according to the ultrasonic images of the real examination objects and the fusion characteristics of the ultrasonic probes corresponding to the ultrasonic images.
Further, the virtual three-dimensional training model is obtained by the following method:
capturing real scene information at least containing a real examination part to generate a real three-dimensional model;
acquiring a virtual three-dimensional ultrasonic model of a real examination part;
and fusing the virtual three-dimensional ultrasonic model and the real three-dimensional model to obtain a virtual three-dimensional training model.
Further, controlling the virtual ultrasonic probe to move along the guide path through the simulation probe to check the virtual three-dimensional training model so as to obtain the target ultrasonic image, specifically including:
acquiring pose information of the simulation probe through one or more of a magnetic sensor, an IMU (inertial measurement Unit) or a camera;
generating a pose adjusting instruction according to the pose information of the simulation probe;
synchronously controlling the virtual ultrasonic probe in response to the pose adjustment instruction to obtain the target ultrasonic image.
A second aspect of an embodiment of the present invention provides a virtual reality-based ultrasound training system, including: the VR equipment loads a virtual three-dimensional training model at least containing an examination part and a virtual ultrasonic probe randomly from a model library according to the examination part selected by a user, wherein the virtual three-dimensional training model contains a target ultrasonic image; the path guiding module is used for displaying a guiding path on the surface of the virtual three-dimensional training model according to the real-time pose information of the virtual ultrasonic probe and the target pose information of the virtual ultrasonic probe corresponding to the target ultrasonic image; and the control module controls the virtual ultrasonic probe to move along the guide path through the simulation probe to check the virtual three-dimensional training model so as to obtain the target ultrasonic image.
A third aspect of embodiments of the present invention provides a computer-readable storage medium storing computer instructions for causing a computer to perform a method for virtual reality based ultrasound training according to any one of the first aspect and the first aspect of embodiments of the present invention.
A fourth aspect of the embodiments of the present invention provides an ultrasound apparatus, including: a memory and a processor, the memory and the processor being communicatively coupled to each other, the memory storing computer instructions, and the processor executing the computer instructions to perform the method for virtual reality based ultrasound training according to the first aspect and any one of the first aspect of the embodiments of the present invention.
The technical scheme provided by the invention has the following effects:
according to the ultrasonic training method, the ultrasonic training device, the ultrasonic training storage medium and the ultrasonic equipment based on the virtual reality, the ultrasonic examination case in the real scene is generated into the virtual three-dimensional training model through the virtual reality technology, and an inexperienced user can train for multiple times under the guidance of the guide path to obtain the target ultrasonic image. The training effect almost the same as that of real object operation can be achieved through the implementation of the invention, and the invention has the advantages of low use cost, strong sense of reality and good teaching effect.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flow diagram of a method of virtual reality based ultrasound training according to an embodiment of the invention;
FIG. 2 is a block diagram of a virtual reality based ultrasound training system according to an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of a computer-readable storage medium provided according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an ultrasound apparatus provided according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The current ultrasonic training course is divided into two parts of classroom theory explanation and clinical teaching. On one hand, the difference between theoretical explanation and actual operation is often large, so that a student cannot intuitively master the key points of the operation technique; on the other hand, clinical teaching is often limited by patients and operating environments, large-scale training cannot be performed, and students cannot perform ultrasonic operation on patients directly, so that the ultrasonic performance of typical diseases is difficult to observe by the existing clinical teaching. All the defects cause the unsatisfactory effect of medical ultrasonic training, so that the trainees cannot well master the clinical ultrasonic skills.
The embodiment of the invention provides an ultrasonic training method based on virtual reality in a first aspect, which comprises the following steps:
s100, randomly loading a virtual three-dimensional training model at least containing an examination part and a virtual ultrasonic probe from a model library according to the examination part selected by a user, wherein the virtual three-dimensional training model contains a target ultrasonic image;
in one embodiment, the virtual three-dimensional training model is obtained by:
controlling an ultrasonic probe to perform ultrasonic scanning on a real inspection part along a preset direction, and acquiring an ultrasonic image corresponding to each section of the real inspection part;
acquiring pose information of an ultrasonic probe corresponding to each ultrasonic image;
and inputting the ultrasonic images corresponding to the sections and the pose information of the ultrasonic probes corresponding to the ultrasonic images into a trained three-dimensional reconstruction network model to obtain the virtual three-dimensional training model.
Further, the three-dimensional reconstruction network model is obtained by training through the following method:
inputting ultrasonic images of a plurality of real examination objects into a first convolutional neural network for feature extraction to obtain image features;
inputting pose information of the ultrasonic probes corresponding to the ultrasonic images into a second convolutional neural network for feature extraction to obtain pose features;
fusing image characteristics corresponding to the ultrasonic images of the plurality of real examination objects with pose characteristics of the ultrasonic probes corresponding to the plurality of ultrasonic images to obtain fused characteristics;
and generating a three-dimensional reconstruction network model according to the ultrasonic images of the real examination objects and the fusion characteristics of the ultrasonic probes corresponding to the ultrasonic images.
In another embodiment, the virtual three-dimensional training model is obtained by:
capturing real scene information at least containing a real examination part to generate a real three-dimensional model;
acquiring a virtual three-dimensional ultrasonic model of a real examination part;
and fusing the virtual three-dimensional ultrasonic model and the real three-dimensional model to obtain a virtual three-dimensional training model.
S120, displaying a guide path on the surface of the virtual three-dimensional training model according to the real-time pose information of the virtual ultrasonic probe and the target pose information of the virtual ultrasonic probe corresponding to the target ultrasonic image;
s130, controlling the virtual ultrasonic probe to move along the guide path through the simulation probe to check the virtual three-dimensional training model so as to obtain the target ultrasonic image. The method specifically comprises the following steps:
s131, acquiring pose information of the simulation probe through one or more of a magnetic sensor, an IMU (inertial measurement Unit) or a camera; in an embodiment, a camera may be provided outside the simulation probe for collecting pose information of the simulation probe, and the camera may be a three-dimensional camera.
In an embodiment, an Inertial sensor (IMU) is disposed in the simulation probe, and may obtain real-time pose information of the simulation probe, for example, coordinate information of real-time X-axis, Y-axis, and Z-axis of the simulation probe. In addition, the pose information of the simulation probe can be acquired through the magnetic sensor.
S132, generating a pose adjusting instruction according to the pose information of the simulation probe;
and S133, synchronously controlling the virtual ultrasonic probe in response to the pose adjusting instruction to obtain the target ultrasonic image.
In one embodiment, the method further comprises: displaying a virtual ultrasound image obtained by the virtual ultrasound probe in front of one eye of the user; displaying the target ultrasound image in front of the other eye of the user.
In order to check whether the user has met the requirements after a period of training, the present invention further comprises:
s140, responding to a control instruction of the user, and hiding the guide path;
s150, calculating the checking time of the user for controlling the virtual ultrasonic probe to obtain the target ultrasonic image through the simulation probe once;
and S160, when the checking time is less than the preset time, judging that the user passes the training.
According to the ultrasonic training method based on the virtual reality, provided by the embodiment of the invention, the ultrasonic examination case in the real scene is generated into the virtual three-dimensional training model through the virtual reality technology, and an inexperienced user can train for many times under the guidance of the guide path to obtain the target ultrasonic image. The training effect almost the same as that of real object operation can be achieved through the implementation of the invention, and the invention has the advantages of low use cost, strong sense of reality and good teaching effect.
As shown in fig. 2, a second aspect of the embodiments of the present invention provides a virtual reality-based ultrasound training system, including: the VR equipment loads a virtual three-dimensional training model at least containing an examination part and a virtual ultrasonic probe randomly from a model library according to the examination part selected by a user, wherein the virtual three-dimensional training model contains a target ultrasonic image; the path guiding module is used for displaying a guiding path on the surface of the virtual three-dimensional training model according to the real-time pose information of the virtual ultrasonic probe and the target pose information of the virtual ultrasonic probe corresponding to the target ultrasonic image; and the control module controls the virtual ultrasonic probe to move along the guide path through the simulation probe to check the virtual three-dimensional training model so as to obtain the target ultrasonic image.
According to the ultrasonic training system based on the virtual reality, provided by the embodiment of the invention, the ultrasonic examination case in the real scene is generated into the virtual three-dimensional training model through the virtual reality technology, and an inexperienced user can train for many times under the guidance of the guide path to obtain the target ultrasonic image. The training effect almost the same as that of real object operation can be achieved through the implementation of the invention, and the invention has the advantages of low use cost, strong sense of reality and good teaching effect.
A third aspect of embodiments of the present invention provides a computer-readable storage medium storing computer instructions for causing a computer to perform a method for virtual reality based ultrasound training according to any one of the first aspect and the first aspect of embodiments of the present invention.
As shown in fig. 3, a computer program 601 is stored thereon, which when executed by a processor, implements the steps of the virtual reality based ultrasound training method in the above embodiments. The storage medium is also stored with audio and video stream data, characteristic frame data, an interactive request signaling, encrypted data, preset data size and the like. The storage medium may be a magnetic Disk, an optical Disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a Flash Memory (Flash Memory), a Hard Disk (Hard Disk Drive, abbreviated as HDD) or a Solid State Drive (SSD), etc.; the storage medium may also comprise a combination of memories of the kind described above.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic Disk, an optical Disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a Flash Memory (Flash Memory), a Hard Disk (Hard Disk Drive, abbreviated as HDD), a Solid State Drive (SSD), or the like; the storage medium may also comprise a combination of memories of the kind described above.
A fourth aspect of the embodiments of the present invention provides an ultrasound apparatus, including: a memory and a processor, the memory and the processor being communicatively connected to each other, the memory storing computer instructions, and the processor executing the computer instructions to perform the method for virtual reality based ultrasound training according to any of the first aspect and the first aspect of the embodiments of the present invention.
As shown in fig. 4, the ultrasound device may include a processor 51 and a memory 52, wherein the processor 51 and the memory 52 may be connected by a bus or other means, and fig. 4 illustrates the connection by the bus as an example.
The processor 51 may be a Central Processing Unit (CPU). The Processor 51 may also be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, or combinations thereof.
The memory 52, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as the corresponding program instructions/modules in the embodiments of the present invention. The processor 51 executes various functional applications and data processing of the processor by executing the non-transitory software programs, instructions and modules stored in the memory 52, namely, implements the virtual reality-based ultrasound training method in the above method embodiment.
The memory 52 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created by the processor 51, and the like. Further, the memory 52 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 52 may optionally include memory located remotely from the processor 51, and these remote memories may be connected to the processor 51 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
Although the embodiments of the present invention have been described in conjunction with the accompanying drawings, those skilled in the art may make various modifications and variations without departing from the spirit and scope of the invention, and such modifications and variations fall within the scope defined by the appended claims.

Claims (10)

1. A virtual reality-based ultrasound training method is characterized by comprising the following steps:
randomly loading a virtual three-dimensional training model at least containing an examination part and a virtual ultrasonic probe from a model library according to the examination part selected by a user, wherein the virtual three-dimensional training model contains a target ultrasonic image;
displaying a guide path on the surface of the virtual three-dimensional training model according to the real-time pose information of the virtual ultrasonic probe and the target pose information of the virtual ultrasonic probe corresponding to the target ultrasonic image;
and controlling the virtual ultrasonic probe to move along the guide path through the simulation probe to check the virtual three-dimensional training model so as to obtain the target ultrasonic image.
2. The virtual reality-based ultrasound training method according to claim 1, further comprising:
displaying a virtual ultrasound image obtained by the virtual ultrasound probe in front of one eye of the user;
displaying the target ultrasound image in front of the other eye of the user.
3. The virtual reality-based ultrasound training method according to claim 1, further comprising:
hiding the guide path in response to a control instruction of the user;
calculating the checking time of the user for controlling the virtual ultrasonic probe to obtain the target ultrasonic image through the simulation probe once;
and when the checking time is less than the preset time, judging that the user passes the training.
4. The virtual reality based ultrasound training method of any of claims 1-3, wherein the virtual three-dimensional training model is obtained by:
controlling an ultrasonic probe to perform ultrasonic scanning on a real inspection part along a preset direction, and acquiring an ultrasonic image corresponding to each section of the real inspection part;
acquiring pose information of an ultrasonic probe corresponding to each ultrasonic image;
and inputting the ultrasonic image corresponding to each section and the pose information of the ultrasonic probe corresponding to each ultrasonic image into a trained three-dimensional reconstruction network model to obtain the virtual three-dimensional training model.
5. The virtual reality-based ultrasound training method according to any one of claims 1-3, wherein the three-dimensional reconstructed network model is obtained by training as follows:
inputting the ultrasonic images of a plurality of real examination objects into a first convolution neural network for feature extraction to obtain image features;
inputting pose information of the ultrasonic probes corresponding to the ultrasonic images into a second convolutional neural network for feature extraction to obtain pose features;
fusing image characteristics corresponding to the ultrasonic images of the plurality of real examination objects with pose characteristics of the ultrasonic probes corresponding to the plurality of ultrasonic images to obtain fused characteristics;
and generating a three-dimensional reconstruction network model according to the ultrasonic images of the real examination objects and the fusion characteristics of the ultrasonic probes corresponding to the ultrasonic images.
6. The virtual reality based ultrasound training method according to any of claims 1-3, wherein the virtual three-dimensional training model is obtained by:
capturing real scene information at least containing a real examination part to generate a real three-dimensional model;
acquiring a virtual three-dimensional ultrasonic model of a real examination part;
and fusing the virtual three-dimensional ultrasonic model and the real three-dimensional model to obtain a virtual three-dimensional training model.
7. The virtual reality-based ultrasound training method according to any one of claims 1 to 3, wherein the virtual three-dimensional training model is examined by controlling the virtual ultrasound probe to move along the guide path through a simulation probe to obtain the target ultrasound image, specifically comprising:
acquiring pose information of the simulation probe through one or more of the magnetic sensor, the IMU or the camera;
generating a pose adjusting instruction according to the pose information of the simulation probe;
synchronously controlling the virtual ultrasonic probe in response to the pose adjustment instruction to obtain the target ultrasonic image.
8. A virtual reality based ultrasound training system, comprising:
the VR equipment loads a virtual three-dimensional training model at least containing an examination part and a virtual ultrasonic probe randomly from a model library according to the examination part selected by a user, wherein the virtual three-dimensional training model contains a target ultrasonic image;
the path guiding module is used for displaying a guiding path on the surface of the virtual three-dimensional training model according to the real-time pose information of the virtual ultrasonic probe and the target pose information of the virtual ultrasonic probe corresponding to the target ultrasonic image;
and the control module controls the virtual ultrasonic probe to move along the guide path through the simulation probe to check the virtual three-dimensional training model so as to obtain the target ultrasonic image.
9. A computer-readable storage medium storing computer instructions for causing a computer to perform the virtual reality based ultrasound training method of any one of claims 1 to 7.
10. An ultrasound device, comprising: a memory and a processor, the memory and the processor communicatively coupled to each other, the memory storing computer instructions, the processor executing the computer instructions to perform the method for virtual reality based ultrasound training as recited in any of claims 1-7.
CN202011639679.0A 2020-12-31 2020-12-31 Ultrasonic training method and device based on virtual reality, storage medium and ultrasonic equipment Pending CN114694442A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011639679.0A CN114694442A (en) 2020-12-31 2020-12-31 Ultrasonic training method and device based on virtual reality, storage medium and ultrasonic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011639679.0A CN114694442A (en) 2020-12-31 2020-12-31 Ultrasonic training method and device based on virtual reality, storage medium and ultrasonic equipment

Publications (1)

Publication Number Publication Date
CN114694442A true CN114694442A (en) 2022-07-01

Family

ID=82135766

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011639679.0A Pending CN114694442A (en) 2020-12-31 2020-12-31 Ultrasonic training method and device based on virtual reality, storage medium and ultrasonic equipment

Country Status (1)

Country Link
CN (1) CN114694442A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002102221A (en) * 2000-10-02 2002-04-09 Aloka Co Ltd Ultrasonic probe and ultrasonic diagnostic equipment
US20140343571A1 (en) * 2011-12-03 2014-11-20 Koninklijke Philips N.V. Robotic guidance of ultrasound probe in endoscopic surgery
US20180360488A1 (en) * 2016-04-15 2018-12-20 Socionext Inc. Ultrasonic probe control method and computer-readable storage medium holding program
CN109310396A (en) * 2016-06-20 2019-02-05 蝴蝶网络有限公司 For assisting the automated graphics of user's operation Vltrasonic device to obtain
CN110689792A (en) * 2019-11-19 2020-01-14 南方医科大学深圳医院 Ultrasonic examination virtual diagnosis training system and method
CN111415564A (en) * 2020-03-02 2020-07-14 武汉大学 Pancreatic ultrasonic endoscopy navigation method and system based on artificial intelligence

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002102221A (en) * 2000-10-02 2002-04-09 Aloka Co Ltd Ultrasonic probe and ultrasonic diagnostic equipment
US20140343571A1 (en) * 2011-12-03 2014-11-20 Koninklijke Philips N.V. Robotic guidance of ultrasound probe in endoscopic surgery
US20180360488A1 (en) * 2016-04-15 2018-12-20 Socionext Inc. Ultrasonic probe control method and computer-readable storage medium holding program
CN109310396A (en) * 2016-06-20 2019-02-05 蝴蝶网络有限公司 For assisting the automated graphics of user's operation Vltrasonic device to obtain
CN110689792A (en) * 2019-11-19 2020-01-14 南方医科大学深圳医院 Ultrasonic examination virtual diagnosis training system and method
CN111415564A (en) * 2020-03-02 2020-07-14 武汉大学 Pancreatic ultrasonic endoscopy navigation method and system based on artificial intelligence

Similar Documents

Publication Publication Date Title
Pfeiffer Measuring and visualizing attention in space with 3D attention volumes
US10350434B2 (en) Patient-specific radiation dose assessment in medical therapy
EP3735695A1 (en) System and method for patient engagement
JP2011511652A (en) System and method for automatic calibration of tracked ultrasound
CN112346572A (en) Method, system and electronic device for realizing virtual-real fusion
Müller et al. Virtual reality in surgical arthroscopic training
US10672125B2 (en) Method and system for supporting medical personnel
US11925418B2 (en) Methods for multi-modal bioimaging data integration and visualization
WO2018119676A1 (en) Display data processing method and apparatus
JP2022527007A (en) Auxiliary imaging device, control method and device for analysis of movement disorder disease
KR101791927B1 (en) Method and apparatus for estimating roughness of skin surface for haptic feedback apparatus based on perceptual analysis
CN111973273A (en) Operation navigation system, method, device and medium based on AR technology
CN115804652A (en) Surgical operating system and method
Huang et al. On mimicking human’s manipulation for robot-assisted spine ultrasound imaging
CN113614781A (en) System and method for identifying objects in an image
CN113574610A (en) System and method for imaging
DE112016001224T5 (en) Image processing apparatus, method and program
CN110796064B (en) Human muscle image establishing method and device, storage medium and electronic equipment
US20150160474A1 (en) Corrective lens prescription adaptation system for personalized optometry
US11766234B2 (en) System and method for identifying and navigating anatomical objects using deep learning networks
CN114694442A (en) Ultrasonic training method and device based on virtual reality, storage medium and ultrasonic equipment
Wagner et al. Intraocular surgery on a virtual eye
Li et al. Haptics-equiped interactive PCI simulation for patient-specific surgery training and rehearsing.
Tercero et al. Catheter insertion reference trajectory construction method using photoelastic stress analysis for quantification of respect for tissue during endovascular surgery simulation
Egorova et al. Determination of workspace for motion capture using Kinect

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination