CN212181696U - Panoramic VR interactive medical ultrasonic digital phantom system - Google Patents

Panoramic VR interactive medical ultrasonic digital phantom system Download PDF

Info

Publication number
CN212181696U
CN212181696U CN202021005258.8U CN202021005258U CN212181696U CN 212181696 U CN212181696 U CN 212181696U CN 202021005258 U CN202021005258 U CN 202021005258U CN 212181696 U CN212181696 U CN 212181696U
Authority
CN
China
Prior art keywords
data
equipment
local server
panoramic
personal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202021005258.8U
Other languages
Chinese (zh)
Inventor
赵连蒙
姜晓龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202021005258.8U priority Critical patent/CN212181696U/en
Application granted granted Critical
Publication of CN212181696U publication Critical patent/CN212181696U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The utility model relates to a panoramic VR interactive medical ultrasound digital phantom system, which is provided with two modules, including a digital modeling module and a VR image module generated by actual operation; based on the composition of the system, the hardware part is divided into a professional base server and a personal client server; above-mentioned two kinds of servers all contain handheld device, image digital processing equipment, wearable panorama VR equipment. With the popularization and development of the 5G technology, the system is connected with the network through the server, the big data cloud computing is achieved, and the VR interactive computing speed and the transmission efficiency are greatly improved.

Description

Panoramic VR interactive medical ultrasonic digital phantom system
Technical Field
The utility model describes an interactive medical supersound digital phantom system of panorama VR, along with the mature realization cloud of 5G technique calculates, generate VR image module including digital modeling module and actual operation, through professional local server and individual customer end, realize that medical field panorama VR is mutual.
Background
At present, medical ultrasonic learning is mainly based on theoretical learning based on teaching materials and related tutorials, and the practical purpose can not be achieved by an ultrasonic student in the computer-on time due to limited medical resources; the emphasis of each hospital is different, the types of cases are limited, and typical cases are also not available; coupled with the increasingly tense medical relationships, most patients are resistant to examination by general trainees.
In ultrasonic simulation teaching at home and abroad, the tissue-imitated ultrasonic phantom is mainly applied. Firstly, the phantom is complex in manufacturing process, high in cost and high in price, the cost of the imported ultrasonic phantom is higher, the cost is hundreds of thousands of imported ultrasonic phantoms, the cost is millions of imported ultrasonic phantoms, most hospitals and individuals cannot afford the phantom, and the phantom is difficult to produce in large quantity; secondly, the image quality of the phantom simulation is general, and most of the phantom simulation cannot realize functional imaging, such as cardiac examination and the like; thirdly, the internal structure of the tissue-imitated phantom is solidified, only normal tissues or single disease types can be simulated, case data cannot be imported in the later period, man-machine interaction cannot be realized, and the tissue-imitated phantom cannot be applied to simulation examination and examination.
SUMMERY OF THE UTILITY MODEL
A panoramic VR interactive medical ultrasound digital phantom system comprises a local handheld data acquisition device, a local server, a VR device, a cloud storage center, a cloud computing center, a personal client and a personal handheld data acquisition device, wherein the local handheld data acquisition device and the personal handheld data acquisition device are respectively connected with the local server and the personal client, the local handheld data acquisition device and the personal handheld data acquisition device acquire original images, the local handheld data acquisition device transmits acquired information to the local server, the personal handheld data acquisition device transmits acquired information to the personal client, the local server and the personal client transmit the original image acquired information to the cloud computing center for computational analysis and processing, and upload processed data to the cloud storage center for storage, and the local server and the personal client are respectively and electrically connected with the VR device, and the cloud computing center transmits the data analyzed and processed by the cloud computing center to VR equipment through the local server and the personal client respectively to generate VR display.
Preferably: the local handheld data acquisition equipment and the personal handheld data acquisition equipment respectively comprise an ultrasonic transducer, a 10-axis MEMS acceleration gyroscope, a Bluetooth 5.0/5G communication module, a microprocessor and a power supply module, wherein the ultrasonic transducer, the 10-axis MEMS acceleration gyroscope and the Bluetooth 5.0/5G communication module are respectively and electrically connected with the microprocessor, and the power supply module is used for supplying power to the microprocessor; the ultrasonic transducer generates ultrasonic waves to detect human tissues, and the reflected ultrasonic waves generate electric energy through the transducer and transmit the electric energy to the microprocessor;
preferably: the 10-axis MEMS acceleration gyroscope provides high-precision position transformation information of detected human tissues, including longitude, latitude and height positioning data, can encrypt original data when acquiring the original data to ensure the traceability of the original data, can generate an image with the positioning data when in actual operation, and transmits the image with the generated position data to a microprocessor so as to trace an operator;
the microprocessor converts electric energy generated by the ultrasonic transducer into data, transmits the data together with data of the 10-axis MEMS acceleration gyroscope to the Bluetooth 5.0/5G communication module, and wirelessly transmits the data to the local server and the personal client side through the module for image digital processing.
Preferably: the VR equipment comprises panoramic VR equipment and a virtual touch glove;
the panoramic VR equipment comprises glasses type VR equipment and helmet type VR equipment;
an earphone and a voice module are installed in the helmet type VR device, and the voice module and the earphone are used interactively.
Preferably: the personal client is a mobile phone, and the local server is a computer.
Preferably: and a positioning module is integrated on the personal client, so that real-time positioning can be realized.
The method for realizing teaching by applying the panoramic VR interactive medical ultrasonic digital phantom system is that,
the method comprises the following steps: in the original image acquisition process, a virtual human body model is made in software, sites of each standard tangent plane are marked on the surface of each visceral organ, and a database containing two-dimensional image data and angle data is generated;
step two: transmitting the two-dimensional image data and the angle data obtained in the step one to a communication module to generate wireless data, transmitting the wireless data to a local server and transmitting the wireless data to a cloud server;
step three: performing subsequent software processing on the data received by the local server in the step two, and storing the data to the local server and the cloud server;
step four: and calling out a corresponding section database of a corresponding part as required through system software, performing simulation demonstration of actual operation, and finally backing up the obtained real-time operation data and image generation data to a local server and a cloud server through data transmission.
Preferably: the specific implementation method of the first step is as follows:
step a: digital modeling, namely making a virtual human body model in software, marking the sites of each standard section on the surface of each visceral organ body, and generating a database;
step b, adopting the inspection section corresponding to the database to collect data: an ultrasonic transducer of the handheld device generates ultrasonic waves to detect human tissues, the reflected ultrasonic waves generate electric energy through the transducer, and the electric energy is converted into two-dimensional image data through a multi-core microprocessor on the integrated circuit; meanwhile, the 10-axis acceleration gyroscope of the handheld device generates accurate angle data along with position change, and the gyroscope itself is used as a coordinate origin.
Preferably: the concrete implementation method of the second step is as follows:
and (4) transmitting the two-dimensional image data and the angle data obtained in the step one to a Bluetooth 5.0/5G communication module through a microprocessor, generating wireless data at the communication module, transmitting and transmitting the wireless data to a local server.
Preferably: the concrete implementation method of the third step is as follows:
the local server receives the data, and the subsequent processing flow of the software is as follows:
(1) correcting the coordinate origin error caused by the relative position of a transducer and a gyroscope in the handheld equipment;
(2) combining the two-dimensional image data with angle data generated at the same time point to form image data with coordinate codes, wherein each pixel on the image corresponds to a coordinate point on a virtual three-dimensional space;
(3) decomposing a two-dimensional image with a coordinate point array into small pixels with three-dimensional coordinate codes, independently storing the pixels as original data, and automatically calibrating the gyroscope angle to zero by software when different organs and different sections are converted;
(4) and calling the database through the actual operation, generating VR images, storing the data obtained by the VR images in a local server, and uploading the data to a cloud server to realize resource sharing.
Preferably: the concrete implementation method of the fourth step is as follows:
calling out a virtual character through software, and when the handheld device reaches the corresponding section of the corresponding visceral organ of the VR virtual human body, automatically identifying by the software and calling out a corresponding section database of the corresponding part; the handheld device 10-axis MEMS acceleration gyroscope generates data along with position change, the data are transmitted to a Bluetooth 5.0/5G communication module to generate wireless data, the wireless data are transmitted to a local server/personal client, the local server finds out corresponding pixels with three-dimensional coordinate codes through three-dimensional coordinate data, the pixels are arranged according to corresponding positions to generate a visual picture formed by hundreds of small image units, the frame frequency is further generated to be 12fps, namely, videos with 12 pictures appearing every second can be matched with videos stored conventionally by the existing ultrasonic instrument, the videos are transmitted to wearable panoramic VR equipment and a screen, and finally, the obtained real-time operation data and image generation data are transmitted through data and are all backed up to the local server and a cloud server.
The utility model discloses following beneficial effect has:
the existing electronic components can basically meet the theoretical requirements of the system, for example, the 10-axis MEMS acceleration gyroscope has extremely high precision (3-dimensional acceleration, 3-dimensional angular velocity, 3-dimensional angle, 3-dimensional magnetic field, 1-dimensional air pressure, GPS, measuring range, acceleration of +/-16G, angular velocity of +/-2000 DEG/s, angle of +/-180 DEG, resolution, acceleration of 6.1e-5G, angular velocity of 7.6e-3 DEG/s, stability, acceleration of 0.01G, angular velocity of 0.05 DEG/s and measuring error of 0.01 DEG), the current 5G communication peak rate is as high as 1Gbps, the electronic components are commercially available, the data transmission barrier is broken, and the current computer calculated speed can completely meet the image processing requirement.
The utility model discloses the meaning of selecting cloud storage or cloud to calculate lies in: by decentralization, each local server can independently operate and can realize networking operation, and even if the local server fails, data loss can not be caused.
The utility model provides a this set of system is with low costs, and is easy and simple to handle, easily promotes, is convenient for popularize, and the learning process is real-time interactive to can use big data, cloud calculation, each local case of can uploading and downloading, every server and customer end all can resource sharing, make the case cover more extensively, let the student master more knowledge, can also promote multidisciplinary exchange.
Drawings
Figure 1 a schematic diagram of a panoramic VR interactive medical ultrasound digital phantom system;
FIG. 2 is a schematic diagram of a local version of a handheld device;
in fig. 2, 1 is an ultrasonic transducer, 2 is a gyroscope, 3 is a microprocessor, 4 is an integrated circuit, 5 is a bluetooth 5.0/5G module, 6 is a power supply and 7 is a switch;
FIG. 3 is a schematic diagram of a personal client version handset;
in FIG. 3, 2 gyroscope, 3 microprocessor, 4 IC, 5 Bluetooth 5.0/5G module, 6 power supply and 7 switch;
FIG. 4 is a schematic view of an original big data acquisition flow;
FIG. 5 is a schematic view of an image synthesis process with angular coordinate data;
FIG. 6 is a schematic diagram of a real-time imaging process for actual operation;
fig. 7 shows a structure diagram of the link between the local server and the personal client.
Detailed Description
In a first specific embodiment, the embodiment is described with reference to fig. 1 to 7 of the specification, and a teaching method implemented by using a panoramic VR interactive medical ultrasound digital phantom system in the embodiment includes the following steps:
the method comprises the following steps: in the original image acquisition process, a virtual human body model is made in software, sites of each standard tangent plane are marked on the surface of each visceral organ, and a database containing two-dimensional image data and angle data is generated;
step two: transmitting the two-dimensional image data and the angle data obtained in the step one to a communication module, generating wireless data, and transmitting the wireless data to a local server and a cloud server;
step three: performing subsequent software processing on the data received by the local server in the step two, storing the data in the local server and transmitting the data to the cloud server;
step four: and calling out a corresponding section database of a corresponding part as required through system software, performing simulation demonstration of actual operation, and finally backing up the obtained real-time operation data and image generation data to a local server and a cloud server through data transmission.
In a second embodiment, the embodiment is described with reference to fig. 1 to 7 of the specification, and a panoramic VR interactive medical ultrasound digital phantom system of the embodiment:
the specific implementation method of the first step is as follows:
step a: digital modeling, namely making a virtual human body model in software, marking the sites of each standard section on the surface of each visceral organ body, and generating a database;
step b, adopting the inspection section corresponding to the database to collect data: an ultrasonic transducer of the handheld device generates ultrasonic waves to detect human tissues, the reflected ultrasonic waves generate electric energy through the transducer, and the electric energy is converted into two-dimensional image data through a multi-core microprocessor on the integrated circuit; meanwhile, the 10-axis acceleration gyroscope of the handheld device generates accurate angle data along with position change, and the gyroscope itself is used as a coordinate origin.
In a third embodiment, the third embodiment is described with reference to fig. 1 to 7 in the specification, and a panoramic VR interactive medical ultrasound digital phantom system of the third embodiment:
the concrete implementation method of the second step is as follows:
and (4) transmitting the two-dimensional image data and the angle data obtained in the step one to a Bluetooth 5.0/5G communication module through a microprocessor, generating wireless data at the communication module, transmitting and transmitting the wireless data to a local server.
In a fourth embodiment, the present embodiment is described with reference to fig. 1 to 7 of the specification, and a panoramic VR interactive medical ultrasound digital phantom system of the present embodiment:
the concrete implementation method of the third step is as follows:
the local server receives the data, and the subsequent processing flow of the software is as follows:
(1) correcting the coordinate origin error caused by the relative position of a transducer and a gyroscope in the handheld equipment;
(2) combining the two-dimensional image data with angle data generated at the same time point to form image data with coordinate codes, wherein each pixel on the image corresponds to a coordinate point on a virtual three-dimensional space;
(3) decomposing a two-dimensional image with a coordinate point array into small pixels with three-dimensional coordinate codes, independently storing the pixels as original data, and automatically calibrating the gyroscope angle to zero by software when different organs and different sections are converted;
(4) and calling the database through the actual operation, and storing data obtained by generating the VR image in a local server.
In a fifth embodiment, the present embodiment is described with reference to fig. 1 to 7 of the specification, and a panoramic VR interactive medical ultrasound digital phantom system of the present embodiment:
the concrete implementation method of the fourth step is as follows:
calling out a virtual character through software, and when the handheld device reaches the corresponding section of the corresponding visceral organ of the VR virtual human body, automatically identifying by the software and calling out a corresponding section database of the corresponding part; the handheld device 10-axis MEMS acceleration gyroscope generates data along with position change, the data are transmitted to a Bluetooth 5.0/5G communication module to generate wireless data, the wireless data are transmitted to a local server/personal client, the local server finds out corresponding pixels with three-dimensional coordinate codes through three-dimensional coordinate data, the pixels are arranged according to corresponding positions to generate a visual picture formed by hundreds of small image units, the frame frequency is further generated to be 12fps, namely, videos with 12 pictures appearing every second can be matched with videos stored conventionally by the existing ultrasonic instrument, the videos are transmitted to wearable panoramic VR equipment and a screen, and finally, the obtained real-time operation data and image generation data are transmitted through data and are all backed up to the local server and a cloud server.
Sixth embodiment, the present embodiment is described with reference to fig. 1 to 7 of the specification, and a panoramic VR interactive medical ultrasound digital phantom system of the present embodiment includes a local handheld data acquisition device, a local server, a VR device, a cloud storage center, a cloud computing center, a personal client, and a personal handheld data acquisition device, where the local handheld data acquisition device and the personal handheld data acquisition device are respectively connected to the local server and the personal client, the local handheld data acquisition device and the personal handheld data acquisition device perform original image acquisition, the local handheld data acquisition device transmits acquired information to the local server, the personal handheld data acquisition device transmits acquired information to the personal client, and the local server and the personal client transmit the original image acquired information to the cloud computing center for computational analysis processing, and uploading the processed data to a cloud storage center for storage, wherein the local server and the personal client are respectively and electrically connected with VR equipment, and the cloud computing center transmits the data analyzed and processed by the cloud computing center to the VR equipment through the local server and the personal client to generate VR display.
Seventh embodiment, the present embodiment is described with reference to fig. 1 to 7 of the specification, and a panoramic VR interactive medical ultrasound digital phantom system of the present embodiment includes a local handheld device and a personal client handheld device:
1) the local handheld device (figure 2) is a palm ultrasonic instrument with a positioning function, and the device comprises a transducer, a Bluetooth 5.0/5G module, a 10-axis MEMS acceleration gyroscope, a highly integrated circuit board (containing a multi-core microprocessor), a power supply and a switch. The working principle is as follows: A. and (3) acquiring original data: the transducer can convert electric energy into ultrasonic waves and can convert received ultrasonic waves into electric energy; the gyroscope generates data (including time, acceleration, angle, angular velocity, longitude and latitude, magnetic field, air pressure, height and the like) along with the change of the position of the probe; the microprocessor can convert the electric energy generated by the transducer into data, transmit the data to the Bluetooth 5.0/5G communication module together with the data of the gyroscope, and wirelessly transmit the data to the image digital processing equipment through the module. The hand-held equipment is essentially different from the traditional ultrasonic probe, firstly, the hand-held equipment is equivalent to the traditional ultrasonic diagnostic apparatus without a display, the traditional ultrasonic probe is wired, only a transducer is arranged in the traditional ultrasonic probe, the electric energy converted by ultrasonic waves is transmitted to a host computer for operation to generate data, the hand-held equipment is in wireless connection, the process of converting the electric energy into the data is directly completed on the equipment, secondly, a 10-axis MEMS acceleration gyroscope is added in the equipment, the traditional ultrasonic probe is not provided, the gyroscope can provide high-precision position conversion information and generate data of three-dimensional space positions to be matched with images, the GPS can provide positioning to generate unique data with addresses, the block chain technology is applied to storage, the data cannot be falsified, the tracing can be realized, the intellectual property protection of case uploaders can be realized, but also is convenient for supervising the study of the students and reduces the possibility of cheating on the examination. B. Simulation exercise or examination stage: the transducer does not work, and only the gyroscope is needed to work, so that the consumption of the transducer can be reduced, and the service life of the handheld equipment is prolonged;
2) personal client handheld device (fig. 3): the virtual palm ultrasonic equipment comprises a Bluetooth 5.0/5G module, a 10-axis acceleration gyroscope, a highly integrated circuit board (containing a multi-core microprocessor), a power supply, a switch and a core component, wherein the core component is the 10-axis acceleration gyroscope, and the gyroscope generates data (comprising time, acceleration, angle, angular velocity, longitude and latitude, a magnetic field, air pressure, height and the like) along with the position change of a probe; the microprocessor can transmit the data of the gyroscope to the Bluetooth 5.0/5G module, and the data is wirelessly transmitted to the mobile phone through the module. Because of having the locate function, can be used to student's attendance.
The 10-axis MEMS acceleration gyroscope provides high-precision detected human tissue position transformation information, generates data of a three-dimensional space position and transmits the data to the microprocessor;
the microprocessor converts electric energy generated by the ultrasonic transducer into data, transmits the data to the Bluetooth 5.0/5G communication module together with data of the 10-axis MEMS acceleration gyroscope, and wirelessly transmits the data to the local server and the personal client side through the module to perform image digital processing equipment:
a) the local server image digital processing device: the equipment is mainly completed based on computer post-processing software, is greatly different from a traditional ultrasonic instrument host, can be competent by a highly-configured electronic computer, and mainly realizes large data exchange by installing image processing software and connecting a cloud storage function. The working principle is as follows: A. and (3) a primary data acquisition stage: the image processing software functions comprise matching image data with three-dimensional space position data to generate new data, storing basic data (normal anatomical image data and student personal archive data) in a local area and uploading the basic data to a cloud storage space, and directly uploading case data and examination data to the cloud storage space to reduce unnecessary occupation of the local storage space, so that the operation speed of a local server is not influenced due to too large data volume, and the possibility of data loss caused by faults of the local server is reduced. B. Simulation exercise or examination stage: the VR digital phantom is arranged in the local server software system, a student can complete operation on the digital phantom, the server collects position data (tangent plane data) transmitted by the handheld device, the position data is compared through post-processing, image data generated along with position change is transmitted to VR software, the VR software generates images and transmits the images to VR equipment, when the VR equipment is unavailable or people who cannot tolerate the VR equipment can transmit the data to a local display, and learning and examination can be performed. Images and videos generated by learning and examination are automatically stored, so that later-stage feedback is facilitated.
b) Personal client image digital processing device: the mobile phone mainly comprises an APP (application program), the mobile phone has an interactive VR function, and can be directly displayed on a mobile phone screen, the software function is consistent with a local server software algorithm and function, and the mobile phone-handheld device interaction, student-expert interaction and expert consultation of difficult cases are provided, so that the overall knowledge architecture and the practical ability of students are improved.
An eighth embodiment, which is described in conjunction with fig. 1 to 7 of the specification, is a panoramic VR interactive medical ultrasound digital phantom system, where the wearable panoramic VR device includes: the system comprises panoramic VR equipment, virtual touch gloves, glasses type VR equipment and helmet type VR equipment; an earphone and a voice module are installed in the helmet type VR device, and the voice module and the earphone are used interactively.
Can divide into the first apparent and VR appurtenance of VR during specific use: the VR head shows the same with the wearable panorama VR equipment of local version, and the VR auxiliary assembly is VR cell-phone box + wireless handle (based on bluetooth or infrared), and former interactive performance is truer, and it is better to immerse the effect, and the latter low price, what ordinary student can bear plays, but cell-phone screen plane show simultaneously.
In a ninth specific embodiment, the present embodiment is described with reference to fig. 1 to 7 of the specification, and a panoramic VR interactive medical ultrasound digital phantom system of the present embodiment is provided, where the personal client is a mobile phone, and the local server is a computer.
In a tenth embodiment, the embodiment is described with reference to fig. 1 to 7 of the specification, and a positioning module is integrated on the personal client, so that the panoramic VR interactive medical ultrasound digital phantom system of the embodiment can perform real-time positioning.

Claims (5)

1. The utility model provides an interactive medical supersound digital phantom system of panorama VR which characterized in that: the system comprises local handheld data acquisition equipment, a local server, VR equipment, a cloud storage center, a cloud computing center, a personal client and personal handheld data acquisition equipment, wherein the local handheld data acquisition equipment and the personal handheld data acquisition equipment are respectively connected with the local server and the personal client, the local handheld data acquisition equipment and the personal handheld data acquisition equipment acquire original images, the local handheld data acquisition equipment transmits acquired information to the local server, the personal handheld data acquisition equipment transmits the acquired information to the personal client, the local server and the personal client transmit the original image acquired information to the cloud computing center for computational analysis and processing, processed data are uploaded to the cloud storage center for storage, the local server and the personal client are respectively and electrically connected with the VR equipment, and the cloud computing center transmits the data analyzed and processed by the cloud computing center to the equipment through the local server and the personal client And generating a VR display.
2. The panoramic VR interactive medical ultrasound digital phantom system of claim 1, wherein: the local handheld data acquisition equipment and the personal handheld data acquisition equipment respectively comprise an ultrasonic transducer, a 10-axis MEMS acceleration gyroscope, a Bluetooth 5.0/5G communication module, a microprocessor and a power supply module, wherein the ultrasonic transducer, the 10-axis MEMS acceleration gyroscope and the Bluetooth 5.0/5G communication module are respectively and electrically connected with the microprocessor, and the power supply module is used for supplying power to the microprocessor; the ultrasonic transducer generates ultrasonic waves to detect human tissues, and the reflected ultrasonic waves generate electric energy through the transducer and transmit the electric energy to the microprocessor;
the 10-axis MEMS acceleration gyroscope provides high-precision detected human tissue position transformation information, generates data of a three-dimensional space position and transmits the data to the microprocessor;
the microprocessor converts electric energy generated by the ultrasonic transducer into data, transmits the data together with data of the 10-axis MEMS acceleration gyroscope to the Bluetooth 5.0/5G communication module, and wirelessly transmits the data to the local server and the personal client side through the module for image digital processing.
3. The panoramic VR interactive medical ultrasound digital phantom system of claim 1, wherein: the VR equipment comprises panoramic VR equipment and a virtual touch glove;
the panoramic VR equipment comprises glasses type VR equipment and helmet type VR equipment;
an earphone and a voice module are installed in the helmet type VR device, and the voice module and the earphone are used interactively.
4. The panoramic VR interactive medical ultrasound digital phantom system of claim 1, wherein: the personal client is a mobile phone, and the local server is a computer.
5. The panoramic VR interactive medical ultrasound digital phantom system of claim 4, wherein: and a positioning module is integrated on the personal client, so that real-time positioning can be realized.
CN202021005258.8U 2020-06-04 2020-06-04 Panoramic VR interactive medical ultrasonic digital phantom system Active CN212181696U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202021005258.8U CN212181696U (en) 2020-06-04 2020-06-04 Panoramic VR interactive medical ultrasonic digital phantom system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202021005258.8U CN212181696U (en) 2020-06-04 2020-06-04 Panoramic VR interactive medical ultrasonic digital phantom system

Publications (1)

Publication Number Publication Date
CN212181696U true CN212181696U (en) 2020-12-18

Family

ID=73789569

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202021005258.8U Active CN212181696U (en) 2020-06-04 2020-06-04 Panoramic VR interactive medical ultrasonic digital phantom system

Country Status (1)

Country Link
CN (1) CN212181696U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111489608A (en) * 2020-06-04 2020-08-04 赵连蒙 Panoramic VR interactive medical ultrasonic digital phantom system teaching method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111489608A (en) * 2020-06-04 2020-08-04 赵连蒙 Panoramic VR interactive medical ultrasonic digital phantom system teaching method

Similar Documents

Publication Publication Date Title
CN107693050B (en) Remote medical ultrasonic examination system and method
CN206431875U (en) Medical anatomy assisted teaching system based on augmented reality
US20100179428A1 (en) Virtual interactive system for ultrasound training
CN113689577B (en) Method, system, equipment and medium for matching virtual three-dimensional model with entity model
CN109009211A (en) Smart machine, the method and device based on ultrasound examination
EP2556497A1 (en) Ultrasound simulation training system
CN111489608A (en) Panoramic VR interactive medical ultrasonic digital phantom system teaching method
CN105512467A (en) Digit visualization mobile terminal medical method
CN105832528A (en) Display method and device of digital human body meridian point model
CN110930804A (en) Clinical medicine education system based on cloud platform
CN212181696U (en) Panoramic VR interactive medical ultrasonic digital phantom system
CN115568823B (en) Human body balance capability assessment method, system and device
CN104881572A (en) Remote auxiliary diagnosis and treatment system based on network hospital and remote auxiliary diagnosis and treatment method
CN109509555A (en) A kind of surgical operation preview appraisal procedure and system based on 3-dimensional image
Müller et al. The virtual reality arthroscopy training simulator
Rao et al. Learning orthodontic cephalometry through augmented reality: A conceptual machine learning validation approach
CN111951651A (en) Medical ultrasonic equipment experiment teaching system based on VR
CN107945607A (en) Ultrasonic demo system and device
CN107578662A (en) A kind of virtual obstetric Ultrasound training method and system
CN113786228A (en) Auxiliary puncture navigation system based on AR augmented reality
Zhang et al. Immersive augmented reality (I am real)–remote clinical consultation
Tahmasebi et al. A framework for the design of a novel haptic-based medical training simulator
CN114098818B (en) Analog imaging method of ultrasonic original image data
US20240008845A1 (en) Ultrasound simulation system
CN117111724A (en) Data processing method and system for XR

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant