CN113888723A - Ultrahigh-definition diagnosis-level medical data MR panoramic display system and method - Google Patents

Ultrahigh-definition diagnosis-level medical data MR panoramic display system and method Download PDF

Info

Publication number
CN113888723A
CN113888723A CN202111163320.5A CN202111163320A CN113888723A CN 113888723 A CN113888723 A CN 113888723A CN 202111163320 A CN202111163320 A CN 202111163320A CN 113888723 A CN113888723 A CN 113888723A
Authority
CN
China
Prior art keywords
unit
consultation
medical data
display
space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111163320.5A
Other languages
Chinese (zh)
Inventor
张吕峥
张翔
王鑫鑫
石成刚
杨丽娜
周海晓
陈郑
丁鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pioneering Huikang Technology Co ltd
Original Assignee
Pioneering Huikang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneering Huikang Technology Co ltd filed Critical Pioneering Huikang Technology Co ltd
Priority to CN202111163320.5A priority Critical patent/CN113888723A/en
Publication of CN113888723A publication Critical patent/CN113888723A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H80/00ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring

Abstract

The invention discloses an ultra-high-definition diagnosis-level medical data MR panoramic display system and method, wherein the MR panoramic display system comprises a consultation workstation, an MR head-mounted display, control gloves and a space tracking positioner; the consultation workstation is a common medical computer, a remote consultation system is arranged in the consultation workstation, the MR head-wearing display is in butt joint with the consultation workstation in a wireless USB mode, and the control gloves and the space tracking positioner are connected with the MR head-wearing display. The system and the method can enable a doctor to realize an efficient, convenient and on-the-spot medical collaborative diagnosis scene through a mixed reality technology in the remote consultation process, and the mode can effectively help the doctor to make accurate remote diagnosis, reduce the misdiagnosis rate of patients with difficult and critical illness and improve the diagnosis and teaching level of the cases with difficult and critical illness in hospitals.

Description

Ultrahigh-definition diagnosis-level medical data MR panoramic display system and method
Technical Field
The invention relates to the technical field of remote consultation and Mixed Reality (Mixed Reality), in particular to a method and a system for displaying MR panorama of ultra-high definition diagnosis level medical data.
Background
Mixed Reality (Mixed Reality) is abbreviated as MR, and the MR technology combines the advantages of Virtual Reality (VR) and Augmented Reality (AR), so that the AR technology can be better embodied, and the real world and the Virtual world can be Mixed together to generate a new visual environment.
Therefore, aiming at complex difficult and critical cases, the accuracy, convenience and operability of medical data diagnosis in the remote consultation process are improved by using the mixed reality technology, so that doctors can make a better collaborative judgment personally on the scene, and the problem of poor diagnosis experience in the remote consultation of the current times is deeply solved.
In addition, the innovation brought by the fifth generation mobile communication technology (5G) provides network guarantee for remote cooperative diagnosis and treatment of critical illness, revolutionary breakthrough in network rate, capacity, delay, paradigm shift, safety and the like is realized, technical support is provided for medical multi-scene, multi-party multi-data interaction sharing can be comprehensively realized, and large-capacity file transmission such as images, dynamic ultrasound, digital pathology and the like is not limited by the influence of network conditions any more.
Disclosure of Invention
The invention provides an ultra-high-definition diagnosis-level medical data MR panoramic display method and system, and the remote consultation scene is realized through a certain technical method.
An ultra-high-definition diagnosis-level medical data MR panoramic display system comprises a consultation workstation, an MR head-mounted display, a control glove and a space tracking positioner.
The consultation workstation is a common medical computer, is internally provided with a remote consultation system, can acquire medical data in a hospital in an interface mode, and realizes common remote consultation processes and display adaptation to the MR environment. The remote consultation system adopts the existing remote consultation system.
The MR head-mounted display is in butt joint with the consultation workstation in a wireless USB mode, and the control gloves and the space tracking positioner are connected with the MR head-mounted display.
The MR head-mounted display comprises a mixed reality display unit, a medical data processing unit, an audio-video interaction unit, an AR rendering unit, a VR processing unit, an interaction control receiving unit and a display space positioning unit.
The mixed reality display unit presents the processed dynamic or static image on a head-mounted display screen through an MR display technology;
the medical data processing unit is used for realizing the processing of the nondestructive medical data. The non-destructive medical data includes dynamic imagery medical images, other static medical images, and structured medical data. The medical data processing unit can directly transmit the structured medical data to the hybrid display unit and transmit the dynamic or static image to the AR rendering unit and the VR processing unit;
the audio and video interaction unit is used for communication between doctors and doctors in the process of remote consultation;
the AR rendering unit and the VR processing unit are used for post-processing and signal superposition of the received dynamic image medical image and other static medical images, and finally the dynamic image medical image and other static medical images are displayed on a screen of the mixed reality image display unit in a mixed reality mode;
the interactive control receiving unit is based on Vive Lighthouse (Lighthouse laser positioning technology) and is used for receiving and processing head movement digital signals of the display space positioning unit and hand movement and operation action digital signals of the control gloves, and calculating position and posture information;
the display space positioning unit is internally provided with an inertial sensor and a plurality of photosensitive sensors, the inertial sensor is used for measuring XYZ three-axis rotation values, the photosensitive sensors are matched with the space tracking positioner and are used for measuring XYZ three-axis movement values, space coordinates are formed based on a space positioning algorithm, and head movement tracking is achieved to assist in operation and control.
The control glove comprises a glove space positioning unit, a control unit and an interactive control output unit. An inertial sensor and a plurality of photosensitive sensors are arranged in the glove space positioning unit, the inertial sensor is used for measuring XYZ three-axis rotation values, the photosensitive sensors are matched with a space tracking positioner and are used for measuring XYZ three-axis movement values, space coordinates are formed based on a space positioning algorithm, and tracking hand movement is achieved to assist in operation and control;
the control unit is used for controlling the fingertip button function of the glove and is matched with the glove space positioning unit to realize the functions of controlling, interacting, marking and mapping the mixed reality virtual scene;
the interactive control output unit is used for forming the operation action of the control unit into a digital signal and outputting the signal to the interactive control receiving unit of the head-mounted display.
The space tracking locator comprises a built-in infrared LED lamp and two mutually vertical rotary infrared laser transmitters. Two infrared laser emitter use 20 milliseconds as a circulation, wherein X axle 10 milliseconds, Y axle 10 milliseconds, when the circulation begins, infrared LED lamp flash of light, each photosensitive sensor synchronizing signal in the consultation room space, two infrared laser emitter scan whole consultation room space respectively, the photosensitive sensor in display space positioning unit and the gloves space positioning unit just can measure the time that X axle and Y axle infrared laser arrived photosensitive sensor respectively, can calculate head motion track and hand motion track according to the position difference of each sensor.
Furthermore, the number of the space tracking positioners is two, and the two space tracking positioners are respectively arranged at diagonal positions of the consultation room, and form a lighthouse laser space.
The invention has the following beneficial effects:
through the system, a doctor can realize a high-efficiency, convenient and on-site medical collaborative diagnosis scene through a mixed reality technology in the remote consultation process, the mode can effectively help the doctor to make accurate remote diagnosis, the misdiagnosis rate of patients with difficult and critical illness is reduced, and the diagnosis, teaching and research level of the cases with difficult and critical illness in hospitals is improved.
Drawings
Fig. 1 is a schematic structural diagram of an apparatus according to an embodiment of the present invention.
Fig. 2 is a schematic view of a panoramic medical data display method according to an embodiment of the present invention.
In the drawings: 1-a consultation workstation, 11-a remote consultation system, 2-an MR head-mounted display, 21-a medical data processing unit, 22-an audio and video interaction unit, 23-an AR rendering unit, 24-a VR processing unit, 25-an interaction control receiving unit, 26-a mixed reality image display unit, 27-a display space positioning unit, 3-a control glove, 31-a glove space positioning unit, 32-a control unit, 33-an interaction control output unit and 4-a space tracking positioner.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the following will clearly and completely describe the technical solutions in the embodiments of the present invention, and it is obvious that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without inventive step, are within the scope of the present invention.
As shown in fig. 1-2, an MR panoramic display system for ultra-high definition diagnostic medical data includes a consultation workstation 1, an MR head-mounted display 2, a control glove 3, and a spatial tracking locator 4.
The consultation workstation 1 is a common medical computer, is internally provided with a remote consultation system 11, can acquire medical data in a hospital in an interface mode, and realizes common remote consultation processes and display adaptation to an MR environment. The remote consultation system 11 adopts an existing remote consultation system.
The MR head mounted display 2 includes a mixed reality display unit, a medical data processing unit 21, an audio-video interaction unit 22, an AR rendering unit 23, a VR processing unit 24, an interaction control receiving unit 25, and a display spatial localization unit 27.
The mixed reality display unit presents the processed dynamic or static image on a head-mounted display screen through an MR display technology;
the medical data processing unit 21 is used to implement the processing of non-destructive medical data. Non-destructive medical data includes dynamic imagery medical images, other static medical images, and structured medical data. The medical data processing unit 21 may transmit the structured medical data directly to the hybrid display unit, and transmit the dynamic or static image to the AR rendering unit 23 and the VR processing unit 24;
the audio and video interaction unit 22 is used for communication between doctors and doctors in the process of remote consultation;
the AR rendering unit 23 and the VR processing unit 24 are configured to perform post-processing and signal superposition on the received dynamic image medical image and other static medical images, and finally present the images on the screen of the mixed reality image display unit 26 in a mixed reality manner;
an interactive control receiving unit 25, based on the Vive Lighthouse (Lighthouse laser positioning technology), for receiving and processing the digital signals of the head movements of the display space positioning unit 27 and the digital signals of the hand movements and operation actions of the control glove 3, and calculating the position and posture information;
the display space positioning unit 27 is internally provided with an inertial sensor and a plurality of photosensitive sensors, the inertial sensor is used for measuring the rotation values of the three axes of XYZ, the photosensitive sensors are matched with the space tracking positioner 4 to measure the movement values of the three axes of XYZ, and then space coordinates are formed based on a space positioning algorithm to realize the tracking of head movement to assist in operation and control.
The control glove 3 comprises a glove space positioning unit 31, a control unit 32 and an interactive control output unit 33. An inertial sensor and a plurality of photosensitive sensors are arranged in the glove space positioning unit 31, the inertial sensor is used for measuring the rotation values of the three axes of XYZ, the photosensitive sensors are matched with the space tracking positioner 4 to measure the movement values of the three axes of XYZ, a space coordinate is formed based on a space positioning algorithm, and the tracking of hand motion is realized to assist in operation and control; the control unit 32 is a fingertip button function of the control glove 3, and is matched with the glove space positioning unit 31 to realize the functions of controlling, interacting, identifying and surveying and mapping the mixed reality virtual scene; the interaction control output unit 33 is configured to convert the operation action of the manipulation unit 32 into a digital signal and output the digital signal to the interaction control receiving unit 25 of the head-mounted display.
The spatial tracking localizer 4 comprises a built-in infrared LED lamp and two mutually perpendicular rotating infrared laser emitters. Two infrared laser emitter use 20 milliseconds as a circulation, wherein X axle 10 milliseconds, Y axle 10 milliseconds, when the circulation begins, infrared LED lamp flash of light, each photosensitive sensor synchronizing signal in the consultation room space, two infrared laser emitter scan whole consultation room space respectively, the photosensitive sensor in display space positioning unit 27 and the gloves space positioning unit 31 just can measure the time that X axle and Y axle infrared laser arrived photosensitive sensor respectively, can calculate head motion track and hand motion track according to the position difference of each sensor.
The two space tracking positioners 4 are respectively arranged at diagonal positions of the consultation room, and the two space tracking positioners 4 form a lighthouse laser space.
The high-definition diagnosis-level medical data panoramic display method based on the MR technology is different from the traditional remote consultation medical data display method, and achieves the effects of high-speed transmission, extremely low time delay, real-time synchronization and being in the border of medical data on the premise of high definition and no damage.
An MR panoramic display method for ultra-high definition diagnostic medical data comprises the following steps:
step (1): the system obtains the consultation patient case from the remote consultation system 11 based on the Web environment, processes the consultation patient case through the medical data processing unit 21, and displays the lossless dynamic and static medical image and the structured medical data in the consultation case in a classified mode.
Step (2): the case function is shared or mapped by the remote consultation system 11, the case is presented at both parties of the consultation, and the case is presented on the head-mounted display screen through the mixed reality image display unit 26.
And (3): the doctors of both parties use the voice or the control unit 32 of the control glove 3 to switch and browse the case files through the audio-video interaction unit 22.
And (4): when a marking command is required to be carried out on the focus part, the operation glove 3 is used for carrying out operation and marking.
And (5): the steering action is based on the annotation commands formed by the spatial tracking localizer 4 deployed in the consultation room and the spatial localization unit inside the MR head-mounted display 2, the steering glove 3. The marking command is obtained according to the rigid body posture and the motion track numerical value of the infrared laser positioning system.
And (6): the annotation command is converted into an AR annotation parameter by the AR rendering unit 23.
And (7): the AR rendering unit 23 and the VR processing unit 24 are used for synchronously superposing and correcting the AR labeling parameters and the VR images to form mixed reality images capable of being labeled in real time.
And (8): the mixed reality image is finally presented on the MR head mounted display 2 screen by the mixed reality image display unit 26.

Claims (6)

1. An ultra-high-definition diagnosis-level medical data MR panoramic display system is characterized by comprising a consultation workstation, an MR head-mounted display, a control glove and a space tracking positioner;
the consultation workstation is a common medical computer, is internally provided with a remote consultation system, can acquire medical data in a hospital in an interface mode, and realizes a common remote consultation process and display adaptation to an MR environment; the remote consultation system adopts the existing remote consultation system;
the MR head-mounted display is in butt joint with the consultation workstation in a wireless USB mode, and the control gloves and the space tracking positioner are connected with the MR head-mounted display.
2. The system of claim 1, wherein the MR head mounted display comprises a mixed reality display unit, a medical data processing unit, an audio-video interaction unit, an AR rendering unit, a VR processing unit, an interaction control receiving unit, and a display spatial positioning unit;
the mixed reality display unit presents the processed dynamic or static image on a head-mounted display screen through an MR display technology;
the medical data processing unit is used for realizing the processing of nondestructive medical data; the non-destructive medical data comprises dynamic imagery medical images, other static medical images, and structured medical data; the medical data processing unit can directly transmit the structured medical data to the hybrid display unit and transmit the dynamic or static image to the AR rendering unit and the VR processing unit;
the audio and video interaction unit is used for communication between doctors and doctors in the process of remote consultation;
the AR rendering unit and the VR processing unit are used for post-processing and signal superposition of the received dynamic image medical image and other static medical images, and finally the dynamic image medical image and other static medical images are displayed on a screen of the mixed reality image display unit in a mixed reality mode;
the interactive control receiving unit is based on a lighthouse laser positioning technology and is used for receiving and processing a head movement digital signal of the display space positioning unit and a hand movement and operation action digital signal of the control glove and calculating position and posture information;
the display space positioning unit is internally provided with an inertial sensor and a plurality of photosensitive sensors, the inertial sensor is used for measuring XYZ three-axis rotation values, the photosensitive sensors are matched with the space tracking positioner and are used for measuring XYZ three-axis movement values, space coordinates are formed based on a space positioning algorithm, and head movement tracking is achieved to assist in operation and control.
3. The system for displaying ultra-high-definition diagnostic medical data MR panorama according to claim 2, wherein said control glove comprises a glove space positioning unit, a control unit, an interactive control output unit; an inertial sensor and a plurality of photosensitive sensors are arranged in the glove space positioning unit, the inertial sensor is used for measuring XYZ three-axis rotation values, the photosensitive sensors are matched with a space tracking positioner and are used for measuring XYZ three-axis movement values, space coordinates are formed based on a space positioning algorithm, and tracking hand movement is achieved to assist in operation and control;
the control unit is used for controlling the fingertip button function of the glove and is matched with the glove space positioning unit to realize the functions of controlling, interacting, marking and mapping the mixed reality virtual scene;
the operation control output unit is used for forming the operation action of the control unit into a digital signal and outputting the signal to the interactive control receiving unit of the head-mounted display.
4. The system for displaying MR panoramic ultra-high-definition diagnostic medical data according to claim 3, wherein the spatial tracking localizer comprises a built-in infrared LED lamp and two mutually perpendicular rotating infrared laser transmitters; two infrared laser emitter use 20 milliseconds as a circulation, wherein X axle 10 milliseconds, Y axle 10 milliseconds, when the circulation begins, infrared LED lamp flash of light, each photosensitive sensor synchronizing signal in the consultation room space, two infrared laser emitter scan whole consultation room space respectively, the photosensitive sensor in display space positioning unit and the gloves space positioning unit just can measure the time that X axle and Y axle infrared laser arrived photosensitive sensor respectively, can calculate head motion track and hand motion track according to the position difference of each sensor.
5. The MR panoramic visualization system for ultra-high definition diagnostic medical data according to claim 4, wherein the number of the space tracking locators is two, and the two space tracking locators are respectively arranged at diagonal positions of the consultation room, and form a lighthouse laser space.
6. An MR panoramic display method for ultra-high definition diagnostic medical data is characterized by comprising the following steps:
step (1): the system acquires a consultation patient case from a remote consultation system based on a Web environment, processes the consultation patient case through a medical data processing unit, and displays lossless dynamic and static medical images and structured medical data in the consultation case in a classified mode;
step (2): sharing or mapping a case function through a remote consultation system, presenting cases at both sides of consultation, and presenting the cases on an MR head-mounted display screen through a mixed reality image display unit;
and (3): doctors of both parties use the voice or control unit of the control glove to switch and read the case files through the audio and video interaction unit;
and (4): when a marking command is required to be carried out on the focus part, the operation is carried out by using the control gloves for marking;
and (5): the control action is based on a spatial tracking locator deployed in a consultation room, an MR head-mounted display and a labeling command formed by a spatial locating unit in a control glove; the marking command is obtained according to the rigid body posture and the motion track numerical value of the infrared laser positioning system;
and (6): the marking command is converted into an AR marking parameter through an AR rendering unit;
and (7): synchronously superposing and correcting the AR marking parameters and the VR images through the AR rendering unit and the VR processing unit to form a mixed reality image capable of being marked in real time;
and (8): and finally, presenting the mixed reality image on the MR head-mounted display screen through the mixed reality image display unit.
CN202111163320.5A 2021-09-30 2021-09-30 Ultrahigh-definition diagnosis-level medical data MR panoramic display system and method Pending CN113888723A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111163320.5A CN113888723A (en) 2021-09-30 2021-09-30 Ultrahigh-definition diagnosis-level medical data MR panoramic display system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111163320.5A CN113888723A (en) 2021-09-30 2021-09-30 Ultrahigh-definition diagnosis-level medical data MR panoramic display system and method

Publications (1)

Publication Number Publication Date
CN113888723A true CN113888723A (en) 2022-01-04

Family

ID=79005076

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111163320.5A Pending CN113888723A (en) 2021-09-30 2021-09-30 Ultrahigh-definition diagnosis-level medical data MR panoramic display system and method

Country Status (1)

Country Link
CN (1) CN113888723A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116476100A (en) * 2023-06-19 2023-07-25 兰州空间技术物理研究所 Remote operation system of multi-branch space robot
CN116994720A (en) * 2023-08-09 2023-11-03 上海涞秋医疗科技有限责任公司 Image remote communication collaboration system based on XR technology

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116476100A (en) * 2023-06-19 2023-07-25 兰州空间技术物理研究所 Remote operation system of multi-branch space robot
CN116994720A (en) * 2023-08-09 2023-11-03 上海涞秋医疗科技有限责任公司 Image remote communication collaboration system based on XR technology

Similar Documents

Publication Publication Date Title
US9665936B2 (en) Systems and methods for see-through views of patients
JP4434890B2 (en) Image composition method and apparatus
US9020203B2 (en) System and method for managing spatiotemporal uncertainty
WO2012081194A1 (en) Medical-treatment assisting apparatus, medical-treatment assisting method, and medical-treatment assisting system
US20140176661A1 (en) System and method for surgical telementoring and training with virtualized telestration and haptic holograms, including metadata tagging, encapsulation and saving multi-modal streaming medical imagery together with multi-dimensional [4-d] virtual mesh and multi-sensory annotation in standard file formats used for digital imaging and communications in medicine (dicom)
CN113888723A (en) Ultrahigh-definition diagnosis-level medical data MR panoramic display system and method
Andersen et al. Virtual annotations of the surgical field through an augmented reality transparent display
CN114173693A (en) Augmented reality system and method for remotely supervising surgical procedures
JP4834424B2 (en) Information processing apparatus, information processing method, and program
CN112618026A (en) Remote operation data fusion interactive display system and method
CN106980383A (en) A kind of dummy model methods of exhibiting, module and the virtual human body anatomical model display systems based on the module
CN115315729A (en) Method and system for facilitating remote presentation or interaction
CN113689577A (en) Method, system, device and medium for matching virtual three-dimensional model and entity model
WO2023078290A1 (en) Mark sharing method and apparatus for surgical robot, and system, device and medium
US20220215539A1 (en) Composite medical imaging systems and methods
JP2023526716A (en) Surgical navigation system and its application
Kim et al. Multi-scale mixed reality collaboration for digital twin
Yasumuro et al. Projection-based augmented reality with automated shape scanning
CA3152809A1 (en) Method for analysing medical image data in a virtual multi-user collaboration, a computer program, a user interface and a system
US10854005B2 (en) Visualization of ultrasound images in physical space
CN112181135B (en) 6-DOF visual and tactile interaction method based on augmented reality
TW201619754A (en) Medical image object-oriented interface auxiliary explanation control system and method thereof
CN113842227B (en) Medical auxiliary three-dimensional model positioning and matching method, system, equipment and medium
WO2023000085A1 (en) System and apparatus for remote interaction with an object
Black et al. Mixed reality human teleoperation with device-agnostic remote ultrasound: Communication and user interaction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination