NL2034443B1 - Ultrasonic display system and method based on augmented reality - Google Patents

Ultrasonic display system and method based on augmented reality Download PDF

Info

Publication number
NL2034443B1
NL2034443B1 NL2034443A NL2034443A NL2034443B1 NL 2034443 B1 NL2034443 B1 NL 2034443B1 NL 2034443 A NL2034443 A NL 2034443A NL 2034443 A NL2034443 A NL 2034443A NL 2034443 B1 NL2034443 B1 NL 2034443B1
Authority
NL
Netherlands
Prior art keywords
ultrasonic
image
real
reality scene
reality
Prior art date
Application number
NL2034443A
Other languages
Dutch (nl)
Other versions
NL2034443A (en
Inventor
Chen Yi
Lai Jiangming
Xie Guojin
Original Assignee
Binhaiwan Central Hospital Of Dongguan
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Binhaiwan Central Hospital Of Dongguan filed Critical Binhaiwan Central Hospital Of Dongguan
Publication of NL2034443A publication Critical patent/NL2034443A/en
Application granted granted Critical
Publication of NL2034443B1 publication Critical patent/NL2034443B1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/462Displaying means of special interest characterised by constructional features of the display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5292Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves using additional data, e.g. patient information, image labeling, acquisition parameters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • A61B2017/3413Needle locating or guiding means guided by ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0891Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of blood vessels

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Pathology (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Gynecology & Obstetrics (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

An ultrasonic display system and method based on augmented reality are provided. The system includes an ultrasonic detection device and an augmented reality device. The ultrasonic detection device includes an ultrasonic probe for real-time ultrasonic scanning and an ultrasonic imaging module for generating and outputting an ultrasonic image. The augmented reality device is configured for acquiring a reality scene image in a visual space and receiving the ultrasonic image in real time, displaying the ultrasonic image in the reality scene image, and presenting the reality scene image with the ultrasonic image to a user. Therefore, the user wearing the AR device may simultaneously observe the realistic scene and real-time ultrasonic image, thereby realizing the visualization operation in the process of puncture without frequent conversion of visual field, which is conducive to the user to judge the puncture site, thus effectively improving the accuracy and efficiency of puncture.

Description

ULTRASONIC DISPLAY SYSTEM AND METHOD BASED ON
AUGMENTED REALITY
FIELD OF THE INVENTION
[0001] The application relates to the technical field of medical equipment, in particular to an ultrasonic display system and a method based on augmented reality.
BACKGROUND OF THE INVENTION
[0002] At present, in the clinical diagnosis and treatment process of dialysis, extracorporeal circulation support, rapid fluid replenishment and hemodynamic monitoring, it is often necessary to perform puncture operations on the arteriovenous vessels (internal jugular arteriovenous vein, femoral arteriovenous vein, etc.) in various parts of the patient.
Commonly, an ultrasonic image acquired by an ultrasonic display device is utilized to know the approximate position and depth of blood vessels, and to follow the echo image of a puncture needle to guide the puncture.
However, such a manner has the following disadvantages.
[0003] The ultrasonic image is displayed on the display screen of the ultrasonic display device as a two-dimensional picture, and there is no image for spatial positioning on the puncture site. Therefore, it is difficult for the operator to accurately control the puncture angle and the puncture depth. It's necessary for the operator to frequently shift the visual field between the puncture site and the ultrasonic display screen, for repeatedly observing the real-time relative position of the puncture needle and the puncture site to determine the accurate position, depth and direction of the puncture, which leads to a low work efficiency.
SUMMARY OF THE INVENTION
[0004] An object of the present application is to provide an ultrasonic display system and a method based on augmented reality, which effectively improves the accuracy and efficiency of the puncture, without shifting the user’s visual field between the puncture site and the display screen.
[0005] To achieve the above object, the present application provides an ultrasonic display system based on augmented reality including: an ultrasonic detection device, including an ultrasonic probe and an ultrasonic imaging module connected with the ultrasonic probe, the ultrasonic probe being configured for real-time ultrasonic scanning of a target object, the ultrasonic imaging module being configured for generating and outputting an ultrasonic image according to ultrasonic data collected by the ultrasonic probe; and an augmented reality device connected to the ultrasonic imaging module and configured for acquiring a reality scene image in a visual space in real time and receiving the ultrasonic image in real time, displaying the ultrasonic image in the reality scene image, and presenting the reality scene image with the ultrasonic image to a user.
[0006] In one embodiment, the augmented reality device is configured for acquiring a real-time position of the ultrasonic probe in the reality scene image and displaying the ultrasonic image received in real time on a side of the ultrasonic probe in the reality scene image, and the ultrasonic image in the reality scene image is configured to move in real time to follow the ultrasonic probe.
[0007] In one embodiment, the augmented reality device includes: an acquisition module, connected to the ultrasonic imaging module and configured for acquiring the ultrasonic image in real time;
a positioning module, configured for acquiring a position of the ultrasonic probe in the reality scene image; and a displaying module, connected with the acquisition module and the positioning module, and configured for displaying the ultrasonic image on a side of the ultrasonic probe, so that the ultrasonic image in the reality scene image moves in real time to follow the ultrasonic probe.
[0008] In one embodiment, the augmented reality device further includes an adjusting module configured for adjusting a size and transparency of the ultrasonic image displayed in the reality scene image.
[0009] In one embodiment, a puncture auxiliary device is further included and configured for assisting puncture positioning by adding at least one of leads, points and angles in the reality scene image.
[0010] In one embodiment, the puncture auxiliary device includes: an identifying module, configured for identifying a lesion of the target object according to the ultrasonic image received in real time; a processing module, configured for obtaining a position of a puncture point in the reality scene image according to positions of the lesion and the ultrasonic probe; and a marking module, configured for marking the puncture point in the reality scene image according to the position of the puncture point.
[0011] To achieve the above object, the present application further provides an ultrasonic display method based on augmented reality, executed by an augmented reality device which is connected to an ultrasonic detection device, and the ultrasonic display method including: acquiring a reality scene image in a visual space in real time from the augmented reality device;
generating an ultrasonic image according to ultrasonic data collected by an ultrasonic probe of the ultrasonic detection device, and receiving the ultrasonic image output by the ultrasonic detection device in real time; displaying the ultrasonic image in the reality scene image; and presenting the reality scene image with the ultrasonic image to a user.
[0012] In one embodiment, the method further includes: acquiring a position of the ultrasonic probe in the reality scene image; and displaying the ultrasonic image received in real time on a side of the ultrasonic probe in the reality scene image, thereby the ultrasonic image in the reality scene image moves in real time to follow the ultrasonic probe.
[0013] In comparison with the prior art, in the ultrasonic display system and method based on augmented reality according to the present embodiment, the ultrasonic image received in real time by the augmented reality (AR) device is displayed in the reality scene image, and then the reality scene image together with the ultrasonic image are displayed to the user, in such a way, the user wearing the AR device may simultaneously observe the realistic scene and real-time ultrasonic image, thereby realizing the visualization operation in the process of puncture without frequent conversion of visual field, which is conducive to the user to judge the puncture site according to the real-time ultrasonic image, thus effectively improving the accuracy and efficiency of puncture.
BRIEF DESCRIPTION OF THE DRAWINGS
[0014] The accompanying drawings facilitate an understanding of the various embodiments of this invention. In such drawings:
[0015] FIG. 1 is a module diagram of an ultrasonic display system based 5 on augmented reality according to one embodiment of the invention;
[0016] FIG. 2 is a schematic diagram of the reality scene image according to the embodiment of the invention;
[0017] FIG. 3 is a module diagram of the AR device according to the embodiment of the invention; and
[0018] FIG. 4 is a module diagram of the puncture auxiliary device according to the embodiment of the invention.
DETAILED DESCRIPTION OF ILLUSTRATED EMBODIMENTS
[0019] In order to make the purpose, technical solutions and advantages of the present application more clearly understood, the present application will be described in further detail below with reference to the accompanying drawings and embodiments.
[0020] Referring to FIGS. 1-4, the ultrasonic display system based on augmented reality according to the present invention is provided, the system include an ultrasonic detection device and an augmented reality (AR) device 20.
[0021] The ultrasonic detection device 10 includes an ultrasonic probe 11 and an ultrasonic imaging module 12 connected with the ultrasonic probe 11. Specifically, the ultrasonic probe 11 is configured for real-time ultrasonic scanning of a target object, the ultrasonic imaging module 12 is configured for generating and outputting an ultrasonic image according to ultrasonic data collected by the ultrasonic probe 11.
[0022] The AR device 20 is connected with the ultrasonic imaging module 12, and is configured for acquiring a reality scene image in a visual space in real time and receiving the ultrasonic image in real time, displaying the ultrasonic image in the reality scene image, and presenting the reality scene image with the ultrasonic image to a user.
[0023] In the ultrasonic display system based on augmented reality according to the present embodiment, the ultrasonic image received in real time by the AR device 20 is displayed in the reality scene image, and then the reality scene image together with the ultrasonic image are displayed to the user, in such a way, the user wearing the AR device 20 may simultaneously observe the realistic scene and real-time ultrasonic image, thereby realizing the visualization operation in the process of puncture without frequent conversion of visual field, which is conducive to the user to judge the puncture site according to the real-time ultrasonic image, thus effectively improving the accuracy and efficiency of puncture.
[0024] It should be noted that "visible space" refers to the space within the field of vision that the user may see through the AR device 20. The visible space will change with the movement or rotation of the user. "Reality scene image" refers to the image captured by the AR device 20 in its visual space, which may be the image captured by the AR device 20 in the entire visual space or in part of the visual space. Of course, it is understandable that, the ultrasonic probe is always captured in the "reality scene image" by the AR device 20.
[0025] It is understandable that, the AR device 20 may include head- mounted devices (such as AR glasses), which may be worn by doctors and/or nurses etc. who perform puncture operations. Such head- mounted devices are used to obtain reality scene images in their visual space in real time, as well as display the reality scene images with ultrasonic images in the line of sight of the staff wearing the head- mounted devices.
[0026] It is understandable that, the ultrasonic image received by the AR device 20 may be processed through computer programs and/or software to display the ultrasonic image in the reality scene image.
[0027] For example, as illustrated in FIG. 2, when the user or other personnel holds the ultrasonic probe 11 to carry out ultrasonic scanning on the wrist, the AR device 20 may capture a reality scene image and an ultrasonic image scanned by the ultrasonic probe 11 in real time, and the ultrasonic image may be displayed in the reality scene image which is then presented to the user, in such a way, the user may simultaneously observe both the reality scene image and the ultrasonic image. The ultrasonic image is not limited to that in FIG. 2, which is dependent on the scanning area of the ultrasonic probe 11.
[0028] In some embodiments, the AR device 20 is configured to acquire the real-time position of the ultrasonic probe 11 in the reality scene image and then display the ultrasonic image received in real time on a side of the ultrasonic probe 11 in the reality scene image (as shown in
FIG. 2), so that the ultrasonic image may move in real time along with the position of ultrasonic probe 11 in the reality scene image, which is conducive to the user to judge the piercing site. There is no restriction on the displaying position of the ultrasonic image in the reality scene image, as long as the user may observe the real scene and the ultrasonic image at the same time.
[0029] Specifically, as shown in FIG. 2 and FIG. 3, the AR device 20 includes an acquisition module 21, a positioning module 22 and a displaying module 23. The acquisition module 21 is connected with the ultrasonic imaging module 12 and configured for acquiring the ultrasonic image in real time. The positioning module 22 is configured for acquiring a position of the ultrasonic probe in the reality scene image. The displaying module 23 is connected with the acquisition module 21 and the positioning module 22, and configured for displaying the ultrasonic image onto a side of the ultrasonic probe 11, so that the ultrasonic image in the reality scene image moves in real time to follow the ultrasonic probe 11. The ultrasonic probe 11 is positioned by the positioning module 22, so that the ultrasonic image may be projected on a side of the ultrasonic probe 11 by the displaying module 23. Optionally, the ultrasonic image may be displayed on the front, rear, left, right or upper side of the ultrasonic probe 11, as long as it may be completely displayed in the reality scene image for the user to view.
[0030] As shown in FIG. 3, for the convenience of the user, the AR device 20 may further include an adjusting module 24, which is configured for adjusting the size and the transparency of the ultrasonic image displayed in the reality scene image, so that the user may adjust the size and/or the transparency of the ultrasonic image in the reality scene image as needed. The ultrasonic image may be regulated to have a certain transparency to prevent the ultrasonic image from blocking part of the reality scene image, which is more conducive to the user's judgment for the puncture site. The adjusting module 24 may include the built-in computer program. The adjusting module 24 may adjust the size and the transparency of the ultrasonic image displayed in the reality scene image by receiving instructions sent by a button or a touch screen.
[0031] In some embodiments, the ultrasonic display system further includes a puncture auxiliary device 30 configured for assisting puncture positioning by adding at least one of leads, points and angles in the reality scene image. Therefore, the user may perform the puncture operation on the target object according to the added leads, points or angles, accordingly the accuracy and efficiency of puncture is further improved.
[0032] As shown in FIG. 4, the puncture auxiliary device 30 includes an identifying module 31, a processing module 32 and a marking module 33. The identifying module 31 is configured for identifying a lesion of the target object according to the ultrasonic image received in real time. The processing module 32 is configured for obtaining a position of a puncture point in the reality scene image according to positions of the lesion and the ultrasonic probe 11.
[0033] Specifically, the ultrasonic probe 11 carries out real-time scanning on the target object, and the identifying module 31 carries out image recognition on the ultrasonic image received in real time. Once the ultrasonic image is identified as containing a lesion, the ultrasonic image with the lesion may also be observed in the reality scene image, at this time, the ultrasonic probe 11 stops moving, but always scanning the lesion area. The processing module 32 then may obtain the positions of the lesion and the ultrasonic probe 11 for calculation and processing to calculate the position of a suitable puncture point. Multiple puncture points may be calculated and then marked in the reality scene image by the marking module 33, so that the user may perform puncture operation on the lesion according to the marked puncture points.
Optionally, the marking module 33 may also assist the puncture by adding leads or angles (such as needle insertion angles) in the reality scene image.
[0034] The present invention further provides an ultrasonic display method based on augmented reality, which is executed by the AR device 20 which is connected to an ultrasonic detection device, and the method includes: acquiring a reality scene image in a visual space in real time from the AR device 20; generating an ultrasonic image according to ultrasonic data collected by an ultrasonic probe 11 of the ultrasonic detection device, and receiving the ultrasonic image output by the ultrasonic detection device in real time; displaying the ultrasonic image in the reality scene image; and presenting the reality scene image with the ultrasonic image to a user.
[0035] In the ultrasonic display method based on augmented reality according to the embodiment, the ultrasonic image received in real time by the AR device 20 is displayed in the reality scene image, and then the reality scene image together with the ultrasonic image are presented to the user, in such a way, the user may simultaneously observe the realistic scene and real-time ultrasonic image through the AR device 20, thereby realizing the visualization operation in the process of puncture without frequent conversion of visual field, which is conducive to the user to judge the puncture site according to the real-time ultrasonic image, thus effectively improving the accuracy and efficiency of puncture.
[0036] It is understandable that, the sequence of the ultrasonic display method in the present embodiment of the invention is not limited to the above description. The reality scene image in a visual space of the AR device may be obtained prior to the ultrasonic image; or, the ultrasonic image may be obtained prior to the reality scene image. Further, the reality scene image and the ultrasonic image may be obtained alternately, since the reality scene image and the ultrasonic image are updated in real time.
[0037] Furthermore, the step of displaying the ultrasonic image in the reality scene image includes: acquiring a position of the ultrasonic probe 11 in the reality scene image; and displaying the ultrasonic image received in real time on a side of the ultrasonic probe 11 in the reality scene image, thereby the ultrasonic image in the reality scene image moves in real time to follow the ultrasonic probe 11.
[0038] The above-mentioned embodiments only represent several embodiments of the present application, and the descriptions thereof are relatively specific and detailed, but should not be construed as limiting the scope of the patent application. It should be pointed out that for those skilled in the art, several modifications and improvements can be made without departing from the concept of the present application, which all belong to the protection scope of the present application.
Therefore, the scope of protection of the patent of the present application shall be subject to the appended claims.

Claims (8)

CONCLUSIESCONCLUSIONS 1. Ultrasoon weergavesysteem op basis van ‘augmented reality’, met: een ultrasone detectie inrichting, bevattende een ultrasone sonde en een ultrasone beeldvormingsmodule verbonden met de ultrasone sonde, waarbij de ultrasone sonde geconfigureerd is voor realtime ultrasoon scannen van een doelobject, de ultrasone beeldvormingsmodule geconfigureerd is om een ultrasoon beeld te genereren en af te geven volgens ultrasone gegevens verzameld door de ultrasone sonde; en een ‘augmented reality'-inrichting die is gekoppeld aan de ultrasone beeldvormingsmodule en geconfigureerd is om een beeld met een werkelijkheidsscene in een visuele ruimte in realtime over te nemen en het ultrasoon beeld in realtime te ontvangen, waarbij het ultrasoon beeld weer te geven is in het beeld met een werkelijkheidssccéne en het beeld met een werkelijkheidsscène met het ultrasoon beeld aan een drager getoond is.1. An augmented reality ultrasonic display system comprising: an ultrasonic detection device comprising an ultrasonic probe and an ultrasonic imaging module coupled to the ultrasonic probe, the ultrasonic probe configured for real-time ultrasonic scanning of a target object, the ultrasonic imaging module configured to generate and output an ultrasonic image according to ultrasonic data collected by the ultrasonic probe; and an augmented reality device coupled to the ultrasonic imaging module and configured to acquire an image with a reality scene in a visual space in real time and receive the ultrasonic image in real time, the ultrasonic image being displayable in the image with a reality scene and the image with a reality scene with the ultrasonic image being displayed to a wearer. 2. Ultrasoon weergavesysteem volgens conclusie 1, waarbij de ‘augmented reality'-inrichting is geconfigureerd om een realtime positie van de ultrasone sonde in het beeld met de werkelijkheidsscène te verkrijgen en om het in realtime ontvangen ultrasoon beeld weer te geven op een zijde van de ultrasone sonde in het beeld van de werkelijkheidsscène en het ultrasoon beeld in het beeld met een werkelijkheidsscène geconfigureerd is om in realtime te bewegen om de ultrasone sonde te volgen.2. The ultrasonic display system of claim 1, wherein the augmented reality device is configured to obtain a real-time position of the ultrasonic probe in the reality scene image and to display the real-time received ultrasonic image on a side of the ultrasonic probe in the reality scene image, and the ultrasonic image in the reality scene image is configured to move in real time to track the ultrasonic probe. 3. Ultrasoon weergavesysteem volgens conclusie 1 of 2, waarbij de ‘augmented reality'-inrichting omvat: een overnamemodule die is gekoppeld aan de ultrasone beeldvormingsmodule en is geconfigureerd om het ultrasoon beeld in realtime over te nemen; een positioneringsmodule, geconfigureerd om een positie van de ultrasone sonde in het beeld met een werkelijkheidsscène te verkrijgen; en een weergavemodule, verbonden met de overnamemodule en de postioneringsmodule en geconfigureerd om het ultrasoon beeld weer te geven op een zijde van de ultrasone sonde, zodat het ultrasoon beeld in het beeld met een werkelijkheidsscéne in realtime beweegt om de ultrasone sonde te volgen.3. The ultrasonic imaging system of claim 1 or 2, wherein the augmented reality device comprises: an acquisition module coupled to the ultrasonic imaging module and configured to acquire the ultrasonic image in real time; a positioning module configured to obtain a position of the ultrasonic probe in the image with a reality scene; and a display module coupled to the acquisition module and the positioning module and configured to display the ultrasonic image on a side of the ultrasonic probe so that the ultrasonic image in the image with a reality scene moves in real time to track the ultrasonic probe. 4. Ultrasoon weergavesysteem volgens conclusie 3, waarbij de ‘augmented reality'-inrichting verder een aanpassingsmodule bevat die is geconfigureerd om een grootte en transparantie van het ultrasoon beeld weergegeven in het beeld met een werkelijkheidsscène aan te passen.4. The ultrasonic display system of claim 3, wherein the augmented reality device further comprises an adjustment module configured to adjust a size and transparency of the ultrasonic image displayed in the image with a reality scene. 5. Ultrasoon weergavesysteem volgens conclusie 1, bevattende verder een punctie hulpinrichting geconfigureerd om te helpen bij het plaatsen van de punctie door ten minste één van de sporen, punten en hoeken in het beeld van de werkelijkheid toe te voegen.5. The ultrasonic imaging system of claim 1, further comprising a puncture assist device configured to assist in placing the puncture by adding at least one of the traces, points and angles into the real-world image. 6. Ultrasoon weergavesysteem volgens conclusie 5, waarbij de punctie hulpinrichting omvat:6. The ultrasonic imaging system of claim 5, wherein the puncture assist device comprises: een identificatiemodule, geconfigureerd om een letsel aan het doelvoorwerp te identificeren volgens het in realtime ontvangen ultrasoon beeld ; een verwerkingsmodule, geconfigureerd om een positie van een punctiepunt in het beeld met een werkelijkheidsscène te verkrijgen volgens de posities van het letsel en de ultrasone sonde; en een markeermodule, geconfigureerd om het punctiepunt aan te duiden in het beeld met een werkelijkheidsscéne volgens de positie van het punctiepunt.an identification module configured to identify a lesion on the target object according to the real-time received ultrasound image; a processing module configured to obtain a position of a puncture point in the real-world scene image according to the positions of the lesion and the ultrasound probe; and a marking module configured to indicate the puncture point in the real-world scene image according to the position of the puncture point. 7. Ultrasone weergavemethode gebaseerd op ‘augmented reality’, uitgevoerd door een ‘augmented reality'-inrichting die is gekoppeld aan een ultrasoon detectie inrichting en de ultrasone weergavemethode omvat: een beeld met een werkelijkheidsscène overnemen in een visuele ruimte in realtime van de "augmented reality'-inrichting; een ultrasoon beeld genereren volgens ultrasone gegevens die zijn verzameld door een ultrasone sonde van de ultrasone detectie inrichting, en het ultrasoon beeldresultaat in realtime ontvangen dat door de ultrasone detectie inrichting werd afgegeven; het ultrasoon beeld weergeven in het beeld met een werkelijkheidsscène; en het beeld met een werkelijkheidsscène met het ultrasoon beeld tonen aan een drager.7. Ultrasonic display method based on augmented reality, performed by an augmented reality device coupled to an ultrasonic detection device, and the ultrasonic display method comprises: adopting an image with a reality scene into a visual space in real time of the augmented reality device; generating an ultrasonic image according to ultrasonic data collected by an ultrasonic probe of the ultrasonic detection device, and receiving the ultrasonic image result in real time outputted by the ultrasonic detection device; displaying the ultrasonic image in the image with a reality scene; and showing the image with a reality scene with the ultrasonic image to a wearer. 8. Ultrasone weergavemethode volgens conclusie 7, waarbij het betreffende weergeven van het ultrasoon beeld in het beeld met een werkelijkheidsscène omvat:8. The ultrasonic display method according to claim 7, wherein the respective displaying of the ultrasonic image in the image with a reality scene comprises: een positie van de ultrasone sonde overnemen in het beeld met een werkelijkheidsscène; en het in realtime ontvangen ultrasoon beeld weergeven op een zijde van de ultrasone sonde in het beeld met een werkelijkheidsscène, waarbij het ultrasoon beeld in het beeld met een werkelijkheidsscène in realtime beweegt om de ultrasone sonde te volgen.capture a position of the ultrasonic probe in the image with a real scene; and display the ultrasonic image received in real time on a side of the ultrasonic probe in the image with a real scene, the ultrasonic image in the image with a real scene moving in real time to track the ultrasonic probe.
NL2034443A 2022-03-28 2023-03-27 Ultrasonic display system and method based on augmented reality NL2034443B1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210316907.3A CN114886461A (en) 2022-03-28 2022-03-28 Ultrasonic display system and method based on augmented reality

Publications (2)

Publication Number Publication Date
NL2034443A NL2034443A (en) 2023-05-04
NL2034443B1 true NL2034443B1 (en) 2024-06-10

Family

ID=82715428

Family Applications (1)

Application Number Title Priority Date Filing Date
NL2034443A NL2034443B1 (en) 2022-03-28 2023-03-27 Ultrasonic display system and method based on augmented reality

Country Status (2)

Country Link
CN (1) CN114886461A (en)
NL (1) NL2034443B1 (en)

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5410629B1 (en) * 2013-05-14 2014-02-05 健司 三木 Ultrasonic diagnostic system, image processing apparatus, control method thereof, and control program
US20200187901A1 (en) * 2017-08-31 2020-06-18 The Regents Of The University Of California Enhanced ultrasound systems and methods
CN107854142B (en) * 2017-11-28 2020-10-23 无锡祥生医疗科技股份有限公司 Medical ultrasonic augmented reality imaging system
WO2019232451A1 (en) * 2018-05-31 2019-12-05 Matt Mcgrath Design & Co, Llc Method of medical imaging using multiple arrays
US20200352655A1 (en) * 2019-05-06 2020-11-12 ARUS Inc. Methods, devices, and systems for augmented reality guidance of medical devices into soft tissue
CN110090069B (en) * 2019-06-18 2021-04-09 无锡祥生医疗科技股份有限公司 Ultrasonic puncture guiding method, guiding device and storage medium
CN112932627A (en) * 2021-03-08 2021-06-11 河南省中医院(河南中医药大学第二附属医院) Puncture device and method based on ultrasonic guidance
CN113786228B (en) * 2021-09-15 2024-04-12 苏州朗润医疗系统有限公司 Auxiliary puncture navigation system based on AR augmented reality
CN113974830B (en) * 2021-11-02 2024-08-27 中国人民解放军总医院第一医学中心 Surgical navigation system for ultrasonic guided thyroid tumor thermal ablation
CN114129232A (en) * 2021-11-09 2022-03-04 上海长征医院 Ultrasonic puncture guiding system and method

Also Published As

Publication number Publication date
NL2034443A (en) 2023-05-04
CN114886461A (en) 2022-08-12

Similar Documents

Publication Publication Date Title
RU2740259C2 (en) Ultrasonic imaging sensor positioning
JP6987893B2 (en) General-purpose devices and methods for integrating diagnostic trials into real-time treatment
CN108324246B (en) Medical diagnosis assisting system and method
EP3505133A1 (en) Use of augmented reality to assist navigation during medical procedures
EP1804705B1 (en) Aparatus for navigation and for fusion of ecographic and volumetric images of a patient which uses a combination of active and passive optical markers
US10492758B2 (en) Device and method for guiding surgical tools
KR101572487B1 (en) System and Method For Non-Invasive Patient-Image Registration
KR102105974B1 (en) Medical imaging system
CN110537980A (en) puncture surgery navigation method based on motion capture and mixed reality technology
CN109907801B (en) Locatable ultrasonic guided puncture method
JP2021509987A (en) Systems and methods for detecting abnormal tissue using vascular features
EP3075342B1 (en) Microscope image processing device and medical microscope system
WO2019048269A1 (en) System for venipuncture and arterial line guidance with augmented reality
US20230065505A1 (en) System and method for augmented reality data interaction for ultrasound imaging
US8805627B2 (en) Method and system for organic specimen feature identification in ultrasound image
NL2034443B1 (en) Ultrasonic display system and method based on augmented reality
CN110638524B (en) Tumor puncture real-time simulation system based on VR glasses
CN111631814B (en) Intraoperative blood vessel three-dimensional positioning navigation system and method
JP2021194268A (en) Blood vessel observation system and blood vessel observation method
CN112397189A (en) Medical guiding device and using method thereof
KR101635731B1 (en) Visualization system and method for visualizing inner objects of human
US20230414297A1 (en) Ultrasound image processing system and method
GB2606359A (en) Augmented reality headset and probe for medical imaging
WO2024080997A1 (en) Apparatus and method for tracking hand-held surgical tools
CN118591820A (en) Method for correcting the position of a measuring point and measuring device