NL2034443B1 - Ultrasonic display system and method based on augmented reality - Google Patents
Ultrasonic display system and method based on augmented reality Download PDFInfo
- Publication number
- NL2034443B1 NL2034443B1 NL2034443A NL2034443A NL2034443B1 NL 2034443 B1 NL2034443 B1 NL 2034443B1 NL 2034443 A NL2034443 A NL 2034443A NL 2034443 A NL2034443 A NL 2034443A NL 2034443 B1 NL2034443 B1 NL 2034443B1
- Authority
- NL
- Netherlands
- Prior art keywords
- ultrasonic
- image
- real
- reality scene
- reality
- Prior art date
Links
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 31
- 238000000034 method Methods 0.000 title claims abstract description 21
- 239000000523 sample Substances 0.000 claims abstract description 56
- 230000000007 visual effect Effects 0.000 claims abstract description 18
- 238000003384 imaging method Methods 0.000 claims abstract description 16
- 238000001514 detection method Methods 0.000 claims abstract description 15
- 230000003902 lesion Effects 0.000 claims description 11
- 238000012545 processing Methods 0.000 claims description 6
- 238000002604 ultrasonography Methods 0.000 claims 2
- 230000008569 process Effects 0.000 abstract description 5
- 238000006243 chemical reaction Methods 0.000 abstract description 4
- 238000012800 visualization Methods 0.000 abstract description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000004590 computer program Methods 0.000 description 2
- 230000000903 blocking effect Effects 0.000 description 1
- 210000004204 blood vessel Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000003759 clinical diagnosis Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000000502 dialysis Methods 0.000 description 1
- 210000003191 femoral vein Anatomy 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000000004 hemodynamic effect Effects 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 210000003462 vein Anatomy 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0833—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
- A61B8/0841—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/462—Displaying means of special interest characterised by constructional features of the display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5292—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves using additional data, e.g. patient information, image labeling, acquisition parameters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/34—Trocars; Puncturing needles
- A61B17/3403—Needle locating or guiding means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/34—Trocars; Puncturing needles
- A61B17/3403—Needle locating or guiding means
- A61B2017/3413—Needle locating or guiding means guided by ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/378—Surgical systems with images on a monitor during operation using ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0891—Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of blood vessels
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Pathology (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Physics & Mathematics (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Gynecology & Obstetrics (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
An ultrasonic display system and method based on augmented reality are provided. The system includes an ultrasonic detection device and an augmented reality device. The ultrasonic detection device includes an ultrasonic probe for real-time ultrasonic scanning and an ultrasonic imaging module for generating and outputting an ultrasonic image. The augmented reality device is configured for acquiring a reality scene image in a visual space and receiving the ultrasonic image in real time, displaying the ultrasonic image in the reality scene image, and presenting the reality scene image with the ultrasonic image to a user. Therefore, the user wearing the AR device may simultaneously observe the realistic scene and real-time ultrasonic image, thereby realizing the visualization operation in the process of puncture without frequent conversion of visual field, which is conducive to the user to judge the puncture site, thus effectively improving the accuracy and efficiency of puncture.
Description
ULTRASONIC DISPLAY SYSTEM AND METHOD BASED ON
AUGMENTED REALITY
[0001] The application relates to the technical field of medical equipment, in particular to an ultrasonic display system and a method based on augmented reality.
[0002] At present, in the clinical diagnosis and treatment process of dialysis, extracorporeal circulation support, rapid fluid replenishment and hemodynamic monitoring, it is often necessary to perform puncture operations on the arteriovenous vessels (internal jugular arteriovenous vein, femoral arteriovenous vein, etc.) in various parts of the patient.
Commonly, an ultrasonic image acquired by an ultrasonic display device is utilized to know the approximate position and depth of blood vessels, and to follow the echo image of a puncture needle to guide the puncture.
However, such a manner has the following disadvantages.
[0003] The ultrasonic image is displayed on the display screen of the ultrasonic display device as a two-dimensional picture, and there is no image for spatial positioning on the puncture site. Therefore, it is difficult for the operator to accurately control the puncture angle and the puncture depth. It's necessary for the operator to frequently shift the visual field between the puncture site and the ultrasonic display screen, for repeatedly observing the real-time relative position of the puncture needle and the puncture site to determine the accurate position, depth and direction of the puncture, which leads to a low work efficiency.
[0004] An object of the present application is to provide an ultrasonic display system and a method based on augmented reality, which effectively improves the accuracy and efficiency of the puncture, without shifting the user’s visual field between the puncture site and the display screen.
[0005] To achieve the above object, the present application provides an ultrasonic display system based on augmented reality including: an ultrasonic detection device, including an ultrasonic probe and an ultrasonic imaging module connected with the ultrasonic probe, the ultrasonic probe being configured for real-time ultrasonic scanning of a target object, the ultrasonic imaging module being configured for generating and outputting an ultrasonic image according to ultrasonic data collected by the ultrasonic probe; and an augmented reality device connected to the ultrasonic imaging module and configured for acquiring a reality scene image in a visual space in real time and receiving the ultrasonic image in real time, displaying the ultrasonic image in the reality scene image, and presenting the reality scene image with the ultrasonic image to a user.
[0006] In one embodiment, the augmented reality device is configured for acquiring a real-time position of the ultrasonic probe in the reality scene image and displaying the ultrasonic image received in real time on a side of the ultrasonic probe in the reality scene image, and the ultrasonic image in the reality scene image is configured to move in real time to follow the ultrasonic probe.
[0007] In one embodiment, the augmented reality device includes: an acquisition module, connected to the ultrasonic imaging module and configured for acquiring the ultrasonic image in real time;
a positioning module, configured for acquiring a position of the ultrasonic probe in the reality scene image; and a displaying module, connected with the acquisition module and the positioning module, and configured for displaying the ultrasonic image on a side of the ultrasonic probe, so that the ultrasonic image in the reality scene image moves in real time to follow the ultrasonic probe.
[0008] In one embodiment, the augmented reality device further includes an adjusting module configured for adjusting a size and transparency of the ultrasonic image displayed in the reality scene image.
[0009] In one embodiment, a puncture auxiliary device is further included and configured for assisting puncture positioning by adding at least one of leads, points and angles in the reality scene image.
[0010] In one embodiment, the puncture auxiliary device includes: an identifying module, configured for identifying a lesion of the target object according to the ultrasonic image received in real time; a processing module, configured for obtaining a position of a puncture point in the reality scene image according to positions of the lesion and the ultrasonic probe; and a marking module, configured for marking the puncture point in the reality scene image according to the position of the puncture point.
[0011] To achieve the above object, the present application further provides an ultrasonic display method based on augmented reality, executed by an augmented reality device which is connected to an ultrasonic detection device, and the ultrasonic display method including: acquiring a reality scene image in a visual space in real time from the augmented reality device;
generating an ultrasonic image according to ultrasonic data collected by an ultrasonic probe of the ultrasonic detection device, and receiving the ultrasonic image output by the ultrasonic detection device in real time; displaying the ultrasonic image in the reality scene image; and presenting the reality scene image with the ultrasonic image to a user.
[0012] In one embodiment, the method further includes: acquiring a position of the ultrasonic probe in the reality scene image; and displaying the ultrasonic image received in real time on a side of the ultrasonic probe in the reality scene image, thereby the ultrasonic image in the reality scene image moves in real time to follow the ultrasonic probe.
[0013] In comparison with the prior art, in the ultrasonic display system and method based on augmented reality according to the present embodiment, the ultrasonic image received in real time by the augmented reality (AR) device is displayed in the reality scene image, and then the reality scene image together with the ultrasonic image are displayed to the user, in such a way, the user wearing the AR device may simultaneously observe the realistic scene and real-time ultrasonic image, thereby realizing the visualization operation in the process of puncture without frequent conversion of visual field, which is conducive to the user to judge the puncture site according to the real-time ultrasonic image, thus effectively improving the accuracy and efficiency of puncture.
[0014] The accompanying drawings facilitate an understanding of the various embodiments of this invention. In such drawings:
[0015] FIG. 1 is a module diagram of an ultrasonic display system based 5 on augmented reality according to one embodiment of the invention;
[0016] FIG. 2 is a schematic diagram of the reality scene image according to the embodiment of the invention;
[0017] FIG. 3 is a module diagram of the AR device according to the embodiment of the invention; and
[0018] FIG. 4 is a module diagram of the puncture auxiliary device according to the embodiment of the invention.
[0019] In order to make the purpose, technical solutions and advantages of the present application more clearly understood, the present application will be described in further detail below with reference to the accompanying drawings and embodiments.
[0020] Referring to FIGS. 1-4, the ultrasonic display system based on augmented reality according to the present invention is provided, the system include an ultrasonic detection device and an augmented reality (AR) device 20.
[0021] The ultrasonic detection device 10 includes an ultrasonic probe 11 and an ultrasonic imaging module 12 connected with the ultrasonic probe 11. Specifically, the ultrasonic probe 11 is configured for real-time ultrasonic scanning of a target object, the ultrasonic imaging module 12 is configured for generating and outputting an ultrasonic image according to ultrasonic data collected by the ultrasonic probe 11.
[0022] The AR device 20 is connected with the ultrasonic imaging module 12, and is configured for acquiring a reality scene image in a visual space in real time and receiving the ultrasonic image in real time, displaying the ultrasonic image in the reality scene image, and presenting the reality scene image with the ultrasonic image to a user.
[0023] In the ultrasonic display system based on augmented reality according to the present embodiment, the ultrasonic image received in real time by the AR device 20 is displayed in the reality scene image, and then the reality scene image together with the ultrasonic image are displayed to the user, in such a way, the user wearing the AR device 20 may simultaneously observe the realistic scene and real-time ultrasonic image, thereby realizing the visualization operation in the process of puncture without frequent conversion of visual field, which is conducive to the user to judge the puncture site according to the real-time ultrasonic image, thus effectively improving the accuracy and efficiency of puncture.
[0024] It should be noted that "visible space" refers to the space within the field of vision that the user may see through the AR device 20. The visible space will change with the movement or rotation of the user. "Reality scene image" refers to the image captured by the AR device 20 in its visual space, which may be the image captured by the AR device 20 in the entire visual space or in part of the visual space. Of course, it is understandable that, the ultrasonic probe is always captured in the "reality scene image" by the AR device 20.
[0025] It is understandable that, the AR device 20 may include head- mounted devices (such as AR glasses), which may be worn by doctors and/or nurses etc. who perform puncture operations. Such head- mounted devices are used to obtain reality scene images in their visual space in real time, as well as display the reality scene images with ultrasonic images in the line of sight of the staff wearing the head- mounted devices.
[0026] It is understandable that, the ultrasonic image received by the AR device 20 may be processed through computer programs and/or software to display the ultrasonic image in the reality scene image.
[0027] For example, as illustrated in FIG. 2, when the user or other personnel holds the ultrasonic probe 11 to carry out ultrasonic scanning on the wrist, the AR device 20 may capture a reality scene image and an ultrasonic image scanned by the ultrasonic probe 11 in real time, and the ultrasonic image may be displayed in the reality scene image which is then presented to the user, in such a way, the user may simultaneously observe both the reality scene image and the ultrasonic image. The ultrasonic image is not limited to that in FIG. 2, which is dependent on the scanning area of the ultrasonic probe 11.
[0028] In some embodiments, the AR device 20 is configured to acquire the real-time position of the ultrasonic probe 11 in the reality scene image and then display the ultrasonic image received in real time on a side of the ultrasonic probe 11 in the reality scene image (as shown in
FIG. 2), so that the ultrasonic image may move in real time along with the position of ultrasonic probe 11 in the reality scene image, which is conducive to the user to judge the piercing site. There is no restriction on the displaying position of the ultrasonic image in the reality scene image, as long as the user may observe the real scene and the ultrasonic image at the same time.
[0029] Specifically, as shown in FIG. 2 and FIG. 3, the AR device 20 includes an acquisition module 21, a positioning module 22 and a displaying module 23. The acquisition module 21 is connected with the ultrasonic imaging module 12 and configured for acquiring the ultrasonic image in real time. The positioning module 22 is configured for acquiring a position of the ultrasonic probe in the reality scene image. The displaying module 23 is connected with the acquisition module 21 and the positioning module 22, and configured for displaying the ultrasonic image onto a side of the ultrasonic probe 11, so that the ultrasonic image in the reality scene image moves in real time to follow the ultrasonic probe 11. The ultrasonic probe 11 is positioned by the positioning module 22, so that the ultrasonic image may be projected on a side of the ultrasonic probe 11 by the displaying module 23. Optionally, the ultrasonic image may be displayed on the front, rear, left, right or upper side of the ultrasonic probe 11, as long as it may be completely displayed in the reality scene image for the user to view.
[0030] As shown in FIG. 3, for the convenience of the user, the AR device 20 may further include an adjusting module 24, which is configured for adjusting the size and the transparency of the ultrasonic image displayed in the reality scene image, so that the user may adjust the size and/or the transparency of the ultrasonic image in the reality scene image as needed. The ultrasonic image may be regulated to have a certain transparency to prevent the ultrasonic image from blocking part of the reality scene image, which is more conducive to the user's judgment for the puncture site. The adjusting module 24 may include the built-in computer program. The adjusting module 24 may adjust the size and the transparency of the ultrasonic image displayed in the reality scene image by receiving instructions sent by a button or a touch screen.
[0031] In some embodiments, the ultrasonic display system further includes a puncture auxiliary device 30 configured for assisting puncture positioning by adding at least one of leads, points and angles in the reality scene image. Therefore, the user may perform the puncture operation on the target object according to the added leads, points or angles, accordingly the accuracy and efficiency of puncture is further improved.
[0032] As shown in FIG. 4, the puncture auxiliary device 30 includes an identifying module 31, a processing module 32 and a marking module 33. The identifying module 31 is configured for identifying a lesion of the target object according to the ultrasonic image received in real time. The processing module 32 is configured for obtaining a position of a puncture point in the reality scene image according to positions of the lesion and the ultrasonic probe 11.
[0033] Specifically, the ultrasonic probe 11 carries out real-time scanning on the target object, and the identifying module 31 carries out image recognition on the ultrasonic image received in real time. Once the ultrasonic image is identified as containing a lesion, the ultrasonic image with the lesion may also be observed in the reality scene image, at this time, the ultrasonic probe 11 stops moving, but always scanning the lesion area. The processing module 32 then may obtain the positions of the lesion and the ultrasonic probe 11 for calculation and processing to calculate the position of a suitable puncture point. Multiple puncture points may be calculated and then marked in the reality scene image by the marking module 33, so that the user may perform puncture operation on the lesion according to the marked puncture points.
Optionally, the marking module 33 may also assist the puncture by adding leads or angles (such as needle insertion angles) in the reality scene image.
[0034] The present invention further provides an ultrasonic display method based on augmented reality, which is executed by the AR device 20 which is connected to an ultrasonic detection device, and the method includes: acquiring a reality scene image in a visual space in real time from the AR device 20; generating an ultrasonic image according to ultrasonic data collected by an ultrasonic probe 11 of the ultrasonic detection device, and receiving the ultrasonic image output by the ultrasonic detection device in real time; displaying the ultrasonic image in the reality scene image; and presenting the reality scene image with the ultrasonic image to a user.
[0035] In the ultrasonic display method based on augmented reality according to the embodiment, the ultrasonic image received in real time by the AR device 20 is displayed in the reality scene image, and then the reality scene image together with the ultrasonic image are presented to the user, in such a way, the user may simultaneously observe the realistic scene and real-time ultrasonic image through the AR device 20, thereby realizing the visualization operation in the process of puncture without frequent conversion of visual field, which is conducive to the user to judge the puncture site according to the real-time ultrasonic image, thus effectively improving the accuracy and efficiency of puncture.
[0036] It is understandable that, the sequence of the ultrasonic display method in the present embodiment of the invention is not limited to the above description. The reality scene image in a visual space of the AR device may be obtained prior to the ultrasonic image; or, the ultrasonic image may be obtained prior to the reality scene image. Further, the reality scene image and the ultrasonic image may be obtained alternately, since the reality scene image and the ultrasonic image are updated in real time.
[0037] Furthermore, the step of displaying the ultrasonic image in the reality scene image includes: acquiring a position of the ultrasonic probe 11 in the reality scene image; and displaying the ultrasonic image received in real time on a side of the ultrasonic probe 11 in the reality scene image, thereby the ultrasonic image in the reality scene image moves in real time to follow the ultrasonic probe 11.
[0038] The above-mentioned embodiments only represent several embodiments of the present application, and the descriptions thereof are relatively specific and detailed, but should not be construed as limiting the scope of the patent application. It should be pointed out that for those skilled in the art, several modifications and improvements can be made without departing from the concept of the present application, which all belong to the protection scope of the present application.
Therefore, the scope of protection of the patent of the present application shall be subject to the appended claims.
Claims (8)
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210316907.3A CN114886461A (en) | 2022-03-28 | 2022-03-28 | Ultrasonic display system and method based on augmented reality |
Publications (2)
Publication Number | Publication Date |
---|---|
NL2034443A NL2034443A (en) | 2023-05-04 |
NL2034443B1 true NL2034443B1 (en) | 2024-06-10 |
Family
ID=82715428
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
NL2034443A NL2034443B1 (en) | 2022-03-28 | 2023-03-27 | Ultrasonic display system and method based on augmented reality |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN114886461A (en) |
NL (1) | NL2034443B1 (en) |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5410629B1 (en) * | 2013-05-14 | 2014-02-05 | 健司 三木 | Ultrasonic diagnostic system, image processing apparatus, control method thereof, and control program |
US20200187901A1 (en) * | 2017-08-31 | 2020-06-18 | The Regents Of The University Of California | Enhanced ultrasound systems and methods |
CN107854142B (en) * | 2017-11-28 | 2020-10-23 | 无锡祥生医疗科技股份有限公司 | Medical ultrasonic augmented reality imaging system |
WO2019232451A1 (en) * | 2018-05-31 | 2019-12-05 | Matt Mcgrath Design & Co, Llc | Method of medical imaging using multiple arrays |
US20200352655A1 (en) * | 2019-05-06 | 2020-11-12 | ARUS Inc. | Methods, devices, and systems for augmented reality guidance of medical devices into soft tissue |
CN110090069B (en) * | 2019-06-18 | 2021-04-09 | 无锡祥生医疗科技股份有限公司 | Ultrasonic puncture guiding method, guiding device and storage medium |
CN112932627A (en) * | 2021-03-08 | 2021-06-11 | 河南省中医院(河南中医药大学第二附属医院) | Puncture device and method based on ultrasonic guidance |
CN113786228B (en) * | 2021-09-15 | 2024-04-12 | 苏州朗润医疗系统有限公司 | Auxiliary puncture navigation system based on AR augmented reality |
CN113974830B (en) * | 2021-11-02 | 2024-08-27 | 中国人民解放军总医院第一医学中心 | Surgical navigation system for ultrasonic guided thyroid tumor thermal ablation |
CN114129232A (en) * | 2021-11-09 | 2022-03-04 | 上海长征医院 | Ultrasonic puncture guiding system and method |
-
2022
- 2022-03-28 CN CN202210316907.3A patent/CN114886461A/en active Pending
-
2023
- 2023-03-27 NL NL2034443A patent/NL2034443B1/en active
Also Published As
Publication number | Publication date |
---|---|
NL2034443A (en) | 2023-05-04 |
CN114886461A (en) | 2022-08-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
RU2740259C2 (en) | Ultrasonic imaging sensor positioning | |
JP6987893B2 (en) | General-purpose devices and methods for integrating diagnostic trials into real-time treatment | |
CN108324246B (en) | Medical diagnosis assisting system and method | |
EP3505133A1 (en) | Use of augmented reality to assist navigation during medical procedures | |
EP1804705B1 (en) | Aparatus for navigation and for fusion of ecographic and volumetric images of a patient which uses a combination of active and passive optical markers | |
US10492758B2 (en) | Device and method for guiding surgical tools | |
KR101572487B1 (en) | System and Method For Non-Invasive Patient-Image Registration | |
KR102105974B1 (en) | Medical imaging system | |
CN110537980A (en) | puncture surgery navigation method based on motion capture and mixed reality technology | |
CN109907801B (en) | Locatable ultrasonic guided puncture method | |
JP2021509987A (en) | Systems and methods for detecting abnormal tissue using vascular features | |
EP3075342B1 (en) | Microscope image processing device and medical microscope system | |
WO2019048269A1 (en) | System for venipuncture and arterial line guidance with augmented reality | |
US20230065505A1 (en) | System and method for augmented reality data interaction for ultrasound imaging | |
US8805627B2 (en) | Method and system for organic specimen feature identification in ultrasound image | |
NL2034443B1 (en) | Ultrasonic display system and method based on augmented reality | |
CN110638524B (en) | Tumor puncture real-time simulation system based on VR glasses | |
CN111631814B (en) | Intraoperative blood vessel three-dimensional positioning navigation system and method | |
JP2021194268A (en) | Blood vessel observation system and blood vessel observation method | |
CN112397189A (en) | Medical guiding device and using method thereof | |
KR101635731B1 (en) | Visualization system and method for visualizing inner objects of human | |
US20230414297A1 (en) | Ultrasound image processing system and method | |
GB2606359A (en) | Augmented reality headset and probe for medical imaging | |
WO2024080997A1 (en) | Apparatus and method for tracking hand-held surgical tools | |
CN118591820A (en) | Method for correcting the position of a measuring point and measuring device |