CN111528920A - Augmented reality observation device for ultrasound device - Google Patents
Augmented reality observation device for ultrasound device Download PDFInfo
- Publication number
- CN111528920A CN111528920A CN202010448490.7A CN202010448490A CN111528920A CN 111528920 A CN111528920 A CN 111528920A CN 202010448490 A CN202010448490 A CN 202010448490A CN 111528920 A CN111528920 A CN 111528920A
- Authority
- CN
- China
- Prior art keywords
- module
- ultrasound
- ultrasonic
- real
- augmented reality
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 25
- 238000002604 ultrasonography Methods 0.000 title claims description 43
- 238000004891 communication Methods 0.000 claims abstract description 48
- 238000001514 detection method Methods 0.000 claims abstract description 29
- 230000005540 biological transmission Effects 0.000 claims description 13
- 238000004364 calculation method Methods 0.000 claims description 11
- 230000001133 acceleration Effects 0.000 claims description 6
- 238000000034 method Methods 0.000 claims description 4
- 230000008569 process Effects 0.000 claims description 3
- 210000000056 organ Anatomy 0.000 abstract description 10
- 210000003754 fetus Anatomy 0.000 abstract description 8
- 210000003128 head Anatomy 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 210000001747 pupil Anatomy 0.000 description 4
- 230000009466 transformation Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 230000003321 amplification Effects 0.000 description 2
- 210000004204 blood vessel Anatomy 0.000 description 2
- 238000003745 diagnosis Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000003199 nucleic acid amplification method Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 230000003187 abdominal effect Effects 0.000 description 1
- 239000000853 adhesive Substances 0.000 description 1
- 230000001070 adhesive effect Effects 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 230000001605 fetal effect Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 210000001503 joint Anatomy 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 210000003739 neck Anatomy 0.000 description 1
- 230000007170 pathology Effects 0.000 description 1
- 210000003491 skin Anatomy 0.000 description 1
- 210000002435 tendon Anatomy 0.000 description 1
- 210000001835 viscera Anatomy 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0866—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving foetal diagnosis; pre-natal or peri-natal diagnosis of the baby
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0833—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
- A61B8/085—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0858—Detecting organic movements or changes, e.g. tumours, cysts, swellings involving measuring tissue layers, e.g. skin, interfaces
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0891—Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4411—Device being modular
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4427—Device being portable or laptop-like
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/462—Displaying means of special interest characterised by constructional features of the display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/466—Displaying means of special interest adapted to display 3D data
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Surgery (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Vascular Medicine (AREA)
- Computer Graphics (AREA)
- General Engineering & Computer Science (AREA)
- Gynecology & Obstetrics (AREA)
- Pregnancy & Childbirth (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
The invention relates to the technical field of medical instruments and discloses augmented reality observation equipment for ultrasonic equipment, which comprises a communication module, an AR module, an ultrasonic module and a wearing device, wherein the communication module is used for communicating with the AR module; the communication module transmits the detection information of the ultrasonic module to the AR module, and the AR module displays the real-time ultrasonic model after the detection information is combined; by arranging the AR module, the ultrasonic module and the wearing device, wherein the AR module is connected with the wearing device, the ultrasonic module is connected with the communication module, the ultrasonic module combines detection information into a real-time ultrasonic model after ultrasonic detection is carried out on the ultrasonic module, the real-time ultrasonic model is transmitted to the AR module through the communication module, and the ultrasonic module is attached to determine locating point information of organs or fetuses.
Description
Technical Field
The invention relates to the technical field of medical instruments, in particular to augmented reality observation equipment for ultrasonic equipment.
Background
Medical ultrasound (also known as diagnostic ultrasound or ultrasound) is a diagnostic imaging technique based on ultrasound applications; it is used to view internal body structures such as tendons, muscles, joints, blood vessels and internal organs; the aim is often to find the origin of the disease or to exclude any pathology; the practice of examining pregnant women using ultrasonic waves is called obstetrical ultrasound, which is widely used; in the conventional ultrasound field, a common two-dimensional picture or a virtual three-dimensional image is displayed through a planar display, but is essentially a two-dimensional representation.
Augmented reality (AR for short) simply applies virtual information to a real world through a computer graphic image technology and a positioning technology, superimposes a real scene and a virtual scene, and can be seen by human eyes, presenting a brand new man-machine interaction mode.
In the prior art, an augmented reality observation device for an ultrasonic device, which is convenient to use, is lacked.
Disclosure of Invention
The invention aims to provide an augmented reality observation device for an ultrasonic device, and aims to provide an augmented reality observation device for an ultrasonic device, which is convenient to use.
The augmented reality observation equipment for the ultrasonic equipment comprises a communication module, an AR module for displaying, an ultrasonic module for ultrasonically detecting a user and a wearing device connected with the AR module, wherein the communication module is electrically connected with the ultrasonic module, and the AR module is in signal connection with the communication module; the combined detection information of the ultrasonic modules is a real-time ultrasonic model, the communication module transmits the real-time ultrasonic model and positioning point information to the AR module, and the AR module displays an image formed by combining the real-time ultrasonic model and a real environment according to the positioning point information.
Furthermore, the wearing device comprises two fixed legs and a fixed frame, wherein the two fixed legs are respectively hinged with two ends of the fixed frame; the two fixing legs deviate from an elastic piece is connected between the rear ends of the fixing frames, and the elastic piece is used for assisting the wearing device to be fixed on the head of the user.
Furthermore, the front ends of the two fixing legs are respectively hinged with the fixing frame, and the rear ends of the two fixing legs are respectively bent towards the other fixing leg.
Further, the AR module comprises a communication block, a controller, a camera and a display lens, wherein the communication block is in close range connection with the communication module; the camera, the display lens the controller with mount fixed connection, the camera the display lens respectively with controller electric connection, the camera is shot and is fixed a position real-time supersound model display position, the display lens shows real-time supersound model.
Further, the AR module comprises an acceleration sensor, an ambient light sensor and a gyroscope sensor, parameters of the acceleration sensor are sent to the controller, and the controller changes from a standby state to a starting state or from the starting state to the standby state; the parameters of the ambient light sensor are sent to the controller, and the controller controls the display brightness of the display lens; and the parameters of the gyroscope sensor are sent to the controller, and the controller controls the display angle of the real-time ultrasonic model.
Further, the ultrasonic module comprises a detection piece and a calculation piece, wherein the detection piece is electrically connected with the calculation piece, the detection piece is in contact detection with the user and transmits a signal to the calculation piece, and the calculation piece transmits the signal to the communication block through the communication module.
Further, the detecting piece comprises an ultrasonic generator, an ultrasonic receiver and a transmission line, the ultrasonic generator and the ultrasonic receiver are fixedly connected, and the ultrasonic receiver is connected to the calculating piece through the transmission line in a wired mode.
Further, the detection piece comprises an ultrasonic generator, an ultrasonic receiver and a transmission block, the ultrasonic generator and the ultrasonic receiver are fixedly connected, and the ultrasonic receiver is connected to the calculation piece through the transmission block in a short range.
Further, the calculating part comprises an ultrasonic image processing module and a real-time ultrasonic model storage module,
the ultrasonic image processing module processes the detection information into a real-time ultrasonic model,
and the real-time ultrasonic model storage module sends the real-time ultrasonic model to the communication module or uploads the real-time ultrasonic model to a server.
Further, the communication module is connected with the communication block through short range communication.
Compared with the prior art, the augmented reality observation device for the ultrasonic device provided by the invention is provided with the AR module, the ultrasonic module and the wearing device, wherein the AR module is connected with the wearing device, the ultrasonic module is connected with the communication module, the ultrasonic module combines the detection information into the real-time ultrasonic model after ultrasonic detection, the real-time ultrasonic model is transmitted to the AR module through the communication module, and the ultrasonic module is additionally provided to determine the locating point information of an organ or a fetus, so that the AR module displays the image formed by combining the real-time ultrasonic model and a real environment, and an observer can directly see the diagnosis condition in the body of the user according to the locating point information, and the augmented reality observation device is very convenient.
Drawings
FIG. 1 is a schematic block diagram of an augmented reality observation device for an ultrasound device according to the present invention;
FIG. 2 is a perspective view of the wearable device and the AR module provided in the present invention;
FIG. 3 is another perspective view of the wearable device and the AR module provided in the present invention;
fig. 4 is another perspective view of the wearable device and the AR module provided in the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The following describes the implementation of the present invention in detail with reference to specific embodiments.
The same or similar reference numerals in the drawings of the present embodiment correspond to the same or similar components; in the description of the present invention, it should be understood that if there is an orientation or positional relationship indicated by the terms "upper", "lower", "left", "right", etc. based on the orientation or positional relationship shown in the drawings, it is only for convenience of describing the present invention and simplifying the description, but it is not intended to indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and therefore, the terms describing the positional relationship in the drawings are only used for illustrative purposes and are not to be construed as limiting the present patent, and the specific meaning of the terms may be understood by those skilled in the art according to specific circumstances.
Referring to fig. 1-3, preferred embodiments of the present invention are shown.
The augmented reality observation device for the ultrasonic device provided by the embodiment can be used for observing the fetal state in the body of a target person, and meanwhile, all parts and organs in the body of the target person can be observed, so that the traditional ultrasonic display screen is replaced, and the augmented reality observation device becomes a more clear and more convenient display terminal.
The augmented reality observation equipment for the ultrasonic equipment comprises a communication module, an AR module 12, an ultrasonic module and a wearing device 11, wherein the AR module 12 is used for displaying, the ultrasonic module is used for ultrasonic detection of a user, the wearing device 11 is connected with the AR module 12, the communication module is electrically connected with the ultrasonic module, and the AR module 12 is in signal connection with the communication module; the combined detection information of the ultrasonic modules is a real-time ultrasonic model, the communication module transmits the real-time ultrasonic model and the positioning point information to the AR module 12, and the AR module 12 displays an image formed by combining the real-time ultrasonic model and the real environment according to the positioning point information.
By setting the AR module 12, the ultrasonic module and the wearing device 11, wherein the AR module 12 is connected with the wearing device 11, the ultrasonic module is connected with the communication module, the ultrasonic module combines the detection information into a real-time ultrasonic model after ultrasonic detection, the real-time ultrasonic model is transmitted to the AR module 12 through the communication module, and the ultrasonic module is attached to determine the locating point information of organs or fetuses, so that the AR module 12 displays the image formed by combining the real-time ultrasonic model and the real environment, and an observer can directly see the diagnosis condition in the body of the user according to the locating point information, thereby being very convenient
The wearing device 11 comprises two fixed legs 13 and a fixed frame 14, wherein the two fixed legs 13 are respectively hinged with two ends of the fixed frame 14; two fixed legs 13 deviate from being connected with elastic component 17 between the rear end of mount 14, and elastic component 17 is used for the supplementary device 11 of wearing to fix at user's overhead, wears the device 11 and can fix at user's overhead, and like this, AR module 12 can be fixed in user's the eye in front, and through setting up elastic component 17, orders about AR module 12 can not lead to vibrations because of walking about for the people when showing, perhaps other actions.
In addition, the front ends of the two fixed legs 13 are respectively hinged with the fixed frame 14, and the rear ends of the two fixed legs 13 are respectively arranged towards the other fixed leg 13 in a bending way, so that the wearing device 11 can be further fixed on the head of a user, and the stability is enhanced.
There is another embodiment as shown in fig. 3:
the wearing device 11 comprises a fixed leg 13 and a fixed frame 14, the fixed leg 13 being hingedly arranged with the fixed frame 14.
In addition, the front end of the fixed leg 13 is hinged to the fixed frame 14, and the rear end of the fixed leg 13 is bent horizontally downward, so that the wearing device 11 can be further fixed on the ear of the user, and the stability is enhanced.
As shown in fig. 4, the rear end of the fixed leg 13 is connected with a fixed head band 17, which has elasticity and drives the wearing device 11 to be fixed on the head of the user.
In yet another embodiment, a stop block is attached to the rear end of the fixed leg 13, wherein the stop block has friction lines on its lower surface, wherein the friction lines are used to prevent the wearing device 11 from slipping.
The AR module 12 includes a communication block, a controller, a camera 16 and a display lens 15, the communication block is connected to the communication module in a short range; camera 16, display lens 15, controller and mount 14 fixed connection, camera 16, display lens 15 respectively with controller electric connection, camera 16 shoot and fix a position real-time supersound model display position, display lens 15 shows real-time supersound model, wherein display lens 15 includes transparent lens and organic light emitting display circuit, wherein organic light emitting display circuit is transparent, and is arranged in transparent lens.
Specifically, the transparent lens includes a first layer and a second layer, wherein the organic light emitting display circuit is located between the first layer and the second layer, and the first layer and the second layer are connected through an adhesive.
The wearing device 11 further comprises a cover sheet movably connected to the fixing frame 14 such that the fixing cover sheet is used to cover the display lens 15, when the display lens 15 is covered by the cover sheet, the user's visual field is limited, the organic light emitting transparent circuit directly displays an image, and the camera 16 continues to take a picture for recognizing a gesture.
Specifically, one end of the cover plate is connected with the fixing frame through magnetic force, and the cover plate moves to cover the transparent lens.
In addition, another embodiment relates to a cover plate, wherein one end of the cover plate is hinged with the fixed frame, and the cover plate is moved to cover the transparent lens.
Specifically, the camera 16 is fixedly connected with the fixing frame 14, and the display lens 15 is also fixedly connected with the fixing frame 14, so that the display lens 15 is further fixed.
The AR module 12 includes an acceleration sensor, an ambient light sensor, and a gyroscope sensor, and parameters of the acceleration sensor are sent to the controller, and the controller changes from a standby state to a start state or from the start state to the standby state; the parameters of the ambient light sensor are sent to the controller, and the controller controls the display brightness of the display lens 15; parameters of the gyroscope sensor are sent to the controller, and the controller controls the display angle of the real-time ultrasonic model.
In addition, the AR module 12 is further provided with a light sensing circuit, wherein the light sensing circuit is used for sensing the pupil angle so as to position the viewing angle focus, and thus, the light sensing circuit transmits the pupil focus position to the controller, and then the controller controls the organic light emitting display circuit to correspondingly display the blurred or cleaned image.
The ultrasonic module comprises a detection piece and a calculation piece, the detection piece is electrically connected with the calculation piece, the detection piece is in contact detection with a user and transmits a signal to the calculation piece, and the calculation piece transmits the signal to the communication block through the communication module.
And the detection piece comprises an ultrasonic generator, an ultrasonic receiver and a transmission line, the ultrasonic generator and the ultrasonic receiver are fixedly connected, and the ultrasonic receiver is connected to the calculation piece through the transmission line in a wired mode.
In yet another embodiment, the detecting member comprises an ultrasonic generator, an ultrasonic receiver and a transmission block, the ultrasonic generator and the ultrasonic receiver are fixedly connected, and the ultrasonic receiver is connected to the calculating member through the transmission block in a short range, wherein the ultrasonic generator generates ultrasonic waves, the ultrasonic receiver receives the ultrasonic waves and converts the ultrasonic waves into detection information, and the transmission block transmits the detection information to the ultrasonic image processing module.
And the computing part comprises an ultrasonic image processing module and a real-time ultrasonic model storage module, the ultrasonic image processing module processes the detection information into a real-time ultrasonic model, and the real-time ultrasonic model storage module sends the real-time ultrasonic model to the communication module or uploads the real-time ultrasonic model to the server.
Wherein the communication module is connected with the communication block through short-range communication.
Wherein the controller includes user interaction module, and camera 16 shoots user's gesture, and user interaction module turns into the operation of rotation, enlargeing, reducing of model with it, and when the user made the gesture before camera 16, the controller discerned the gesture, wherein has following step:
(1) the optical sensor circuit detects whether the pupil focus of the user is in front of the lens, and when the pupil focus of the user is in front of the lens, the optical sensor circuit drives the camera 16 to start shooting;
(2) the controller judges whether the user faces the detection part of the user according to the shooting result of the camera 16, whether the detection corresponds to the positioning point information one by one, and when the camera 16 is positioned to the positioning point information, the controller drives the display lens 15 to display the real-time ultrasonic model;
(3) the user makes a gesture in the shooting range of the camera 16, and the shielding range of the hand is calculated and transmitted, so that the shielding range corresponding to the hand is hidden by the model display on the display lens 15, and meanwhile, the controller acquires the gesture action;
(4) and the controller judges whether the user performs the gesture of rotation, amplification and reduction according to the duration and the amplitude of the gesture action, and when the gesture of rotation, amplification and reduction is determined, the controller drives the model on the display lens 15 to rotate, amplify and reduce.
The user interaction module is also responsible for calculating and displaying the annotation, so that a user can know the near condition of the organ or the fetus conveniently.
In addition, the AR module 12 is further provided with a memory, wherein the memory is used for temporarily storing the real-time ultrasound model transmitted from the ultrasound module, and then letting the controller select full high-definition display or fuzzy display, and when the controller sets the energy-saving mode, the definition of the display lens 15 displaying the real-time ultrasound model will decrease.
The data processed by the virtual-real mixed processing module is obtained, and the obtained ultrasonic imaging mixed model can be directly output to a display module for display and can also be output to a virtual-real mixed data storage module for storage;
in addition, in order to fuse the real-time ultrasound model with the real scene three-dimensional model, the transformation of a coordinate system is required; converting a real-time ultrasonic model coordinate system into a real scene coordinate system which can be seen by the vision of a user;
assuming that all points of the corresponding three-dimensional real-time ultrasound model in the real-time ultrasound model coordinate system are P1(x, y, z), P2(x, y, z), P3(x, y, z), P4(x, y, z) … … Pn (x, y, z),
corresponding to all points in the real scene coordinate system based on the AR module 12 are P1 ' (x, y, z), P2 ' (x, y, z), P3 ' (x, y, z), P4 ' (x, y, z) … … Pn ' (x, y, z),
a transformation matrix M is required to realize the transformation of the coordinate system, and the relative position of the organ or fetus is obtained through the transformation of the coordinate system.
The positioning point information is obtained by identifying the position of the organ or the fetus on the human body through the ultrasonic module, the position of the three-dimensional model of the organ or the fetus is finally obtained by combining the positioning point information and the coordinate system, and the corresponding real-time ultrasonic model is finally displayed on the display lens 15 through the camera 16.
Specifically, when the person does not look at the user, the display lens 15 does not display the corresponding real-time ultrasound model, but when the camera 16 gradually photographs the user or the user rotates in the user direction, the real-time ultrasound model is gradually displayed on the display lens 15.
Optionally, the ultrasound module may be a palm ultrasound device or a desktop ultrasound device, which is not limited herein.
The ultrasound module may be used to detect, among other things, abdominal organs, blood vessels, skin, neck organs, fetuses, etc.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.
Claims (10)
1. The augmented reality observation equipment for the ultrasonic equipment is characterized by comprising a communication module, an AR module for displaying, an ultrasonic module for ultrasonically detecting a user and a wearing device connected with the AR module, wherein the communication module is electrically connected with the ultrasonic module, and the AR module is in signal connection with the communication module; the combined detection information of the ultrasonic modules is a real-time ultrasonic model, the communication module transmits the real-time ultrasonic model and positioning point information to the AR module, and the AR module displays an image formed by combining the real-time ultrasonic model and a real environment according to the positioning point information.
2. The augmented reality observation apparatus for an ultrasound apparatus according to claim 1, wherein the wearing means includes two fixed legs and a fixed frame, the two fixed legs are respectively hinged to two ends of the fixed frame; the two fixing legs deviate from an elastic piece is connected between the rear ends of the fixing frames, and the elastic piece is used for assisting the wearing device to be fixed on the head of the user.
3. The augmented reality observation device for an ultrasound device according to claim 2, wherein front ends of the two fixed legs are respectively hinged to the fixed frame, and rear ends of the two fixed legs are respectively arranged to be bent toward the other fixed leg.
4. The augmented reality observation device for an ultrasound device of any one of claims 1-3, wherein the AR module comprises a communication block, a controller, a camera, and a display lens, the communication block in proximity connection with the communication module; the camera, the display lens and the controller are fixedly connected with the fixing frame, the camera and the display lens are respectively and electrically connected with the controller, the camera shoots the positioning point information for the controller to position, and the display lens displays the real-time ultrasonic model.
5. The augmented reality observation device for an ultrasound device of claim 4 wherein the AR module comprises an acceleration sensor, an ambient light sensor and a gyroscope sensor, parameters of the acceleration sensor are sent to the controller, the controller changes from a standby to a startup state or from a startup to a standby state; the parameters of the ambient light sensor are sent to the controller, and the controller controls the display brightness of the display lens; and the parameters of the gyroscope sensor are sent to the controller, and the controller controls the display angle of the real-time ultrasonic model.
6. The augmented reality observation apparatus for an ultrasound apparatus according to any one of claims 1 to 3, wherein the ultrasound module comprises a detecting member and a calculating member, the detecting member is electrically connected to the calculating member, the detecting member is in contact with the user to detect and transmit a signal to the calculating member, and the calculating member transmits the signal to the communication block through the communication module.
7. The augmented reality observation apparatus for an ultrasound apparatus according to claim 6, wherein the detecting member includes an ultrasound generator, an ultrasound receiver, and a transmission line, the ultrasound generator and the ultrasound receiver are fixedly connected, and the ultrasound receiver is wired to the calculating member through the transmission line.
8. The augmented reality observation apparatus for an ultrasound apparatus according to claim 7, wherein the detecting member comprises an ultrasound generator, an ultrasound receiver, and a transmission block, the ultrasound generator and the ultrasound receiver are fixedly connected, and the ultrasound receiver is connected to the calculating member by proximity through the transmission block.
9. The augmented reality observation apparatus for an ultrasound apparatus according to claim 8, wherein the calculation means includes an ultrasound image processing module and a real-time ultrasound model storage module,
the ultrasonic image processing module processes the detection information into a real-time ultrasonic model,
and the real-time ultrasonic model storage module sends the real-time ultrasonic model to the communication module or uploads the real-time ultrasonic model to a server.
10. The augmented reality observation device for an ultrasound device of claim 9, wherein the communication module is connected with the communication block by short range communication.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010448490.7A CN111528920A (en) | 2020-05-25 | 2020-05-25 | Augmented reality observation device for ultrasound device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010448490.7A CN111528920A (en) | 2020-05-25 | 2020-05-25 | Augmented reality observation device for ultrasound device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111528920A true CN111528920A (en) | 2020-08-14 |
Family
ID=71969630
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010448490.7A Pending CN111528920A (en) | 2020-05-25 | 2020-05-25 | Augmented reality observation device for ultrasound device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111528920A (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TW201610472A (en) * | 2014-07-31 | 2016-03-16 | 三星電子股份有限公司 | Wearable glasses and method of providing content using the same |
KR20170093422A (en) * | 2016-02-05 | 2017-08-16 | 길재소프트 주식회사 | System of providing augmented reality image for ultrasonic image diagnosis |
CN107854142A (en) * | 2017-11-28 | 2018-03-30 | 无锡祥生医疗科技股份有限公司 | Medical supersonic augmented reality imaging system |
CN108021241A (en) * | 2017-12-01 | 2018-05-11 | 西安枭龙科技有限公司 | A kind of method for realizing AR glasses virtual reality fusions |
CN209962019U (en) * | 2019-06-21 | 2020-01-17 | 上海工程技术大学 | Interactive augmented reality glasses integrating real-time positioning and voice recognition |
CN212853514U (en) * | 2020-05-25 | 2021-04-02 | 居天智慧(深圳)有限公司 | Augmented reality observation device for ultrasound device |
-
2020
- 2020-05-25 CN CN202010448490.7A patent/CN111528920A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TW201610472A (en) * | 2014-07-31 | 2016-03-16 | 三星電子股份有限公司 | Wearable glasses and method of providing content using the same |
KR20170093422A (en) * | 2016-02-05 | 2017-08-16 | 길재소프트 주식회사 | System of providing augmented reality image for ultrasonic image diagnosis |
CN107854142A (en) * | 2017-11-28 | 2018-03-30 | 无锡祥生医疗科技股份有限公司 | Medical supersonic augmented reality imaging system |
CN108021241A (en) * | 2017-12-01 | 2018-05-11 | 西安枭龙科技有限公司 | A kind of method for realizing AR glasses virtual reality fusions |
CN209962019U (en) * | 2019-06-21 | 2020-01-17 | 上海工程技术大学 | Interactive augmented reality glasses integrating real-time positioning and voice recognition |
CN212853514U (en) * | 2020-05-25 | 2021-04-02 | 居天智慧(深圳)有限公司 | Augmented reality observation device for ultrasound device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
RU2740259C2 (en) | Ultrasonic imaging sensor positioning | |
US8504136B1 (en) | See-through abdomen display for minimally invasive surgery | |
KR101407986B1 (en) | Medical robotic system providing three-dimensional telestration | |
JP4054585B2 (en) | Information processing apparatus and method | |
US6937268B2 (en) | Endoscope apparatus | |
US20200037998A1 (en) | Methods and apparatuses for guiding collection of ultrasound data using motion and/or orientation data | |
CN107854142B (en) | Medical ultrasonic augmented reality imaging system | |
CN111035458A (en) | Intelligent auxiliary system for operation comprehensive vision and image processing method | |
CN107945607B (en) | Ultrasonic demonstration system and device | |
JP5103682B2 (en) | Interactive signage system | |
US20240173018A1 (en) | System and apparatus for remote interaction with an object | |
CN212853514U (en) | Augmented reality observation device for ultrasound device | |
JP2023549093A (en) | Robust segmentation with high-level image understanding | |
TW202017368A (en) | A smart glasses, a smart glasses system, and a method for using the smart glasses | |
US20110169605A1 (en) | System and method for providing remote indication | |
CN211484971U (en) | Intelligent auxiliary system for comprehensive vision of operation | |
CN111528920A (en) | Augmented reality observation device for ultrasound device | |
US10854005B2 (en) | Visualization of ultrasound images in physical space | |
US11839516B2 (en) | Medical imaging equipment and medical imaging method | |
CN111193830B (en) | Portable augmented reality medical image observation auxiliary assembly based on smart phone | |
US20210128265A1 (en) | Real-Time Ultrasound Imaging Overlay Using Augmented Reality | |
JP7267160B2 (en) | echo guide system for acupuncture | |
CN111857342A (en) | Eye movement tracking system and method based on medical endoscope | |
JPH09147142A (en) | Displaying method and device therefor | |
US20230218270A1 (en) | System and apparatus for remote interaction with an object |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |