IL301424B2 - System and methods for performing remote ultrasound examinations - Google Patents
System and methods for performing remote ultrasound examinationsInfo
- Publication number
- IL301424B2 IL301424B2 IL301424A IL30142423A IL301424B2 IL 301424 B2 IL301424 B2 IL 301424B2 IL 301424 A IL301424 A IL 301424A IL 30142423 A IL30142423 A IL 30142423A IL 301424 B2 IL301424 B2 IL 301424B2
- Authority
- IL
- Israel
- Prior art keywords
- ultrasound
- probe
- user
- instructions
- computer
- Prior art date
Links
- 238000002604 ultrasonography Methods 0.000 title claims description 277
- 238000000034 method Methods 0.000 title claims description 89
- 239000000523 sample Substances 0.000 claims description 148
- 238000004891 communication Methods 0.000 claims description 11
- 238000013480 data collection Methods 0.000 claims description 6
- 238000002372 labelling Methods 0.000 claims description 2
- 238000010801 machine learning Methods 0.000 claims description 2
- 230000006870 function Effects 0.000 description 24
- 238000013473 artificial intelligence Methods 0.000 description 19
- 239000000872 buffer Substances 0.000 description 17
- 238000005070 sampling Methods 0.000 description 14
- 238000012544 monitoring process Methods 0.000 description 13
- 230000008569 process Effects 0.000 description 13
- 238000004088 simulation Methods 0.000 description 12
- 230000009466 transformation Effects 0.000 description 12
- 238000004422 calculation algorithm Methods 0.000 description 11
- 230000000694 effects Effects 0.000 description 10
- 230000003287 optical effect Effects 0.000 description 9
- 238000012545 processing Methods 0.000 description 9
- 238000004590 computer program Methods 0.000 description 8
- 230000001419 dependent effect Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 8
- 238000002592 echocardiography Methods 0.000 description 8
- 230000008713 feedback mechanism Effects 0.000 description 8
- 210000000056 organ Anatomy 0.000 description 8
- 238000003860 storage Methods 0.000 description 8
- 210000003734 kidney Anatomy 0.000 description 7
- 238000012937 correction Methods 0.000 description 6
- 238000003384 imaging method Methods 0.000 description 6
- 238000012549 training Methods 0.000 description 6
- 238000004458 analytical method Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 4
- 238000006073 displacement reaction Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000012285 ultrasound imaging Methods 0.000 description 4
- 150000001875 compounds Chemical class 0.000 description 3
- 238000002591 computed tomography Methods 0.000 description 3
- 230000002596 correlated effect Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 239000003550 marker Substances 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 238000010408 sweeping Methods 0.000 description 3
- 210000001519 tissue Anatomy 0.000 description 3
- 238000009941 weaving Methods 0.000 description 3
- 230000006399 behavior Effects 0.000 description 2
- 230000001276 controlling effect Effects 0.000 description 2
- 230000000875 corresponding effect Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000005489 elastic deformation Effects 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 239000004615 ingredient Substances 0.000 description 2
- 238000003780 insertion Methods 0.000 description 2
- 230000037431 insertion Effects 0.000 description 2
- 230000002427 irreversible effect Effects 0.000 description 2
- 238000002595 magnetic resonance imaging Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 230000000737 periodic effect Effects 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 230000003362 replicative effect Effects 0.000 description 2
- 230000029058 respiratory gaseous exchange Effects 0.000 description 2
- 238000000844 transformation Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 1
- 230000003187 abdominal effect Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 238000002669 amniocentesis Methods 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000005266 casting Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 210000003754 fetus Anatomy 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000003862 health status Effects 0.000 description 1
- 210000000936 intestine Anatomy 0.000 description 1
- 210000004185 liver Anatomy 0.000 description 1
- 210000004072 lung Anatomy 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000004118 muscle contraction Effects 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000007170 pathology Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4245—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
- A61B8/4254—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Medical Informatics (AREA)
- Pathology (AREA)
- Veterinary Medicine (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Public Health (AREA)
- Molecular Biology (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Physics & Mathematics (AREA)
- Radiology & Medical Imaging (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Description
SYSTEM AND METHODS FOR PERFORMING REMOTE ULTRASOUND EXAMINATIONS FIELD AND BACKGROUND OF THE INVENTION The present invention, in some embodiments thereof, relates to system and methods for performing remote ultrasound examinations and, more particularly, but not exclusively, to system and methods for performing remote ultrasound examinations by untrained users. Background art includes International Patent Application Publication No. WO2022/118305 disclosing systems and methods for simulating an ultrasound, comprising a camera, a physical tridimensional reference element attached to an ultrasound transducer of an ultrasound device and a processing module comprising instructions to perform the following while performing an ultrasound examination or after performing the ultrasound examination and spatial location of the documented information: digitally imaging a physical tridimensional reference element, determining from said imaging a second spatial data of said physical tridimensional reference element relative to a virtual location-identifying orientation; displaying said ultrasound data when said second spatial data is the same as said first spatial data and said orientation of a patient is the same as said virtual location-identifying orientation. Additional background art includes U.S. Patent Application No. US20140004488A1 discloses a system for training practitioners in use of an ultrasound system including a unit for managing workflow of an ultrasound training session, a user interface for providing ultrasound training session instructions to a practitioner operating an ultrasound machine and for receiving input from a trainee, a unit for communication with the ultrasound machine, for collecting one or more ultrasound images produced during the training session from the ultrasound machine, a unit for image processing the ultrasound images, and a unit for assessing quality of the ultrasound images. A method for monitoring practitioner proficiency in use of an ultrasound system including providing the practitioner with an ultrasound task definition, collecting one or more ultrasound images produced by the practitioner during performance of the ultrasound task from an ultrasound machine, image processing the
ultrasound images, and assessing quality of the ultrasound images. Related apparatus and methods are also described. U.S. Patent No. 7782319 discloses a method, apparatus, and article of manufacture that provide the ability to control a three-dimensional scene view. A three-dimensional (3D) scene having one or more three-dimensional objects is displayed. A 3D representation of a coordinate system of the scene is displayed. The 3D representation contains a current viewpoint, one or more faces, one or more edges, and one or more corners with each face, edge, and corner representing a corresponding viewpoint of the scene. The 3D representation is manipulated. A new current viewpoint of the 3D representation is displayed based on the manipulation. The scene is then reoriented corresponding to the new current viewpoint based on the manipulation of the 3D representation. U.S. Patent Application No. US20040106869A1 discloses an apparatus for precision location of a tool such as a surgical tool within an obscured region such as an internal space of the human or animal body, the apparatus comprising: a planar scanning unit for scanning planes within said obscured region using an imaging scan, and a locator, associated with said tool and with said scanning unit, for determining a location of said tool, and for selecting a plane including said tool location. The apparatus allows the planar scan to follow the tool automatically and saves skill and effort on the part of the surgeon. U.S. Patent Application No. US20130137988A1 discloses an augmented ultrasound examination system, that comprises: a) an ultrasound system suitable to generate images of a body portion; b) a first position sensor coupled to the ultrasound transducer of said ultrasound system; c) a second position sensor suitable to be coupled to a finger; and d) data processing apparatus suitable to receive position information from said first and from said that second position sensors and to generate therefrom information correlating on a screen the position of said second position sensor with the image generated by said ultrasound system. U.S. Patent Application No. US20150056591A1 discloses methods and devices for simulating ultrasound procedures and for training ultrasound users. Additionally disclosed are methods and devices for simulating needle insertion procedures, such as
amniocentesis procedures, and for training physicians to perform such needle insertion procedures. SUMMARY OF THE INVENTION Following is a non-exclusive list including some examples of embodiments of the invention. The invention also includes embodiments which include fewer than all the features in an example and embodiments using features from multiple examples, also if not expressly listed below. Example 1. A system for performing remote ultrasound examinations, comprising: a. an ultrasound device comprising a trackable probe; b. a computer in communication with said ultrasound device and comprising instructions for: i. providing instructions to a user for performing an ultrasound examination; ii. receiving movement information about said trackable probe; iii. providing feedback to said user regarding said performing of said ultrasound examination based on said received movement information of said trackable probe. Example 2. The system according to example 1, wherein said trackable probe comprises a first hardware configured to allow active tracking of said trackable probe. Example 3. The system according to example 1 or example 2, wherein said trackable probe comprises a second hardware configured to allow passive tracking of said trackable probe, said passive tracking being performed by said computer. Example 4. The system according to example 3, wherein said second hardware is a reference element mounted and/or incorporated in said trackable probe. Example 5. The system according to example 4, wherein said computer comprises a camera and said computer is configured to track said reference element of said trackable probe by means of said camera. Example 6. The system according to any one of examples 1-5, wherein said providing instructions comprises showing a video to said user.
Example 7. The system according to any one of examples 1-6, wherein said providing instructions comprises connecting said computer with an external source that provides said instructions for performing said ultrasound examination. Example 8. The system according to example 7, wherein said external source is a physician and/or a medical sonographer. Example 9. The system according to any one of examples 1-8, wherein said providing feedback comprises comparing said received movement information with at least one reference movement information. Example 10. The system according to example 9, wherein said at least one reference movement information comprises one or more of reference movement information from a database and reference movement information in real-time from a physician and/or a medical sonographer. Example 11. The system according to any one of examples 1-10, wherein said computer further comprises instructions for evaluating said ultrasound examination according to at least one ultrasound image received from said ultrasound device. Example 12. The system according to example 11, wherein said evaluating comprises evaluating comprises detecting “dead spots” in a 3D scanned volume. Example 13. The system according to any one of examples 1-12, wherein said computer comprises dedicated software configured to allow 3D geometric labelling points of interest on a 2D ultrasound image. Example 14. The system according to example 13, wherein said points of interest comprise anomalies found in said 2D image of the ultrasound. Example 15. The system according to any one of examples 1-14, wherein said computer comprises instructions for generating a 4D volume reconstruction based on a plurality of 2D images. Example 16. The system according to example 15, wherein said reconstruction is based on 2D images in time. Example 17. The system according to example 15, wherein said reconstruction is based on 2D images with at least one geometric embedding. Example 18. The system according to any one of examples 1-17, wherein said computer further comprises instructions for generating a 4D Doppler approximation based on a 2D Doppler in time.
Example 19. The system according to example 18, wherein said 4D Doppler approximation is generated based on same voxels from different angles and different time samples. Example 20. The system according to any one of examples 1-19, wherein said computer further comprises instructions for associating a received ultrasound image with said received movement information of said trackable probe. Example 21. The system according to example 20, wherein data generated by said associating is used as data collection for Machine Learning. Example 22. The system according to any one of examples 1-21, wherein said computer further comprises instructions for embedding one or more frames of a ultrasound stream within a geometric space. Example 23. The system according to example 22, further comprising generating a geometric dataset of ultrasound images based on said embedding. Example 24. A system for performing remote ultrasound examinations, comprising: a. a user unit, comprising: i. an ultrasound device comprising a trackable probe; and ii. a first computer in communication with said ultrasound; b. a physician unit, comprising: iii. a trackable mock ultrasound probe; iv. a second computer with a camera. Example 25. The system according to example 24, wherein said user unit is as said system according to example 1. Example 26. The system according to example 25, wherein said external source is said physician unit, said physician unit being used by a physician and/or a medical sonographer. Example 27. The system according to any one of examples 24-26, wherein said trackable mock ultrasound probe comprises a reference element mounted and/or incorporated in said trackable mock ultrasound probe. Example 28. A method for performing a remote ultrasound examination, comprising:
a. providing to a user instructions comprising one or more movements of a first ultrasound probe; b. tracking movements performed by a second ultrasound probe used by said user during said remote ultrasound examination; c. comparing said performed movements with said provided instructions; d. providing feedback to said user in view of results of said comparing. Example 29. The method according to example 28, wherein said providing to a user instructions comprises providing instructions in real-time by an external source. Example 30. The method according to example 28 or example 29, wherein said providing to a user instructions comprises providing instructions by means of a video shown on an electronic device. Example 31. The method according to any one of examples 28-30, further comprises providing said second ultrasound probe with one or more of active and passive tracking hardware. Example 32. The method according to example 31, wherein said passive tracking hardware is a reference element mounted and/or incorporated in said second ultrasound probe. Example 33. The method according to example 32, wherein said tracking movements performed by a second ultrasound probe are performed by tracking movements performed by said reference element. Unless otherwise defined, all technical and/or scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the invention pertains. Although methods and materials similar or equivalent to those described herein can be used in the practice or testing of embodiments of the invention, exemplary methods and/or materials are described below. In case of conflict, the patent specification, including definitions, will control. In addition, the materials, methods, and examples are illustrative only and are not intended to be necessarily limiting. As will be appreciated by one skilled in the art, some embodiments of the present invention may be embodied as a system, method or computer program product. Accordingly, some embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and
hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, some embodiments of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon. Implementation of the method and/or system of some embodiments of the invention can involve performing and/or completing selected tasks manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of some embodiments of the method and/or system of the invention, several selected tasks could be implemented by hardware, by software or by firmware and/or by a combination thereof, e.g., using an operating system. For example, hardware for performing selected tasks according to some embodiments of the invention could be implemented as a chip or a circuit. As software, selected tasks according to some embodiments of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In an exemplary embodiment of the invention, one or more tasks according to some exemplary embodiments of method and/or system as described herein are performed by a data processor, such as a computing platform for executing a plurality of instructions. Optionally, the data processor includes a volatile memory for storing instructions and/or data and/or a non-volatile storage, for example, a magnetic hard-disk and/or removable media, for storing instructions and/or data. Optionally, a network connection is provided as well. A display and/or a user input device such as a keyboard or mouse are optionally provided as well. Any combination of one or more computer readable medium(s) may be utilized for some embodiments of the invention. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a
portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium and/or data used thereby may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing. Computer program code for carrying out operations for some embodiments of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). Some embodiments of the present invention may be described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention.
It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks. The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. Some of the methods described herein are generally designed only for use by a computer, and may not be feasible or practical for performing purely manually, by a human expert. A human expert who wanted to manually perform similar tasks, might be expected to use completely different methods, e.g., making use of expert knowledge and/or the pattern recognition capabilities of the human brain, which would be vastly more efficient than manually going through the steps of the methods described herein. BRIEF DESCRIPTION OF THE DRAWINGS Some embodiments of the invention are herein described, by way of example only, with reference to the accompanying drawings. With specific reference now to the
drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of embodiments of the invention. In this regard, the description taken with the drawings makes apparent to those skilled in the art how embodiments of the invention may be practiced. In the drawings: Figure 1 is a schematic presentation of a remote ultrasound examination system, according to some embodiments of the invention; Figure 2 is a schematic representation of a user unit, according to some embodiments of the invention; Figure 3 is a schematic representation of a physician unit, according to some embodiments of the invention; Figures 4a-b are schematic representations of an exemplary reference element, according to some embodiments of the invention; Figure 4c is a schematic representation of directions in a 3D space, according to some embodiments of the invention; Figure 4d is a schematic representation of movements in a 3D space, according to some embodiments of the invention; Figure 4e is a schematic representations of an exemplary hexagonal reference element, according to some embodiments of the invention; Figures 5a-b are schematic representations of exemplary volumes generated by a plurality of 2D images and exemplary analysis thereof, according to some embodiments of the invention; Figure 6 is a flowchart of an exemplary method of performing remote ultrasound examinations, according to some embodiments of the invention; Figure 7 is a flowchart of an exemplary method of performing a remote ultrasound examination from the side of the user, according to some embodiments of the invention; Figure 8 is a flowchart of an exemplary method of performing a remote ultrasound examination from the side of the physician/medical sonographer, according to some embodiments of the invention; Figure 9a is a schematic representation of a system to perform unsupervised ultrasound examinations, according to some embodiments of the invention;
Figure 9b is a schematic representation of exemplary guiding images/videos for performing an exemplary ultrasound protocol, according to some embodiments of the invention; Figure 9c is a flowchart of an exemplary method of performing remote ultrasound examinations, according to some embodiments of the invention; Figure 10 is a schematic representation of a slicing of the volumetric buffer, according to some embodiments of the invention; and Figure 11 is a block diagram of exemplary technical issues involved in the field of ultrasound simulation and examples of how the system of the invention resolves these technical issues, according to some embodiments of the invention. DESCRIPTION OF SPECIFIC EMBODIMENTS OF THE INVENTION The present invention, in some embodiments thereof, relates to system and methods for performing remote ultrasound examinations and, more particularly, but not exclusively, to system and methods for performing remote ultrasound examinations by untrained users. Overview An aspect of some embodiments of the invention relates to performing remote ultrasound (US) examinations by untrained users that are guided by trained personnel (Remote Ultrasound also referred as rUS). In some embodiments, guiding of the ultrasound examination is done in real-time. In some embodiments, guiding of the ultrasound examination is performed using a recorded video that is being screened/shown to the untrained user. In some embodiments, the ultrasound examination is monitored while being performed. In some embodiments, the monitoring is performed by one or more of a computer (for example by an AI system) and a human trained personnel. In some embodiments, the ultrasound examination is evaluated in real-time while being performed. In some embodiments, the evaluation is performed by one or more of a computer (for example by an AI system) and a human trained personnel. In some embodiments, corrections and/or requests for new ultrasound collections are performed during the collection session according to needs, optionally predetermined needs. In some embodiments, the corrections and/or requests are
generated by one or more of a computer (for example by an AI system) and a human trained personnel. In some embodiments, the corrections and/or requests are generated off-line, meaning an untrained user performs a rUS, which is saved for example in a server, and later the rUS is reviewed and corrections and/or requests are sent back to the untrained user. In some embodiments, the trained personnel is a physician. In some embodiments, the trained personnel is an ultrasound medical sonographer. In some embodiments, the untrained users are the patients requiring an ultrasound examination. In some embodiments, the ultrasound examination is performed using a mobile 2D and/or 3D ultrasound device (referred hereinafter just as “US device”). In some embodiments, the ultrasound device is a 2D ultrasound device and the system comprises a dedicated software comprising instructions to transform the 2D ultrasound images into 3D images, optionally to transform high quality 2D ultrasound images into high quality 3D ultrasound images. In some embodiments, the US device comprises a probe comprising a reference element connected and/or mounted to it. In some embodiments, the reference element is used for monitoring the performance of the rUS examination. In some embodiments, alternatively or additionally, the ultrasound probe comprises one or more sensors configured to allow monitoring and tracking of the six dimensional spatial movements of the ultrasound probe. In some embodiments, monitoring the rUS examination comprises monitoring the spatial movements of the US probe in real-time and providing real-time feedback, guidance and/or corrections. In some embodiments, the real-time feedback, guidance and corrections are delivered using one or more of visual feedback, sound feedback, tactile feedback and any combination thereof. In some embodiments, the system is configured to allow the trained personnel to control commands at the untrained user device. It is known that performing ultrasound examinations require highly trained personnel since it is very difficult, for an untrained person, to capture the required portions of the body requiring examination. For example, it is known that one of the difficulties is to know how to orient the probe in order to capture the right orientation and/or location. Another example, is that it is known that it is difficult to identify the location by the ultrasound image. Therefore, in some embodiments, a potential advantage of the invention is that allows untrained people to perform intricate ultrasound examinations without the need of visiting a professional. In some embodiments, another
potential advantage is that the system allows an untrained user to perform a self ultrasound examination without the need that the person performing the scan understand what is being shown in the image and whether the collection is good. The system comprises instructions for monitoring the quality of the scan and the accuracy in the spatial collection, while the user is just being guided on how to move the ultrasound probe. Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not necessarily limited in its application to the details of construction and the arrangement of the components and/or methods set forth in the following description and/or illustrated in the drawings and/or the Examples. The invention is capable of other embodiments or of being practiced or carried out in various ways. Referring now to Figure 1, showing a schematic presentation of a remote ultrasound examination system, according to some embodiments of the invention. In some embodiments, an exemplary remote ultrasound examination system 100comprises a patient/user unit/the supported entity 102 (referred hereinafter as user unit 102 ) and a physician/medical sonographer/expert unit 104 (referred hereinafter as physician unit 104 ). In some embodiments, optionally, the system comprises a server 106 to which information is delivered from and/or to the user unit 102 and/or the physician unit 104 are connected. In some embodiments, the user unit 102 comprises an ultrasound device 108 , optionally a mobile ultrasound device (in Figure 1 a picture of an exemplary mobile ultrasound device from Mobisante (https://www(dot)medicalexpo(dot)com/prod/mobisante/product-81044-513034(dot)html) is shown but any type of mobile or non mobile ultrasound device can be used), optionally a 2D mobile or non mobile ultrasound device, optionally a 2D/3D mobile or non mobile ultrasound device. In some embodiments, optionally, the ultrasound device is an FDA approved ultrasound device. In some embodiments, the user unit 102 comprises a computer 110 comprising a webcam 112 . In some embodiments, the user 114 can use his own computer with webcam and the user unit
102 comprises only the ultrasound device 108 that is connected to the computer or at least is configured to deliver data from the ultrasound device 108 to the computer. In some embodiments, the physician unit 104 comprises a computer 116comprising a webcam 118 , where a physician 120and/or a medical sonographer 122 can communicate with the user 114 . In some embodiments, all communications within the system 100 are run through the server 106 , where the system may use dedicated software and/or may record the ultrasound sessions. Referring now to Figure 2, showing a schematic representation of a user unit 102 , according to some embodiments of the invention. Same reference numbers will be used for same elements in all Figures. In some embodiments, as mentioned above, the user unit 102 comprises a computer 110comprising a screen and a webcam 112 . In some embodiments, the computer 110 can be a personal computer provided with the user unit 102 or can be the personal computer of the user 114 . In some embodiments, the user unit 102 comprises an ultrasound device 108 , optionally a mobile ultrasound device, comprising a probe 202 . In some embodiments, attached and/or connected to the probe 202 there is reference element 204 (see below further explanations regarding the reference element 204 ). In some embodiments, optionally, the ultrasound device 108is connected to the computer 110 . In some embodiments, the webcam 112is configured to capture at least the user 114and the reference element 204 . Figure 2 also shows a schematic representation of what is shown in the computer 110 screen. In some embodiments, on the screen are shown one or more of: an image captured by the webcam 112 showing at least the user 114and the reference element 204 (Not limited to a spatial shape as shown in Figure 2) performing the ultrasound examination 206 ; an image of the physician 120and/or the medical sonographer 122 guiding the user 114 while performing the ultrasound examination 208 ; and at least one feedback image 210 (see below). In some embodiments, optionally, an ultrasound image of what is being captured is also shown in the screen (not shown in Figure 2). In some embodiments, a kit can be provided to an untrained user. For example, a patient is sent home from the hospital and routine ultrasound examinations are required to monitor the health status of the patient. In some embodiments, a dedicated kit is sent to the patient’s home, the kit comprises all the necessary hardware and software to
perform remote ultrasound examinations. In some embodiments, once the kit is not needed anymore, the kit is sent back. Referring now to Figure 3, showing a schematic representation of a physician unit 104 , according to some embodiments of the invention. Same reference numbers will be used for same elements in all Figures. In some embodiments, as mentioned above, the physician unit 104 comprises a computer 116 comprising a screen and a webcam 118 . In some embodiments, the computer 110 can be a personal computer provided with the physician unit 104 or can be the personal computer of the physician 120 and/or medical sonographer 122 . In some embodiments, the physician unit 104 comprises a mock ultrasound probe 302 . In some embodiments, attached and/or connected to the probe 302 there is reference element 204 (see below further explanations regarding the reference element 204 ). In some embodiments, the webcam 118is configured to capture at least the physician 120and/or medical sonographer 122and the reference element 204 . Figure 3 also shows a schematic representation of what is shown in the computer 116screen. In some embodiments, on the screen are shown one or more of: an image captured by the webcam 118 showing at least the physician 120and/or medical sonographer 122and the reference element 204guiding the user 114while performing the ultrasound examination 304 , the user 114while performing the ultrasound examination 306 ; and at least one feedback image 210 (see below). In some embodiments, in addition, the system 100 is configured to show the physician 120and/or medical sonographer 122in real-time the ultrasound image 308that is being acquired by the ultrasound device 108operated remotely by the user 114 . In some embodiments, physician unit 104 is configured to remote control one or more controls at the user unit 102 , for example, the physician unit 104 comprises dedicated software that allows activating/deactivating controls in the ultrasound device and/or in the computer to allow better performance of the rUS examination. In some embodiments, when a computer (for example an AI system) is monitoring the rUS examination, the computer can also activate/deactivate commands at the user unit 102 . Exemplary reference element Referring now to Figures 4a-b showing schematic representations of an exemplary reference element 204 , according to some embodiments of the invention.
In some embodiments, an exemplary external reference element is a physical object comprising one or more reference markings. In some embodiments, software of the system 100 is configured to recognize the markings on the reference element 204by analyzing the visual information received from the camera 112/118 . In some embodiments, an exemplary reference element 204 comprises any geometrical form. In some embodiments, the geometrical form comprises a 2D form. In some embodiments, the geometrical form comprises a 3D form. In some embodiments, as shown for example in Figures 4a-b, the 3D geometrical form of the exemplary reference element 204 is a cube. In some embodiments, the 3D geometrical form of the exemplary reference element 204 can be other than a cube, for example one or more of a sphere, a cylinder, a torus, a cone, a cuboid, a triangular pyramid, a square pyramid, an octahedron, a dodecahedron, an icosahedron, etc. In some embodiments, the 3D geometrical form of the exemplary reference element 204 is a hexagonal prism, as shown for example in Figure 4e (see also below description about reference element having hexagonal prism form). In some embodiments, the 3D geometrical form of the exemplary reference element 204 may comprise any form as long as it comprises the necessary reference markings. Returning to Figures 4a-b, the exemplary reference element 204 , shown for example as a cube, comprises six distinct reference markings in the form of Greek alphabet letters: alpha (α), beta (β), gamma (γ), delta (δ), epsilon (ε) and theta (θ). It should be understood that these are just exemplary reference markings and that other forms of reference markings maybe used, for example, figures, lines, barcodes, etc. In some embodiments, the software in the system 100comprises instructions for recognizing the reference markings and providing dedicated directions in the 3D space (for example up, down, front, back, right, left), as shown for example in Figure 4c. In some embodiments, using the directions as shown in Figure 4c, the software of the system 100 extrapolates movements of the exemplary reference element 204 in space (spatial movements). In some embodiments, exemplary movements in space are schematically shown in Figure 4d. In some embodiments, movements in space are one or more of up, down, forward, back (backwards), right, left roll, yaw and pitch. In some embodiments, the software utilizes at least one reference markers to extrapolate the spatial movement of the reference element 204 . In some embodiments,
one reference marker is enough for the software to extrapolate the spatial movements of the reference element 204 . It should be understood that the use of more markers to extrapolate the spatial movements of the reference element are included in the scope of the invention. Exemplary reference element having hexagonal prism form In some embodiments, the reference element 204 comprises a 3D form of a hexagonal prism, as shown for example in Figure 4e. In some embodiments, the inventors have found that a reference element having a hexagonal prism form provides enough surfaces (with markings) for the system to allow a highly precise identification of the reference element in the space and highly precise tracking performance of the reference element, both with minimal or low computational effort. In some embodiments, as mentioned above, the system is configured to identify the markers on the reference element using a camera. In some embodiments, once the markers are identified, the system locks on them and uses the perceived movements of the markers (changes in spatial location of the markers in space) to translate them into virtual movements. In some embodiments, the virtual movements are used for monitoring the performance of the ultrasound examination performed by the user 114 . In some embodiments, there is a direct correlation between the number of markers found by the system and the computational effort required to follow them and translate them into the right virtual position of the transducer. In some embodiments, the more markers there are, the more computational effort is required and vice versa. In some embodiments, additionally, there is a direct correlation between the number of reference markers and the precision which the system translates the real-world movements of the reference element into virtual transducer/probe ones. In some embodiments, the more markers there are, the more precise is the tracking and vice versa. In some embodiments, as mentioned above, the inventors have found that the optimal number of reference markers, each located on a specific surface, for allowing a highly precise tracking with minimal computation effort is provided by using a reference element having a hexagonal prism form. It should be understood that the invention is meant to cover those embodiments where computational effort is not an issue, for example by using a supercomputer and/or a quantum computer, where the reference element can have as
many markings as desired and can have any geometrical form, including a sphere (which has no distinguishable surfaces but one continuous surface). Exemplary additional identification markers on the reference element In some embodiments, the reference element 204 comprises an additional marking, which is not related to the orientation role of the markings, which provides a unique identification to the user ultrasound probe. In some embodiments, identification markers allow the system to access the personal file of the specific user and display/provide/update/upload the relevant file/program to that specific user. In some embodiments, at the first use, the user will be required to link a specific identification marker, optionally irreversibly attached to a specific reference element, to his user account. In some embodiments, when a user has multiple reference elements, for example different reference elements to practice different ultrasound techniques on different places (for example vaginal or abdominal), each reference element will have a specific identification marker, and all identification markers will be linked to the same account. Exemplary feedback mechanisms on a reference element In some embodiments, the reference element 204 comprises one or more feedback mechanisms configured to transmit a type of feedback to the user while using the system. In some embodiments, exemplary feedback mechanisms are one or more of lights, vibration and sounds. For example, the reference element might comprise a plurality of lights that are activated during the use of the system. In some embodiments, when the user is moving the reference element as expected, the reference element will show green lights. In some embodiments, when the user is moving the reference element not as expected, the reference element will show red lights. In some embodiments, using this example, when the user is moving the reference element as expected, the reference element will not vibrate and/or will not sound any sounds. In some embodiments, when the user is moving the reference element not as expected, the reference element will vibrate and/or will sound sounds. It should be understood that the above are just examples to allow a person having skills in the art to understand the invention and that
other and/or different uses of the feedback mechanisms are also included within some of the embodiments of the invention. In some embodiments, on the reference element there are one or more buttons configured to activate and/or actuate features in the system, for example activate the communication between the reference element and the electronic device, begin the simulation, end the simulation, activation of “PRINT” button on the GUI, opening a control panel of the system, commence calibration of the system and other. It should be understood that other actions are also included in the scope of some embodiments of the invention, and that the abovementioned examples are just examples to allow a person having skills in the art to understand the invention. Exemplary general tracking mechanism of the US probe In some embodiments, other techniques and/or mechanisms can be used to track the ultrasound probe used by the untrained user. For example, in some embodiments, the US probe is tracked as previously described using optical tracking of markers using a single (optionally fixed) camera. In some embodiments, the US probe is tracked using optical tracking of markers using stereo (or multi camera) camera tracking of markers. In some embodiments, the US probe is tracked using optical tracking using infrared (IR) markers, for example as used by OptiTrack©. In some embodiments, trackers are positioned either on or within the US probe, for example built-in cameras (for example like Intel® RealSense™ Tracking Cameras), motion sensors, gyroscopes, IMU based orientation + motion tracking, etc. In some embodiments, tracking of the US probe is done by optical tracking of markerless targets in which the system tracks a whole object, without the need to mark the tracked object (for example as used by Ultraleap©). In some embodiments, the US probe is tracked by optical flow, which is similar to the technology used for optical peripherals (for example a mouse) together with the use of IMU. In some embodiments, the US probe is tracked by magnetic field position and orientation technology. In some embodiments, the US probe is tracked by the use of sensor fusion of the above technologies with or without AI to improve the tracking.
Exemplary feedback mechanisms of the system In some embodiments, the system 100 comprises one or more feedback mechanisms configured to transmit a type of feedback to the user 114 and/or the physician 120and/or medical sonographer 122while using the system 100 . Examples of how the feedback mechanisms are activated are explained below. In some embodiments, exemplary feedback mechanisms are one or more of lights, vibration and sounds. For example, on the screen there might be some kind of image 210that shows that the movements (direction and/or velocity) performed by the user 114 are according to what the physician 120and/or medical sonographer 122are showing. For example, when the user is moving the reference element as expected (direction and/or velocity), the feedback image 210will be shown as green, while when the user 114is moving the reference element not as expected (direction and/or velocity), the feedback image 210will be shown as red. In some embodiments, both the user 114and the physician 120and/or medical sonographer 122 can see the feedback image 210 . In some embodiments, additionally or alternatively, the feedback is performed by providing written messages, for example, “slowdown”, “move the probe forward”, “do not tilt the probe”, and more. In some embodiments, similar to the above, other types of feedbacks are used, for example, sound feedback emitted from the computer, vibrational feedback from one or more parts of the system, like the reference element, the probe, etc. In some embodiments, during the performance of the ultrasound examination by the untrained user, the system comprises instructions to monitor the performance of the scan at various levels. In some embodiments, the system monitors the performance of the scan at the probe level according to the movements of the ultrasound probe and/or the movements of the reference element. In some embodiments, the movements comprise one or more of where the probe/reference element is located in relation to the body of the user, the orientation of the probe/reference element at the moment of the scan and the velocity of movement of the probe/reference element during the scan. In some embodiments, the system monitors the performance of the scan at the image level according to the images collected. In some embodiments, the system is configured to assess whether the scan is performed correctly by analyzing the 2D images and/or the 3D images. Figure 5a shows a schematic representation of a volume, which represent the 3D volume generated by the plurality of 2D ultrasound images collected. In some
embodiments, when the ultrasound scan is performed correctly, a full volume will be generated. Figure 5b shows a schematic representation of a volume generated by a 2D scan. It can be seen that there are two locations 502/504missing from the volume. These might happen because the user skipped those locations during the scan, or moved the probe too quickly or tilted the probe not according to the instructions. In some embodiments, the system is configured to assess whether there are missing areas and request from the user to return to those locations and re-scan them until the complete volume is generated properly. In some embodiments, the instructions are provided using graphic instructions, for example showing a body and the locations where the untrained user needs to performed the scans. In some embodiments, the system monitors the performance of the scan using an AI system. In some embodiments, the quality of the scan is done either in real-time or later (by either a human or by an AI system), and when performed later, a notification to the user is sent requesting to perform the necessary scans. In some embodiments, the system is configured to assess the quality of the ultrasound examination according to one or more of the following parameters: - Operation of the ultrasound device: For example in relation to known ultrasound functions: gain, focus, power, freeze, depth and angle. - Movements of the probe: required velocity of movement, positioning of the probe in relation to the body including the angle of the probe in relation to the area being scanned and/or keeping the probe in contact with the scanned surface. - Examined patient: that the patient did not move during the scan, that necessary stop in breathing were performed in order to achieve an optimal scan and that the user keeps the probe/reference element in view of the camera. - Positioning of the probe: that the user used enough ultrasound gel, the user provided proper contact between the probe and the tissue. Exemplary 2D ultrasound collection and generating a high quality 3D ultrasound image In some embodiments, as mentioned above, the untrained user receives a 2D ultrasound device. In some embodiments, the system comprises instructions to transform the 2D ultrasound images into 3D ultrasound images, optionally into high quality 3D ultrasound images. In existing ultrasound devices, 2D ultrasound images are
produced in high quality and a plurality of features are allowed. When a 3D image is requested, the existing ultrasound system automatically reduces the quality of the image and blocks the additional features that are allowed when taking 2D images, thus providing a low quality 3D image. The reason existing ultrasound devices do this is because it will require high amounts of computational resources and time to provide a 3D image with all the plurality of features. In some embodiments, the system utilizes mainly a two-dimensional transducer that is moved at a constant time and distance, thus allowing an optimal scanning quality, to collect only 2D images that are then used for the generation of a higher-quality three-dimensional volume. In some embodiments, the system enables the production of three-dimensional volumes with the help of only two- dimensional transducers, even in devices that do not support the generation of native three-dimensional images. In some embodiments, a plurality of volumes generated by system can be “connected” (unified into one or more volumes) to each other using external coordinates assigned to them, and this because they are all referenced to the same set of coordinates used during the recording with the camera, for example referenced to one or more of the probe, the reference element and/or any other external reference used during the scans. In some embodiments, each scanned volume is “tested” to assess if the recorded coordinates of the current recorded volume match other recorded coordinates/volumes. In some embodiments, a potential advantage of this is that it potentially ensures the correlation and connection of different volumes for the unification and generation of complex volumes. In some embodiments, the system comprises a feature that allows the user to choose points of interest (or sections of interest) that were recorded during the scan and then convert scans into movies that can be later displayed. In some embodiments, videos can also be of a defined section, in black and white and/or Doppler and/or color Doppler. In some embodiments, the scan products that are possible to be generated using the system include, for example, a collection of high quality two-dimensional images defined in a system of tested coordinates, as well as videos without transducer motion in the same coordinate system.
Exemplary methods Referring now to Figure 6, showing a flowchart of an exemplary method of performing remote ultrasound examinations, according to some embodiments of the invention. In some embodiments, once the user 114receives the user unit 102 , the user 114 turns on the unit to begin establishing a connection between the user unit 102 and the physician unit 104 ( 602 ). In some embodiments, the connection between the user unit 102 and the physician unit 104 is active, which allows the trained personnel to control one or more commands at the user unit 102 , for example, commands in the ultrasound machine and/or in the computer. In some embodiments, the system comprises instructions to check that all the requirements are met for performing a remote ultrasound examination. For example, that the US device is connected to the user unit 102 , that the webcam 112is working, that the system recognizes the reference element 204 and, optionally, that a reference element setup has been performed. In some embodiments, establishing a connection comprises contacting the server 106 . In some embodiments, once the communication has been established, the physician 120and/or medical sonographer 122 begins instructing the user 114how to move the probe 202 comprising the reference element 204 . In some embodiments, instructing the user 114 how to move the probe 202 comprising the reference element 204 comprises using a mock probe 302 also comprising a reference element 204 . In some embodiments, when the physician 120and/or medical sonographer 122 moves the mock probe 302 with the reference element 204 , the system generates the data related to the movements of the reference element 204 , according to the movements of the mock probe 302 made by the physician 120 and/or medical sonographer 122captured by the webcam 118of the physician unit 104 . Therefore, the method comprises receiving reference element data from the physician unit 604 . In some embodiments, the video captured by the webcam 118 of the physician unit 104 is shown in the screen 110of the user unit 102 and, optionally, also on the screen 116of the physician unit 104 . In some embodiments, the user 114 follows the movements of the mock probe 302 using the probe 202 (comprising the reference element 204 ) of the ultrasound device 108 , shown in the screen 110of the user unit 102 . In some embodiments, when user 114 moves the probe 202 with the reference element 204 , the system generates the data
related to the movements of the reference element 204 , according to the movements of the probe 202 made by the user 114captured by the webcam 112of the user unit 102 . Therefore, the method comprises receiving reference element data from the user unit 606 . In some embodiments, the video captured by the webcam 112 of the user unit 102 is shown in the screen 116 of the physician unit 104 and, optionally, also on the screen 110of the user unit 102 . In some embodiments, the method comprises comparing between the reference element data (referred hereinafter as “physician data”) received from the physician unit 104 and the reference element data (referred hereinafter as “user data”) received from the user unit 102( 608 ). In some embodiments, the system comprises instructions to provide feedback according to the result of the comparison 610between the physician data and the user data. In some embodiments, when the comparison between the physician data and the user data show that the data is similar, which means that the movements of the reference element of the probe moved by the user are the same as the movements of the reference element of the mock probe moved by the physician/medical sonographer are similar or the same, then the feedback will be a positive feedback, for example a green probe on the screen (as explained above). In some embodiments, when the comparison between the physician data and the user data show that the data is different, which means that the movements of the reference element of the probe moved by the user are different as the movements of the reference element of the mock probe moved by the physician/medical sonographer are similar or the same, then the feedback will be a negative feedback, for example a red probe on the screen (as explained above). In some embodiments, the system comprises instructions to continue with the comparison process until all the required data has been collected. In some embodiments, optionally, once the trained personnel (either human or AI) identifies that the probe is directed at the right location (for example at the right organ), the trained personnel can mark the area of interest (for example the borders of the organ) of the area of interest. In some embodiments, marking the area of interest allows the system to generate a dedicated volume of interest in the system that is then saved in the system for future use, for example for comparison analysis.
In some embodiments, the required data is set a priori, for example, the physician/medical sonographer inserts into the physician unit 104 information regarding the specific ultrasound examination that will be performed, for example, a kidney ultrasound examination (see below using reference library for remote examinations). Referring now to Figure 7, showing a flowchart of an exemplary method of performing a remote ultrasound examination from the side of the user, according to some embodiments of the invention. In some embodiments, same as explained for Figures 5a-b, once the user 114receives the user unit 102 , the user 114 turns on the unit to begin establishing a connection between the user unit 102 and the physician unit 104 ( 702 ). In some embodiments, the system comprises instructions to check that all the requirements are met for performing a remote ultrasound examination. For example, that the US device is connected to the user unit 102 , that the webcam 112 is working, that the system recognizes the reference element 204 and, optionally, that a reference element setup has been performed. In some embodiments, establishing a connection comprises contacting the server 106 . In some embodiments, once the communication has been established, the physician 120and/or medical sonographer 122 begins instructing the user 114how to move the probe 202 comprising the reference element 204 , optionally while controlling one or more commands at the user unit 102 . In some embodiments, instructing the user 114 how to move the probe 202 comprising the reference element 204 comprises using a mock probe 302 also comprising a reference element 204 . In some embodiments, the user 114 follows the movements of the mock probe 302 using the probe 202 (comprising the reference element 204 ) of the ultrasound device 108shown on the screen 110of the user unit 102 . In some embodiments, when user 114 moves the probe 202 with the reference element 204 , the system generates the data related to the movements of the reference element 204 , according to the movements of the probe 202 made by the user 114captured by the webcam 112of the user unit 102and sends it to the physician unit 104(optionally through the server). Therefore, the method comprises sending reference element data from the user unit 704 . In some embodiments, the method comprises receiving feedback on the reference element data sent 706 . In some embodiments, as mentioned above, the
feedback is provided by comparing the sent data from the user unit 102 with data received from the physician unit 104 , as explained above. In some embodiments, the method comprises continuing sending reference element data until the ultrasound examination is complete 708 . Referring now to Figure 8, showing a flowchart of an exemplary method of performing a remote ultrasound examination from the side of the physician/medical sonographer, according to some embodiments of the invention. In some embodiments, same as explained for Figure 6 and/or Figure 7, once the user 114receives the user unit 102 , the user 114 turns on the unit to begin establishing a connection between the user unit 102 and the physician unit 104 ( 802 ). In some embodiments, the system comprises instructions to check that all the requirements are met for performing a remote ultrasound examination. For example, that the US device is connected to the user unit 102 , that the webcam 112 is working, that the system recognizes the reference element 204 and, optionally, that a reference element setup has been performed. In some embodiments, the connection between the user unit 102 and the physician unit 104 is active, which allows the trained personnel to control one or more commands at the user unit 102 , for example, commands in the ultrasound machine and/or in the computer. In some embodiments, establishing a connection comprises contacting the server 106 . In some embodiments, once the communication has been established, the physician 120and/or medical sonographer 122 begins instructing the user 114how to move the probe 202 comprising the reference element 204 . In some embodiments, instructing the user 114 how to move the probe 202 comprising the reference element 204 comprises using a mock probe 302 also comprising a reference element 204 . In some embodiments, optionally, once the trained personnel (either human or AI) identifies that the probe is directed at the right location (for example at the right organ), the trained personnel can mark the area of interest (for example the borders of the organ) of the area of interest. In some embodiments, marking the area of interest allows the system to generate a dedicated volume of interest in the system that is then saved in the system for future use, for example for comparison analysis. In some embodiments, when the physician 120and/or medical sonographer 122 moves the mock probe 302 with the reference element 204 , the system generates the
data related to the movements of the reference element 204 , according to the movements of the mock probe 302 made by the physician 120 and/or medical sonographer 122captured by the webcam 118of the physician unit 104 . In some embodiments, the reference element data is sent to the system to be shown to the user 114in the user unit 102 . Therefore, the method comprises sending reference element data from the physician unit 804 . In some embodiments, the video captured by the webcam 118 of the physician unit 104 is shown in the screen 110of the user unit 102 and, optionally, also on the screen 116of the physician unit 104 . In some embodiments, the user 114 follows the movements of the mock probe 302 using the probe 202 (comprising the reference element 204 ) of the ultrasound device 108shown on the screen 110of the user unit 102 . In some embodiments, when user 114 moves the probe 202 with the reference element 204 , the system generates the data related to the movements of the reference element 204 , according to the movements of the probe 202 made by the user 114captured by the webcam 112of the user unit 102and sends it to the physician unit 104(optionally through the server). In some embodiments, the method comprises receiving feedback on the reference element data sent 806 . In some embodiments, as mentioned above, the feedback is provided by comparing the sent data from the user unit 102 with data received from the physician unit 104 , as explained above. In some embodiments, the method comprises continuing sending reference element data until the ultrasound examination is complete 808 . Exemplary system for unsupervised remote ultrasound examinations Referring now to Figure 9a, showing a schematic representation of a system to perform unsupervised ultrasound examinations, according to some embodiments of the invention. Same reference numbers will be used for same elements in all Figures. In some embodiments, the system is similar to that shown in Figure 1, with the difference that there is no communication between the user unit 102 and the physician unit 104 . In some embodiments, the user unit 102 comprises a dedicated software configured to guide the untrained user 114 to perform a specific ultrasound examination. For example, as shown in Figure 9b, a protocol for a specific ultrasound examination is shown to an untrained user. In this example, a protocol for a lung ultrasound protocol examination
is shown. At Step I, the user is requested to hold the probe horizontally and scan the front right side of his/her body. At Step II, the user is requested to hold the probe vertically and scan the front right side of his/her body. At Step III, the user is requested to hold the probe horizontally and scan the front left side of his/her body. At Step IV, the user is requested to hold the probe vertically and scan the front left side of his/her body. At Step V, the user is requested to hold the probe horizontally and scan the left side of his/her body. At Step VI, the user is requested to hold the probe vertically and scan the left side of his/her body. At Step VII, the user is requested to hold the probe horizontally and scan the right side of his/her body. At Step VIII, the user is requested to hold the probe vertically and scan the right side of his/her body. At Step IX, the user is requested to hold the probe horizontally and scan the back left side of his/her body. At Step X, the user is requested to hold the probe vertically and scan the back left side of his/her body. At Step XI, the user is requested to hold the probe horizontally and scan the back right side of his/her body. At Step XII, the user is requested to hold the probe vertically and scan the back right side of his/her body. In some embodiments, the untrained user can use the help of another person to perform difficult examinations, like the back of the patient. In some embodiments, when the system comprises an AI system monitoring the rUS, the untrained user can position the ultrasound probe at any location in the body and the AI is configured to identify the current location in the body of the user where the probe is positioned, for example by identifying one or more elements in the ultrasound image, and provide real-time instructions to the untrained user in order to bring the probe to the required location that is needed to be examined. In some embodiments, as explained above, during the ultrasound examination, the system monitors the movements of the probe to assess the performance of the examination, optionally while controlling one or more commands at the user unit 102 . In some embodiments, in order for the system to perform assessment of performance of the examination, the real-time tracked movements of the ultrasound probe/reference element 204are compared with historical examinations comprising revised movements of ultrasound probes/reference elements of relevant ultrasound examination protocols. In some embodiments, once the untrained user (patient) has finished the requested ultrasound examination, a professional (physician and/or medical
sonographer) can review the results offline and/or not in real-time. In some embodiments, the system is configured to add relevant information to the scan, for example, the location where the scan was performed, optionally in relation to one or more of the reference element, the camera and the patient. In some embodiments, the system is also configured to provide a scan that can be analyzed by the physician and/or medical sonographer while changing the view format, for example from 3D volume to 2D image, Color Doppler image, etc. Referring now to Figure 9c, showing a flowchart of an exemplary method of performing remote ultrasound examinations, according to some embodiments of the invention. In some embodiments, the user turns on the user unit 902 . In some embodiments, the system provides instructions to the untrained user on how to move the ultrasound probe according to established ultrasound examinations protocols 904 . In some embodiments, the system receives the ultrasound data while also receiving the reference element tracking data. In some embodiments, optionally, the system also receives data about the relative location of the ultrasound probe in relation to the body of the patient 906 . In some embodiments, the system assesses the quality of the ultrasound data with/without referencing it to the reference element tracking data 908 . In some embodiments, the system provides feedback according to the result of the assessment 910 . For example, when the ultrasound scan has been performed correctly the system will notify the user that the ultrasound examination has finished. When the ultrasound scan has not been performed correctly, the system will instruct the user to perform additional scans until reaching the required information. Exemplary use of reference library for remote examinations In some embodiments, the server comprises a library (reference library) of historical ultrasound examinations used as reference for remote ultrasound examinations. For example, using the example above, the physician/medical sonographer inserts into the system that the remote ultrasound examination is related to kidneys. In some embodiments, the reference library comprises a plurality of historical ultrasound examinations performed for kidneys. In some embodiments, when a kidney ultrasound remote examination is performed, the system comprises instructions to
compare between the ultrasound data collected by the user 114 and one or more reference kidney ultrasound examinations located in the reference library. In some embodiments, the system further comprises instructions to provide feedback of the result of the comparison, for example, providing feedback whether the data collected by the user is complete and/or is of good quality. In some embodiments, optionally, the system further comprises instructions to guide the user and/or the physician/medical sonographer to what specific examinations (translated into specific movements of the probe) are required to complete the required ultrasound examination. It should be understood that the kidney example, was provide just as an example to allow a person having skills in the art to understand the invention and that other ultrasound examinations are also part of the invention, for example, ultrasound examinations of the heart, the intestine, a fetus, a liver, a kidney, any other organ and any combination thereof. Use of AI in monitoring ultrasound examinations by untrained users In some embodiments, the system comprises instructions for monitoring the performance of the ultrasound examinations by untrained users using one or more artificial intelligence (AI) systems. In some embodiments, the AI comprises instructions to assess in real-time the images being received during the scan and provide feedback and/or instructions to the untrained user. In some embodiments, the AI system resides in the server. In some embodiments, when the system comprises an AI system monitoring the rUS, the untrained user can position the ultrasound probe at any location in the body and the AI is configured to identify the current location in the body of the user where the probe is positioned, for example by identifying one or more elements in the ultrasound image, and provide real-time instructions to the untrained user in order to bring the probe to the required location that is needed to be examined. In some embodiments, optionally, once the AI system identifies that the probe is directed at the right location (for example at the right organ), the AI system automatically marks the area of interest (for example the borders of the organ) of the area of interest. In some embodiments, marking the area of interest allows the system to generate a dedicated volume of interest in the system that is then saved in the system for future use, for example for comparison analysis.
Exemplary information regarding the ultrasound acquisition of images and their transformation into 3D volumes Common ultrasound imaging devices visualize a 2D cross-section of a 3D body. Usually, the cross-section is perpendicular to the transducer probe and is of arbitrary orientation since it depends on how the user is holding the transducer. For example, a representation of the 2D cross-section image can be explained as a slicing area 1002of the total volume 1004 , as shown for example in Figure 10. Definitions: In scientific visualization and computer graphics, volume rendering is a set of techniques used to display a 2D projection of a 3D discretely sampled data set, typically a 3D scalar field. A typical 3D data set is a group of 2D slice images acquired for example by an ultrasound, CT, MRI, or MicroCT scanner. Usually these are acquired in a regular pattern (e.g., one slice every millimeter) and usually have a regular number of image pixels in a regular pattern. This is an example of a regular volumetric grid, with each volume element, or voxel represented by a single value that is obtained by sampling the immediate area surrounding the voxel. Voxel: short for volume element (or also known as volume pixel), it is the smallest unit of a three-dimensional volume equivalent of a pixel in a 2D image. Volumetric buffer: is the total volume of the 3D body (or a large 3D array) which comprises a plurality of voxels, each of which represents a view-independent 2D cross-section of an ultrasound sample. An arbitrary slice is a virtual image frame buffer defined in a local independent coordinate system. In some embodiments, an arbitrary slide is set and the voxels pierced by the virtual frame are sampled, mapped and displayed in their image coordinate system after the virtual image frame is clipped against the volume buffer. In some embodiments, the algorithm is an extension of the widely known 2D scan-line algorithm where at each scan-line the third dimension is also interpolated. Referring now to Figure 11 showing a block diagram of exemplary technical issues involved in the field of ultrasound simulation and examples of how the system of the invention resolves these technical issues, according to some embodiments of the
invention. These were also further described by Aiger et al in Real-Time Ultrasound Imaging Simulation, Real-Time Imaging 4, 263–274 (1998). The Principles of Ultrasound Devices The ultrasound input device is the transducer which is manually positioned by the physician. The transducer converts electric energy into ultrasound energy and vice versa. It produces pulses of sound waves and sends them to the patient’s body. It also receives the echoes from the patient and converts them to electric energy. This energy is translated into an image that consists of gray level pixels which represent the structure of the body image in the ultrasound display. There are different kinds of transducers at different frequencies with which it is possible to determine the depth and the resolution of the image. The physical principle of the ultrasound is as follows. Pulses of ultrasound, which are short pulses of sound wave at high frequency (1–15 MHz), are generated by the transducer (called pulse beam) and sent into the patient’s body. They produce echoes at organ boundaries and within tissues. These echoes return to the transducer and are detected, processed and translated into appropriate gray level pixels which form the image on the ultrasound display. The gray level is a function of the reflection coefficient of the body at the appropriate location. The reflection coefficient is an attribute of the tissue depending on its physical, chemical and other characteristics. The location of the pixels corresponds to the anatomic location of the echo-generating structure determined by knowing the direction of the pulse when it enters the patient and measuring the time for its echo to return to the transducer. From an assumed starting point on the display, the proper location for presenting the echo can then be derived, provided the direction in which to travel from that starting point to the appropriate distance is known. With knowledge of the speed of sound, the echo arrival time can be converted to distance to the structure that produces this echo. Ultrasound Imaging 1102In some embodiments, the system, as an ultrasound simulation system, generates images in real-time that resemble real ultrasound images, including the typical ultrasound functions, depth gain compensation (DGC) and gain. In some embodiments, real-time imaging means a frame rate of at least 10 Hz (over 10 frames per second). In
some embodiments, the system forms a volume from real ultrasound images (received by the data collection module 1120 ) in an off-line pre-process (in the processing module 1122 ), and then slices the volume (on-line) to display (by the display module 1124 ) a processed image of the oblique slice. In some embodiments, such images can be generated very rapidly, including post-processing enhancements, and can produce images which are, in most cases, indistinguishable from real ultrasound images. However, the inventors have found that this method of generating images from a pre-sampled ultrasound volume has some inherent problems, due to the fact that an ultrasound image has view-dependent features and an acquisition-parameter-dependent character. This fact is two-fold: firstly, the pre-processed volume dataset includes some unwanted view-dependent features that should be removed. Second, the generated simulated image from a given arbitrary direction should be enhanced and should include the appropriate view dependent features. These and other inherent problems 1104 are listed below. Shadows: In some embodiments, the ultrasound image exhibits shadows when closer objects obscure the sound waves from further objects. In some embodiments, the shadows of a given image are correlated with the specific sampling direction. In some embodiments, this effect is minimized by the software during the data collection, because this feature is not reversible. In some embodiments, the data at the shadow are lost and cannot be recovered unless the same area is sampled from a different viewing direction which views the shadow areas. Gain: In some embodiments, the Gain control determines how much amplification is accomplished in the ultrasound receiver. In some embodiments, since the Gain operates on the image globally and has a uniform effect on the entire voltage received, it is not correlated with the specific sampling direction. In some embodiments, the Gain is easily simulated, but problematic during data collection. In some embodiments, if the data are sampled with too little Gain, weak echoes are not registered and these echoes are lost. On the other hand, in some embodiments, too much Gain causes saturation; that is, most echoes appear bright, and contrast resolution is lost. In
some embodiments, since Gain affects the sampling volume in an irreversible manner, therefore the sample is performed with an appropriate Gain level. Depth gain compensation (DGC): In some embodiments, the DGC equalizes differences in received echo amplitudes as a function of the reflector depth. In some embodiments, reflectors at different depths with equal reflection coefficients produce different return amplitude echoes arriving at the transducer. In some embodiments, echoes are displayed from similar reflectors in a similar way. In some embodiments, the DGC functions as the Gain does, but at different levels as a function of the depth (the distance from the transducer). In some embodiments, the user sets different Gain controls for different depths. In some embodiments, most ultrasound devices can set eight control points which define the DGC behavior. In some embodiments, like the Gain, the DGC is correlated with the sampling direction. In some embodiments, during data collection, given the dependence on the sampling direction, the image is as homogeneous and view independent as possible. In some embodiments, the main problem with DGC and Gain is that they are irreversible, and some data are always lost during the collection and cannot be recovered from the sampled volume. However, in some embodiments, with a good setup of the DGC and Gain levels it is possible to generate a volume buffer from which simulated are get with images almost indistinguishable from real ultrasound images. Focus: In some embodiments, the width of the pulse beam generated by the transducer increases with depth, i.e. the beam has the shape of a cone whose apex is at the transducer. In some embodiments, the pixel resolution is a function of the beam width. Thus, in some embodiments, an ultrasound image exhibits varying resolutions at different depths. In some embodiments, the first problem is to simulate this different resolution based on one sampled volume taken with a specific machine and a specific transducer. Thus, in some embodiments, it is needed to use an ultrasound machine with a narrow beam to get an almost homogeneous sampled volume. In some embodiments, in high end machines the beam size is small and it is neglected in the simulation. In some embodiments, very much like the operation of a camera and the physics of light that passes through lenses, the ultrasound beam can also be focused at an arbitrary field
of view. In some embodiments, the focus is set at an arbitrary depth and get the highest resolution at that depth. In some embodiments, the second problem related to focus is to simulate the arbitrary focal depth by changing the resolution at the related focal depth. In some embodiments, one way to do this is to change the sample rate while generating the simulation image depending on the depth of the scan line (see the later section on ‘Real-Time image generation’). In some embodiments, multiple focuses use a different pulse for each one of the required focuses, and the generated image has high resolution at several depths. However, in some embodiments, using multiple focuses results in longer refresh rates. In some embodiments, the collection sampling time remains short to avoid the introduction of undesired movements. Thus, in some embodiments, the volume is sampled in a single focus. Resolution: In some embodiments, the resolution of the acquired data is defined by the user who sets the magnification size. In some embodiments, the acquired resolution is, of course, constant in the sense that it may be either over or under sampled in the on-line process. In some embodiments, during the collection phase the sampled resolution also affects the size of the entire image. In some embodiments, if magnification is applied than we get a smaller area of higher resolution. In some embodiments, this trade-off implies that acquiring data of higher resolution takes more time. In some embodiments, in most cases the sampling is not performed at higher resolution and it is preferred to minimize the collection phase by sampling larger areas. However, in some embodiments, certain pathologies are better learned from a smaller volume of higher resolution. In some embodiments, another related problem is the uneven resolution of the sampled volume. In some embodiments, the x–y slices (the sampled images) have a different resolution to the z–x planes (the inter-slice dimension). In some embodiments, the shape of the ultrasound beam is not symmetric. In some embodiments, it gets wider along the z axis than in the x–y plane (x–y is the ultrasound image plane). Thus, in some embodiments, the x–y planes have a higher resolution than other planes. Our experience shows that this is not an aquatic problem and the resolution variations are hardly noticeable during the simulation. 30
Noise: In some embodiments, the ultrasound images are very blurred and noisy. However, in some embodiments, this is not a real problem, since the simulation should retain these characteristics. In some embodiments, they are also not view-dependent, and thus these attributes require no special treatment. It should be understood that the above mentioned are just examples, and that more and/other functions are changed using one or more buttons provided by the system, and those are also included in the scope of some embodiments of the invention. In summary, in some embodiments, some of the above ultrasound features (for example DGC, Gain and/or Focus) are alleviated by tuning down the acquisition parameters. However, in some embodiments, it is not possible to remove them in a post- process. In the following section it will be described the on-line imaging process, performed in some embodiments of the invention, in which all the above ultrasound features are simulated over the image with respect to the view direction and in accordance with the user’s specific parameters. Real-Time Image Generation: As explained above, the common ultrasound imaging devices visualize a 2D cross-section of the 3D body. The cross-section is perpendicular to the probe and is of arbitrary orientation. The simulation of this image is basically a slicing algorithm of the volumetric buffer (Figure 10). The assumption is that the volume buffer is a large 3D array of voxels, each of which represents a view-independent ultrasound sample. The arbitrary slice is a virtual image frame buffer defined in a world coordinate system. The voxels pierced by the virtual frame are sampled, mapped and displayed in their image coordinate system after the frame is clipped against the volume buffer. In many cases the mapping between world and image space does not involve scaling, and the virtual frame can be voxelized with no filtering. In some embodiments, a voxelization algorithm for planar polygons is used, which is basically an extension of the widely known 2D scan-line algorithm where at each scan-line the third dimension is also interpolated. In some embodiments, a sweeping technique where a polygon can be generated by replicating one discrete line over the other and saving most of the computations involved in the discretization process of the plane is used. In some embodiments, the sweeping technique is fast enough to
voxelize slices in real-time on a Pentium processor (a 300x300 voxel slice can be scaled and displayed in a 400x400 pixel image in less than 1 second). However, in some embodiments, when scaling is required, voxel oversampling and filtering are necessarily involved. In some embodiments, voxelization with scaling calls for the development of fast sampling algorithms of 3D polygons. A brute force oversampling algorithm would use a trilinear interpolation of the dense lattice of sampling points. However, even an incremental computation of the trilinear function along a straight line would not avoid the excess of memory access, much of which is repeated. In some embodiments, this sampling process should incorporate the Gain and DGC functions, as well as other functions like Focus and casting shadows. The effect of the Gain and the DGC on the image is basically the same. The Gain affects the whole image simultaneously. The DGC function is set by eight potentiometers which control the effect on different areas of the image. The two values are combined to modify the pixel value as a function of its range in image space. Given the four points which define the image frame in the world coordinates, the slicing algorithm scans all the voxels intersected by the frame and maps them to the image coordinate systems. The following algorithm, as described also by Aiger et al, is based on the weaving method in which the voxelized plane is generated by replicating a voxelized line, called a template, along a base voxelized line. In some embodiments, weaving is a natural technique for voxelizing surfaces which can be obtained by sweeping some fixed curve through space in such a way that the orientation of the swept curve remains unchanged throughout. In some embodiments, the voxelized plane should be free of holes, which means a correct sample of the slice. In some embodiments, weaving can be implemented very efficiently. If ? ? , ? , ? are the coordinates of the ith voxel in the template T, then the sequence of offsets from a given reference point, say the template starting point ( ? ), is called the template offset form. The offset value ? ? is defined by: ?? ? ? ? ∗ ?? ??? ∗ ? ? ?? ? ? ∗????? ? where sizeX and sizeY are the volume array dimensions. In other words, the dT value is the unfolded offset inside a linear volume. The template offset form, denoted
by T, is computed once at the outset and stored in an array. Then, for each voxel u in the base, denoted by B, the T array is used to incrementally construct a translate of the template, starting at u. The basic algorithm that maps the voxels to the image buffer, I[i][j]. is a double loop which runs over all the uj values of the base and all the vi values of the template: The inner loop runs over the i while j is constant. A pointer ptr = &(Volume[uj]) is used to further simplify the computation. A clipping process is necessary to avoid overflow of the ? , since the equation above holds for ? in the volume. This is done by clipping each row (template) against the volume so that the i index runs between the clipped boundary. The clipping cost is insignificant since it is applied only once in a row, while the memory access time to the voxels dominates the cost. Thus, it is most important to minimize the number of retrievals. Note that the preceding statement assumes that there is a one-to-one mapping between the voxel space and the pixel space, which is not necessarily true. The image space resolution is constant and defined by the display size. However, the slice size is defined on-line by the user, who can either zoom in or out. For efficiency, in some embodiments, it is important to avoid unnecessary access to the volume buffer. In some embodiments, the access time to the volume memory is dependent on the volume size, since adjacent voxels in large volumes have longer offsets, which tend to have a low cache hit ratio. In some embodiments, in the case of a zoom out, the voxels are smaller than the pixels. In some embodiments, stepping inside the volume in a pixel-sized step is an under-sampling of the voxels, and yields an image of reasonable quality. In some embodiments, in the case of a zoom in, the voxels are over-sampled and yield a displeasingly blocky and jaggy image. In some embodiments, to avoid the redundant oversampling of the same voxels, the zoom in image is generated by two interleaved processes. In some embodiments, the first samples the volume, and generates an intermediate image whose pixels are of voxel size. In some embodiments, the other process scales up the image into the final size. In some embodiments, since the image frame is scanned in scan line order, each intermediate row can be stretched before scanning the next row. In some embodiments, then the intermediate image needs to stretched vertically along its column to the final size. In some embodiments, the scale process is decomposed into two series of one- dimensional stretches. This implies that the stretch function operates only on one-dimensional vectors and improves the efficiency. In some embodiments, the stretch
function must be fast enough to operate in real-time. In some embodiments, the stretch algorithm minimizes the access to the input and output buffers, since no buffer value is accessed twice. In some embodiments, a direct advantage of the row-by-row stretch principle is the simulation of the focus. As described before, the focus increases the image resolution at a given depth. In some embodiments, it is possible to use the stretch function to augment or reduce the resolution of different rows to provide the user with the impression of higher resolution at a given depth, while rows out of focus depth are blurred by under-sampling them and stretching them back to the image size. For example, assuming that the ultrasound has N potentiometers for the DGC function and one potentiometer for the Gain function. As explained previously, these two functions are essentially the same, and they amplify the signal reflected from the body according to the potentiometer’s values. Thus, the value of the gray level of the voxel sampled at the volume buffer, denoted by V, is scaled by a scalar value Gain, to simulate the effect: NewGray = MIN(Gain * V, 255) This effect can be applied by simply modifying the color lookup table of the display system. For the DGC effect, the N values are interpolated by a spline to the length of the image column, and stored in a lookup table DGC[]; so for row y: NewGray = MIN(DGC[y] * V, 255) Combining the effect of the two functions, we get: NewGray = MIN(Gain * MIN(DGC[y] *V, 255), 255) In some embodiments, to save on computation, a 2D lookup table is used. In some embodiments, the indices to the table are the gray values and the row number. In some embodiments, each table entry contains the gray level value to be displayed for a given sample value at a given depth for a preset Gain and DGC setup. In some embodiments, this table is updated each time some potentiometer’s values are modified, or whenever the image is magnified.
Data Acquisition 1106In some embodiments, the volume buffer which stores the ultrasonic data is big enough and represent a large portion of the human body to permit the bounded-free practice of a real life diagnosis. Contemporary ultrasound devices do not provide the capability of obtaining the entire volume in a single acquisition. This implies that the volume buffer has to be reconstructed from several sub-volumes obtained from different viewpoints. The registration of mono-modal datasets has been extensively investigated elsewhere in medical application where atlas data are used. However, ultrasonic datasets are far more problematic than other medical modalities, such as computed tomography (CT) or magnetic resonance imaging (MRI), since the ultrasound values are significantly noisy, blurred and have many more view-dependent variations, as mentioned above. Moreover, the data sampled from a real patient is usually deformed, as will be explained below. In some embodiments, given two volumes with a significant overlap, a spatial transformation is found which aligns and registers the two volumes into a single volume which smoothly combines the information from both. In some embodiments, the type of registration technique that can be appropriately applied is directly dependent on the type of variation between the two volumes. Thus, to design a registration method it is necessary to know the type of variation exhibited by ultrasonic volumes. The typical size of an ultrasound image generated by common ultrasonic devices is limited to 12–15 cm at the wide zone. The acquisition of a volume is thus reconstructed from a series of 2D slices. There are two main methods to collect the series of slices: a freehand collection and a mechanical collection. In some embodiments, in a freehand collection the location and orientation of the slice is tracked by a six-degree-of-freedom (6DOF) device (e.g. 6DOF, Isotrack). In some embodiments, the slices are stored in the volume, and the gaps between the slices are filled by interpolations. In some embodiments, another approach used is to attach the transducer probe to a mechanical motor that sweeps the slice along some type of trajectory (e.g. fun, rotation). In particular, an example of one of these ultrasound devices is the TomTec device that offers a parallel sweep by which a series of parallel uniformly spaced slices leave no gaps. It is possible to define the image resolution which is traded off for speed. The TomTec also includes three types of motors: parallel, fun and rotational, and gating equipment for periodic movements. The parallel dense slices
generated by the TomTec provide small volumes of good quality. A series of such volumes needs to be collected and assembled to form a large volume 1110 . The registration of two volumes requires one to detect the changes between the two images and to design a transformation that deforms them in order to remove or reduce the variations between them. The source variations can be classified into the following three types. Directional variations 1114 : These variations are due to changes in the view point. They cause a misalignment that can be simply corrected by a rigid transformation. However, as we showed above, the acquisition of the same volume from a different view point causes other effects that are not compensated for by spatial transformation. For example, shadows are cast with strong correlation with the probe viewing direction. Volumetric variations 1116 : These are caused by the characteristic of the ultrasonic technology. For example, the DGC and Gain distortions and the inherent noisy and blurred ultrasound signal. These effects are difficult to model and to remove. One can attempt to reduce them by tuning the acquisition parameters. Geometric variations 1118 : Geometric deformations are caused by the movements of the body during the time of acquisition. Some movements are forced by the acquisition device, since the ultrasound probe must have good contact with the body. Of course the human body is soft and not flat, and it is rather difficult to maintain contact without causing forced movements by the muscles contracting. Immersing the body in a tube of water can avoid probe contact and eliminate the muscular contractions. Another unavoidable deformation is caused by breathing and other natural behavior of the sampled body. Periodic deformation (like that of the heart) can be overcome by gating. In gating, the acquisition is synchronized with the period and the slices are acquired in the same phase of the period, using equipment similar to ECG, which monitors heart activity. Volume Registration 1108In some embodiments, large ultrasound volumetric buffers are constructed using a series of volumes acquired by one or more ultrasound devices, for example like the TomTec or Polhemus – motion tracking 1110 . In some embodiments, the ultrasound
device is attached to a mechanical arm with which different volumes are obtained. In some embodiments, the ultrasound device position and orientation are recorded using for example a 6DOF device with which the global misalignment can be corrected by a simple rigid transformation that maps the volumes back to a common (world coordinate) space. However, in some embodiments, the global rigid transformation is coarse, and a fine elastic deformation is needed to obtain a good registration that compensates for local shape deformations and acquisition variations. In some embodiments, the elastic deformation is local and is based on the overlapping portion of two given volumes. In some embodiments, the rigid transformation is too coarse and, even if exact, the two volumes have variations which are apparent mainly where the two volumes are in contact. In some embodiments, a direct registration method is used to automatically correct small spatial variations caused by geometric deformations. In some embodiments, the method is based on the gradient values. In some embodiments, the registration method further comprises a multi-resolution method to better deal with large misalignments. In some embodiments, the transformation is computed on a resolution pyramid and the results from the low resolution transformation are used to guide the computation of the finer levels. In some embodiments, a resolution pyramid consists of the original image and a number of copies at lower resolutions. In some embodiments, at lower resolutions adjacent pixels and local gradients represent large distances of the original image. In some embodiments, a displacement computed on a low resolution image indicates a larger displacement on the highest resolution of the original image. In some embodiments, these larger displacements may yield transformations that compensate for larger misalignments. However, in some embodiments, those are only rough transformations since they are based on coarse representations of the original images. In some embodiments, the computation of the higher levels is based on the displacements of the lower levels and refines them. In some embodiments, the multi-resolution method improves the performance of the registration in terms of the initial misalignment of the source and target images. As used herein with reference to quantity or value, the term “about” means “within 20 % of”.
The terms “comprises”, “comprising”, “includes”, “including”, “has”, “having” and their conjugates mean “including but not limited to”. The term “consisting of” means “including and limited to”. The term “consisting essentially of” means that the composition, method or structure may include additional ingredients, steps and/or parts, but only if the additional ingredients, steps and/or parts do not materially alter the basic and novel characteristics of the claimed composition, method or structure. As used herein, the singular forms “a”, “an” and “the” include plural references unless the context clearly dictates otherwise. For example, the term “a compound” or “at least one compound” may include a plurality of compounds, including mixtures thereof. Throughout this application, embodiments of this invention may be presented with reference to a range format. It should be understood that the description in range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the invention. Accordingly, the description of a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range. For example, description of a range such as “from 1 to 6” should be considered to have specifically disclosed subranges such as “from 1 to 3”, “from 1 to 4”, “from 1 to 5”, “from 2 to 4”, “from to 6”, “from 3 to 6”, etc.; as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range. Whenever a numerical range is indicated herein (for example “10-15”, “10 to 15”, or any pair of numbers linked by these another such range indication), it is meant to include any number (fractional or integral) within the indicated range limits, including the range limits, unless the context clearly dictates otherwise. The phrases “range/ranging/ranges between” a first indicate number and a second indicate number and “range/ranging/ranges from” a first indicate number “to”, “up to”, “until” or “through” (or another such range-indicating term) a second indicate number are used herein interchangeably and are meant to include the first and second indicated numbers and all the fractional and integral numbers therebetween. 30
Unless otherwise indicated, numbers used herein and any number ranges based thereon are approximations within the accuracy of reasonable measurement and rounding errors as understood by persons skilled in the art It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination or as suitable in any other described embodiment of the invention. Certain features described in the context of various embodiments are not to be considered essential features of those embodiments, unless the embodiment is inoperative without those elements. Although the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims. It is the intent of the applicant(s) that all publications, patents and patent applications referred to in this specification are to be incorporated in their entirety by reference into the specification, as if each individual publication, patent or patent application was specifically and individually noted when referenced that it is to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present invention. To the extent that section headings are used, they should not be construed as necessarily limiting. In addition, any priority document(s) of this application is/are hereby incorporated herein by reference in its/their entirety.
Claims (31)
- WHAT IS CLAIMED IS: 1. A system for performing remote ultrasound examinations, comprising: a. an ultrasound device comprising a trackable probe; b. a computer in communication with said ultrasound device and comprising instructions for: i. providing instructions to a user for performing an ultrasound examination; ii. receiving movement information about said trackable probe; iii. providing feedback to said user regarding said performing of said ultrasound examination based on said received movement information of said trackable probe; wherein said computer further comprises instructions for evaluating said ultrasound examination according to at least one ultrasound image received from said ultrasound device; and wherein said evaluating comprises detecting “dead spots” in a 3D scanned volume.
- 2. The system according to claim 1, wherein said trackable probe comprises a first hardware configured to allow active tracking of said trackable probe.
- 3. The system according to claim 1 or claim 2, wherein said trackable probe comprises a second hardware configured to allow passive tracking of said trackable probe, said passive tracking being performed by said computer.
- 4. The system according to claim 3, wherein said second hardware is a reference element mounted and/or incorporated in said trackable probe.
- 5. The system according to claim 4, wherein said computer comprises a camera and said computer is configured to track said reference element of said trackable probe by means of said camera.
- 6. The system according to any one of claims 1-5, wherein said providing instructions comprises showing a video to said user.
- 7. The system according to any one of claims 1-6, wherein said providing instructions comprises connecting said computer with an external source that provides said instructions for performing said ultrasound examination. 46 301424/
- 8. The system according to claim 7, wherein said external source is a physician and/or a medical sonographer.
- 9. The system according to any one of claims 1-8, wherein said providing feedback comprises comparing said received movement information with at least one reference movement information.
- 10. The system according to claim 9, wherein said at least one reference movement information comprises one or more of reference movement information from a database and reference movement information in real-time from a physician and/or a medical sonographer.
- 11. The system according to any one of claims 1-10, wherein said computer comprises dedicated software configured to allow 3D geometric labelling points of interest on a 2D ultrasound image.
- 12. The system according to claim 11, wherein said points of interest comprise anomalies found in said 2D image of the ultrasound.
- 13. The system according to any one of claims 1-12, wherein said computer comprises instructions for generating a 4D volume reconstruction based on a plurality of 2D images.
- 14. The system according to claim 13, wherein said reconstruction is based on 2D images in time.
- 15. The system according to claim 13, wherein said reconstruction is based on 2D images with at least one geometric embedding.
- 16. The system according to any one of claims 1-15, wherein said computer further comprises instructions for generating a 4D Doppler approximation based on a 2D Doppler in time.
- 17. The system according to claim 16, wherein said 4D Doppler approximation is generated based on same voxels from different angles and different time samples.
- 18. The system according to any one of claims 1-17, wherein said computer further comprises instructions for associating a received ultrasound image with said received movement information of said trackable probe. 47 301424/
- 19. The system according to claim 18, wherein data generated by said associating is used as data collection for Machine Learning.
- 20. The system according to any one of claims 1-19, wherein said computer further comprises instructions for embedding one or more frames of a ultrasound stream within a geometric space.
- 21. The system according to claim 20, further comprising generating a geometric dataset of ultrasound images based on said embedding.
- 22. A system for performing remote ultrasound examinations, comprising: a. a user unit, comprising: i. an ultrasound device comprising a trackable probe; and ii. a first computer in communication with said ultrasound; b. a physician unit, comprising: iii. a trackable mock ultrasound probe; iv. a second computer with a camera.
- 23. The system according to claim 22, wherein said user unit is as said system according to claim 1.
- 24. The system according to claim 23, wherein said external source is said physician unit, said physician unit being used by a physician and/or a medical sonographer.
- 25. The system according to any one of claims 22-24, wherein said trackable mock ultrasound probe comprises a reference element mounted and/or incorporated in said trackable mock ultrasound probe.
- 26. A method for performing a remote ultrasound examination, comprising: a. providing to a user instructions comprising one or more movements of a first ultrasound probe; b. tracking movements performed by a second ultrasound probe used by said user during said remote ultrasound examination; c. comparing said performed movements with said provided instructions; d. providing feedback to said user in view of results of said comparing. 48 301424/
- 27. The method according to claim 26, wherein said providing to a user instructions comprises providing instructions in real-time by an external source.
- 28. The method according to claim 26 or claim 27, wherein said providing to a user instructions comprises providing instructions by means of a video shown on an electronic device.
- 29. The method according to any one of claims 26-28, further comprises providing said second ultrasound probe with one or more of active and passive tracking hardware.
- 30. The method according to claim 29, wherein said passive tracking hardware is a reference element mounted and/or incorporated in said second ultrasound probe.
- 31. The method according to claim 30, wherein said tracking movements performed by a second ultrasound probe are performed by tracking movements performed by said reference element. Mr. Maier Fenster Patent Attorney G.E. Ehrlich (1995) Ltd. 35 HaMasger Street Sky Tower, 13th Floor Tel Aviv 6721407
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| IL301424A IL301424B2 (en) | 2023-03-15 | 2023-03-15 | System and methods for performing remote ultrasound examinations |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| IL301424A IL301424B2 (en) | 2023-03-15 | 2023-03-15 | System and methods for performing remote ultrasound examinations |
Publications (3)
| Publication Number | Publication Date |
|---|---|
| IL301424A IL301424A (en) | 2024-08-01 |
| IL301424B1 IL301424B1 (en) | 2024-08-01 |
| IL301424B2 true IL301424B2 (en) | 2024-12-01 |
Family
ID=92300705
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| IL301424A IL301424B2 (en) | 2023-03-15 | 2023-03-15 | System and methods for performing remote ultrasound examinations |
Country Status (1)
| Country | Link |
|---|---|
| IL (1) | IL301424B2 (en) |
Family Cites Families (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2019514476A (en) * | 2016-04-19 | 2019-06-06 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | Positioning of ultrasound imaging probe |
| US20180344286A1 (en) * | 2017-06-01 | 2018-12-06 | General Electric Company | System and methods for at-home ultrasound imaging |
| EP3485816A1 (en) * | 2017-11-21 | 2019-05-22 | Koninklijke Philips N.V. | Method and apparatus for guiding an ultrasound probe |
| US20200069285A1 (en) * | 2018-08-31 | 2020-03-05 | General Electric Company | System and method for ultrasound navigation |
| US11850090B2 (en) * | 2020-09-23 | 2023-12-26 | GE Precision Healthcare LLC | Guided lung coverage and automated detection using ultrasound devices |
-
2023
- 2023-03-15 IL IL301424A patent/IL301424B2/en unknown
Also Published As
| Publication number | Publication date |
|---|---|
| IL301424A (en) | 2024-08-01 |
| IL301424B1 (en) | 2024-08-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11504095B2 (en) | Three-dimensional imaging and modeling of ultrasound image data | |
| Mohamed et al. | A survey on 3D ultrasound reconstruction techniques | |
| JP6297085B2 (en) | Ultrasound imaging system for ultrasound imaging of volume of interest and method of operation thereof | |
| US8917916B2 (en) | Medical training method and apparatus | |
| US8079957B2 (en) | Synchronized three or four-dimensional medical ultrasound imaging and measurements | |
| Aiger et al. | Real-time ultrasound imaging simulation | |
| CN112603361A (en) | System and method for tracking anatomical features in ultrasound images | |
| US12053326B2 (en) | Apparatus and method for automatic ultrasound segmentation for visualization and measurement | |
| US10980509B2 (en) | Deformable registration of preoperative volumes and intraoperative ultrasound images from a tracked transducer | |
| US12089997B2 (en) | System and methods for image fusion | |
| CN106030657B (en) | Motion-adaptive visualization in medical 4D imaging | |
| CN114098795B (en) | System and method for generating ultrasound probe guidance instructions | |
| WO2021141717A1 (en) | Methods and systems for using three-dimensional (3d) model cuts based on anatomy for three-dimensional (3d) printing | |
| US20230267618A1 (en) | Systems and methods for automated ultrasound examination | |
| US20240008845A1 (en) | Ultrasound simulation system | |
| Lv et al. | A real-time interactive 3D ultrasound imaging system | |
| IL301424B2 (en) | System and methods for performing remote ultrasound examinations | |
| CN115153621B (en) | Model-based automatic navigation system and method for ultrasound images | |
| Khamene et al. | Local 3D reconstruction and augmented reality visualization of free-hand ultrasound for needle biopsy procedures | |
| US20250241622A1 (en) | Methods and systems for generating dynamic 3d ultrasound image | |
| Qiu et al. | Freehand 3D ultrasound reconstruction for image-guided surgery | |
| CN120279166A (en) | Three-dimensional reconstruction method and system based on ultrasonic image | |
| WO2025231468A9 (en) | Multi-faceted transducer | |
| CN118266992A (en) | 3D ultrasonic contrast perfusion multi-parameter functional imaging method and system | |
| BE | Surface Reconstruction in 3D Medical Imaging |