EP3655919A1 - Systems and methods for determining three dimensional measurements in telemedicine application - Google Patents
Systems and methods for determining three dimensional measurements in telemedicine applicationInfo
- Publication number
- EP3655919A1 EP3655919A1 EP17912224.7A EP17912224A EP3655919A1 EP 3655919 A1 EP3655919 A1 EP 3655919A1 EP 17912224 A EP17912224 A EP 17912224A EP 3655919 A1 EP3655919 A1 EP 3655919A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- points
- point cloud
- image
- point
- coordinates
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/40—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management of medical equipment or devices, e.g. scheduling maintenance or upgrades
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H80/00—ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
Definitions
- the present concepts relate generally to telemedicine, and more specifically, to a system and method for calculating a length, depth, or related measurement in a three dimensional point cloud generated in a telemedicine application.
- Telemedicine refers to the practice of medicine at a distance where the patient and the medical professional are at different locations, and communicate via a computer network, or telecommunications system, which provides for the live exchange of information between the patient and medical professional locations.
- Applications may include tele-surgery, tele- mentoring, or related medical-related exchanges between users at different locations.
- a typical telemedicine environment includes a camera at a medical professional location that captures live or near real-time images of a patient, and transmits the images electronically to a computer at a remote location where a doctor or other medical professional may view the images on the computer's display screen, and provide medical services for the patient such as diagnosis, virtual assistance to a local surgeon, or even surgery, for example, with the assistance of robotic medical devices co-located with the patient and remotely controlled by the doctor.
- Telemedicine operations such as diagnosis, surgery, teaching, and so on often present a challenge to the remote viewer of the displayed electronic images of the patient, in particular, with respect to determining from the images a length or depth of a wound, incision, skin lesion, or other region of interest to the remote medical professional viewing the images on the display screen.
- a system for measuring a depth or length of a wound of a telemedicine patient comprises a first image capturing device that captures a two-dimensional (2D) image of a region of interest of a patient; a second image capturing device that generates a three- dimensional (3D) point cloud of the region of interest of the patient; a rendering system that processes a unified view for both the first and second image capturing devices where the 2D image and 3D point cloud are generated and registered; and a remote measurement processing system that determines a depth or length between two points selected from the 2D image of the region of interest by identifying associated points in the 3D point cloud and performing a measurement using the identified associated points in the 3D point cloud.
- the system further comprises an overlay system that creates a render point map image computed using data from the first and second image capturing devices.
- the first and second image capturing devices are calibrated to produce intrinsic and extrinsic parameters, which in turn are used to create the render point map image.
- the rendering system generates the render point map to map a 3D point cloud of a fiducial marker of the set of fiducial markers to a shared field of view (FOV) of the first and second image capturing devices to register 2D video of the markers with the image of the render point map and determine the set of 2D to 2D transformation parameters.
- FOV field of view
- the first and second image capturing devices are co-located at a rigid frame.
- the first image capturing device includes a monocular camera and the second image capturing device includes a 3D camera.
- the rigid frame fixes a relative position of the first and second image capturing devices.
- FIG. 1 is a diagram illustrating a telemedicine environment in which aspects of the present inventive concepts may be practiced.
- FIG. 2 is a flow diagram of a method for initializing a remote measurement processing system to determine a measurement taken from a digital image in a telemedicine operation, in accordance with some embodiments.
- FIG. 5 is a view of a displayed measurement result produced in accordance with some embodiments.
- Two image capturing devices in particular, a monocular camera 22 and a three- dimensional (3D) camera 24, are co-located with a patient 14 at a location remote from a doctor 12 or other medical professional.
- the monocular camera 22 and 3D camera 24 are preferably co-located under a common frame, housing, enclosure, or the like, for example, a rigid frame, so that the relative positions of the cameras 22, 24 are fixed and remain constant during movement, for example, where the cameras 22, 24 capture various images of an object scene.
- the calibration of the cameras 22, 24 permits the system to find parameters and establish a unified coordinate system.
- the monocular camera 22 and 3D camera 24 operate collectively to capture 2D and 3D images, or video, during a measurement scan of an object scene, for example, a region of interest of a patient, more specifically, images of a wound, incision, skin lesion, or the like.
- the monocular camera 22 produces real-time or near real-time 2D images, e.g., high 2D resolution video, during a medical procedure, which may be transmitted via a network 16 to a computer 26 having a display at a remote location, where a medical practitioner such as a doctor, nurse, or the like can observe the displayed 2D video, and may annotate or otherwise modify the displayed video using augmented reality tools or the like that are executed by a computer peripheral device such as a mouse/cursor, touchscreen, voice-activated commands and so on. Therefore, in particular, the user can select points on the displayed 2D video during a remote medical procedure, for example, to identify points on the video, or one or more images or frames of the video, for determining a depth, length, or related measurement.
- a medical practitioner such as a doctor, nurse, or the like
- augmented reality tools or the like that are executed by a computer peripheral device such as a mouse/cursor, touchscreen, voice-activated commands and so on. Therefore, in particular, the user can select points
- the telemedicine environment shown in FIG. 1 includes a remote measurement processing unit 50 that communicates with the cameras 22, 24, and/or other electronic devices such as a database 60, rendering system 70, calibration system 80, and/or user computers 26 via a communications network 16.
- the rendering system 70 is part of, i.e., stored and executed, at one of the cameras 22, 24 or the remote measurement processing unit 50.
- the network 16 may be a public switched telephone network (PSTN), a mobile communications network, a data network, such as a local area network (LAN) or wide area network (WAN), e.g., 3G/4G/5G network, or a combination thereof, or other data communication network known to those of ordinary skill in the art.
- PSTN public switched telephone network
- LAN local area network
- WAN wide area network
- 3G/4G/5G network e.g., 3G/4G/5G network
- the remote measurement processing unit 50 performs a measurement of a length and/or depth of an object, such as a wound of a patient for the remote doctor or other viewer and present it in a computer display 26.
- FIG. 2 is a flow diagram of a method 200 for initializing the remote measurement processing system 50 of FIG. 1 to determine a measurement taken from a digital image in a telemedicine operation, in accordance with some embodiments.
- 3D camera 24 are coupled to and collocated at a rigid frame or other apparatus that fixes and maintains a relative position of the monocular camera 22 and 3D camera 24.
- the monocular camera 22 and 3D camera 24, once in a fixed position, are calibrated to produce intrinsic and extrinsic parameters, which in turn are used to create an accurate render point map.
- a well-known camera calibration technique may be performed, for example, using a checkerboard target and non-linear optimization.
- Intrinsic parameters may include but not limited to camera parameters such as focal length, principal point, skew coefficients, optical center or principal point, or other camera characteristics or features.
- Extrinsic parameters correspond to rotation and translation vectors used to transform 3D point coordinates to a point in another coordinate system, for example transforming a point from world coordinate system to camera coordinate system.
- the intrinsic and extrinsic parameters of the cameras 22, 24 along with calibration data can be stored at the database 60.
- the calibration method starts by using the cameras 22, 24 to capture a set
- the fiducial markers can be any 3D objects placed in the field of view of each of the monocular camera 22 and 3D camera 24, and which appear in the image produced, for use as a reference. Fiducial markers positioned on a calibration board or the like may permit semi-automatic or fully automatic registration.
- the render point map image maps a 3D point cloud of a marker produced by the 3D camera 24 to a shared field of view (FOV) of the cameras 22, 24 to register the video with the image of render point map and determine the set of 2D to 2D transformation parameters.
- the 2D video taken by the monocular camera 22 of the markers is used by the rendering system 70 to transform the point coordinates selected from the video to render point map image and find them in 3D point cloud using a search algorithm and 2D-3D transformation parameters described in block 210.
- a registration procedure may be performed between two captured images.
- two images can be registered, i.e., the render point map is registered with a 2D image or video frame, and transformation parameters using the corresponding point coordinates may be determined.
- the highly precise calibration creates submillimeter to a millimeter accuracy, or more precise accuracy, in measurements.
- the calibration parameters are used to create render point map image with high precision.
- the registration creates parameters to transform coordinates of the points from 2D camera to 3D point cloud.
- a point cloud is produced by the 3D camera 24 of the wound or other area of interest, which captures 3D images in the same FOV as the 2D camera 22.
- measurement processing system 50 searches the point cloud to find the points in the point cloud generated by the 3D camera during the surgery, and identifies the associated 2D points along a curve in the 3D point cloud. More specifically, the coordinates of the two selected points are transformed to associated coordinates of the render point map regarding the monocular camera 22 processed in method 200 or 300. Here, adjusted x and y coordinates of the selected points are identified.
- a transformation algorithm is applied to find the associated points in the render point map generated in the method 200. The algorithm searches the generated point cloud to find the corresponding points in the point cloud generated by the 3D camera during the surgery, for example, in the same FOV as the 2D camera 22 producing the video shown in FIG. 5.
- a measurement is determined.
- the system may provide a selection option, for example, provided to a user computer for a user to select between a depth measurement (e.g., points PI and P2 shown in FIG. 4) and a length measurement (e.g., points P3 and P4 shown in FIG. 4).
- the difference between the z coordinates of the two points in the point cloud represents the depth of the selected point, for example, selected points PI and P2 shown in FIG. 4.
- a depth result 501 and length result 502 are presented in a display, for example, shown in FIG. 5. Also displayed are z coordinates (pzb, pzc) of the two points.
- the length between two points is determined by identifying the associated 2D points along a curve in the generated 3D point cloud in order to find the distance between the points in the 3D point cloud where the distance is a curve in a 3D environment.
- the user 12 may select the two points by clicking on two points of interest in display of computer 26.
- the length is calculated by finding all points in 3D point cloud which are located along the line connecting two points in 2D video. Then the length between every pair (e.g. point j and point j+1) is calculated using their x,y,z coordinates in the 3D point cloud. All the lengths are added together to finalize the measurement of the length in 3D.
Abstract
Description
Claims
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2017/035147 WO2018222181A1 (en) | 2017-05-31 | 2017-05-31 | Systems and methods for determining three dimensional measurements in telemedicine application |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3655919A1 true EP3655919A1 (en) | 2020-05-27 |
EP3655919A4 EP3655919A4 (en) | 2021-06-16 |
Family
ID=64455467
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17912224.7A Pending EP3655919A4 (en) | 2017-05-31 | 2017-05-31 | Systems and methods for determining three dimensional measurements in telemedicine application |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP3655919A4 (en) |
WO (1) | WO2018222181A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111613301B (en) * | 2019-02-22 | 2023-09-15 | 曹生 | Arterial and venous Ai processing method and product based on VRDS 4D medical image |
CN110634551A (en) * | 2019-10-15 | 2019-12-31 | 北京爱康宜诚医疗器材有限公司 | Osteotomy amount measuring method, measuring device, storage medium, and processor |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8290305B2 (en) * | 2009-02-13 | 2012-10-16 | Harris Corporation | Registration of 3D point cloud data to 2D electro-optical image data |
EP4140414A1 (en) * | 2012-03-07 | 2023-03-01 | Ziteo, Inc. | Methods and systems for tracking and guiding sensors and instruments |
US9298884B1 (en) * | 2014-12-17 | 2016-03-29 | Vitaax Llc | Remote instruction and monitoring of health care |
-
2017
- 2017-05-31 EP EP17912224.7A patent/EP3655919A4/en active Pending
- 2017-05-31 WO PCT/US2017/035147 patent/WO2018222181A1/en unknown
Also Published As
Publication number | Publication date |
---|---|
EP3655919A4 (en) | 2021-06-16 |
WO2018222181A1 (en) | 2018-12-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11025889B2 (en) | Systems and methods for determining three dimensional measurements in telemedicine application | |
US8147503B2 (en) | Methods of locating and tracking robotic instruments in robotic surgical systems | |
US8073528B2 (en) | Tool tracking systems, methods and computer products for image guided surgery | |
Harders et al. | Calibration, registration, and synchronization for high precision augmented reality haptics | |
US20090088897A1 (en) | Methods and systems for robotic instrument tool tracking | |
US20100249595A1 (en) | System and method for automatic calibration of tracked ultrasound | |
US20220110684A1 (en) | Method and assembly for spatial mapping of a model, such as a holographic model, of a surgical tool and/or anatomical structure onto a spatial position of the surgical tool respectively anatomical structure, as well as a surgical tool | |
Hu et al. | Head-mounted augmented reality platform for markerless orthopaedic navigation | |
US10078906B2 (en) | Device and method for image registration, and non-transitory recording medium | |
US20230114385A1 (en) | Mri-based augmented reality assisted real-time surgery simulation and navigation | |
CN111658142A (en) | MR-based focus holographic navigation method and system | |
EP3655919A1 (en) | Systems and methods for determining three dimensional measurements in telemedicine application | |
CN109934798A (en) | Internal object information labeling method and device, electronic equipment, storage medium | |
CN115919461B (en) | SLAM-based surgical navigation method | |
US20210128243A1 (en) | Augmented reality method for endoscope | |
JP2017136275A (en) | Image registration apparatus, method, and program | |
KR101596868B1 (en) | Camera parameter computation method | |
US10832422B2 (en) | Alignment system for liver surgery | |
WO2016042297A1 (en) | Computer and computer-implemented method for supporting laparoscopic surgery | |
US20230355319A1 (en) | Methods and systems for calibrating instruments within an imaging system, such as a surgical imaging system | |
Kemper et al. | Open source video-based hand-eye calibration | |
US20240016550A1 (en) | Scanner for intraoperative application | |
CN111481293B (en) | Multi-viewpoint optical positioning method and system based on optimal viewpoint selection | |
Trevisan et al. | Towards markerless augmented medical visualization | |
Trevisan et al. | Augmented vision for medical applications |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20200401 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
A4 | Supplementary search report drawn up and despatched |
Effective date: 20210518 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G16H 30/40 20180101AFI20210511BHEP Ipc: G06T 7/00 20170101ALN20210511BHEP |