WO2014194276A2 - Plate-forme de partage de données multi-sites - Google Patents

Plate-forme de partage de données multi-sites Download PDF

Info

Publication number
WO2014194276A2
WO2014194276A2 PCT/US2014/040363 US2014040363W WO2014194276A2 WO 2014194276 A2 WO2014194276 A2 WO 2014194276A2 US 2014040363 W US2014040363 W US 2014040363W WO 2014194276 A2 WO2014194276 A2 WO 2014194276A2
Authority
WO
WIPO (PCT)
Prior art keywords
frame
biometric imaging
frames
data
stream
Prior art date
Application number
PCT/US2014/040363
Other languages
English (en)
Other versions
WO2014194276A3 (fr
Inventor
Ravi N. Amble
Harish P. Hiriyannaiah
Farooq Mirza Mohammad Raza
Steven J. Salve
Original Assignee
eagleyemed, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by eagleyemed, Inc. filed Critical eagleyemed, Inc.
Publication of WO2014194276A2 publication Critical patent/WO2014194276A2/fr
Publication of WO2014194276A3 publication Critical patent/WO2014194276A3/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
    • H04N21/8153Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics comprising still images, e.g. texture, background image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43072Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of multiple content streams on the same device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2347Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving video stream encryption
    • H04N21/23476Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving video stream encryption by partially encrypting, e.g. encrypting the ending portion of a movie
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/266Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
    • H04N21/2665Gathering content from different sources, e.g. Internet and satellite
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8126Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
    • H04N21/8133Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts specifically related to the content, e.g. biography of the actors in a movie, detailed information about an article seen in a video program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content

Definitions

  • the present invention relates generally to medical technologies. More specifically, the present invention involves a telemedicine platform that allows medical professionals to share data, images and video.
  • Radiology is a medical specialty that employs the use of imaging to diagnose and/or treat disease or trauma within the body. Radiologists use an array of imaging technologies including ultrasound, X-ray radiography, computed tomography (CT), nuclear medicine, positron emission tomography (PET) and magnetic resonance imaging (MRI) to diagnose or treat ailments.
  • CT computed tomography
  • PET positron emission tomography
  • MRI magnetic resonance imaging
  • the images must be sent to the radiologist for analysis after the session has been completed. Once the radiologist has completed their analysis, a report is typically transmitted to the ordering physician - who schedules an even later appointment to meet with the patient to discuss the results. If the radiologist sees something that requires further imaging (e.g., to get a different view of a region of interest) - a new scan is ordered and the process is repeated. This substantially increases the time and costs involved in obtaining a diagnosis of a medical condition
  • the process can be sped-up if the radiologist personally conducts the radiological examination or is present during such examination.
  • the radiologist personally conducts the radiological examination or is present during such examination.
  • Telemedicine has the potential to substantially improve patient care by facilitating more immediate access to highly trained specialists at a variety of stages in the health care process. Accordingly systems that can improve the efficacy of remote medicine are of great interest.
  • the Applicant has developed a collaborative telemedicine platform that allows a remote medical specialist (such as a radiologist) to participate in an examination in real time.
  • a remote medical specialist such as a radiologist
  • radiological applications are often used as a representative use of the technology, it should be apparent from the following description that the described collaborative telemedicine platform can also be used in a wide variety of other remote medicine applications.
  • Radiologists and other specialists sometimes like to refer to the cases studies, professional references and/or the medical literature to identify similar cases when making a diagnosis. Therefore, there are continuing efforts to provide more useful online tools to make such resources readily available and/or to make it easier to access the desired information. In radiology, it can be helpful to find similar radiologic images to help make, or confirm a diagnosis.
  • multiple data streams are received from one or more medical imaging/sensing devices or other types of devices (e.g., a video camera).
  • Frames are obtained from the data streams.
  • a part of each frame and/or only particular frames are selectively encrypted.
  • the frames are transmitted to a remote device.
  • the frames for the data streams are reconstructed, rendered and/or displayed at the remote device.
  • the frames of different streams are synchronized.
  • the streams may involve a variety of different types of media and data, depending on the needs of a particular application.
  • the streams are a video stream and a biometric imaging stream.
  • One example approach involves performing a biometric imaging scan (e.g., an ultrasound scan) on a patient, which generates a biometric image stream.
  • a video camera is directed at a technician that is using the biometric imaging device, which indicates how the device is being handled and positioned.
  • the video camera generates a video stream.
  • Frames are obtained from the video and biometric imaging streams.
  • the frames are transmitted to a remote device e.g., as a packet sequence.
  • the frames are reconstructed and synchronized.
  • a user of the remote device can then display and observe the video and biometric imaging in (near) real time. The synchronization helps ensure that the video and biometric imaging are properly timed and coordinated when viewed at the remote device.
  • Any suitable data streams may be synchronized.
  • frames are obtained from biometric waveform data and biometric imaging streams received from one or more medical imaging/scanning devices. The frames are transmitted and synchronized.
  • annotation data is received from a specialist or medical professional and biometric imaging data is received from a medical imaging/sensing device. Frames are obtained from the annotation data and the biometric imaging data, which is then transmitted and synchronized.
  • a data stream is received from a medical imaging/sensing device or another type of device.
  • the data stream may be any suitable type of data stream, including a biometric imaging, biometric data, video, audio, annotation data or other type of data stream.
  • Frames are obtained from the stream. At least some of the frames are (partially) encrypted. In some embodiments, only a part of each frame is encrypted. Some implementations involve encrypting only a header of the frame, at least part of a header of a frame or only part of the header and part of the media data (payload) of the frame. The frames are then transmitted to a remote device.
  • FIG. 1 is a block diagram of a multi-site data sharing platform according to a particular embodiment of the present invention.
  • FIG. 2 is a block diagram of a patient telemedicine system according to a particular embodiment of the present invention.
  • FIG. 3 is a block diagram of a specialist telemedicine device according to a particular embodiment of the present invention.
  • FIG. 4 is an example user interface for the patient telemedicine device illustrated in FIG. 2.
  • FIG. 5 is an example user interface for the specialist telemedicine device illustrated in FIG. 3.
  • FIGS. 6 and 7 are flow diagrams illustrating a method for receiving, encoding, encrypting and transmitting data streams according to a particular embodiment of the present invention.
  • FIG. 8 is a data encoding module according to a particular embodiment of the present invention.
  • FIGS. 9A-9D are block diagrams illustrating possible encryption schemes for frames according to various embodiments of the present invention.
  • the present invention relates generally to methods and arrangements for supporting collaborative telemedicine.
  • the Applicant has developed a collaborative telemedicine platform that allows a remote medical practitioner (who may be a specialist such as a radiologist, a general practitioner, etc.) to participate in an examination in (near) real time. Several unique aspects of that platform are described herein.
  • a platform that allows a practitioner conducting a medical examination and a remote medical practitioner to concurrently share a plurality of different views in real time.
  • one of the shared views may be live streamed biometric information (such as radiological images, biometric waveforms, etc.).
  • Another shared view may show a relevant view of the patient being examined.
  • One example of this might be a view showing the placement and orientation of an ultrasonic probe being used in a sonographic examination.
  • Other shared views may include items such as video conference type view of one of the participants, a replay of an imaging stream selected by one of the participants, reference materials that have been selected by one of the participants, etc..
  • a particular strength of the platform is the ability to share medical imaging streams (such as the output of an ultrasonic probe) with remote collaborators in real time as the examination is taking place.
  • medical imaging streams such as the output of an ultrasonic probe
  • radiological applications are often used as a representative use of the technology, it should be apparent from the following description that the described collaborative telemedicine platform can also be used in a wide variety of other remote medicine applications as well.
  • the platform 100 includes a patient telemedicine system 102, which includes a first telemedicine device (workstation) 104.
  • the platform 100 further includes second and third telemedicine devices (workstations) 108 and 110.
  • the first telemedicine device 104 is sometimes referred to herein as the "patient” or "local” telemedicine system since it is preferably positioned at the location of a patient and is used by a technician or practitioner that is interacting with the patient.
  • the second telemedicine device 108 is sometimes referred to herein as the "remote" or "specialist” telemedicine device since it is typically positioned at a location that is apart from the patient and is most often used by a medical practitioner (such as a specialist, the ordering doctor, etc.) that is participating in the telemedicine session.
  • a medical practitioner such as a specialist, the ordering doctor, etc.
  • one or more additional remote telemedicine devices 110 may be included for use by others that are participating in or viewing the telemedicine session.
  • Such remote participants may include other specialists that are participating to provide a second opinion, other medical practitioners involved in the patient's care (e.g., the ordering physician/veterinarian, a surgeon that will be operating on the patient, the patient's primary care physician, etc.), parties that are observing the examination for training or educational reasons (e.g., interns, residents, students trainees, etc.) or anyone else that has a reason to participate in the examination.
  • other medical practitioners involved in the patient's care e.g., the ordering physician/veterinarian, a surgeon that will be operating on the patient, the patient's primary care physician, etc.
  • parties that are observing the examination for training or educational reasons (e.g., interns, residents, students trainees, etc.) or anyone else that has a reason to participate in the examination.
  • the patient telemedicine system 102 and the telemedicine devices 104, 108 and 110 are connected to one another through one or more networks 112 and may optionally also be connected to a cloud based architecture 114.
  • the cloud based architecture 114 preferably includes a server 116 and a database 118 which optionally include a medical records store and various other databases. Any suitable network(s) may be used to connect the devices of the platform, including but not limited to local area networks, wide area networks, intranets, the Internet, etc.
  • the patient telemedicine system 102 includes one or more medical imaging/sensing devices 106 and a patient telemedicine device 104.
  • the telemedicine devices 104 typically takes the form of a general purpose computing device having software configured to perform the described functions - although special purpose devices can be used as well. Suitable computing devices include desktop computers, laptop computers, tablet computing devices, mobile devices, etc.
  • the patient telemedicine device 104 is arranged to obtain (and optionally store) data from each connected medical imaging/sensing device that is being used for diagnostic testing.
  • the data received from a particular source will be received in the form of one or more live streams (e.g. a sonographic stream, or a multiplicity of sensor outputs from an EKG machine).
  • the patient telemedicine system 102 is situated near a patient who is currently undergoing the diagnostic testing, although this is not a requirement.
  • the patient telemedicine device 104 encodes the data streams from the diagnostic testing and transmits them to other participating telemedicine devices (e.g., specialist telemedicine device 108).
  • the patient telemedicine system 102 also allows a medical professional at the site of the diagnostic test to communicate with professionals who are participating in the telemedicine session remotely.
  • the patient telemedicine device 104 is preferably arranged so that it may be coupled to (or otherwise receive inputs from) a variety of different types of biometric diagnostics machines. These may include various types of imaging devices (e.g., ultrasound probes, X-ray machines, MRI devices, CT scanner, etc.) and/or biometric measurement devices (e.g., EKG, EEG, or ECG devices, pulse oximeters, thermal/temperature sensors, blood pressure monitors, glucose level monitors, pulmonary function testers, etc.).
  • imaging devices e.g., ultrasound probes, X-ray machines, MRI devices, CT scanner, etc.
  • biometric measurement devices e.g., EKG, EEG, or ECG devices, pulse oximeters, thermal/temperature sensors, blood pressure monitors, glucose level monitors, pulmonary function testers, etc.
  • the patient telemedicine device 104 is also arranged to provide an audio and one or more video links with the remote telemedicine devices.
  • the audio link allows the practitioner(s) who are working with the patient to talk with the remote participants.
  • the video links originating from the patient side allow remote participants to view relevant aspects of the examination.
  • a video camera is attached to the device 104 to provide video of the procedure that may be shared with the remote participants.
  • the appropriate focal point of the camera will vary with the needs of any particular examination. Most often, a camera will be focused on the examination target. For example, during an ultra-sound examination, a camera may be focused on the area that the ultrasound probe is being applied to so that remote participants can see how the operator is positioning and otherwise using the probe simultaneously with viewing the images produced by the probe.
  • the telemedicine device 104 may be arranged to receive a video feed from a camera associated with an endoscope during an endoscopic procedure. Sharing such streams allows the remote participants to see the video output of the endoscope.
  • the patient telemedicine device 104 has a graphical user interface that allows the operator to select and simultaneously display a plurality of different items as different views.
  • the different views may be rendered in different windows, in different panes of a window, or in any other suitable manner.
  • the operator may select the views that are deemed appropriate at any time.
  • one window may display the ultrasound image live.
  • a second window may show the output of video camera 106a and a third window may display a video conference feed from a remote participant.
  • a fourth window may be arranged to show a patient's medical record or other information of use to the operator.
  • biometric images e.g., an ultrasound scan displayed in real time
  • biometric data e.g., waveform results from an EEG
  • a video of a remote collaborator might be presented in a third window, and a fourth window can include a medical record and/or reference imagery (e.g., if the ultrasound scan shows a diseased heart, the fourth window may include a reference image of a healthy heart.)
  • an operator of the patient telemedicine system 102 is able to annotate any of the displayed images, waveforms or test results.
  • the operator can provide input (e.g., circle or otherwise mark a portion of an image displayed on a touch- sensitive screen) indicating that a particular image part should be highlighted or examined further.
  • This annotation data is also selectively transmitted to other remote collaborators in the platform.
  • the operator can selectively designate which of the above types of data should be transmitted or shared with other remote collaborators in the multi-site data sharing platform 100.
  • the patient telemedicine device 104 obtains or generates frames from the shared data, selectively encrypts, multiplexes and/or transmits them through the network to the other collaborator(s) and telemedicine device(s).
  • Each remote telemedicine device 108/110 receives the frames and is arranged to render them in real time.
  • the various media streams may be shown on one or more display screens at the remote telemedicine device.
  • the operator of each remote telemedicine device 108/110 can configure how the various types of media are displayed. For example, if the patient telemedicine device transmits biometric imaging data, waveform data, video data, an audio stream and patient records to the remote telemedicine device 108/110, the operator of the remote telemedicine device can display the waveform data in one window on the display screen, the biometric imaging data in another window, the video data in a third window, and the patient records in a fourth window. Received audio communications will be played through a speaker associated with the specialist telemedicine device 104.
  • the various media streams are received simultaneously, in real time, and rendered at nearly the same time and order as they are received or displayed at the patient telemedicine device 104.
  • the operator of the remote telemedicine device 108/110 can also provide input to the device, which is transmitted in real time to other collaborators in the platform 100.
  • the operator can annotate any received biometric image, waveform or test result or select a relevant segment of a stream for review (e.g., create a short video segment or "cine-loop") and any of these items can be shared with other collaborators.
  • Stream segments that are typically rendered in the form of a short video sequence that is repeatedly replayed are sometimes referred to herein as "cine-loops.”
  • the operator can also speak with other participants over an audio channel.
  • the operator can also obtain use cases, medical records, reference imagery and any other suitable diagnostic aids or information from the cloud-based server 116 and/or the database 118. When appropriate, the operator can elect to share any of these inputs with other participants in the telemedicine session. Shared media (i.e., annotation data, medical records, use cases, audio messages, etc.) are then transmitted to the patient telemedicine device 104 and/or any other designated devices in the platform, so that they can be rendered and displayed in real time at those devices.
  • shared media i.e., annotation data, medical records, use cases, audio messages, etc.
  • the specialist allows a specialist to use the device 108/110 to fully participate in the aforementioned diagnostic procedure.
  • the specialist telemedicine device 108/110 receives ultrasound imagery. Simultaneously, the specialist also receives a video indicating how an ultrasound probe is being handled to produce the ultrasound imagery. The specialist is thus able to review the procedure in real time.
  • the specialist has the ability to provide feedback to the medical professional who is handling the ultrasound probe or equipment. For example, the specialist can request that the attending medical professional reposition an ultrasound probe on a particular portion of the patient's body to get a better or different view of a region of interest.
  • the server 116 is arranged to facilitate communication between participants in a telemedicine session.
  • the server 116 is not necessary.
  • some or all traffic between multiple telemedicine devices passes through the server.
  • the server 116 helps ensure that each participating device has access to any data that is shared by any other device.
  • the server 116 can provide a variety of additional features, depending on the needs of a particular application. Various implementations involve the server 116 providing a store and forward and broadcasting functionality. That is, any data that is to be shared and is transmitted from a device is first transmitted to the server, which stores the data and then forwards copies of the data to the intended recipients. In some embodiments, the server 116 and/or its connected database(s) 118 store copies of some or all traffic transmitted between the devices in the platform. Upon request from any properly authorized device (e.g., patient telemedicine device 104 and specialist telemedicine device 108/110), the server 116 and/or the database 118 can provide previously transmitted imagery, test results, annotations or any other transmitted data which can be a powerful diagnostics tool.
  • any properly authorized device e.g., patient telemedicine device 104 and specialist telemedicine device 108/110
  • the server 116 and/or the database 118 can provide previously transmitted imagery, test results, annotations or any other transmitted data which can be a powerful diagnostics
  • the patient telemedicine system 102 is arranged to collect data streams from various medical imaging/sensing devices 106a-106e that are being used on a patient, process the streams, and transmit them to designated recipients, such as a specialist.
  • the patient telemedicine system 102 has the ability to received inputs from a variety of different devices such as a video camera 106a, a probe 106b, a probe platform 106e, an image acquisition device 202, an echocardigram (EKG) device 106c, an X-ray device 106d and a display device 210.
  • EKG echocardigram
  • the video camera 106a represents one or more video cameras used to take video footage of an area of interest, a medical scanning/sensing device, a patient, a technician and/or an operator of the telemedicine device 104.
  • the video camera 106a is directed at a medical scanning/sensing device (e.g., an ultrasound probe) and/or a particular region (e.g., the hand of a technician that is gripping the probe and/or a portion of the patient's body where the probe is being applied, such as the chest or abdominal area.)
  • a medical scanning/sensing device e.g., an ultrasound probe
  • a particular region e.g., the hand of a technician that is gripping the probe and/or a portion of the patient's body where the probe is being applied, such as the chest or abdominal area.
  • the video footage can be used to observe how the medical scanning/sensing device or probe is being applied to a patient, and a viewer of the footage can make recommendations on how the
  • a video camera 106a is directed at an operator, medical professional or other person who is participating in the telemedicine session from the side of the patient.
  • a video camera 106a may also be taking video footage of the patient.
  • the video data is streamed to the patient telemedicine device 104 in (near) real time.
  • the probe 106b represents any type of medical imaging/sensing device that is operated and handled by a medical professional.
  • An example of such a probe is an ultrasound probe, which is a wand or other device that emanates ultrasound waves and is typically placed against the body of a patient to provide a scan of a particular portion of the patient's body.
  • the probe 106b is attached to a probe platform 106e, which is arranged to collect the image data.
  • An image acquisition device 202 is attached to the probe platform 106e and is arranged to obtain the biometric image stream generated by the probe platform 106e.
  • the image acquisition device 202 is in turn connected to the patient telemedicine device 104 and transfers the biometric image stream to the telemedicine device 104, so that it can be encoded and transmitted, as desired, to a remote telemedicine device (e.g., specialist telemedicine device 108/110) and other participants in the telemedicine session.
  • a remote telemedicine device e.g., specialist telemedicine device 108/110
  • the electrocardiogram device 106c is arranged to monitor the electrical activity of the heart of the patient.
  • an electrocardiogram involves attaching multiple electrodes to the body of the patient.
  • the electrodes monitor the electrical activity in the body and an electrocardiogram device generates waveforms to indicate this electrical activity.
  • the electrocardiogram device transmits this waveform data to the telemedicine device 104.
  • the electrocardiogram device 106c may represent any number of suitable devices that are arranged to collect biometric data that tracks or reflects physiological changes in the patient. Such devices include but are not limited to an electroencephalography (EEG) device, a temperature detection device, a blood pressure device, a glucose level detector, a weighing device, a pulmonary function test device and a pulse oximeter.
  • EEG electroencephalography
  • the X-ray device 106d is arranged to project X-rays towards a patient and obtain X-ray images of portions of the patient' s body. Any device suitable for obtaining biometric image data may be added to or replace the X-ray device. Such devices include but are not limited to a CT scanner, an MRI device, a retinal scanner, an ultrasound scanning device or any nuclear medicine-related device.
  • the X-ray device generates biometric imaging data, which is also transmitted to the telemedicine device 104.
  • the patient telemedicine device 104 coordinates the operation of the other components of the patient telemedicine system 102.
  • the patient telemedicine device 104 collects data from multiple sources and medical imaging/sensing devices, displays the data, makes requests for additional data, receives input from an operator and shares selected types of data with other participants in a telemedicine session.
  • Any suitable type of computing device may be used.
  • the patient telemedicine device 104 is a laptop, computer tablet, a mobile device and/or a computer.
  • the patient telemedicine device 104 can also receive a variety of other types of data from the cloud based architecture 114 and/or from an operator of the device 104.
  • there is an audio channel or messaging system that allows an operator to communicate with other participants in the telemedicine session by speaking into a microphone 208 that is connected to the patient telemedicine device 104.
  • the operator can provide input to the telemedicine device 104 to request supplementary data 206.
  • the telemedicine data then transmits a request for such data to the cloud-based server 116 and/or database 118.
  • the supplementary data is any suitable diagnostic or reference data that will assist in the diagnostic procedure.
  • the supplementary data 206 includes but is not limited to use cases, reference imagery (e.g., ultrasound images of healthy or unhealthy tissues in the body, etc.), medical records for the patient, descriptions of various medical diseases and conditions, etc.
  • the server 116 transmits the requested supplementary data to the patient telemedicine device 104.
  • the patient telemedicine device 104 may also receive collaboration data 204 from other devices in the same telemedicine session (e.g., specialist telemedicine device 108/110 of FIG. 1.)
  • the collaboration data 204 includes any suitable data that a remote specialist or operator chooses to share with the rest of the participants in the telemedicine session, including but not limited to annotations, audio messages, selected use cases, reference imagery and medical records.
  • the patient telemedicine system 102 includes a display device 210, which is part of or connected to the patient telemedicine device 104 and may be any suitable video screen or user interface suitable for presenting test results and media.
  • the operator of the telemedicine device 104 can select which, if any, of the above types of data should be displayed at the display device 210.
  • FIG. 4 One example user interface 400 displayed in the display device 210 is illustrated in FIG. 4.
  • the user interface 400 includes multiple windows, including windows 402, 404, 406, 408 and 410.
  • the operator of the telemedicine system 104 is able to configure the user interface 400 in accordance with his or her preferences. For example, some of the received imaging, waveform, supplementary or collaboration data may be excluded from the user interface, while other selected media is shown. Each type of media can be shown in a separate window, which can be resized and moved as desired by the operator.
  • an image from an ultrasonic scan, taken using the probe 106b, is displayed in the window 402.
  • the image is a snapshot from an ongoing ultrasound scan of a dog.
  • the image is constantly received and updated in (near) real time as the scan continues.
  • the medical records for the pet, which was downloaded as supplementary data from the cloud-based server, is presented in window 406.
  • a (near) real time video of a technician performing the ultrasonic scan is shown in window 410. This video was obtained from a video stream generated by the video camera 106a.
  • window 408 a real time video of a specialist using the specialist telemedicine device 108 is shown.
  • This video is collaboration data that was transmitted by the specialist telemedicine device 108 for display at the patient telemedicine system 102.
  • various reference ultrasound images are presented that are used to provide a comparative model for the ultrasound imagery in window 402.
  • the patient telemedicine device 104 requests and receives such images from the cloud-based server 116 or database 114.
  • the patient telemedicine device 104 receives a message from a specialist (i.e., through a specialist telemedicine device 108/110), which identifies imagery or other data that is stored in the cloud and should be reviewed.
  • the user can provide input to the patient telemedicine device 104, which triggers the downloading of the images from the cloud-based database 118 for display at the display device 210.
  • the media in windows 402, 408 and 410 are received in real time from various connected medical imagery/sensing devices (e.g., the probe, the video camera) or from a specialist telemedicine device 108/110.
  • any suitable type of collaboration data received from a specialist is generally also received and displayed in real time.
  • the video of the specialist in window 408 can display, in real time, the face of the specialist, since face-to-face conversations are sometimes desirable and can facilitate communications between the participants in the telemedicine session. This video is constantly and progressively transmitted by the specialist telemedicine device 108 to the patient telemedicine device 104 over the network.
  • Some implementations allow an operator to annotate any of the images displayed in the user interface.
  • the operator is able to mark, circle or highlight any of the windows.
  • the operator can provide such marks by, for example, touching or swiping a touch sensitive screen, applying a pen to the screen, or by providing keyboard or other types of input.
  • the patient telemedicine device 104 receives the annotation data 212 and associates it with the frames, data or images upon which the marks were made. This information can then be transmitted in real time to other devices in the multi-site data sharing platform (e.g., the specialist telemedicine device), so that they may review the markings as they are made.
  • the operator of the patient telemedicine device 104 can then provide input to the device 104, designating which types of the aforementioned data (e.g, biometric imaging data, annotation, video, audio, etc.) should be shared with one or more other devices in the platform. Alternatively, this sharing selection process may be automated, possibly based on settings that were configured by the operator beforehand.
  • the patient telemedicine device 104 then obtains the designated data streams and selectively encrypts frames and/or parts of frames of the data streams.
  • the frames are broken down into packet sequences 214, which are multiplexed and/or transmitted to a device 108/110 and/or to the server 116 for distribution throughout the multi-site data sharing platform 100. The frames are then reconstructed at those remote devices for rendering at the devices.
  • the specialist telemedicine device 108 illustrated in FIG. 1 will be described.
  • the specialist telemedicine device includes or is connected to a video camera 110, a microphone 302 and a display device 312.
  • the specialist telemedicine device 108 is used by a specialist (e.g., a radiologist) whose expertise is desired by a user of the patient telemedicine device 104.
  • another medical practitioner e.g., a family doctor for the patient
  • the specialist telemedicine device 108 may be any suitable computing device, including but not limited to a computer, a laptop, a computer tablet and a mobile device.
  • the specialist telemedicine device 108 allows its operator to view biometric and medical data in (near) real time and participate in a diagnostic procedure that is being currently performed on a patient at the site of the patient telemedicine system 102.
  • the specialist telemedicine device 108 includes a network interface that receives the aforementioned packet sequences sent from the patient telemedicine device 104 (e.g., biometric imaging data 318, biometric data 320, video 324, audio 322, collaboration data 326, supplemental data 316, annotations 314, etc.) Additionally, any authorized device in the multi-site data sharing platform 100 can selectively transmit data for review and rendering to the specialist telemedicine device 108.
  • the specialist telemedicine device 108 receives the packet sequences 214 and reconstructs the frames of the data (e.g, biometric imaging data, annotation, video, audio, etc.) The frames can then be rendered and the various types of data can be displayed.
  • the operator of the specialist telemedicine device 108 can provide input to the device, indicating which types of media and data should be displayed at the display device 312.
  • the display device 312 may use any suitable user interface, screen or display technology, including but not limited to a video screen, a touch sensitive screen and an e-ink display. In various implementations, different types of data are shown in separate windows or sections on a user interface 500 displayed on the display device 312. [0059] An example user interface 500 is illustrated in FIG. 5.
  • the user interface 500 includes multiple windows 502, 504, 506, 508 and 510. Different types of media are shown in separate windows.
  • an ultrasound image is shown in windows 502.
  • a video of an ultrasonic probe being applied to the body of a patient is shown in window 508.
  • a video of the specialist at the specialist telemedicine system 108 is shown in window 510.
  • the operator of the specialist telemedicine device 108 may configure, resize, move and select media for each window as described in connection with the user interface FIG. 4.
  • the operator can remotely control the camera generating the video in window 508, allowing the operator to zoom in or out or focus the video camera at different areas of interest, which then adjusts the video in window 508 accordingly and in (near) real time.
  • the display device 210 for the patient telemedicine system 102 is displaying an ultrasound image and a video of an ultrasound scanning procedure in (near) real time as the image is being generated at the probe 106b and as the ultrasonic scan is being performed.
  • These biometric imaging and video streams have been transmitted to the specialist telemedicine device 108, so that they may be displayed (nearly) simultaneously in windows 502 and 508 of the user interface 500.
  • an operator of the specialist telemedicine device is able to observe the biometric images and the handling of the biometric imaging equipment at the patient telemedicine system 102 in (near) real time.
  • Windows 504 and 506 display the supplementary data 316 (reference images and medical records) described in connection with the user interface 400 of FIG. 4.
  • the specialist using the specialist telemedicine device 108 initially requests the supplementary data.
  • the specialist provides input to the specialist telemedicine device 108, which causes the device 108 to send a request for the selected data to the cloud-based server 116 and/or database 118.
  • the desired supplementary data is then downloaded into the device 108 and presented in a user selected window of the user interface 500.
  • another professional at a remote device e.g., patent telemedicine device 104 may have been the first to suggest the use of the supplementary data.
  • the remote device e.g., the patient telemedicine device 104 transmits a message to the specialist telemedicine device 108 recommending particular types of supplementary data which are available in the cloud-based architecture 114.
  • the operator of the specialist telemedicine device 108 provides input to the device 108, which causes the device 108 to retrieve the suggested data and display it at the display device 312.
  • the specialist telemedicine device 108 allows the specialist to create and share audio messages, make annotations, obtain and suggest supplementary data. Generally, such operations are handled in a similar or the same manner as with the patient telemedicine device 104 of FIG. 2. That is, the operator of the specialist telemedicine device 108 can mark or annotate any displayed images or information, recommend use cases, reference images or other types of supplementary data, and create audio messages using the microphone 302. Additionally, the telemedicine device 108 can include a video camera 310, which can take video footage of the specialist or any other suitable target. The operator of the specialist telemedicine device 108 can provide input to the device, identifying which of the above data that should be shared.
  • the operator can further provide input to the device 108 indicating which devices (e.g., another specialist telemedicine device 110, the patient telemedicine device 104, etc.) in the telemedicine session should receive the selected data.
  • the operator makes such selections prior to the beginning of the telemedicine session.
  • different types of data are then automatically shared and transmitted based on the selections.
  • the selected data e.g., annotations, recommended supplementary data, etc.
  • a method 600 for receiving, encoding and transmitting data streams will be described.
  • the method 600 is performed where diagnostic testing is taking place i.e., at the patient telemedicine system 102 of FIG. 2.
  • diagnostic testing is taking place i.e., at the patient telemedicine system 102 of FIG. 2.
  • Selected streams can also be synchronized, so that telemedicine participants at remote devices can view multiple images, waveforms and video in the appropriate order and in (near) real time.
  • the patient telemedicine device receives one or more types of data streams simultaneously from various imaging/sensing/video devices.
  • a video camera 106a takes video footage of an area of interest (e.g., a part of the body where an ultrasound probe is being applied) and streams the footage to the device (step 602).
  • a suitable medical sensing device e.g., an EKG 106c, a heart monitor, a blood pressure monitor, etc.
  • monitors a medical condition of a patient and transmits biometric waveform data to the device 104 (step 604).
  • a medical imaging device e.g., an ultrasound scanner and probe 106b) collects images from the patient and sends them to the device 104 (step 606).
  • any combination of data streams may be received at the device.
  • one useful combination involves an ultrasound probe and a live video camera feed.
  • the video camera that provides the feed is directed at an ultrasound probe and/or a part of a patient's body where the ultrasound probe is being applied. That is, the video camera is oriented so that it can capture where and how the probe is being positioned on the body of the patient.
  • ultrasound imagery that is generated by the ultrasound probe and its associated equipment is also transmitted in (near) real time to the device 104.
  • a specialist at the remote device can observe the medical procedure and make useful suggestions (e.g., request a repositioning of the probe or a different use of the probe) that can immediately be acted upon by the medical professional who is handling the probe.
  • the above data streams are generally transmitted in (near) real time. That is, data streams are progressively transmitted while the streams are generated and one or more diagnostic tests are ongoing.
  • a medical imagery/sensing device detects a change in a physiological condition of the patient, this event is immediately registered with the patient telemedicine device in the form of a change in a transmitted medical image, waveform or data.
  • These images, waveforms or data are also selectively displayed in real time on a display device 210 (step 620).
  • a waveform generated by the heart rate monitor will indicate a rise in beats per minute.
  • the video camera footage indicates a tremor in the patient.
  • the ultrasound scan reveals a quickening in the activity of the heart.
  • Data indicating these changes are received simultaneously at the patient telemedicine device 104 and the aforementioned changes are immediately and simultaneously represented at the display device 210 in the form of changes in the heart rate waveform, the video footage and the ultrasound imagery.
  • the frames of two or more of these data streams will be encoded and synchronized, so that this timing is also conveyed to any remote participants and devices in the telemedicine session.
  • a medical professional who is handling one of the scanning devices or another participant may wish to send audio messages and commentary to other remote participants in the telemedicine session.
  • a technician who is handling an ultrasonic probe may wish to ask a remote specialist whether the probe is properly being applied to the patient, or to comment on an anomaly he or she noticed in the medical images.
  • the participant speaks into a microphone 208 to create the message.
  • the audio message is then transmitted to and received by the patient telemedicine device 104 (step 608).
  • the above data and media is selectively transmitted in (near) real time to remote specialists and other participants (e.g., specialist telemedicine device 108).
  • specialist telemedicine device 108 Upon viewing the images and the diagnostic procedure in real time, a specialist may wish to provide audio commentary or requests, annotate some of the received images, or suggest use cases, reference imagery or other resources.
  • Such collaboration data is transmitted from the specialist telemedicine device(s) 108/110, received, rendered and/or displayed in (near) real time at the patient telemedicine device 104 (step 610).
  • an operator of the patient telemedicine device 104 obtains supplementary data (e.g., use cases, reference imagery, medical records, etc.) from a cloud-based server or database (step 612). Additionally, in various implementations, the operator of the patient telemedicine device 104 annotates or marks any displayed images, waveforms or data (step 614). Any of steps 602, 604, 606, 608, 610, 612 and 614 may be performed using any of the techniques and features previously discussed in connection with FIGS. 2 and 3.
  • supplementary data e.g., use cases, reference imagery, medical records, etc.
  • the operator of the patient telemedicine device 104 annotates or marks any displayed images, waveforms or data (step 614). Any of steps 602, 604, 606, 608, 610, 612 and 614 may be performed using any of the techniques and features previously discussed in connection with FIGS. 2 and 3.
  • Some designs involve storing any of the above received data for later reference (step 616).
  • Such data can be stored in any suitable storage medium e.g., a flash drive, a hard drive, a connected or remote database, etc.
  • the operator of the patient telemedicine device 104 can provide input to the device, causing the device to obtain and display any stored data.
  • One or more of the above received data types are selectively rendered and displayed in real time at the display device 210 (step 620).
  • the operator of the patient telemedicine device 104 can configure the device 104 to remove data from the display, to add data to the display, or otherwise arrange the displayed media (step 618).
  • biometric waveforms, patient records, biometric images, video and supplementary data can be presented in separate, resizable and movable windows, as shown in the user interface 400 of FIG. 4.
  • the operator of the patient telemedicine device can determine data sharing preferences.
  • the operator of the patient telemedicine device provides input to the device, indicating what kinds of data (e.g., biometric imaging, biometric data, audio, video, collaboration data, supplementary data, etc.) should be shared and what telemedicine devices and professionals should receive the data (step 622).
  • data e.g., biometric imaging, biometric data, audio, video, collaboration data, supplementary data, etc.
  • an operator could indicate that all biometric imaging, waveform and biometric data received from the medical imaging/sensing devices should be shared with all other participants (e.g., specialist telemedicine devices) in a telemedicine session, but that any annotations and selected medical records only be shown locally at the display device 210.
  • the data to be shared is progressively encoded, encrypted and transmitted as it is received or created. In various embodiments, these steps are performed by a data encoding module 800, which is stored at the patient telemedicine device 104.
  • the data encoding module 800 is any software or hardware that is suitable for encoding and/or encrypting data streams.
  • An example of a data encoding module 800 is illustrated in FIG. 8.
  • the data encoding module 800 is arranged to receive multiple, different types of data streams from medical imaging/sensing devices or other types of devices (e.g., a microphone, a video camera, etc.) and to separately encode and/or process the data streams.
  • the module receives audio data, video data, biometric imaging data, biometric data and annotation data, although any of the aforementioned data streams received by the patient telemedicine device 104 may also be processed using the data encoding module.
  • the data encoding module 800 includes multiple submodules 802a- 802e that separately process each data type in parallel. That is, submodules 802a-802e receive and process an audio data stream, video data stream, biometric imaging stream, biometric data and annotation data, respectively.
  • Each submodule includes an authenticator 806, an encoder 808 and an encrypter 810.
  • the authenticator 806 helps ensure that only authorized users are able to direct the encoding process and obtain access to the received data streams.
  • the encoder 808 processes, compresses and/or encodes the associated data stream.
  • the encoder 808 outputs frames obtained from the associated data stream.
  • the encrypter 810 is arranged to at least partially encrypt some or all of the frames. The functionality of these various components will be discussed in greater detail below in the context of method 600 of FIGS. 6 and 7.
  • the data encoding module 800 separately encodes each type of data.
  • the encoding process involves several operations. Frames are obtained from each associated data stream.
  • frames are provided by an external device (e.g., a medical imaging/sensing device).
  • a data stream is received and frames are encoded from the data.
  • a frame is a segment, packet or amount of data of any suitable size.
  • a frame of a video stream or a biometric imaging stream includes an image, although this is not a requirement.
  • each encoder 808 compresses (if appropriate) the frames of its associated data stream.
  • ultrasound image streams are compressed at a ratio of approximately 1 :40 to 1:90, while video is compressed at a ratio of approximately 1:250 to 1: 1000, although higher and lower compression levels may also be used for particular applications.
  • the level and type of compression for some data streams is determined dynamically. That is, feedback is received from one or more quality of service agents.
  • the compression scheme used on the data stream is based at least in part on this feedback.
  • Various techniques and arrangements relating to such compression schemes are described in U.S. Patent Application No. 14/291,567, entitled “Dynamic Adjustment of Image Compression for High Resolution Live Medical Image Sharing,” filed May 30, 2014, which is incorporated herein in its entirety for all purposes. Any method, component, system, operation or arrangement described in this application may be used to compress a suitable biometric imaging stream or other data stream in step 623.
  • the encoder 808 adds a timestamp to each frame.
  • this timestamp is inserted into a header of the frame.
  • the timestamp represents an approximate time at which the frame was processed, generated or received at a local device (e.g., the patient telemedicine device 104.)
  • the timestamp can involve any time, value or code that helps indicate time, timing or an order in which frames should be rendered.
  • the timestamp is used to help ensure that frames of particular data streams that are reconstructed at a remote device are properly synchronized and coordinated, as will be discussed in greater detail below.
  • the timestamp may be derived from any suitable source.
  • the timestamp can be based on a timer, clock or CPU clock of the computing device.
  • the timestamp can be based on a time received through the network e.g., using the Network Time Protocol (NTP), from a NTP time server or other type of time server, etc.
  • NTP Network Time Protocol
  • the timestamp is based on time data in a GPS signal received from a GPS satellite.
  • the encrypter 810 in each submodule receives the associated frames from the encoder and at least partially encrypts them. Any known encryption technology may be used. In various embodiments, frames from different types of data streams are separately encrypted using different encryption techniques, depending on the characteristics of the data.
  • FIGS. 9A-9D illustrate frames 900.
  • Each frame includes a header 902 and media data/payload 904.
  • the media payload 904 contains a particular media type (e.g., audio, video, biometric imagery, biometric data, etc.)
  • the header 902 includes metadata relating to the media in the media payload 904 of the frame.
  • the header 902 indicates characteristics of the media payload 904 and/or includes information on how to access, process and/or render the media data.
  • the shaded regions represent parts of the frames that are encrypted, while the white areas of the frames are unencrypted parts. Thus, in FIG. 9A, only the entire header 902 is encrypted.
  • FIG. 9A only the entire header 902 is encrypted.
  • FIG. 9B only a portion but not all of the header is encrypted.
  • FIG. 9C a portion of the header 902 and a portion of the payload 904 of the frame is encrypted.
  • FIG. 9D the entire frame or almost the entire frame is encrypted. In some embodiments, only a particular type of frame is encrypted while other types of frames are not encrypted.
  • the advantage in encrypting only a portion of a frame or particular types of frames is that it substantially reduces overhead.
  • selected portions of each frame are encrypted (and the other portions are left unencrypted) such that an interceptor of the frame would not be able to make sense of the contents of the frame without decrypting those selected portions.
  • the encrypted portion can thus vary depending on the data type and packet structure. That is, video frames may be encrypted in a different manner than biometric imaging frames, etc.
  • the type of data that is encrypted in each frame can vary, depending on the type of data and the needs of a particular application.
  • the encrypted portion(s) of a header may indicate a channel ID and/or sampling rate.
  • the encrypted portion(s) of a header of each frame can indicate a number of macroblocks in a portion of the frame, a quantization table, and/or a DC coefficient of each of the macroblocks.
  • a particular type of frame (e.g., an I-, P- or B-frame for video streams) is encrypted and another type of frame is left unencrypted.
  • the media pay load of a biometric imaging or video frame contains an image, which is divided into multiple slices. Each slice has a header, which is encrypted, although the slices themselves are not.
  • the data stream encoding module breaks the frames from the various data types down into packet sequences, multiplexes the packet sequences and transmits them (step 638) to a remote device (e.g., to a telemedicine device 108/110).
  • the transmission may be performed using any suitable network protocol, such as UDP or TCP/IP.
  • a server 116 is available in the multi-site data sharing platform, all traffic may pass first through one or more servers, and then be broadcasted to any participating specialist telemedicine devices from the server(s). Alternatively, the packet sequences may be sent to a single or multiple telemedicine devices directly.
  • an encrypted message is transmitted to the server or the remote devices, which indicates what portions of each frame are encrypted. This allows an appropriately authorized device at the receiving end to access the frames that follow the encrypted message.
  • the remote device reconstructs frames of the various data streams based on the packet sequences (step 640). For example, the biometric imaging packet sequence is used to reconstruct the frames of the biometric imaging stream, the video packet sequence is used to reconstruct the frames of the video stream, and so on. Each reconstructed frame includes a timestamp, as noted in step 624.
  • Steps 644, 646 and 648 pertain to the synchronization of different combinations of data streams for different applications. It should be appreciated, however, that any suitable group of different data streams may be synchronized.
  • the synchronization of two data streams involves rendering and displaying the frames of the two data streams in order based on their associated timestamps.
  • the frames are rendered and displayed at the remote device in the order that they were generated or originally received at the patient telemedicine device 104. If a frame of one data stream was received at the patient telemedicine device at the same time as the frame of another data stream, if the frames are properly synchronized, the two frames will be rendered and displayed simultaneously at the remote device as well, since the two frames should have similar or identical timestamps.
  • Step 644 pertains to the optional synchronization of frames of a video stream and a biometric imaging stream.
  • the video stream pertains to live video footage of a medical professional handling an ultrasound probe or another medical imaging/sensing device.
  • the biometric imaging stream is received from the medical imaging/sensing device that the professional is handling. That is, the actions of the professional directly and immediately affect what is shown in the biometric imaging stream.
  • Step 646 pertains to the optional synchronization of frames of biometric waveform and biometric imaging streams.
  • Such synchronization is useful in applications where different diagnostic tests are being applied to the same patient and provide different types of information about the same physiological changes.
  • an ultrasound probe is being applied to a patient and is generating an ultrasound image of the patient's heart.
  • a heart rate monitor is applied to the patient, which is continuously monitoring heart activity and outputting a waveform that tracks the heart activity.
  • Particular changes in the activity of the heart e.g., a sudden burst of palpitations or a seizure
  • the synchronization of the frames of the biometric waveform and biometric imaging streams allows telemedicine participants to view the waveform and images in (near) real time with the correct timing and order.
  • Step 648 pertains to the optional synchronization of frames of annotation data and the biometric imaging stream.
  • any operator of a specialist telemedicine device 108/110 or the patient telemedicine device 104 can annotate frames of a displayed biometric image (e.g., an ultrasound image).
  • the annotation takes the form of a circle, line, an underline, a highlight or any other suitable mark.
  • Each frame of annotation data represents or includes an annotation that was applied to a particular frame of a biometric imaging stream.
  • step 624 very similar or the same timestamps are added to the annotation frame and the biometric imaging frame that the annotation was applied to.
  • the frames of the annotation data and biometric imaging stream are used to form packets.
  • the packets are then transmitted and received at a remote device. At the remote device, the packets are used to reconstruct the frames of the annotation data and biometric imaging streams (step 640), which include the aforementioned timestamps.
  • the reconstructed frames of the annotation data and biometric imaging streams are rendered and displayed in the order of their associated timestamps (i.e., since their timestamps are very similar or the same, the annotation and biometric imaging frames are rendered and displayed (approximately) simultaneously.)
  • an annotation that was applied to a frame of a biometric imaging stream at a local device can be displayed together with the same frame of the biometric imaging stream at a remote device. That is, at the remote device there should be no or minimal delay between the display of the annotation and the frame of biometric imaging to which the annotation was applied.
  • steps 602, 604, 606, 608, 610, 612 and 614 can occur simultaneously
  • steps 608-638 can also by performed by any specialist telemedicine device 108/110.
  • each step involves particular types of operations, these operations may be modified as appropriate using any suitable techniques known to persons of ordinary skill in the art.
  • any of the methods e.g., methods 600 and 700 of FIGS. 6 and 7
  • processes, actions and techniques e.g., operations performed by the patient telemedicine device 104 and/or the specialist telemedicine device 108 in connection with any of the figures
  • the computer code or instructions is stored in at least one memory of a device (e.g., a patient telemedicine device 104 or a specialist telemedicine device 108/110.)
  • the device also includes at least one processor.
  • the computer code or instructions when executed by the at least one processor, causes the device to perform any of the operations or methods described herein.
  • the local device can refer to any device in the multi-site data sharing platform 100 (e.g., a patient telemedicine device 104, a specialist telemedicine device 108/110).
  • the remote device refers to any other device that is connected with the local device through the cloud and/or a network and that is also in the multi-site data sharing platform 100 (e.g., a patient telemedicine device 104, a specialist telemedicine device 108/110, etc.).
  • FIG. 2 illustrates a patient telemedicine device 104 that receives simultaneous data streams from a probe 106b, a biometric waveform data source 106c and a biometric imaging source 106d.
  • biometric data is received from the probe and probe platform (e.g., an ultrasound probe)
  • video is received from the video camera (e.g., video footage of the application and use of the probe on the body of a patient) and no biometric waveform data source and/or biometric imaging source is used.
  • a telemedicine device e.g., a specialist telemedicine device, a patient telemedicine device, etc.
  • the telemedicine device is depicted as having particular functions.
  • the patient telemedicine device is depicted as receiving data from multiple components and sources.
  • the patient telemedicine device is not necessarily a single structure and in various embodiments is a system that includes multiple connected components that provide additional features or functionality.
  • the patient telemedicine device can incorporate or include additional adapters, connectors, modules, devices and/or any component described in the telemedicine system 102.
  • the patient telemedicine device incorporates an image acquisition device 202, a microphone 208, a video camera 106a, a diagnostic device, etc.) Therefore, the present embodiments should be considered illustrative and not restrictive and the invention is not to be limited to the details given herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Security & Cryptography (AREA)
  • Physics & Mathematics (AREA)
  • Astronomy & Astrophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Computer Graphics (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

L'invention concerne divers procédés et agencements pour partager des données médicales. Dans un aspect, un ou plusieurs flux de données sont reçus par un ou plusieurs dispositifs médicaux d'imagerie/détection ou d'autres types de dispositifs. Des trames sont obtenues à partir des flux. Dans certains modes de réalisation, des trames particulières et/ou des parties de trames sont sélectivement cryptées. Les trames sont transmises à un dispositif à distance, au niveau duquel elles sont restituées et/ou affichées. Dans divers modes de réalisation les trames des différents flux sont synchronisées.
PCT/US2014/040363 2013-05-31 2014-05-30 Plate-forme de partage de données multi-sites WO2014194276A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361829905P 2013-05-31 2013-05-31
US61/829,905 2013-05-31

Publications (2)

Publication Number Publication Date
WO2014194276A2 true WO2014194276A2 (fr) 2014-12-04
WO2014194276A3 WO2014194276A3 (fr) 2015-01-22

Family

ID=51989546

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/040363 WO2014194276A2 (fr) 2013-05-31 2014-05-30 Plate-forme de partage de données multi-sites

Country Status (1)

Country Link
WO (1) WO2014194276A2 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070217501A1 (en) * 2005-09-20 2007-09-20 A4S Security, Inc. Surveillance system with digital tape cassette
US20110134203A1 (en) * 2009-12-03 2011-06-09 Vladimir Smelyansky Medical video call distributor
US20120173282A1 (en) * 2011-01-01 2012-07-05 Kelley Timothy L Processing a patient study
US20130093829A1 (en) * 2011-09-27 2013-04-18 Allied Minds Devices Llc Instruct-or

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070217501A1 (en) * 2005-09-20 2007-09-20 A4S Security, Inc. Surveillance system with digital tape cassette
US20110134203A1 (en) * 2009-12-03 2011-06-09 Vladimir Smelyansky Medical video call distributor
US20120173282A1 (en) * 2011-01-01 2012-07-05 Kelley Timothy L Processing a patient study
US20130093829A1 (en) * 2011-09-27 2013-04-18 Allied Minds Devices Llc Instruct-or

Also Published As

Publication number Publication date
WO2014194276A3 (fr) 2015-01-22

Similar Documents

Publication Publication Date Title
US20140275851A1 (en) Multi-site data sharing platform
US9092556B2 (en) Multi-site data sharing platform
US8313432B2 (en) Surgical data monitoring and display system
US8797155B2 (en) Distributed medical sensing system and method
Petruzzi et al. WhatsApp: a telemedicine platform for facilitating remote oral medicine consultation and improving clinical examinations
US20110270631A1 (en) Remote healthcare data-gathering and viewing system and method
JP2018526180A (ja) 遠隔医療により患者の重要な生理学的データを測定および報告する統合医療機器およびホームベースシステム
Kurillo et al. New emergency medicine paradigm via augmented telemedicine
Della Mea Prerecorded telemedicine
Ramdurg et al. Smart app for smart diagnosis: Whatsapp a bliss for oral physician and radiologist
JP2019197271A (ja) 医療情報処理システム
JP2007296079A (ja) 医用画像処理装置及びプログラム
WO2014194276A2 (fr) Plate-forme de partage de données multi-sites
Sapkal et al. Telemedicine in India: a review challenges and role of image compression
KR20190138106A (ko) 의료 영상 저장 전송 시스템
US20110275924A1 (en) Method and System for Assimilating and Transmitting Medical Imaging and Associated Data to a Remote User
US20210021787A1 (en) Audio-video conferencing system of telemedicine
Rizou et al. TraumaStation: A portable telemedicine station
Bynum et al. Brief Report: Participant Satisfaction in an Adult Telehealth Education Program Using Interactive Compressed Video Delivery Methods in Rural Arkansas
Wei et al. A secure and synthesis tele-ophthalmology system
US20240323169A1 (en) System and device for display sharing
CN109754651A (zh) 手术示教方法及系统
Fouad Implementation of Remote Health Monitoring in Medical Rural Clinics for Web Telemedicine System
Łabno et al. Telemedicine applications in modern medicine, the possibilities and limitations
Reimer et al. Beyond videoconference: A literature review of store-and-forward applications in telehealth

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14803974

Country of ref document: EP

Kind code of ref document: A2

122 Ep: pct application non-entry in european phase

Ref document number: 14803974

Country of ref document: EP

Kind code of ref document: A2