GB2410390A - Transmitting image data processed in accordance with image processing parameters received from the receiving device - Google Patents

Transmitting image data processed in accordance with image processing parameters received from the receiving device Download PDF

Info

Publication number
GB2410390A
GB2410390A GB0401299A GB0401299A GB2410390A GB 2410390 A GB2410390 A GB 2410390A GB 0401299 A GB0401299 A GB 0401299A GB 0401299 A GB0401299 A GB 0401299A GB 2410390 A GB2410390 A GB 2410390A
Authority
GB
United Kingdom
Prior art keywords
data
image
image data
processing parameters
image processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0401299A
Other versions
GB0401299D0 (en
Inventor
Gerard Wimpenny
Philip Wakely
Richard Titmuss
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
XIOMED Ltd
Original Assignee
XIOMED Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by XIOMED Ltd filed Critical XIOMED Ltd
Priority to GB0401299A priority Critical patent/GB2410390A/en
Publication of GB0401299D0 publication Critical patent/GB0401299D0/en
Publication of GB2410390A publication Critical patent/GB2410390A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4728End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for selecting a Region Of Interest [ROI], e.g. for requesting a higher resolution version of a selected region
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234345Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements the reformatting operation being performed only on part of the stream, e.g. a region of the image or a time segment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234363Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the spatial resolution, e.g. for clients with a lower screen resolution
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234381Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the temporal resolution, e.g. decreasing the frame rate by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/64Addressing
    • H04N21/6405Multicasting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/643Communication protocols
    • H04N21/6437Real-time Transport Protocol [RTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

Embodiments of the invention are concerned with transmitting image data for review at remote locations, in particular with identifying a means of processing medical image data received from a plurality of imaging devices. Accordingly embodiments provide a method of transmitting medical image data from a first system attached to a communications network to a second system requesting medical image data, the second system being remotely connected to the first system via the communications network, the method comprising the steps of: ```receiving first image data from a medical imaging device, the image data having a video format; ```receiving data identifying one or more image processing parameters, from the second system; ```in response to receipt of said image processing parameters, selectively processing the first image data in accordance therewith so as to create second image data at the first system; and ```transmitting the second image data to the second system. Embodiments of the invention therefore operate on data having video format which means that they advantageously operate independently of codec type, and can be used in conjunction with any type of imaging device that is arranged to generate a video signal. An independent claim is included for receiving a plurality of data streams and associated image processing parameters wherein a display mode is selected from a plurality of modes in dependence on the image processing parameters.

Description

241 0390 Method and Apparatus for Transmitting Image Data
Field of the Invention
The present invention relates to a method and apparatus for transmitting image data and, in one aspect, is concerned with remotely transmitting image data received from a plurality of disparate imaging devices, and in another aspect, is concerned with remotely transmitting image data over bandwidth limited communications channels. Embodiments of the invention are particularly, but not exclusively, concerned with processing and transmitting medical image data.
Background of the Invention
Imaging techniques such as computed tomography imaging (CTI), magnetic resonance imaging (MRI), echocardiograph and the like are essential tools for diagnosing a range of medical conditions including heart disease, heart attacks, sports injuries, mammography and cancer. Current trends in imaging include higher resolution, 3D imaging, and faster image capture time, with physicians routinely viewing high resolution displays of beating hearts to help diagnose cardiovascular disease.
Such images are typically stored in digital format for viewing on computers and the like by means of proprietary software. Transferring such files from one file system to another, remote, file system is extremely attractive from the point of view of providing an efficient and prompt healthcare service, but achieving this within a practically useful time frame is a technically challenging problem, due to constraints on available bandwidth. Essentially both the quality of imaging, and thus sizes of the files for transmission, are increasing, as are demands from other data transfer services, which means that there is an ever present need to improve methods of data transmission.
International patent application PCT/US98/19065, publication number WO99/49412, presents one approach to this problem, namely selecting a region or regions of interest in digital image data, and attributing a priority to each region. A wavelet transform of the pixel values of an entire image is performed in order to obtain a complete set of transform coefficients of the wavelet, whereupon those and the transform coefficients corresponding to each region of interest are identified. The transform coefficients for these regions of interest are emphasized either by scaling them up such that more bits are allocated to these transform coefficients or by advancing the encoding ordering of these coefficients. This technique effectively allows for de-prioritisation of areas via the encoding step, which results in a reduction in the overall data rate since less information about these de-prioritised areas will be sent. However, a drawback with this technique is that it is only applicable to codecs using wavelet transform encoding.
US patent US 6,314,452 describes another approach, namely a progressive image encoding method, where an image is compiled as a series of layers comprising a poor quality base layer and a plurality of other layers, each being more and more detailed with respect to one another. Transmission of the encoded data involves several stages: firstly transmitting one, or an initial number of, layers, and thereafter selectively transmitting areas of the upper higher detail layers in response to a user request. Whilst this reduces bandwidth usage, there are nevertheless latency problems associated with receiving requests for layer data and sending data in response thereto.
Summary of the Invention
In accordance with a first aspect of the present invention, there is provided a method of transmitting medical image data from a first system attached to a communications network to a second system requesting medical image data, the second system being remotely connected to the first system via the communications network, the method comprising the steps of: receiving first image data from a medical imaging device, the first image data having a video format; receiving data identifying one or more image processing parameters, from the second system; in response to receipt of said image processing parameters, selectively processing the received image data in accordance therewith so as to create second image data at the first system; and transmitting the second image data to the second system.
Embodiments of the invention therefore operate on data having a video format which means that they advantageously operate independently of codec type, and, unlike the method described in WO99/49412 (which works at the codec level) can be used in conjunction with any type of imaging device that is arranged to generate a video signal.
Preferably the first image data comprise analogue video format data, and the method includes converting the analogue video format data into initial digitised data, and selectively processing the initial digitised data so as to create said second image data.
The first image data can comprise one or a plurality of images.
Advantageously embodiments of the invention can be used to process a plurality of still images, such as are generated by an MRI imaging device, and a sequence of moving images, such as are generated by an ultrasound scanning device. In the case of still images, individual ones of the plurality vary with respect to one another spatially, whereas in the case of moving images, individual images vary with respect to one another temporally. Thus, for cases where first image data comprise a plurality of images, embodiments of the invention could be used to process individual images that vary spatially and/or temporally with respect to one another.
In at least one arrangement the image processing parameters include one or some of frame rate, resolution, andlor data rate, so that processing of the received image data effectively involves modifying the bandwidth requirements of the second image data. Since the bandwidth requirements can be set on a per image basis this therefore provides an extremely flexible way of controlling bandwidth requirements in dependence on the nature of the first image data and the type of communications network through which the second image data is to be transmitted.
The image processing parameters can additionally include data specifying extents of an area of interest within said first image data, which enables users of the second system to specify and select different regions within the image, and, since frame rate, resolution and/or data rate (generally referred to herein as image reproduction parameters) can be specified on a per image basis, the speed and resolution of these different images can easily be modified.
Preferably the second image data comprise a plurality of digitised images, and image processing parameters can be specified for each of the plurality. This means that users of the second system can view a plurality of images, each having different sizes and/or being reproduced at different rates and/or resolutions.
In accordance with a second aspect of the present invention, there is provided a first system and a second system arranged to perform the method described above. The second system comprises one or more terminals that are logically remote from the first system. In at least one arrangement at least one of the terminals of the second system can be physically located in close proximity to the first system, so that an operator of the first system can review the data transmitted to the second system.
In accordance with a further aspect of the invention, there is provided a data processing system for use in displaying medical images, the data processing system comprising: data receiving means arranged to receive a plurality of data streams and image processing parameters corresponding thereto, each data stream corresponding to a medical image; display means arranged to display a display mode selected from a plurality of display modes, the display modes comprising a display mode having at least two regions each corresponding to one of said received data streams, wherein the system is arranged to select a display mode in dependence on the image processing parameters.
In one arrangement the data streams are video data streams and the image processing parameters include position, extents (or size of image), frame rate, resolution and data rate, so that selection of display mode is dependent on, for example, the positions and frame rates of the medical images contained within the data streams.
This aspect of the invention can be seen as relating to processing performed at the second processing system, which preferably includes at least one multimodal input device. During display of a selected display mode, the display means is advantageously arranged to receive data indicative of a portion of interest via the multimodal input device, e.g. via a mouse input from a physician viewing the medical images. The data processing system can then transmit said data indicative of a portion of interest to a terminal located remote from the data processing system, which in one arrangement is the first system.
The first system can then use these image processing parameters to create further medical images, and transmit these remotely to the second system.
Whilst embodiments of the invention are particularly concerned with processing and transmission of medical image data, the method could be applied to other types of image data.
Further features and advantages of the invention will become apparent from the following description of preferred embodiments of the invention, given by way of example only, which is made with reference to the accompanying drawings.
Brief Description of the Drawings
Figure 1 is a schematic diagram showing an environment in which embodiments of the invention operate comprising a medical imaging device, a first system and a second system; Figure 2 is a schematic diagram showing components of a terminal forming part of the first system shown in Figure 1; Figure 3 is a schematic block diagram showing, in greater detail, some of the components shown in Figure 2 and the flow of data therebetween; Figure 4 is a flow diagram showing steps performed by the components shown in Figure 2 in processing and encoding data captured by the medical imaging device shown in Figure 1; Figure 5 is a schematic diagram illustrating examples of constituent parts of image processing parameters; Figure 6 is a schematic diagram showing, in greater detail, some of the components shown in Figure 2 and the flow of data therebetween; Figure 7 is a schematic diagram showing components of a terminal forming part of the second system shown in Figure 1; Figure 8 is a flow diagram showing steps performed by the components shown in Figure 7; and Figure 9 is a schematic diagram showing the effects of the steps shown in Figure 8 on received video streams.
Detailed Description of the Invention
Embodiments of the invention are concerned with controlling the processing of image data, which have been captured at an imaging device, by terminals located remote from the imaging device. In particular, embodiments of the invention are concerned with remotely controlling the format, reproduction and content of data that are transmitted to one or more such terminals through encoding or re-encoding of data.
An environment 10 within which the embodiments operate is shown in Figure 1, comprising a first system 1 attached to a communications network 5 and arranged to receive medical image data in video format from a medical imaging device 3. The environment 10 also includes a second system 7, which is remotely connected to the first system 1 via the communications network 5 and is arranged to control the format and content of the data transmitted from the first system 1 to the second system 7.
The second system 7 comprises one or more terminals capable of receiving and displaying streamed video data, such as desktop computers, laptop computers, mobile terminals and the like; in addition the second system 7 can comprise a data storage system DB for storing the streamed video data. The communications network 5 can include a plurality of interconnected networks such as mobile networks (for example Public Land Mobile Network) and/or fixed line networks (for example one or more Integrated Services Digital Networks and the Public Switched Telephone Network), in which case the environment 10 can also include various gateways (not shown) interconnecting these networks. The medical imaging device 3 is preferably a scanning device, such as an ultrasound machine, a magnetic resonance imaging (MRI) device or a computed tomography imaging (CTI) device and the first system 1 and second system 7 are connected via a public network or private network such as a Virtual Private Network (VPN), which may, or may not involve connection over the Internet. Alternatively the medical imaging device 3 could comprise camera based equipment such as is used to perform endoscopy procedures.
One particular application of embodiments of the invention is remote diagnosis and treatment of potential donor hearts for transplant. Referring to Figure 1, in such an application, video images of the heart of a potential donor are generated by an ultrasound machine 3 under control of an operator, and passed to the first system 1. The first system 1 processes the images into a format suitable for transmission to the second system 7, as is described in detail below, and then transmits the same to the second system, for example to a desktop computer T1 of the second system, thereby enabling an expert located at one of the clients of the second system to view a remote video image based on the ultrasound machine video output. The expert is able to send various control data to the first system, as is described in detail below, effectively guiding the operator through sufficient Trans Oesophageal Echocardiography (TOE) examination to perform clinical diagnosis so as to identify a heart's suitability for transplant and institute therapies to optimise the heart function prior to retrieval. Thus a system configured in accordance with embodiments of the invention can be used by an expert, who is located physically and logically remote from the medical imaging device 3, to control progress of a clinical examination.
Referring now to Figure 2, features and functionality of the first system 1 will now be described in more detail. The first system 1 can comprise one or more computers, and in the representation shown in Figure 2 is embodied as a terminal comprising conventional computer components such as memory, CPU, Operating Systems (OS) programs, an Input/Output port and a fixed storage disc, together with bespoke components arranged to manipulate, and ultimately transmit, data that originated from the ultrasound machine 3. More specifically, these components comprise an Analogue-to-Digital Converter 201, pre encoding processing means 203, encoding means 205 and stream manager 207; at least some of these components are preferably embodied as a computer program, or a suite of computer programs, loaded onto the terminal, whilst others are embodied in hardware.
Considering firstly the analogue-to-digital converter 201 (ADC), data received from the ultrasound imaging device are in standard video format, most commonly PAL (specifically PAL-I) or NTSC, over Composite or S-Video interfaces, and need to be digitised prior to any processing or encoding. The ADC 201 can be embodied as a capture card arranged to interface with the terminal 1 and capable of either digitizing a single video signal; or capable of digitizing and encoding a single video signal; or capable of digitizing and encoding multiple input video signals. Alternatively, the ADC 201 could be embodied in a capture unit external to the terminal 1, arranged to digitize a single video signal and transfer the digitized data, in real time, to the terminal.
(An example of this would be the Canopus ADVC100, which digitises the S Video or Composite video input, encodes the video into DV stream format (a very low loss, high bit rate, compression method) and streams this to the connected Media Server via a firewire (IEEE1394) interface.) Having digitized the input video image data, the ADC 201 passes the data onto the pre-encoding processing means 203, an arrangement of which is shown in more detail in Figure 3. The steps carried out by pre-encoding processing means 203 are, broadly speaking, dependent on the type of information that is being, or likely to be, requested from the second system 7.
For example, an expert associated with the second system 7 may wish to simultaneously view different regions of the heart, some in greater detail than others, to enable remote diagnosis and analysis of the condition of the heart. In such an example, a plurality of images could be requested by the second system 7 - perhaps a first stream corresponding to the entire image captured by the medical imaging device 3 and a second stream corresponding to a specific region within the captured image - each at a different resolution, and/or frame rate.
Accordingly, in one arrangement, the pre-encoding processing means 203 replicates the image digitized by the ADC 201 so as to form a plurality of digitized images, and, for each of the plurality, selectively modifies the extents of the image region so that each corresponds to a different region of the image captured by the medical imaging device. The actual extents of these regions can either be pre- specified, specified by users of the second system 7 (i.e. by the doctor reviewing the image of the heart in real-time), or a combination of both, and are embodied in the form of image sizing parameters, as will be described in more detail below; the extents could also be specified by an operator associated with the first system 1 (i.e. local to the first system 1 and medical imaging device 3). Thus pre-encoding processing means 203 is responsive to data received from the second system 7 and data entered via a local input device associated with first system 1.
In addition to performing the steps of replication, the pre-encoding processing means 203 can also filter the digitized images, so as to effect brightness adjustment, noise reduction, contrast/saturation management, and other types of image adjustment that will maximise the quality of images received and processed by the second system 7.
Referring to Figure 4, an example of the steps carried out by the ADC 201 and pre-encoding processing means 203 will now be described for the case where the pre-encoding processing means 203 creates three image portions 301, 303, 305. At step 401, an image captured by the medical imaging facility 3 is received by the ADC 201, which digitizes the same to create a digital video image having frame size 720x576 and a frame rate of 25 frame-per-second (fps).
The digitized image is then input to the pre-encoding processing means 203, which digitally resides (step 403) the image to form a frame size of 360x288.
This resized image is then used to create a plurality of image portions (in this example 3) on the basis of specified image processing parameters. Referring to Figure 5, the image processing parameters 500 comprise sizing parameters 501, such as extents, centre and resolution of image portion, which are to be used directly by the pre-encoding processing means 203, and image reproduction parameters 503, such as frame rate and data rate which are to be used by the encoding means 205. At least some of the image processing parameters SOO can be received from second system 2, in real-time, and stored in a buffer for retrieval by the pre-encoding processing means 203, as will be described in more detail below.
Accordingly, at step 405, the pre-encoding processing means 203 retrieves image processing parameters corresponding to each image portion 301, 303, 305, replicates the cropped image three times, at step 407, and modifies each image (step 411) in accordance with the image processing parameters 500 retrieved at step 407. Assuming the sizing parameters received at step 407 to be 360x288; 260x120 at 100,120; and lOOx80 at 50,50, the whole of the first replicated image is selected to form first image portion 301; a 160x120 section of the second replicated image located at 100,120 is selected to form second image portion 303; and a lOOx80 section of the third replicated image located at 50,50 is selected to form third image portion 305. Each image portion is then cropped (step 409) the image to a frame size of 352x288. Figure 4 does not include details of the possible filtering operations that can be performed (as described above, these can include brightness adjustment, noise reduction, contrast/saturation management), but the image portions could be so modified after replication (step 407) and prior to cropping (step 409). In addition, having filtered the image portions 301, 303, 305, they could be resized again; the skilled person will appreciate that the actual selection and implementation of these steps is a matter of design and likely to vary dependent on the type of image data.
Each image portion 301, 303, 305 is then copied by a respective loopback driver 311, 313, 315 and input to encoding means 205 (a loopback driver is used in conjunction with the pre-encoding processing means 203 to ensure that the encoding means 205 receives an input from an video device driver; the skilled person will realize that this is a feature of the implementation of the encoding means 205 and not an essential feature of the invention).
In the arrangement shown in Figure 3, the encoding means 205 comprises as many encoders (preferably embodied as one, or a suite of, computer programs) as there are image portions; however, it will be appreciated that there could be one encoder, which is arranged to receive each image portion and encode them in turn. In the present example, since there are three image portions, there are three encoders. Referring again to Figure 4, the pre-encoding processing means 203 passes image reproduction parameters 503 retrieved at step 407 (frame rate, data rate), together with any encoding-specific parameters, to a respective encoder for use in encoding a respective image portion at step 413. The image portions can be encoded in accordance with any codec, such as Real Networks, Windows Media, Quicklime, each of which may comply with one or more of the codec technical standards (e.g. MPEG1, MPEG2, MPEG3, MPEG4, H.264 incorporated into MPEG4 as MPEG4 Part 10). In a preferred arrangement the image portions 301, 303, 305 are encoded in accordance with the MPEG4 standard, using MP4LIVE, which is a Linux_ audio/video capture utility that can capture and encode audio and video in real-time (for further details the reader is referred to the documentation associated with MP4LIVE, which, at October 2003, is managed by Dave Mackie and Bill May).
Each encoder effectively carries out the same steps as the other encoders when encoding a respective image portion, but on the basis of a different set of image reproduction parameters. Continuing with the example above, and assuming image reproduction parameters 503 corresponding to the first, second and third portions 301, 303, 305 are respectively 1 lips and 10kb/s; 2 lips and 20kb/s; and 3 lips and 50kb/s, the encoding means 205 will generate (step 419) three MPEG video streams, having the following characteristics: A. A first stream of the whole captured image, of size 352x288, 1 lips B. A second stream, consisting of a medium size window, resolution 260x120 of size 100,120, relative to a full screen resized to 352x288, at 2fps; and C. A third stream, consisting of a small window, resolution 100x80 of size 50,50 relative to full screen resized at 352x288 at 3fps.
Thus the ADC 201, the pre-encoding processing means 203 and the encoding means 203 cooperate to receive, as input, an initial video image, digitize the same at 720x576 resolution and 25 fps, and generate, as output, encoded streams of images with different parameters. By choosing to encode data at different frame rates (in this example 1 lips for large image, 2 fps for medium and 3 lips for small image), prior to transmission to a remote location, the bandwidth required for these images is far lower than would be the case if the images were not processed prior to transmission.
Having created multiple streams of images, each having different parameters associated therewith (and possibly having been filtered as described above), the streams are then transmitted to the second system 7 (step 415).
Referring to Figures 2 and 6, transmission of data streams between the first and second systems 1, 7 is controlled by stream manger 207, which is embodied as one or a suite of computer programs arranged to receive, from the encoding means 203, a plurality of unicast Realtime Transport Protocol (RTP) streams and to serve the streams to terminals in the second system 7. In this particular arrangement, therefore, the stream manager 207 essentially acts as a communications broker between the first and second systems 1, 7. RTP, which is both an IETF Proposed Standard (RFC 1889) and an International Telecommunications Union (ITU) Standard (H.225.0), is a packet format for multimedia data streams generally used in conjunction with Real-Time Streaming Protocol (RTSP), which is a control protocol responsible for initiating and directing delivery of streaming multimedia from media servers such as the first and second systems 1, 7. In addition to standard RTP control data, RTSP is used to carry non-protocol information, such as sizing parameters 503 (positional and size information) associated with the various image portions 301, 303, 305; this positional information is used by the second system 7 when creating a single image from the multiple received video streams, as will be described in detail below. Alternatively, the Session Initialization Protocol (SIP), one of the ITU protocols, or a combination of RTSP and SIP could be used as the control protocol.
It should be noted that streams modified in accordance with embodiments of the invention require significantly less bandwidth, whilst still providing users of the second system 7 with sufficiently detailed images at locations of interest, than unmodified streams. For certain image processing parameters 500, reductions in bandwidth usage of 66% or more have been achieved.
The stream manager 207 is configured such that data can be captured by the medical imaging facility 3, processed and passed to the stream manager 207 on a continuous basis and thus independently of any requests, or lack thereof, received from the second system 7. In effect, in the absence of requests for a stream from the second system 7, all external RTP transmission of streams is "muted" and no data are sent from the first system 1; this means that external bandwidth is only used when necessary. The skilled person will realize, however, that there are alternative ways of controlling the transmission of encoded data streams, for example, by selectively encoding (and immediately transmitting) the digitized data in response to a request from the second system 7.
Remote selection of the image processing parameters 500 at the second system 7 will now be described. Any set of image processing parameters 500 can conveniently be described as a mode and a plurality of mode options, so that, when a primary user, that is to say a clinician directing the diagnosis, at terminal T1 wants to view a different region and/or at a different frame rate and/or resolution etc. he selects different mode options (to be described in more detail below). These mode options, in the form of image processing parameters, are propagated through the network 5 to the first system 1 and stored locally, for retrieval by the pre-encoding processing means 203 at step 407 and transmission on to secondary clients T2, T3 (passively reviewing the medical images) requesting image data in the second system 7 at step 417. Alternatively the parameters 500 can be stored centrally (not shown) and accessed by the secondary clients of the second system 7 when submitting their request for imaging data to the first system 1. As a further alternative theparameters 500 could be transmitted to, and stored by, the secondary clients T2, T3 directly.
Preferably the RTSP control channel, SIP or one of the ITU protocols is used to send data indicative of a change in mode option.
Receipt and display of the image streams at a client terminal T1 of the second system 7 will now be described with reference to Figures 7, 8 and 9. In addition to conventional operating system software, I/O port, video card, hard drive, CPU and memory, the terminal T1 comprises a plurality of software components that cooperate to receive, decode and process data streams received from the first system 1 for display by terminal T1. As described above, the streams comprise RTP and RTSP (and/or SIP, and/or other ITU protocols) data, which are received by the terminal T1, in accordance with standard methods, and passed to decoding means 703, which decodes (step 801) the streams in accordance with whatever codec was used in step 413 to encode the image data (here MP4LIVE) and passes the decoded streams to a multiplexer 705. The multiplexer 705 is responsible for copying the decoded image frames from the input streams to form a single final image frame for display, using the data specifying the relative positions of the video images sent using RTSP control channel. In a typical scenario, three streams are active: (i) a background stream, showing semi-static text around the screen; (ii) a large "window" with a moving image in the lower half of the screen, showing a slow moving image; and (iii) a small "window" at the top right of the screen containing a fast moving image.
For clarity, and referring to Figure 8, the steps carried out by the multiplexer 705 will be described with reference to streams A, B. C. The multiplexer 705 reads in the relative positions of the streams A, B. C (e. g. from the RTSP data - at step 803), and selectively associates each stream with a location on a display area G (Figure 9). One suitable method involves use of a mask having a plurality of regions R1, R2, R3, each of which is associated with an input stream A, B. C, so that, at step 805 the multiplexer 705 copies the pixels from the input video streams into an output buffer based on the associated mask identity- for example, the region R1 having a lowest identifier is associated with the first stream (which in this example is stream A). Preferably, and as shown in Figure 9, each region R1, R2, R3 has a colour associated therewith, so that the region having a lowest ROB value (R1) is associated with stream A, the region having the next lowest RGB value (R2) is associated with stream B etc. This is advantageous since, when the contents of the output buffer are displayed, the location of image portion of an input stream relative to its intended location within the display area G can be easily identified (since, if an image were offset relative to its associated mask region R1 (thus running the risk of effectively deleting some of the image), this would be immediately apparent). Accordingly the multiplexer 705 is arranged to apply an offset to each of the image portions on the basis of the position information received on the RTSP control channel for example changing the position of the image portion corresponding to stream B to 95,120 - for the image portions to fit within their respective mask region R1, R2, R3. Any additional colours on the mask are assigned black pixels so as to minimize edge effects around the Images.
Having copied the pixels from the input video streams into an output buffer based on the associated mask identity, the multiplexer 705 passes the output buffer to a renderer 707, which, in accordance with known methods, creates an image on the display 711 based on data in the output buffer (step 807). Conveniently the image rendered at step 807 is presented within a Graphical User Interface (GUI) comprising a plurality of selectable mode options, each corresponding to a different set of imaging parameters 500 (to be described in more detail below); preferably the options are prespecified, but they could be user-configurable.
Before being sent to the renderer 707, the multiplexer 705 preferably links the output buffer (or more specifically the co-ordinates of pixels in the output buffer) to an overlay layer, which, when displayed by the GUI is responsive to inputs from the user and is arranged to capture userspecified regions of interest and markers placed within the rendered images. The renderer 707 is arranged to receive data indicative of modified regions of interest (step 809), transform them into co-ordinates and/or resolution values and transmit to the first system 1 as modified image processing parameters (step 811).
Since, as described above, the imaging processing parameters 500 can be modified at any time (e.g. by a change of mode options - step 809), new data streams, corresponding to different image portions and/or different frame rates, data rates and resolutions, can be sent from the first system 1 to client terminal T1 at any time. The multiplexer 705 thus has to co-ordinate receipt and processing of incoming streams to ensure that a coherent image is presented on the display 711 (i.e. that the frame rendered in region R1 corresponds to that in region R2 and R3). In one arrangement, the multiplexer 705 has access to a local store (e.g. memory 713) and a configurable timer, preferably arranged to run at 25 times a second; the skilled person will appreciate that frame rate is dependent upon the type of video format, so that, for example an implementation of the system using an NTSC signal for its video source may have a different frame rate to that of one using a PAL signal source. The multiplexer 705 maintains copies of the most recently received video frame for each input stream, and, when the timer is triggered, combines the stored frames into the output buffer as described above (steps 803, 805). Alternatively steps 803 and 805 could be performed in response to receipt of input video frames, but this approach could lead to a less cohesive rendering of the display, since frames from different streams can be received at different times.
The various modes of operation for a particular medical imaging device, namely an EchoCardioGraph unit, will now be described. Conventional ultrasound scanners create two-dimensional B-mode (also referred to as Echo mode or Blood mode) images of tissue in which the brightness of a pixel is based on the intensity of the echo return and/or, in a color Doppler mode, the movement of fluid (e.g., blood) or tissue. Measurement of blood flow in the heart and vessels using the Doppler effect is well known - the phase shift of backscattered ultrasound waves may be used to measure the velocity of the backscattered waves from tissue or blood, and this Doppler shift can be displayed using different colors to represent speed and direction of flow.
Alternatively, in power Doppler imaging, the power contained in the returned Doppler signal is displayed.
When operating in accordance with embodiments of the invention in the Bmode, a Region of Interest, frame rate, resolution and data rate can be specified, via the plurality of selectable mode options described above, for different parts of a B-mode image. Referring back to Figures 3 and 9, in the B mode, a first image portion 301 (i.e. contained within stream A and rendered in region R1) can be a background image containing data identifying the settings of the echocardiograph machine 3, which can be expected to remain static over time and thus transmitted at a low frame rate and data rate; a second image portion 303 (i.e. contained within stream B and rendered in region R2) can be an image of the entire B-mode image area, which varies over time and needs to be captured and transmitted at a faster frame and data rate respectively than that corresponding to image portion 301; and a third image portion 305 (i.e. contained within stream C and rendered in region R3) can be a Region of Interest within second image portion 303, specified by a user connected to the second system 7 and at a faster frame and data rate still, since this is the region for which the user requires more detail. As an alternative to the arrangement shown in Figure 9, region R3 can be superimposed upon region R2.
The ECG unit can operate in a number of alternative modes, such as Continuous Wave (COO) mode, in which a graphical representation of cardiac activity is displayed, from a specific location in the heart, together with an image of the region of the heart at that location. Referring back to Figures 3 and 9, a first image portion 301 (i.e. contained within stream A and rendered in region R1) can be a background image containing data identifying settings of the echocardiograph machine 3, which can be expected to remain static over time and thus transmitted at a low frame rate and data rate; a second image portion 303 (i.e. contained within stream B and rendered in region R2) can correspond to a graph describing the magnitude of received echoes, while a third image portion 305 (i.e. contained within stream C and rendered in region R3), contains the image of the region of the heart at which ultrasound echoes are being measured. In this mode of operation the second image portion 303 in region R2 is the focal point and the frame rate and data rate corresponding thereto is typically selected to be faster than those corresponding to the first and third image portions 301 and 305; these mode options can be selected via a plurality of selectable mode options on the GUI.
Additional details In addition to streaming video data between the first and second systems 1, 7, terminals T1, T2 of the second system 7 can also include a Voice Over IP (VOIP) and so be arranged to receive and transmit audio data. Audio input can be received using a standard microphone input associated with terminal T1 and encoded using a speech codec such as Speex 11 k wideband codec. The encoded speech data could then be transmitted to a central conference point (not shown) using RTP, or alternative protocol, which comprises software arranged to decode and combine incoming audio streams, then re-encode the same and transmit to all clients receiving data from the first system 1. The conference point could be provided as part of the first system 1 or implemented in a server outside of both the first and second systems. Irrespective of where the conference point is actually running, the re-encoded streams could be passed within (or transmitted to) the first system 1 and multiplexed with the video streams in a transport stream, prior to transmission to the second system 7.
Alternatively, the re-encoded streams could be transmitted directly to all clients T1, T2 from the conference point.
When received at clients T1, T2 of the second system 7, the (combined) audio stream is received within the client transport stack and passed to the decoding means 703, which additionally includes means for decoding audio data, and optionally echo cancellation (echo cancellation may be required within the system dependent upon the overall delays, and whether the user (expert, operator) is using a headset or standard speakers for the audio of the system).
Once decoded by the decoding means 703, the audio data can be output via a standard PC audio output device.
In addition to the video and audio, the environment 10 can be arranged to capture and display data from other medical equipment. This is likely to include (but not be limited to) Vital Signs Monitors, Pulse Oximeters and ElectroCardioGraph (ECG) Waveforms, so that the first system 1 preferably includes at least some of the following interfaces: RS232, GPIB/HPIB or equivalent, Raw Waveforms (analogue), Visual data, Digitisation of data. Any data obtained via RS232, GPIB/HPIB etc will already be in digital form, and consequently only need processing to extract a required subset from any data stream, packetisation and/or header information supplied by the medical equipment. Raw waveforms required for transmission and output (visual, audio or otherwise) must be digitized, e.g. through standard audio input if the bandwidth and sample rates allow, or alternatively through use of DSPs and sampling, and encoded using codecs to minimise bandwidth requirements. For data that are only be available visually, i.e. as part of a video output (e.g. an ECH waveform on an ultrasound display) , the first system 1 can include means arranged to extract a waveform from video data, for encoding and transmission for display or output.
Whilst in the above described embodiments the pre-encoding processing means 203 is described as being separate from the encoding means 205, they could be combined.
Whilst in the above described embodiments all of the images encoded by the encoding means 205 are sent to terminals T1, T2, T3 of the second system 7, alternatively the first system 1 could be arranged to stream only selected image portions 301, 303, 305 to the terminals T1, T2, T3. The stream manager 207 could, for example, be selectively responsive to requests from the terminals, in accordance with specified streaming requirements. Such streaming requirements could, for example, be transmitted from individual terminals T1, T2, T3 of the second system 7, so that individual terminals can select which streams they receive; alternatively the primary terminal T1 could relay streams transmitted thereto onto requesting secondary terminals T2, T3 in accordance with specified streaming requirements.
Whilst in the above described embodiments the stream manager 207 is arranged to transmit the image data as unicast streams using RTP and RTSP/SIP, the stream manager 207 could alternatively be arranged to transmit the image data as multicast packets, in which case nodes T1, T2, T3 of the second system 7 could be arranged to send "join" requests when requesting image data.
Whilst in the above embodiments the second system 7 is described as including a data storage system DB for storing the streamed video data, the first system 1 may additionally or alternatively store the video data, at high quality, for deferred playback.
The above embodiments are to be understood as illustrative examples of the invention. It is to be understood that any feature described in relation to any one embodiment may be used alone, or in combination with other features described, and may also be used in combination with one or more features of any other of the embodiments, or any combination of any other of the embodiments. Furthermore, equivalents and modifications not described above may also be employed without departing from the scope of the invention, which is defined in the accompanying claims.

Claims (32)

  1. Claims 1. A method of transmitting medical image data from a first system
    attached to a communications network to a second system receiving medical image data, the second system being remotely connected to the first system via the communications network, the method comprising the steps of: receiving first image data from a medical imaging device, the first image data having a video format; receiving data identifying one or more image processing parameters, from the second system; in response to receipt of said one or more image processing parameters, selectively processing the first image data in accordance with said one or more image processing parameters so as to create second image data at the first system; and transmitting the second image data to the second system.
  2. 2. A method according to claim 1, in which the first image data comprise analogue video format data, the method including converting the analogue video format data into initial digitised data, and selectively processing the initial digitised data so as to create said second image data.
  3. 3. A method according to claim 1 or claim 2, in which the image processing parameters include one or some of frame rate, resolution, and/or data rate.
  4. 4. A method according to any one of the preceding claims, in which the image processing parameters include data specifying extents of an area of interest within said first image data.
  5. 5. A method according to any one of the preceding claims, in which the second image data comprise a plurality of digitised images.
  6. 6. A method according to claim 5, in which each digitised image is processed in accordance with a different set of image processing parameters.
  7. 7. A method according to claim 6 when dependent on any one of claim 3 to claim 5, in which the frame rate associated with one of the sets of image processing parameters is faster than that associated with at least one other of the plurality.
  8. 8. A method according to claim 6 or claim 7 when dependent on any one of claim 3 to claim 5, in which the resolution associated with one of the sets of image processing parameters is finer than that associated with at least one other of the plurality.
  9. 9. A method according to any one of claim 6 to claim 8 when dependent on claim 4, in which a first area of interest associated with one of the sets of image processing parameters is different to a second area of interest associated with at least one other of the plurality.
  10. 10. A method according to claim 9, in which the first and second areas overlap.
  11. 11. A method according to claim 9 or claim 10, in which the first area is larger than said second area.
  12. 12. A method according to any one of the preceding claims, including receiving data identifying one or more image processing parameters from the first system.
  13. 13. A data processing system for transmitting medical image data through a communications network, the data processing system comprising: a first system attached to the communications network; and a second system remotely connected to the first system via the communications network; wherein the first system comprises: data receiving means arranged to receive first image data from a medical imaging device, and image processing data identifying one or more image processing parameters; data processing means arranged, in response to receipt of said one or more image processing parameters, to selectively process the received image data in accordance with one or more image processing parameters so as to create second image data; and data transmitting means arranged to transmit the second image data to the communications network, and wherein the second system is arranged to transmit said image processing parameters to the first system and receive said second image data transmitted by the first system.
  14. 14. A data processing system according to claim 13, wherein the data processing means is arranged to create a plurality of images corresponding to said first image data.
  15. 15. A data processing system according to claim 13 or claim 14, wherein the second system is arranged to transmit data identifying a plurality of sets of one or more image processing parameters specifying characteristics of a plurality of images.
  16. 16. A data processing system according to claim 15 when dependent on claim 14, wherein, for each created image, the data processing means is arranged to process said image in accordance with a different set of image processing parameters received from the second system, said second image data thereby comprising a plurality of differently processed images.
  17. 17. A data processing system according to any one of claim 13 to claim 16, wherein said second system comprises a plurality of nodes.
  18. 18. A data processing system according to claim 17, wherein a first node of the plurality is arranged to transmit the image processing parameters and a second node of the plurality is arranged to receive at least part of the second image data.
  19. 19. A data processing system according to claim 18, wherein said first node of the second system is arranged to transmit data corresponding to the image processing parameters to the second node, for use in processing said second image data when received by said second node.
  20. 20. A data processing system according to any one of claim 17 to claim 19, wherein the plurality of nodes in the second system includes a plurality of user terminals, each attached to a different part of the communications network.
  21. 21. A data processing system according to claim 20, wherein each of the plurality of user terminals is arranged to receive at least part of the second image data transmitted from the first system.
  22. 22. A data processing system according to claim 21, wherein the first system and second system are inter-connected via a Virtual Private Network.
  23. 23. A data processing system according to any one of claim 17 to claim 22, wherein the, or each, node in the second system is associated with a physician.
  24. 24. A data processing system for use in displaying medical images, the data processing system comprising: data receiving means arranged to receive a plurality of data streams and image processing parameters corresponding thereto, each data stream corresponding to a medical image; display means arranged to display a display mode selected from a plurality of display modes, the display modes comprising a display mode having at least two regions each corresponding to one of said received data streams, wherein the system is arranged to select a display mode in dependence on the image processing parameters.
  25. 25. A data processing system according to claim 24, wherein, for each data stream, the image processing parameters include location data specifying the extents and position of the medical image corresponding thereto, and the display means is arranged to select a display mode in dependence on the location data.
  26. 26. A data processing system according to claim 24 or claim 25, wherein, in at least one display mode, at least part of the regions overlap with one another.
  27. 27. A data processing system according to any one of claim 24 to claim 26, including at least one multimodal input device, and wherein the display means is arranged to receive data indicative of a portion of interest via the multimodal input device.
  28. 28. A data processing system according to claim 27, wherein the system is arranged to transmit said data indicative of a portion of interest to a terminal located remote from the data processing system, for use thereby in creating said medical images.
  29. 29. A medical imaging system for transmitting medical image data through a communications network, the medical imaging system comprising: a medical imaging device arranged to generate first image data in respect of an object being imaged; a first system attached to the communications network; and a second system remotely connected to the first system via the communications network; wherein the first system comprises: data receiving means arranged to receive said first image data from the medical imaging device, and image processing data identifying one or more image processing parameters; data processing means arranged, in response to receipt of said one or more image processing parameters, to selectively process the received image data in accordance with one or more image processing parameters so as to create second image data; and data transmitting means arranged to transmit the second image data to the communications network, and wherein the second system is arranged to transmit said image processing parameters to the first system and receive said second image data transmitted by the first system.
  30. 30. A data processing system for use in generating medical images for transmission to one or more terminals logically located remote therefrom, the data processing system comprising: receiving means arranged to receive data identifying one or more image processing parameters; image processing means arranged to modify first image data having a video format in accordance with said one or more image processing parameters so as to create second image data; and transmitting means arranged to transmit said second image data to said one or more terminals.
  31. 31. A data processing system according to claim 30, wherein the receiving means is arranged to receive said image processing parameters from said one or more terminals.
  32. 32. A method of transmitting image data from a first system attached to a communications network to a second system receiving image data, the second system being remotely connected to the first system via the communications network, the method comprising the steps of: receiving first image data having a video format; receiving data identifying one or more image processing parameters, from the second system; in response to receipt of said one or more image processing parameters, selectively processing the first image data in accordance with said one or more image processing parameters so as to create second image data at the first system; and selectively transmitting the second image data to the second system.
GB0401299A 2004-01-21 2004-01-21 Transmitting image data processed in accordance with image processing parameters received from the receiving device Withdrawn GB2410390A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0401299A GB2410390A (en) 2004-01-21 2004-01-21 Transmitting image data processed in accordance with image processing parameters received from the receiving device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0401299A GB2410390A (en) 2004-01-21 2004-01-21 Transmitting image data processed in accordance with image processing parameters received from the receiving device

Publications (2)

Publication Number Publication Date
GB0401299D0 GB0401299D0 (en) 2004-02-25
GB2410390A true GB2410390A (en) 2005-07-27

Family

ID=31971218

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0401299A Withdrawn GB2410390A (en) 2004-01-21 2004-01-21 Transmitting image data processed in accordance with image processing parameters received from the receiving device

Country Status (1)

Country Link
GB (1) GB2410390A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1920604A2 (en) * 2005-08-01 2008-05-14 Covi Technologies, Inc. Systems and methods for providing high-resolution regions-of-interest
EP1922878A2 (en) * 2005-08-01 2008-05-21 Covi Technologies, Inc. Systems and methods for video stream selection
EP2067357A2 (en) * 2006-09-29 2009-06-10 Lucent Technologies Inc. Method and apparatus for a zooming feature for mobile video service
GB2427783B (en) * 2004-01-22 2009-08-19 Hitachi Int Electric Inc Video distribution device
CN103533305A (en) * 2013-10-10 2014-01-22 国电南瑞科技股份有限公司 B/S framework plugin-free universal video monitoring system
DE102015201354A1 (en) * 2015-01-27 2016-07-28 Siemens Healthcare Gmbh Audio and control data transmission over a common transmission channel in medical imaging systems

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1994003010A1 (en) * 1992-07-20 1994-02-03 Automated Medical Access Corporation Automated high definition/resolution image storage retrieval and transmission system
WO1996029818A1 (en) * 1995-03-17 1996-09-26 Imperial College Of Science, Technology & Medicine Progressive transmission of images
WO2000001151A1 (en) * 1998-06-26 2000-01-06 Sarnoff Corporation Apparatus and method for dynamically controlling the frame rate of video streams
EP1187418A2 (en) * 2000-08-10 2002-03-13 Nidek Co., Ltd. Image distibution apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1994003010A1 (en) * 1992-07-20 1994-02-03 Automated Medical Access Corporation Automated high definition/resolution image storage retrieval and transmission system
WO1996029818A1 (en) * 1995-03-17 1996-09-26 Imperial College Of Science, Technology & Medicine Progressive transmission of images
WO2000001151A1 (en) * 1998-06-26 2000-01-06 Sarnoff Corporation Apparatus and method for dynamically controlling the frame rate of video streams
EP1187418A2 (en) * 2000-08-10 2002-03-13 Nidek Co., Ltd. Image distibution apparatus

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2427783B (en) * 2004-01-22 2009-08-19 Hitachi Int Electric Inc Video distribution device
EP1920604A2 (en) * 2005-08-01 2008-05-14 Covi Technologies, Inc. Systems and methods for providing high-resolution regions-of-interest
EP1922878A2 (en) * 2005-08-01 2008-05-21 Covi Technologies, Inc. Systems and methods for video stream selection
EP1922878A4 (en) * 2005-08-01 2010-09-22 Covi Technologies Inc Systems and methods for video stream selection
EP1920604A4 (en) * 2005-08-01 2010-09-22 Covi Technologies Inc Systems and methods for providing high-resolution regions-of-interest
EP2067357A2 (en) * 2006-09-29 2009-06-10 Lucent Technologies Inc. Method and apparatus for a zooming feature for mobile video service
CN103533305A (en) * 2013-10-10 2014-01-22 国电南瑞科技股份有限公司 B/S framework plugin-free universal video monitoring system
CN103533305B (en) * 2013-10-10 2018-01-12 国电南瑞科技股份有限公司 A kind of plugin-free universal video monitoring of B/S frameworks
DE102015201354A1 (en) * 2015-01-27 2016-07-28 Siemens Healthcare Gmbh Audio and control data transmission over a common transmission channel in medical imaging systems
DE102015201354B4 (en) * 2015-01-27 2016-11-17 Siemens Healthcare Gmbh Audio and control data transmission over a common transmission channel in medical imaging systems

Also Published As

Publication number Publication date
GB0401299D0 (en) 2004-02-25

Similar Documents

Publication Publication Date Title
US20060122482A1 (en) Medical image acquisition system for receiving and transmitting medical images instantaneously and method of using the same
US5619995A (en) Motion video transformation system and method
US5949491A (en) Ultrasound image management system
EP2098995B1 (en) System for real-time volume rendering on thin clients via a render server.
US8924234B2 (en) Streaming video network system
Perlman et al. Real-time remote telefluoroscopic assessment of patients with dysphagia
US20110267418A1 (en) Telemedicine system
US20100202510A1 (en) Compact real-time video transmission module
US6771822B1 (en) Method and apparatus for storing image frame with user-selected compression
GB2410390A (en) Transmitting image data processed in accordance with image processing parameters received from the receiving device
CN108366225A (en) Method, apparatus, equipment and storage medium based on the acquisition of multichannel endoscopic images
Yoo et al. Performance of a web-based, realtime, tele-ultrasound consultation system over high-speed commercial telecommunication lines
EP3573069A1 (en) Ultrasound diagnostic system with multimedia information distribution system
EP1168964A1 (en) Medical imaging system
JP2000116606A (en) Medical image processing method, medical image transmission method, medical image processor, medical image transmitter and medical image transmission apparatus
Yoo et al. Design of a PC-based multimedia telemedicine system for brain function teleconsultation
Pedersen et al. Telemedicine applications of mobile ultrasound
JP2002282251A (en) Method and device for transmitting live streaming image from ultrasonic imaging system through network
CN114710635A (en) Method and system for recording images by medical image processing platform
El Jaouhari et al. Streaming dicom real-time video and metadata flows outside the operating room
JP3211786U (en) Interactive device using live video
Fogliardi et al. Telecardiology: results and perspectives of an operative experience
CN101887491A (en) Distributed PACS-based teleconsultation method
Barbier et al. Clinical Validation of Different Echocardiographic Motion Pictures Expert Group-4 Algorythms and Compression Levels for Telemedicine.
US11949927B2 (en) Methods and systems for hybrid and concurrent video distribution for healthcare campuses

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)