WO2005031650A1 - System and method for presentation of data streams - Google Patents

System and method for presentation of data streams Download PDF

Info

Publication number
WO2005031650A1
WO2005031650A1 PCT/IL2004/000906 IL2004000906W WO2005031650A1 WO 2005031650 A1 WO2005031650 A1 WO 2005031650A1 IL 2004000906 W IL2004000906 W IL 2004000906W WO 2005031650 A1 WO2005031650 A1 WO 2005031650A1
Authority
WO
WIPO (PCT)
Prior art keywords
data stream
presentation
stream
vivo
image
Prior art date
Application number
PCT/IL2004/000906
Other languages
French (fr)
Inventor
Eli Horn
Hagai Krupnik
Original Assignee
Given Imaging Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Given Imaging Ltd. filed Critical Given Imaging Ltd.
Priority to DE602004031443T priority Critical patent/DE602004031443D1/en
Priority to JP2006531011A priority patent/JP4027409B2/en
Priority to AU2004277001A priority patent/AU2004277001B2/en
Priority to AT04770577T priority patent/ATE498875T1/en
Priority to EP04770577A priority patent/EP1676244B1/en
Publication of WO2005031650A1 publication Critical patent/WO2005031650A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/041Capsule endoscopes for imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000094Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/065Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/07Endoradiosondes
    • A61B5/073Intestinal transmitters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B43/00Balls with special arrangements
    • A63B43/02Balls with special arrangements with a handle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • G06T11/206Drawing of charts or graphs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00011Operational features of endoscopes characterised by signal transmission
    • A61B1/00016Operational features of endoscopes characterised by signal transmission using wireless means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30028Colon; Small intestine

Definitions

  • the present invention relates to presentations of data streams and to a system and method for presenting in-vivo data.
  • GI imaging devices include ingestible capsules that may capture images from the inside of the gastrointestinal (GI) tract. Captured images may be transmitted to an external source to be examined, for example, for pathology by a healthcare professional.
  • in-vivo devices may include various other sensors that may transmit data to an external source for monitoring and diagnosis.
  • An in-vivo device may collect data from different points along a body lumen, for example lumens of the GI tract, and transmit them externally for analysis and diagnosis.
  • the GI tract is a very long and curvy path such that it may be difficult to get a good indication of where along this tract each transmitted datum was obtained.
  • Time bars are known to be used when reviewing data, so as to indicate to the health professional how far along the image stream he/she may have advanced.
  • the in-vivo device may stall or advance at different speeds through various sections of a body lumen, for example, the GI tract, it may not be positively determined in some cases where or at what distance along the GI tract was a particular datum, for example an image, captured.
  • the time bar there may be no indication as to when the device may have reached certain anatomical milestones, for example, the duodenum, the cecum, or other anatomical locations in the GI tract.
  • Localization methods have been applied. Some localization methods may indicate the spatial position of the device in space at any given time.
  • An in-vivo device may collect data from more than one sensor along the very long and curvy path resulting in multiple data streams captured by the in-vivo sensor.
  • Embodiments of the present invention may provide a system and method for generating and displaying a fixed graphical presentation of captured in-vivo data streams.
  • the fixed graphical presentation is a varying visual representation of a quantity or a dimension captured in an in-vivo data stream.
  • the graphical presentation is in the form of a color bar.
  • the fixed graphical presentation may be displayed along side a streaming display of a data stream.
  • Figure 1 is a schematic illustration of an in-vivo imaging system in accordance with embodiments ofthe present invention
  • Figure 2 is a schematic illustration of a display of a color bar together with other data captured in-vivo in accordance with an embodiment of the present invention
  • Figure 3 is a schematic illustration of a color bar with an identified anatomical site in accordance with an embodiment ofthe current invention
  • Figure 4A and 4B are schematic illustrations of exemplary pH and blood detecting color bars respectively in accordance with embodiments of the present invention
  • Figure 5 is a display with more than one color bar that may be viewed substantially simultaneously according to an embodiment ofthe present invention
  • Figure 6 is a flow chart describing a method for presentation of in-vivo data
  • Embodiments of the present invention offer a device, system and method for generating a fixed graphical presentation of a captured data stream, for example image streams, other non-imaged data, or other data such as color coded, possibly imaged data (e.g., pH data, temperature data, etc.) that may have been collected in vivo, for example along the GI tract.
  • the summarized graphical presentation may include, for example, a varying visual representation, for example, a series of colors that may be at least partially representative of a quantity and/or data collected, e.g. a series of colors where each color presented on the bar may representative of a value of a parameter.
  • the summarized graphical presentation may be a fixed display along side a streaming display ofthe data stream
  • the presentation may map out a varying quantity (e.g. a captured data stream) and may, for example, give indication of the relationship between the data stream captured and the anatomical origin or position relative to a start of the captured data stream, for example, the approximate or exact site, for example, in the GI tract from where various data captured may have originated.
  • the mapping may give, for example, an indication of an event (e.g. a physiological event) captured, measured, or otherwise obtained.
  • the mapping may give for example an indication of change of one or more parameters measured over time, for example, a change occurring due to pathology, a natural change in the local environment, or due to other relevant changes.
  • the location may be relative to other information, for example, anatomical attributes along for example the GI tract.
  • the location may in some embodiments be an absolute location, such as a location based on time or based on position of an in-vivo information capture device, based on an image frame in a sequence of images, etc.
  • Fig. 1 shows a schematic diagram of an in-vivo sensing system according to one embodiment ofthe present invention.
  • an image sensing system ⁇ may include an in-vivo sensing device 40, for example an imaging device having an imager 46, for capturing images, an illumination source 42, for illuminating the body lumen, a power source 45 for powering device 40, and a transmitter 41 with antenna 47, for transmitting image and possibly other data to an external receiving device 12.
  • in-vivo device 40 may include one or more sensors 30 other than and/or in addition to image sensor 46, for example, temperature sensors, pH sensors, pressure sensors, blood sensors, etc.
  • device 40 may be an autonomous device, a capsule, or a swallowable capsule.
  • device 40 may not be autonomous, for example, device 40 may be an endoscope or other in-vivo imaging sensing device.
  • the in-vivo imaging device 40 may typically, according to embodiments of the present invention, transmit information (e.g., images or other data) to an external data receiver and/or recorder 12 possibly close to or worn on a subject.
  • the data receiver 12 may include an antenna or antenna array 15 and a data receiver storage unit 16.
  • the data receiver and/or recorder 12 may of course take other suitable configurations and may not include an antenna or antenna array.
  • the receiver may, for example, include processing power and/or a LCD display from displaying image data.
  • the data receiver and/or recorder 12 may, for example, transfer the received data to a larger computing device 14, such as a workstation or personal computer, where the data may be further analyzed, stored, and/or displayed to a user.
  • computing device 14 may include processing unit 13, data processor storage unit 19 and monitor 18.
  • Computing device 14 may typically be a personal computer or workstation, which includes standard components such as processing unit 13, a memory, for example storage or memory 19, a disk drive, a monitor 18, and input- output devices, although alternate configurations are possible.
  • Processing unit 13 typically, as part of its functionality, acts as a controller controlling the display of data for example, image data or other data.
  • Monitor 18 is typically a conventional video display, but may, in addition, be any other device capable of providing image or other data.
  • Instructions or software for carrying out a method according to an embodiment of the invention may be included as part of computing device 14, for example stored in memory 19.
  • each of the various components need not be required; for example, the in-vivo device 40 may transmit or otherwise transfer (e.g., by wire) data directly to a viewing or computing device 14
  • In-vivo imaging systems suitable for use with embodiments of the present invention may be similar to various embodiments described in US Patent No. 5,604,531, assigned to the common assignee of the present application and incorporated herein by reference, and/or Publication Number WO 01/65995, also assigned to the common assignee of the present application and incorporated herein by reference. Other in-vivo systems, having other configurations, may be used.
  • Embodiments of the present invention include a device, system, and method for generating a typically concise and/or summarized graphical presentation of parameters sensed through or over time in a body lumen, for example the GI tract or any other tract, through which a sensing device may be present and/or traveling.
  • Viewing a data stream captured by an in-vivo device e.g., viewing an image stream transmitted by an ingestible imaging capsule may be a prolonged procedure.
  • a summarized presentation of the data captured may, for example, provide a visual representation and/or map of the captured data and may help focus a health professional's attention to an area of interest and/or may promote a health professional's understanding ofthe origin and contents ofthe data being viewed.
  • One or more streams of data obtained from said sensing device may be processed to create one or more summarized presentations that may, for example, be displayed in a graphical user interface, for example a graphical user interface of analysis software.
  • presentation of a data stream may be with a bar, for example, a color bar that that may be displayed, for example, on a monitor 18 perhaps through a graphical user interface, or, for example, in real time on an LCD display on a receiver 12 as a data stream is being captured.
  • the presentation may include a varying visual representation of a quantity or a dimension representing, for example, a varying quantity captured in a portion of (e.g., an image frame) an in-vivo data stream.
  • the dimension may be color.
  • the presentation may typically be an abstracted or summarized version of image or other data being presented, for example, streamed on a different portion of the display.
  • the presentation may typically include multiple image items or data items such as bars, stripes, pixels or other components, assembled in a continuing series, such as a bar including multiple strips, each strip corresponding to an image frame.
  • a portion of the presentation may represent a summary of the overall color scheme, brightness, pH level, temperature, pressure, or other quantity on a displayed frame or data item.
  • Fig. 2 showing a display and/or a graphical user interface 200 for displaying data captured in-vivo data.
  • Display 200 may include a summarized graphical presentation 220 of an in-vivo data stream, for example, a color bar.
  • the graphical presentation 220 may be a fixed presentation displayed alongside a streaming display of a data stream 210, for example, an image stream in accordance with some embodiments of the present invention. In other embodiments ofthe present invention, graphical presentation 220 may be displayed separately.
  • the graphical presentation 220 may include a series of colors, a series of colored areas, or a series of patterns, image items, images or pixel groups (e.g., a series of stripes 222 or areas of color arranged to form a larger bar or rectangular area), where each, for example, color in the series 222 may be associated with and/or correspond to an element or a group of elements in the original data stream.
  • each colored stripe 222 may correspond to an image or a group of images from a data stream 210.
  • Image units other than stripes e.g., pixels, blocks, etc.
  • the image units may vary in a dimension other than color (e.g., pattern, size, width, brightness, animation, etc).
  • One image unit e.g., a stripe 222
  • the series of, for example, colors in the bar may be arranged in the same sequence or order in which the data stream, for example, the images or groups of images may typically be displayed.
  • pointing at a stripe in a graphical presentation 220 may advance the image stream to the frames corresponding to that stripe.
  • the color bar may be generated by, for example, assigning a color to each element (e.g., an image frame) or subgroup of elements in the data stream and then processing the series of colors, for example such that it may emphasize variations within the displayed properties. In one embodiment of the invention, it may be processed, for example to emphasize cue points in an accompanying video such that, for example, it may be used as an ancillary tool for indicating points of interest.
  • a stream of data 210 may be displayed along side one or more bars and or graphical presentations (220 and 230) described herein.
  • the data stream 210 may be for example data represented in the graphical presentation 220 (e.g. a captured in-vivo image stream) or other data obtained and/or sampled simultaneously or substantially simultaneously with the data represented in the graphical presentation 220.
  • a marker or indicator 250 may progress across or along the graphical presentation 220 as the substantially corresponding datum in data stream 210 (e.g., video) may be concurrently displayed to indicate the correspondence between the graphical presentation 220 and the data stream 210.
  • the presentation may be of a shape other than a bar, for example a circle, oval, square, etc. According to other embodiments, the presentation may be in the form of an audio tract, graph, and other suitable graphic presentations.
  • An indicator 250 such as a cursor may advance along the time bar 230 and graphical presentation 220 as the image stream 210 is scrolled on the display 200.
  • control buttons 240 may be included in the display that may allow the user to, for example, fast-forward, rewind, stop play or reach the beginning or end of, for example, an image stream 210.
  • a user may control the display of a data stream 210, for example, by altering the start position of the streaming display, e.g. skipping to areas of interest, by moving the position of indicator 250, for example with a mouse or other pointing device.
  • a user and/or health professional may insert indications or markers such as thumbnails to mark location along the image stream for easy access to those locations in the future. For example, a health professional may mark these milestones on the graphical presentation 220 (e.g., using a pointing device such as a mouse, a keyboard, etc).
  • a health professional may mark these milestones on the graphical presentation 220 (e.g., using a pointing device such as a mouse, a keyboard, etc).
  • thumbnails or other markers may be defined based on an image of interest from the data stream 210, based on a location identified on the graphical presentation 220 or based on a time recorded on time bar 230.
  • Other suitable methods of defining thumbnails or other markers or notations may be used. For example, a computer algorithm may be used to identify thumbnails that may be of interest to, for example, the health professional.
  • Algorithm based thumbnails may also, for example, be based on an image of interest from the data stream 210, based on a location identified on the graphical presentation 220 or based on a time recorded on time bar 230, or other methods.
  • the graphical presentation 220 may be in itself a series of color thumbnails, so that a user may point or "click" on colors in the color bar to restart the display ofthe data stream from a different point in the stream.
  • Fig. 3 is a schematic illustration of a tissue color bar according to an embodiment of the present invention. Tissue graphical presentation 220 may have been obtained through image processing of a stream of images obtained, for example, from an imager 46 imaging the tissue of the GI tract.
  • the tissue graphical presentation 220 represents, for example, a compressed and perhaps smoothed version of the image stream captured such that the top horizontal strip of color on the bar may represent a first image, a first representative image, or a first group of images captured and the bottom horizontal strip of color may represent the last image, the last representative image, or a final set of images captured; in alternate embodiments only a portion ofthe image stream and or other data stream may be represented.
  • the color scheme of image frames taken of tissue over time may change, for example as an in-vivo imaging device 40 travels along the GI tract.
  • Changes in the color scheme of the images may be used to identify, for example, passage through a specific anatomical site, for example, the duodenal, cecum or other sites, and/or may indicate pathology, for example bleeding or other pathology.
  • the changes in color streams may be readily identified.
  • passage into the cecum may be identified by a color that may be typical to the large intestine, for example, a color that may indicate content or a color typical of the tissue found in the large intestine.
  • Entrance into the duodenum may be identified by another color that may be typical of the tissue in the small intestine.
  • anatomical sites may be identified by observing color and/or changing color streams on a color bar, for example, a tissue color bar.
  • a pathological condition such as for example, the presence of polyps, bleeding, etc.
  • a specific area of interest such as pathology indicated by blood, may be directly identified through the tissue.
  • a health professional may first examine the tissue graphical presentation 220 and only afterwards decide what block of images to review.
  • an algorithm may be employed to identify anatomical sites, pathologies, or areas of interest using data from such a color bar and bring them to the attention of a health professional, by for example marking the area of interest along the displayed color bar.
  • a health professional may use the thumbnails or markings along a tissue color bar, for example, markings and/or markings of the first gastric image 320, the first duodenum image 330 and the first cecum image 340 to locate where along the GI tract the data 210 (concurrently being displayed) may be originating from. Knowing the area at which an image was captured may help a health professional decide if an image viewed is representative of a healthy or pathological tissue, and may help a health professional to determine other conditions. According to some embodiments, different colors or other visual indications, shades, hues, sizes or widths, etc. may be artificially added to a processed data stream, for example, in order to accentuate changes along the data stream. Other processing methods may be used to enhance the information presented to the user.
  • smoothing may or may not be performed on selected pixels based on decision rules. For example in one embodiment of the invention smoothing may not be performed on dark pixels or on green pixels that may indicate content in the intestines.
  • Fig.4A and 4B showing an example, of graphical presentations in the form of a bar other than tissue color bars.
  • Fig. 4A shows a schematic example of a pH color bar 225 that may map out pH measurements obtained, for example over time or alternatively along a path of a body lumen. Other measurements may be used, for example, temperature, blood sensor, and pressure measurements may be used.
  • Data obtained from an in-vivo pH sensor may be displayed with color, brightness, and/or patterns to map out the pH over time and/or over a path, for example a GI tract where different colors may represent, for example, different pH levels. In other examples, different colors may represent different levels of changes in pH levels. Other suitable presentations may be displayed. Changes in pH along a path may be due to pathology, entrance into or out of anatomical locations, etc. Observed changes in pH over time may, for example, classify physiological occurrences over time, for example a healing process, progression of a medical condition, etc.
  • Fig. 4B is a schematic illustration of blood detecting color bar 226. In one example, color stripes 222 along the bar may indicate a site where blood may have been detected.
  • color bar 226 may give indication of the presence of blood over a period of time.
  • International application published as WO 02/26103 entitled “An immobilizable in vivo sensing device" assigned to the assignee of the present invention which is incorporated by reference herein in its entirety includes, inter alia, descriptions of embodiments of devices, such as capsules, that may be anchored at post-surgical sites.
  • Embodiments described in International application WO 02/26103 may be used in conjunction with the system and methods described herein to capture and transmit data for an in-vivo site over time.
  • a presentation of the captured data may give indication of any changes occurring over time from a current static situation or may show an overview of how a tissue healed or changed over time without having to review the entire stream image by image.
  • FIG. 5 showing schematically a graphical user interface for viewing a streaming display of in-vivo data 210 along with multiple fixed summarized graphical presentations 220, 225, and 226 of a data stream.
  • a single scrolling indicator 250 may be used along with a time bar 230 to point to a position along the fixed presentation of the data streams (220, 225, and 226) so as to indicate where along the bars the data 210 presently being displayed originated.
  • the individual color bars may include for example, a tissue graphical presentation 220, a pH color bar 225, and a blood detector color bar 226.
  • Other number of graphical presentations, other suitable types of bars summarizing other data, and other suitable types of presentations may be used.
  • Multiple graphical presentations may be helpful in diagnosis of medical conditions as well as locating within a stream of data, sites of interest. Multiple graphical presentation may increase the parameters that are available to a health professional when reviewing, for example, an image stream and may give a better indication of the environmental condition that may exist at a point of observation. For example, in one embodiment, pH, temperature and tissue graphical presentations or other presentation may be displayed, possibly, side by side.
  • two or more streams of information may be displayed simultaneously and combined into a single graphical presentation using for example a unifying algorithm.
  • pH and temperature can be combined into a single color bar where, for example, red holds the temperature values and blue holds the pH values (other suitable colors may be used).
  • a physician may choose which parameters he/she is interested in viewing as a map or summary. Having more than one set of parameter available at one time may make it easier to find more anatomical sites and to identify areas that may, for example, contain pathologies. Numerous algorithms based on case studies or other suitable data may be applied to suggest to the physician alert sites or other information obtained from one or more color bars or from the combination of one or more color bars.
  • Non-limiting examples of different types of graphical presentations may include: • Tissue graphical presentation: brightness, pattern, or other visual representation of a tissue image stream; • Temperature graphical presentation: color, brightness, pattern, or other visual representation of sensed in-vivo temperature data over time and/or along a body lumen; • PH graphical presentation: color brightness, pattern, or other visual representation of sensed in-vivo temperature data over time and/or along a body lumen; • Oxygen saturation graphical presentation: color, brightness, pattern, or other visual representation of sensed oxygen saturation over time and/or along a body lumen; • Pressure graphical presentation: color, brightness, pattern, or other visual representation of sensed in-vivo pressure over time and/or along a body lumen; • Blood detection graphical presentation: color, brightness, pattern, or other visual representation of sensed presence of bleeding over time and/or along a body lumen; • Bio
  • FIG. 6 showing a flow chart of a method for presentation of an in-vivo data stream according to an embodiment of the present invention.
  • a fixed presentation of a data stream may be displayed, e.g.
  • a color bar a series of strips of varying width or brightness, etc., summarizing, for example, an image stream, a pH data stream, temperature data stream etc.
  • a user may annotate portions of the fixed presentation (block 680) for example identified anatomical sites and/or physiological events. More than one fixed presentation may be displayed.
  • a time bar indicating the time that data from a displayed data stream may have been sampled and/or captured may be displayed. A time bar need not be used.
  • the data stream to be displayed may be initiated (block 630) so as, for example, to begin the streaming display.
  • streaming of the data stream may begin.
  • the displayed data stream may be other than the data stream represented in the fixed presentation.
  • an in-vivo device may capture images as well as sample, for example, temperature values, as it progresses through the body lumen.
  • a fixed presentation of temperature values may be displayed along side a streaming display of image frames captured substantially simultaneously.
  • the fixed presentation as well as the streaming display may be of the captured image frame.
  • a cursor or other indicator may point to a position on the fixed presentation (as well as the time bar) that may correspond to the data (e.g., an image frame, a pH value) displayed in the displayed data stream.
  • a command may be received to stream the display from a different point in the data stream.
  • the user may drag the cursor along the fixed presentation to indicate the point at which the streaming should begin.
  • the user may annotate portions in the fixed presentation (block 680) and at some point click on the annotations to begin streaming the data stream at the corresponding point in the displayed streamed data stream.
  • Other suitable methods of receiving user inputs may be implemented and other suitable methods of annotations other than user input annotations may be implemented, for example as may have been described herein.
  • the start position of the streaming display may be defined by a user input and with that information a command to begin streaming from the defined point may be implemented.
  • Other operations or series of operations may be used.
  • Various suitable methods may be use to abstract data from the source data stream (e.g.
  • a set (wherein set may include one item) or series of data items, for example frames from an image stream may be extracted. For example every 10 th frame from the image stream may be extracted and or chosen to represent the image stream in a fixed presentation. In other embodiments, all the data items or frames may be included, or every 5 th , 20 th , or any other suitable number of frames may be used. In yet other embodiment of the present invention, an image representing an average of every two or more frames may be used.
  • a criterion may be defined by which to define one frame out of a block of frames (e.g. two or more frames) to be representative of that block.
  • a vector of average color or other values e.g., brightness values
  • the average color is calculated in a defined area in each frame, for example, a defined area that is smaller than the area of the image frame.
  • an average red, blue, and green value in a defined area of each frame in the series of frames chosen may be calculated to form 3 color vectors.
  • the defined area may be a centered circle, for example with a radius of 102 pixels taken from an image frame containing, for example 256 x 256 pixels.
  • a filter may be applied, for example a median filter, on the vector of average color values, for example, the three color vectors: red, green, and blue.
  • the pixel size ofthe resultant tissue color bar presentation may be set by decimating the vector of colors to a desired size, for example, decimating each color vector to the desired size by interpolation.
  • tissue color bar or other data summary may be implemented.
  • a series of data items such as for example one or more individual images, may be converted to a data point, such as a color area or a color strip within a larger display area, such as a color bar.
  • An average brightness value for each image or set of images may be found, and a bar or assemblage of strips of widths, patterns, colors or brightnesses corresponding to the averaged values may be generated.
  • the values such as pH, pressure or temperature corresponding to each of an image or set of images may be found, and a bar or assemblage of strips or other image units of widths, colors or brightnesses corresponding to the averaged values may be generated.
  • One or more images may be converted or processed to a corresponding stripe of color.
  • Various data items may be combined together to individual data points using, for example, averaging, smoothing, etc.
  • the luminance of the images can be normalized and only normalized chromatic information of the data for example the tissue's color, can be shown, eliminating, for example, the contribution of the light source.
  • Other color bars or other presentations of data obtained in-vivo other than imaged data may be generated.
  • Color bars and other representations of data may aid in reducing the viewing time necessary to review an image stream.
  • a health professional may, in one embodiment of the present invention, use a pointer, for example, a mouse to point at an area along the color bar that may be of interest.
  • the graphical user interface may in turn skip to the corresponding location on the data stream, so that a health professional may focus into the area of interest without having to review an entire image stream.
  • a health professional may for example, change the rate at which to view different portions defined by a tissue color bar.
  • a specific area of interest, such as pathology indicated by blood, may be directly identified through the tissue. As such a health professional may first examine the tissue color bar and only afterwards decide what block of images is he interested in reviewing.
  • a summarized graphical presentation of a data stream may be generated in real time in for example a recorder 12, and displayed in real time on a display included in recorder 12.
  • a graphical presentation for example, color bar may be used for other purposes besides presentation of in-vivo data.
  • a color bar may be used as a summarizing presentation of any stream of frames, for example a video.
  • a summarized graphical presentation, for example a color bar as described herein, of a video may help a viewer to locate different scenes in a video and possibly fast forward, rewind or skip to that scene.
  • a scene in a movie that might have been filmed outdoors may for example have a different color scheme than a later or earlier scene that may have been filmed indoors.
  • the color bar may be analogous to a color table of contents.

Abstract

An in-vivo sensing system and a method for creating a summarized graphical presentation of a data stream captured in-vivo. The graphical presentation may be in the form of for example a color bar. The color bar may be a fixed display along side a streaming display of the data stream. A cursor or other indicator may move along the fixed color bar as the data stream is displayed and/or streamed so as to indicate to a health professional what part of the data stream may be currently displayed. The color content in the color bar may map out the data stream and give indication of the location of anatomical sites as well as possible locations of pathology.

Description

SYSTEM AND METHOD FOR PRESENTATION OF DATA STREAMS
FIELD OF THE INVENTION
The present invention relates to presentations of data streams and to a system and method for presenting in-vivo data. BACKGROUND OF THE INVENTION
Known in-vivo imaging devices include ingestible capsules that may capture images from the inside of the gastrointestinal (GI) tract. Captured images may be transmitted to an external source to be examined, for example, for pathology by a healthcare professional. In some embodiments, in-vivo devices may include various other sensors that may transmit data to an external source for monitoring and diagnosis. An in-vivo device may collect data from different points along a body lumen, for example lumens of the GI tract, and transmit them externally for analysis and diagnosis. The GI tract is a very long and curvy path such that it may be difficult to get a good indication of where along this tract each transmitted datum was obtained. Time bars are known to be used when reviewing data, so as to indicate to the health professional how far along the image stream he/she may have advanced. However, since the in-vivo device may stall or advance at different speeds through various sections of a body lumen, for example, the GI tract, it may not be positively determined in some cases where or at what distance along the GI tract was a particular datum, for example an image, captured. In addition, on the time bar there may be no indication as to when the device may have reached certain anatomical milestones, for example, the duodenum, the cecum, or other anatomical locations in the GI tract. Localization methods have been applied. Some localization methods may indicate the spatial position of the device in space at any given time. Although this information together with the time log may give the health professional a better indication of the rate at which the device has advanced it may still be difficult to correlate the spatial position of the device in space to the specific anatomy of, for example, the GI tract. An in-vivo device may collect data from more than one sensor along the very long and curvy path resulting in multiple data streams captured by the in-vivo sensor.
It may be time consuming and difficult to review multiple long streams of data. In addition, it may be difficult for a health profession to get an overall view of the contents of all the data obtained.
SUMMARY OF THE INVENTION
Embodiments of the present invention may provide a system and method for generating and displaying a fixed graphical presentation of captured in-vivo data streams. In one embodiment of the present invention, the fixed graphical presentation is a varying visual representation of a quantity or a dimension captured in an in-vivo data stream. In one example the graphical presentation is in the form of a color bar. In other embodiments of the present invention, the fixed graphical presentation may be displayed along side a streaming display of a data stream.
BRIEF DESCRIPTION OF THE DRAWINGS The subject matter regarded as the invention is particularly pointed out and distinctly claimed in the concluding portion of the specification. The invention, however, both as to organization and method of operation, together with objects, features and advantages thereof, may best be understood by reference to the following detailed description when read with the accompanied drawings in which: Figure 1 is a schematic illustration of an in-vivo imaging system in accordance with embodiments ofthe present invention; Figure 2 is a schematic illustration of a display of a color bar together with other data captured in-vivo in accordance with an embodiment of the present invention; Figure 3 is a schematic illustration of a color bar with an identified anatomical site in accordance with an embodiment ofthe current invention; Figure 4A and 4B are schematic illustrations of exemplary pH and blood detecting color bars respectively in accordance with embodiments of the present invention; Figure 5 is a display with more than one color bar that may be viewed substantially simultaneously according to an embodiment ofthe present invention; Figure 6 is a flow chart describing a method for presentation of in-vivo data according to an embodiment ofthe present invention; and Figure 7 is a flow chart describing a method for constructing a color bar from a stream of images in accordance with an embodiment ofthe present invention. It will be appreciated that for simplicity and clarity of illustration, elements shown in the figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity. Further, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements.
DETAILED DESCRIPTION OF THE INVENTION The following description is presented to enable one of ordinary skill in the art to make and use the invention as provided in the context of a particular application and its requirements. Various modifications to the described embodiments will be apparent to those with skill in the art, and the general principles defined herein may be applied to other embodiments. Therefore, the present invention is not intended to be limited to the particular embodiments shown and described, but is to be accorded the widest scope consistent with the principles and novel features herein disclosed. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be understood by those skilled in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, and components have not been described in detail so as not to obscure the present invention. Embodiments of the present invention offer a device, system and method for generating a fixed graphical presentation of a captured data stream, for example image streams, other non-imaged data, or other data such as color coded, possibly imaged data (e.g., pH data, temperature data, etc.) that may have been collected in vivo, for example along the GI tract. The summarized graphical presentation may include, for example, a varying visual representation, for example, a series of colors that may be at least partially representative of a quantity and/or data collected, e.g. a series of colors where each color presented on the bar may representative of a value of a parameter. Other suitable representations may be used, and other visual dimensions or qualities, such as brightness, size, width, pattern, etc. may be used. In some embodiments of the present invention, the summarized graphical presentation may be a fixed display along side a streaming display ofthe data stream In one embodiment of the invention, the presentation may map out a varying quantity (e.g. a captured data stream) and may, for example, give indication of the relationship between the data stream captured and the anatomical origin or position relative to a start of the captured data stream, for example, the approximate or exact site, for example, in the GI tract from where various data captured may have originated. In another embodiment of the invention, the mapping may give, for example, an indication of an event (e.g. a physiological event) captured, measured, or otherwise obtained. In yet another embodiment of the invention, the mapping may give for example an indication of change of one or more parameters measured over time, for example, a change occurring due to pathology, a natural change in the local environment, or due to other relevant changes. The location may be relative to other information, for example, anatomical attributes along for example the GI tract. The location may in some embodiments be an absolute location, such as a location based on time or based on position of an in-vivo information capture device, based on an image frame in a sequence of images, etc. Reference is made to Fig. 1, which shows a schematic diagram of an in-vivo sensing system according to one embodiment ofthe present invention. Typically the in- vivo sensing system, for example, an image sensing system^ may include an in-vivo sensing device 40, for example an imaging device having an imager 46, for capturing images, an illumination source 42, for illuminating the body lumen, a power source 45 for powering device 40, and a transmitter 41 with antenna 47, for transmitting image and possibly other data to an external receiving device 12. In some embodiments of the present invention, in-vivo device 40 may include one or more sensors 30 other than and/or in addition to image sensor 46, for example, temperature sensors, pH sensors, pressure sensors, blood sensors, etc. In some embodiments of the present invention, device 40 may be an autonomous device, a capsule, or a swallowable capsule. In other embodiments of the present invention, device 40 may not be autonomous, for example, device 40 may be an endoscope or other in-vivo imaging sensing device. The in-vivo imaging device 40 may typically, according to embodiments of the present invention, transmit information (e.g., images or other data) to an external data receiver and/or recorder 12 possibly close to or worn on a subject. Typically, the data receiver 12 may include an antenna or antenna array 15 and a data receiver storage unit 16. The data receiver and/or recorder 12 may of course take other suitable configurations and may not include an antenna or antenna array. In some embodiments of the present invention, the receiver may, for example, include processing power and/or a LCD display from displaying image data. The data receiver and/or recorder 12 may, for example, transfer the received data to a larger computing device 14, such as a workstation or personal computer, where the data may be further analyzed, stored, and/or displayed to a user. Typically, computing device 14 may include processing unit 13, data processor storage unit 19 and monitor 18. Computing device 14 may typically be a personal computer or workstation, which includes standard components such as processing unit 13, a memory, for example storage or memory 19, a disk drive, a monitor 18, and input- output devices, although alternate configurations are possible. Processing unit 13 typically, as part of its functionality, acts as a controller controlling the display of data for example, image data or other data. Monitor 18 is typically a conventional video display, but may, in addition, be any other device capable of providing image or other data. Instructions or software for carrying out a method according to an embodiment of the invention may be included as part of computing device 14, for example stored in memory 19. In other embodiments, each of the various components need not be required; for example, the in-vivo device 40 may transmit or otherwise transfer (e.g., by wire) data directly to a viewing or computing device 14 In-vivo imaging systems suitable for use with embodiments of the present invention may be similar to various embodiments described in US Patent No. 5,604,531, assigned to the common assignee of the present application and incorporated herein by reference, and/or Publication Number WO 01/65995, also assigned to the common assignee of the present application and incorporated herein by reference. Other in-vivo systems, having other configurations, may be used. Of course, devices, systems, structures, functionalities and methods as described herein may have other configurations, sets of components, processes, etc. Embodiments of the present invention include a device, system, and method for generating a typically concise and/or summarized graphical presentation of parameters sensed through or over time in a body lumen, for example the GI tract or any other tract, through which a sensing device may be present and/or traveling. Viewing a data stream captured by an in-vivo device, e.g., viewing an image stream transmitted by an ingestible imaging capsule may be a prolonged procedure. A summarized presentation of the data captured that may, for example, provide a visual representation and/or map of the captured data and may help focus a health professional's attention to an area of interest and/or may promote a health professional's understanding ofthe origin and contents ofthe data being viewed. One or more streams of data obtained from said sensing device may be processed to create one or more summarized presentations that may, for example, be displayed in a graphical user interface, for example a graphical user interface of analysis software. According to one embodiment, presentation of a data stream (e.g., a stream or set of images, a sequence of pH data, etc.) may be with a bar, for example, a color bar that that may be displayed, for example, on a monitor 18 perhaps through a graphical user interface, or, for example, in real time on an LCD display on a receiver 12 as a data stream is being captured. The presentation may include a varying visual representation of a quantity or a dimension representing, for example, a varying quantity captured in a portion of (e.g., an image frame) an in-vivo data stream. In one example the dimension may be color. The presentation may typically be an abstracted or summarized version of image or other data being presented, for example, streamed on a different portion of the display. The presentation may typically include multiple image items or data items such as bars, stripes, pixels or other components, assembled in a continuing series, such as a bar including multiple strips, each strip corresponding to an image frame. For example, a portion of the presentation may represent a summary of the overall color scheme, brightness, pH level, temperature, pressure, or other quantity on a displayed frame or data item. Reference is now made to Fig. 2 showing a display and/or a graphical user interface 200 for displaying data captured in-vivo data. Display 200 may include a summarized graphical presentation 220 of an in-vivo data stream, for example, a color bar. Typically, the graphical presentation 220 may be a fixed presentation displayed alongside a streaming display of a data stream 210, for example, an image stream in accordance with some embodiments of the present invention. In other embodiments ofthe present invention, graphical presentation 220 may be displayed separately. The graphical presentation 220 may include a series of colors, a series of colored areas, or a series of patterns, image items, images or pixel groups (e.g., a series of stripes 222 or areas of color arranged to form a larger bar or rectangular area), where each, for example, color in the series 222 may be associated with and/or correspond to an element or a group of elements in the original data stream. For example, each colored stripe 222 may correspond to an image or a group of images from a data stream 210. Image units other than stripes (e.g., pixels, blocks, etc.) may be used, and the image units may vary in a dimension other than color (e.g., pattern, size, width, brightness, animation, etc). One image unit (e.g., a stripe 222) may represent one or more units (e.g., image frames) in the original data stream. Typically, the series of, for example, colors in the bar may be arranged in the same sequence or order in which the data stream, for example, the images or groups of images may typically be displayed. In one embodiment of the present invention, pointing at a stripe in a graphical presentation 220 may advance the image stream to the frames corresponding to that stripe. The color bar may be generated by, for example, assigning a color to each element (e.g., an image frame) or subgroup of elements in the data stream and then processing the series of colors, for example such that it may emphasize variations within the displayed properties. In one embodiment of the invention, it may be processed, for example to emphasize cue points in an accompanying video such that, for example, it may be used as an ancillary tool for indicating points of interest. In one embodiment of the invention, a stream of data 210 may be displayed along side one or more bars and or graphical presentations (220 and 230) described herein. The data stream 210 may be for example data represented in the graphical presentation 220 (e.g. a captured in-vivo image stream) or other data obtained and/or sampled simultaneously or substantially simultaneously with the data represented in the graphical presentation 220. In one example, a marker or indicator 250 may progress across or along the graphical presentation 220 as the substantially corresponding datum in data stream 210 (e.g., video) may be concurrently displayed to indicate the correspondence between the graphical presentation 220 and the data stream 210. In other embodiments of the invention, the presentation may be of a shape other than a bar, for example a circle, oval, square, etc. According to other embodiments, the presentation may be in the form of an audio tract, graph, and other suitable graphic presentations. An indicator 250 such as a cursor may advance along the time bar 230 and graphical presentation 220 as the image stream 210 is scrolled on the display 200. In one example, control buttons 240 may be included in the display that may allow the user to, for example, fast-forward, rewind, stop play or reach the beginning or end of, for example, an image stream 210. In other embodiments of the present invention, a user may control the display of a data stream 210, for example, by altering the start position of the streaming display, e.g. skipping to areas of interest, by moving the position of indicator 250, for example with a mouse or other pointing device. In other embodiments of the present invention, a user and/or health professional may insert indications or markers such as thumbnails to mark location along the image stream for easy access to those locations in the future. For example, a health professional may mark these milestones on the graphical presentation 220 (e.g., using a pointing device such as a mouse, a keyboard, etc). Some embodiments described in Published United States patent application US-2002-0171669-A1, entitled "System and method for annotation on a moving image", published on November 21, 2002, assigned to the assignee of the present invention and incorporated by reference herein in its entirety, include methods and devices to mark or annotate portions of an image stream; such methods may be used in conjunction with embodiments of the present invention. Other suitable methods for marking or annotating a stream of data may be used. A user may then "click" on the thumbnails to advance to the site of datum, for example the image frame, of interest or alternatively click on the graphical presentation 220 to advance or retract to the image frame, of interest and then, for example, continue or begin streaming and/or viewing the data stream from that desired point of interest. Thumbnails or other markers may be defined based on an image of interest from the data stream 210, based on a location identified on the graphical presentation 220 or based on a time recorded on time bar 230. Other suitable methods of defining thumbnails or other markers or notations may be used. For example, a computer algorithm may be used to identify thumbnails that may be of interest to, for example, the health professional. Algorithm based thumbnails may also, for example, be based on an image of interest from the data stream 210, based on a location identified on the graphical presentation 220 or based on a time recorded on time bar 230, or other methods. In other embodiments, the graphical presentation 220 may be in itself a series of color thumbnails, so that a user may point or "click" on colors in the color bar to restart the display ofthe data stream from a different point in the stream. Fig. 3 is a schematic illustration of a tissue color bar according to an embodiment of the present invention. Tissue graphical presentation 220 may have been obtained through image processing of a stream of images obtained, for example, from an imager 46 imaging the tissue of the GI tract. Other lumens may be sensed, and other modalities (e.g., temperature) may be sensed. The tissue graphical presentation 220 represents, for example, a compressed and perhaps smoothed version of the image stream captured such that the top horizontal strip of color on the bar may represent a first image, a first representative image, or a first group of images captured and the bottom horizontal strip of color may represent the last image, the last representative image, or a final set of images captured; in alternate embodiments only a portion ofthe image stream and or other data stream may be represented. In one embodiment ofthe present invention, the color scheme of image frames taken of tissue over time may change, for example as an in-vivo imaging device 40 travels along the GI tract. Changes in the color scheme of the images may be used to identify, for example, passage through a specific anatomical site, for example, the duodenal, cecum or other sites, and/or may indicate pathology, for example bleeding or other pathology. When presenting an image stream of a tissue in a summarized, concise color bar, the changes in color streams may be readily identified. For example, passage into the cecum may be identified by a color that may be typical to the large intestine, for example, a color that may indicate content or a color typical of the tissue found in the large intestine. Entrance into the duodenum may be identified by another color that may be typical of the tissue in the small intestine. Other anatomical sites may be identified by observing color and/or changing color streams on a color bar, for example, a tissue color bar. In other embodiments a pathological condition, such as for example, the presence of polyps, bleeding, etc., may be identified by viewing, for example, a tissue graphical presentation 220. A specific area of interest, such as pathology indicated by blood, may be directly identified through the tissue. As such a health professional may first examine the tissue graphical presentation 220 and only afterwards decide what block of images to review. In some embodiments of the present invention, an algorithm may be employed to identify anatomical sites, pathologies, or areas of interest using data from such a color bar and bring them to the attention of a health professional, by for example marking the area of interest along the displayed color bar. A health professional may use the thumbnails or markings along a tissue color bar, for example, markings and/or markings of the first gastric image 320, the first duodenum image 330 and the first cecum image 340 to locate where along the GI tract the data 210 (concurrently being displayed) may be originating from. Knowing the area at which an image was captured may help a health professional decide if an image viewed is representative of a healthy or pathological tissue, and may help a health professional to determine other conditions. According to some embodiments, different colors or other visual indications, shades, hues, sizes or widths, etc. may be artificially added to a processed data stream, for example, in order to accentuate changes along the data stream. Other processing methods may be used to enhance the information presented to the user. In one embodiment of the invention smoothing may or may not be performed on selected pixels based on decision rules. For example in one embodiment of the invention smoothing may not be performed on dark pixels or on green pixels that may indicate content in the intestines. Reference is now made to Fig.4A and 4B showing an example, of graphical presentations in the form of a bar other than tissue color bars. For example, Fig. 4A shows a schematic example of a pH color bar 225 that may map out pH measurements obtained, for example over time or alternatively along a path of a body lumen. Other measurements may be used, for example, temperature, blood sensor, and pressure measurements may be used. Data obtained from an in-vivo pH sensor may be displayed with color, brightness, and/or patterns to map out the pH over time and/or over a path, for example a GI tract where different colors may represent, for example, different pH levels. In other examples, different colors may represent different levels of changes in pH levels. Other suitable presentations may be displayed. Changes in pH along a path may be due to pathology, entrance into or out of anatomical locations, etc. Observed changes in pH over time may, for example, classify physiological occurrences over time, for example a healing process, progression of a medical condition, etc. Fig. 4B is a schematic illustration of blood detecting color bar 226. In one example, color stripes 222 along the bar may indicate a site where blood may have been detected. In one embodiment ofthe present invention, color bar 226 may give indication of the presence of blood over a period of time. International application published as WO 02/26103 entitled "An immobilizable in vivo sensing device" assigned to the assignee of the present invention which is incorporated by reference herein in its entirety includes, inter alia, descriptions of embodiments of devices, such as capsules, that may be anchored at post-surgical sites. Embodiments described in International application WO 02/26103 may be used in conjunction with the system and methods described herein to capture and transmit data for an in-vivo site over time. A presentation of the captured data, for example a color bar may give indication of any changes occurring over time from a current static situation or may show an overview of how a tissue healed or changed over time without having to review the entire stream image by image. Reference is now made to Fig. 5 showing schematically a graphical user interface for viewing a streaming display of in-vivo data 210 along with multiple fixed summarized graphical presentations 220, 225, and 226 of a data stream. A single scrolling indicator 250 may be used along with a time bar 230 to point to a position along the fixed presentation of the data streams (220, 225, and 226) so as to indicate where along the bars the data 210 presently being displayed originated. The individual color bars may include for example, a tissue graphical presentation 220, a pH color bar 225, and a blood detector color bar 226. Other number of graphical presentations, other suitable types of bars summarizing other data, and other suitable types of presentations may be used. Multiple graphical presentations may be helpful in diagnosis of medical conditions as well as locating within a stream of data, sites of interest. Multiple graphical presentation may increase the parameters that are available to a health professional when reviewing, for example, an image stream and may give a better indication of the environmental condition that may exist at a point of observation. For example, in one embodiment, pH, temperature and tissue graphical presentations or other presentation may be displayed, possibly, side by side. In an alternate embodiment, two or more streams of information may be displayed simultaneously and combined into a single graphical presentation using for example a unifying algorithm. For example, pH and temperature can be combined into a single color bar where, for example, red holds the temperature values and blue holds the pH values (other suitable colors may be used). A physician may choose which parameters he/she is interested in viewing as a map or summary. Having more than one set of parameter available at one time may make it easier to find more anatomical sites and to identify areas that may, for example, contain pathologies. Numerous algorithms based on case studies or other suitable data may be applied to suggest to the physician alert sites or other information obtained from one or more color bars or from the combination of one or more color bars. Other suitable indicating maps, information summaries, or color bars may be used. Non-limiting examples of different types of graphical presentations (e.g., color bars, series of brightness levels, etc.) may include: • Tissue graphical presentation: brightness, pattern, or other visual representation of a tissue image stream; • Temperature graphical presentation: color, brightness, pattern, or other visual representation of sensed in-vivo temperature data over time and/or along a body lumen; • PH graphical presentation: color brightness, pattern, or other visual representation of sensed in-vivo temperature data over time and/or along a body lumen; • Oxygen saturation graphical presentation: color, brightness, pattern, or other visual representation of sensed oxygen saturation over time and/or along a body lumen; • Pressure graphical presentation: color, brightness, pattern, or other visual representation of sensed in-vivo pressure over time and/or along a body lumen; • Blood detection graphical presentation: color, brightness, pattern, or other visual representation of sensed presence of bleeding over time and/or along a body lumen; • Biosensor graphical presentation: color, brightness, pattern, or other visual representation of results obtained from one or more in-vivo biosensors; • Speed graphical presentation: color, brightness, pattern, or other visual representation ofthe speed of a moving in-vivo device; • Spatial position graphical presentation: color, brightness, pattern, or other visual representation ofthe spatial position and or orientation of an in-vivo device over time; • Ultrasound graphical presentation: color, brightness, pattern, or other visual representation of data sensed from an in-vivo ultrasound probe; and • Motility graphical presentation: color, brightness, pattern, or other visual representation ofthe sensed motility of a traveling in-vivo device. International application published as WO02102223, entitled "MOTILITY ANALYSIS WITHIN A GASTROINTESTINAL TRACT" assigned to the assignee of the present invention and incorporated by reference herein in its entirety includes, inter alia, a device, system, and methods for determining in-vivo motility that may be used in conjunction with the device, system, and method described herein. Other suitable representations other than bars and other suitable types of data may be implemented using the device, system, and method described herein. Reference is now made to Fig. 6 showing a flow chart of a method for presentation of an in-vivo data stream according to an embodiment of the present invention. In block 610 a fixed presentation of a data stream may be displayed, e.g. a color bar, a series of strips of varying width or brightness, etc., summarizing, for example, an image stream, a pH data stream, temperature data stream etc. A user may annotate portions of the fixed presentation (block 680) for example identified anatomical sites and/or physiological events. More than one fixed presentation may be displayed. In block 620 a time bar indicating the time that data from a displayed data stream may have been sampled and/or captured may be displayed. A time bar need not be used. The data stream to be displayed may be initiated (block 630) so as, for example, to begin the streaming display. In block 640, streaming of the data stream may begin. The displayed data stream may be other than the data stream represented in the fixed presentation. For example, an in-vivo device may capture images as well as sample, for example, temperature values, as it progresses through the body lumen. In one example, a fixed presentation of temperature values may be displayed along side a streaming display of image frames captured substantially simultaneously. In other examples, the fixed presentation as well as the streaming display may be of the captured image frame. In block 650 as the data stream progress, a cursor or other indicator may point to a position on the fixed presentation (as well as the time bar) that may correspond to the data (e.g., an image frame, a pH value) displayed in the displayed data stream. In block 660, a command may be received to stream the display from a different point in the data stream. In one example, the user may drag the cursor along the fixed presentation to indicate the point at which the streaming should begin. In other examples, the user may annotate portions in the fixed presentation (block 680) and at some point click on the annotations to begin streaming the data stream at the corresponding point in the displayed streamed data stream. Other suitable methods of receiving user inputs may be implemented and other suitable methods of annotations other than user input annotations may be implemented, for example as may have been described herein. In block 670 the start position of the streaming display may be defined by a user input and with that information a command to begin streaming from the defined point may be implemented. Other operations or series of operations may be used. Various suitable methods may be use to abstract data from the source data stream (e.g. an image stream, a series of temperature data) to the fixed representation. Reference is now made to Fig. 7 describing a method of generating a fixed summary of a data representation, for example a tissue color bar, according to an embodiment of the present invention. In an exemplary embodiment, in block 510 a set (wherein set may include one item) or series of data items, for example frames from an image stream may be extracted. For example every 10th frame from the image stream may be extracted and or chosen to represent the image stream in a fixed presentation. In other embodiments, all the data items or frames may be included, or every 5th, 20th, or any other suitable number of frames may be used. In yet other embodiment of the present invention, an image representing an average of every two or more frames may be used. In one example, a criterion may be defined by which to define one frame out of a block of frames (e.g. two or more frames) to be representative of that block. In block 520 a vector of average color or other values (e.g., brightness values) may be calculated. In one embodiment of the present invention, the average color is calculated in a defined area in each frame, for example, a defined area that is smaller than the area of the image frame. For example, an average red, blue, and green value in a defined area of each frame in the series of frames chosen may be calculated to form 3 color vectors. In one example, the defined area may be a centered circle, for example with a radius of 102 pixels taken from an image frame containing, for example 256 x 256 pixels. In other examples, only one or two colors may be used to generate a color bar. In block 530 a filter may be applied, for example a median filter, on the vector of average color values, for example, the three color vectors: red, green, and blue. An exemplary filter may for example have a length defined by the following equation: 1+2 (alpha *N/Np); alpha =2.2; where N is the original pixel size and Np is the desired pixel size of the resultant tissue color bar presentation. Other equations or formulae may be used. In block 540 the pixel size ofthe resultant tissue color bar presentation may be set by decimating the vector of colors to a desired size, for example, decimating each color vector to the desired size by interpolation. Other methods of generating a tissue color bar or other data summary may be implemented. In one embodiment, a series of data items, such as for example one or more individual images, may be converted to a data point, such as a color area or a color strip within a larger display area, such as a color bar. An average brightness value for each image or set of images may be found, and a bar or assemblage of strips of widths, patterns, colors or brightnesses corresponding to the averaged values may be generated. The values such as pH, pressure or temperature corresponding to each of an image or set of images (e.g., in a device collecting both image and other data) may be found, and a bar or assemblage of strips or other image units of widths, colors or brightnesses corresponding to the averaged values may be generated. One or more images may be converted or processed to a corresponding stripe of color. Various data items may be combined together to individual data points using, for example, averaging, smoothing, etc. In one embodiment the luminance of the images can be normalized and only normalized chromatic information of the data for example the tissue's color, can be shown, eliminating, for example, the contribution of the light source. Other color bars or other presentations of data obtained in-vivo other than imaged data may be generated. Color bars and other representations of data may aid in reducing the viewing time necessary to review an image stream. A health professional may, in one embodiment of the present invention, use a pointer, for example, a mouse to point at an area along the color bar that may be of interest. The graphical user interface may in turn skip to the corresponding location on the data stream, so that a health professional may focus into the area of interest without having to review an entire image stream. A health professional may for example, change the rate at which to view different portions defined by a tissue color bar. A specific area of interest, such as pathology indicated by blood, may be directly identified through the tissue. As such a health professional may first examine the tissue color bar and only afterwards decide what block of images is he interested in reviewing. When screening patients it may be possible only to review one or more data presentations, such as a tissue color bar. In other examples, a summarized graphical presentation of a data stream may be generated in real time in for example a recorder 12, and displayed in real time on a display included in recorder 12. In other embodiments of the present invention, a graphical presentation, for example, color bar may be used for other purposes besides presentation of in-vivo data. For example, a color bar may be used as a summarizing presentation of any stream of frames, for example a video. A summarized graphical presentation, for example a color bar as described herein, of a video may help a viewer to locate different scenes in a video and possibly fast forward, rewind or skip to that scene. For example, a scene in a movie that might have been filmed outdoors may for example have a different color scheme than a later or earlier scene that may have been filmed indoors. The color bar may be analogous to a color table of contents. The foregoing description of the embodiments of the invention has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. It should be appreciated by persons skilled in the art that many modifications, variations, substitutions, changes, and equivalents are possible in light of the above teaching. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit ofthe invention.

Claims

1. A method for presentation of a data stream, said method comprising: generating a fixed graphical presentation ofthe data stream wherein said presentation includes at least a varying visual representation, said visual representation varying in accordance with a varying quantity in the data stream; and displaying the fixed graphical presentation.
2. The method according to claim 1 comprising displaying the fixed presentation of the data stream along side a streaming display ofthe data stream.
3. The method according to claim 2 comprising displaying a marker in a position along the fixed presentation said position substantially corresponding to a datum from the data stream, said datum concurrently being displayed.
4. The method according to claim 2 comprising generating a fixed graphical presentation of a first data stream displaying the fixed presentation along side a streaming display of a second data stream, said second data stream captured substantially simultaneously with the first data stream.
5. The method according to claim 4 comprising displaying a marker in a position along the fixed presentation, said marker substantially corresponding to a datum from the second data stream.
6. The method according to claim 4 wherein the second data stream is an image stream.
7. The method according to claim 4 wherein the second data stream is a stream of measurements selected from a group consisting of: in-vivo pH measurements, temperature measurements, pressure measurements, blood measurements.
8. The method according to claim 1 wherein the data stream is an image stream.
9. The method according to claim 1 wherein the data stream is captured by an in- vivo device.
10. The method according to claim 1 wherein the data stream is a stream of in-vivo images ofthe GI tract.
11. The method according to claim 1 wherein the data stream is transmitted from an ingestible in-vivo device.
12. The method according to claim 1 wherein the fixed presentation is a color bar.
13. The method according to claim 4 comprising: providing a graphical user interface accepting a user indication to alter the start position ofthe streaming ofthe first data stream.
14. The method according to claim 1 comprising accepting a user annotation.
15. A method for generating a summarized presentation of an image stream said method comprising: extracting a set of frames from the image stream; calculating a vector of average color values for each frame in the set of frames; and arranging the vector of average color values in a series.
16. The method according to claim 15, comprising arranging a series of image items on a display according to the vector.
17. The method according to claim 15 wherein the image stream is a stream of images captured in-vivo.
18. The method according to claim 15 wherein the image stream is an image stream transmitted by an in-vivo ingestible capsule.
19. The method according to claim 15 wherein the series of frames includes all the frames in the image stream.
20. The method according to claim 15 wherein the vector of average color values comprises a vector of average red color values.
21. The method according to claim 15 wherein the vector of average color values comprises a vector of average red, blue, and green color values.
22. The method according to claim 15 wherein the calculating is performed in a defined area in each of the frames in the series of frames.
23. The method according to claim 22 wherein the defined area is smaller than the area ofthe image frame
24. The method according to claim 15 comprising applying a filter to the vector of average color values.
25. The method according to claim 24 wherein applying a filter comprises applying a median filter.
26. The method according to claim 15 comprising decimating the vector of colors.
27. A system for presentation of in-vivo data, the system comprising: an in-vivo sensing device; a processing unit to generate a summarized presentation of a data stream sampled by said in-vivo sensing device; and a display to display summarized presentation.
28. The system according to claim 27 wherein the in-vivo sensing device comprises an imager.
29. The system according to claim 27 wherein the in-vivo sensing device is an ingestible capsule.
30. The system according to claim 27 wherein the data stream is an image stream.
31. The system according to claim 27wherein the data stream comprises images ofthe GI tract.
32. The system according to claim 27 wherein the graphical presentation of a data stream is a color bar.
33. The system according to claim 32 wherein the color bar is generated from an image stream captured by said in-vivo sensing device.
34. A system for presentation of a data stream, said system comprising: a controller to generate a fixed graphical presentation ofthe data stream wherein said presentation includes at least a varying visual representation, said visual representation varying in accordance with a varying quantity in the data stream; and a display unit.
35. The system according to claim 34 wherein the display unit is to display the fixed graphical presentation of the data stream.
36. The system according to claim 34 wherein the display unit is to display the fixed graphical presentation ofthe data stream along side a streaming display ofthe data stream.
37. The system according to claim 36 wherein the display unit is to display a marker in a position along the fixed presentation said marker substantially corresponding to a datum from the data stream concurrently being displayed.
38. The system according to claim 35 comprising a graphical user interface.
39. The system according to claim 38 wherein the graphical user interface is to accept a user indication to alter the start position ofthe streaming of the first data stream.
40. The system according to claim 38 wherein the graphical user interface is to accept a user indication to annotate a portion ofthe data stream.
PCT/IL2004/000906 2003-10-02 2004-09-28 System and method for presentation of data streams WO2005031650A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
DE602004031443T DE602004031443D1 (en) 2003-10-02 2004-09-28 SYSTEM AND METHOD FOR THE PRESENTATION OF DATA FLOWS
JP2006531011A JP4027409B2 (en) 2003-10-02 2004-09-28 System and method for displaying a data stream
AU2004277001A AU2004277001B2 (en) 2003-10-02 2004-09-28 System and method for presentation of data streams
AT04770577T ATE498875T1 (en) 2003-10-02 2004-09-28 SYSTEM AND METHOD FOR REPRESENTING DATA STREAMS
EP04770577A EP1676244B1 (en) 2003-10-02 2004-09-28 System and method for presentation of data streams

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US50750803P 2003-10-02 2003-10-02
US60/507,508 2003-10-02

Publications (1)

Publication Number Publication Date
WO2005031650A1 true WO2005031650A1 (en) 2005-04-07

Family

ID=34393240

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2004/000906 WO2005031650A1 (en) 2003-10-02 2004-09-28 System and method for presentation of data streams

Country Status (8)

Country Link
US (4) US7215338B2 (en)
EP (2) EP2290613B1 (en)
JP (3) JP4027409B2 (en)
AT (1) ATE498875T1 (en)
AU (1) AU2004277001B2 (en)
DE (1) DE602004031443D1 (en)
ES (1) ES2360701T3 (en)
WO (1) WO2005031650A1 (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1618832A1 (en) * 2003-04-25 2006-01-25 Olympus Corporation Image display unit, image display method and image display program
JP2006288976A (en) * 2005-04-14 2006-10-26 Olympus Medical Systems Corp Simplified image displaying device and receiving system
JP2006334297A (en) * 2005-06-06 2006-12-14 Olympus Medical Systems Corp Image display device
WO2007061008A1 (en) 2005-11-24 2007-05-31 Olympus Medical Systems Corp. Device for displaying in vivo image, receiving device, and image display system and method using them
DE102006008508A1 (en) * 2006-02-23 2007-09-13 Siemens Ag Medical visualization method for e.g. cardiological diagnosis of heart, involves storing recorded organ layer pictures with information about series of organ layer pictures and time period of recording in image data memory
JP2008036028A (en) * 2006-08-03 2008-02-21 Olympus Medical Systems Corp Picture display
JP2008061704A (en) * 2006-09-05 2008-03-21 Olympus Medical Systems Corp Image display device
EP1922979A1 (en) * 2005-09-09 2008-05-21 Olympus Medical Systems Corp. Image display device
EP1924193A2 (en) * 2005-09-15 2008-05-28 Given Imaging Ltd. System and method for presentation of data streams
EP2149332A1 (en) * 2007-05-17 2010-02-03 Olympus Medical Systems Corp. Image information display processing device and display processing method
EP2148611A1 (en) * 2007-04-20 2010-02-03 Sierra Scientific Instruments, Inc. Diagnostic system for display of high-resolution physiological data of multiple properties
JP2010046525A (en) * 2009-11-30 2010-03-04 Olympus Medical Systems Corp Image display device
US8169472B2 (en) 2005-08-22 2012-05-01 Olympus Corporation Image display apparatus with interactive database
US8406489B2 (en) 2005-09-09 2013-03-26 Olympus Medical Systems Corp Image display apparatus
US8617058B2 (en) 2008-07-09 2013-12-31 Innurvation, Inc. Displaying image data from a scanner capsule
US8900124B2 (en) 2006-08-03 2014-12-02 Olympus Medical Systems Corp. Image display device
US9017248B2 (en) 2007-11-08 2015-04-28 Olympus Medical Systems Corp. Capsule blood detection system and method
EP2996541A4 (en) * 2013-05-17 2017-03-22 EndoChoice, Inc. Interface unit in a multiple viewing elements endoscope system
US9900109B2 (en) 2006-09-06 2018-02-20 Innurvation, Inc. Methods and systems for acoustic data transmission
EP3538644A4 (en) * 2016-11-10 2020-07-08 Becton, Dickinson and Company Timeline system for monitoring a culture media protocol

Families Citing this family (66)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7474327B2 (en) 2002-02-12 2009-01-06 Given Imaging Ltd. System and method for displaying an image stream
CN100431475C (en) * 2003-04-25 2008-11-12 奥林巴斯株式会社 Device, method and program for image processing
WO2005000101A2 (en) * 2003-06-12 2005-01-06 University Of Utah Research Foundation Apparatus, systems and methods for diagnosing carpal tunnel syndrome
US20060287590A1 (en) * 2003-09-18 2006-12-21 Mceowen Edwin L Noninvasive vital sign measurement device
WO2005039399A1 (en) * 2003-10-27 2005-05-06 Olympus Corporation Image processing device, image processing method, and image processing program
JP4574983B2 (en) * 2003-11-04 2010-11-04 オリンパス株式会社 Image display apparatus, image display method, and image display program
US20050132414A1 (en) * 2003-12-02 2005-06-16 Connexed, Inc. Networked video surveillance system
EP2649931B1 (en) * 2003-12-05 2017-02-01 Olympus Corporation Display processing device
US7623690B2 (en) * 2004-03-30 2009-11-24 Carestream Health, Inc. System and method for classifying in vivo images according to anatomical structure
WO2005115227A1 (en) * 2004-05-17 2005-12-08 Sierra Scientific Instruments Manometry probe and data visualization
WO2005122906A1 (en) * 2004-06-18 2005-12-29 Hitachi Medical Corporation Ultrasonic diagnositic apparatus
US20060173318A1 (en) * 2004-07-20 2006-08-03 Scimed Life Systems Inc. Systems and methods for detecting and presenting textural information from medical images
US7578790B2 (en) * 2004-07-20 2009-08-25 Boston Scientific Scimed, Inc. Systems and methods for detecting and presenting textural information from medical images
US20060036147A1 (en) * 2004-07-20 2006-02-16 Scimed Life Systems, Inc. Systems and methods for detecting and presenting textural information from medical images
JP4537803B2 (en) * 2004-08-27 2010-09-08 オリンパス株式会社 Image display device
JP4615963B2 (en) * 2004-10-29 2011-01-19 オリンパス株式会社 Capsule endoscope device
JP2006141898A (en) * 2004-11-24 2006-06-08 Olympus Corp Image display system
JP2006345929A (en) * 2005-06-13 2006-12-28 Olympus Medical Systems Corp Image display device
IL177045A (en) 2005-07-25 2012-12-31 Daniel Gat Device, system and method of receiving and recording and displaying in-vivo data with user entered data
WO2007023771A1 (en) * 2005-08-22 2007-03-01 Olympus Corporation Image display device
JP4441464B2 (en) * 2005-09-09 2010-03-31 オリンパスメディカルシステムズ株式会社 Image display device
US7567692B2 (en) * 2005-09-30 2009-07-28 Given Imaging Ltd. System and method for detecting content in-vivo
US8423123B2 (en) 2005-09-30 2013-04-16 Given Imaging Ltd. System and method for in-vivo feature detection
US7577283B2 (en) * 2005-09-30 2009-08-18 Given Imaging Ltd. System and method for detecting content in-vivo
JP4855759B2 (en) * 2005-10-19 2012-01-18 オリンパス株式会社 Receiving apparatus and in-subject information acquisition system using the same
WO2007077554A2 (en) * 2005-12-30 2007-07-12 Given Imaging Ltd. System and method for displaying an image stream
AU2007239598B2 (en) * 2006-04-14 2010-09-16 Olympus Medical Systems Corp. Image display apparatus
JP2009022446A (en) * 2007-07-18 2009-02-05 Given Imaging Ltd System and method for combined display in medicine
US8204225B2 (en) * 2007-07-23 2012-06-19 Savi Technology, Inc. Method and apparatus for providing security in a radio frequency identification system
JP2009039449A (en) * 2007-08-10 2009-02-26 Olympus Corp Image processor
JP2009050321A (en) * 2007-08-23 2009-03-12 Olympus Corp Image processor
US20100329520A2 (en) * 2007-11-08 2010-12-30 Olympus Medical Systems Corp. Method and System for Correlating Image and Tissue Characteristic Data
US9131847B2 (en) * 2007-11-08 2015-09-15 Olympus Corporation Method and apparatus for detecting abnormal living tissue
JP2009226066A (en) * 2008-03-24 2009-10-08 Olympus Corp Capsule medical device
US10304126B2 (en) 2008-04-30 2019-05-28 Beyondvia Technologies Visual communication systems and methods designing and building entire experiences
US9310980B2 (en) * 2012-08-21 2016-04-12 Beyondvia Technologies Systems and methods for performance comparisons utilizing an infinite cylinder user interface
US9538937B2 (en) * 2008-06-18 2017-01-10 Covidien Lp System and method of evaluating a subject with an ingestible capsule
US8888680B2 (en) * 2008-07-07 2014-11-18 Olympus Medical Systems Corp. Method and apparatus for foreign matter detection for blood content sensors
CN101721199B (en) 2008-10-14 2012-08-22 奥林巴斯医疗株式会社 Image display device and image display method
JP2012509715A (en) * 2008-11-21 2012-04-26 メイヨ・ファウンデーション・フォー・メディカル・エデュケーション・アンド・リサーチ Colonoscopy tracking and evaluation system
EP2407082B1 (en) * 2009-03-11 2017-04-26 Olympus Corporation Image processing system, external device and image processing method
US20110084967A1 (en) * 2009-10-09 2011-04-14 International Business Machines Corporation Visualization of Datasets
US8446465B2 (en) 2010-01-05 2013-05-21 Given Imaging Ltd. System and method for displaying an image stream captured in-vivo
US8682142B1 (en) * 2010-03-18 2014-03-25 Given Imaging Ltd. System and method for editing an image stream captured in-vivo
US20110237906A1 (en) * 2010-03-26 2011-09-29 General Electric Company System and method for graphical display of medical information
CN106923779A (en) 2010-04-28 2017-07-07 基文影像公司 For the system and method for image section in display body
JP5460488B2 (en) * 2010-06-29 2014-04-02 富士フイルム株式会社 Electronic endoscope system, processor device for electronic endoscope, image retrieval system, and method of operating electronic endoscope system
US8922633B1 (en) 2010-09-27 2014-12-30 Given Imaging Ltd. Detection of gastrointestinal sections and transition of an in-vivo device there between
US8965079B1 (en) 2010-09-28 2015-02-24 Given Imaging Ltd. Real time detection of gastrointestinal sections and transitions of an in-vivo device therebetween
US8873816B1 (en) 2011-04-06 2014-10-28 Given Imaging Ltd. Method and system for identification of red colored pathologies in vivo
JP5242731B2 (en) * 2011-04-28 2013-07-24 オリンパス株式会社 Image display system and image display terminal device
WO2012165298A1 (en) * 2011-06-01 2012-12-06 オリンパスメディカルシステムズ株式会社 Receiving device and capsule-type endoscope system
EP2810216B1 (en) * 2012-01-31 2017-11-15 Given Imaging Ltd. System and method for displaying motility events in an in vivo image stream
JP6342390B2 (en) 2012-06-29 2018-06-13 ギブン イメージング リミテッドGiven Imaging Ltd. System and method for displaying an image stream
JP5980604B2 (en) * 2012-07-18 2016-08-31 オリンパス株式会社 Endoscope system
US20140028820A1 (en) * 2012-07-24 2014-01-30 Capso Vision, Inc. System and Method for Display of Capsule Images and Associated Information
EP2910173A4 (en) * 2012-10-18 2016-06-01 Olympus Corp Image processing device, and image processing method
US9324145B1 (en) 2013-08-08 2016-04-26 Given Imaging Ltd. System and method for detection of transitions in an image stream of the gastrointestinal tract
EP3033000A2 (en) * 2013-08-13 2016-06-22 Koninklijke Philips N.V. Method and display for long term physiological signal quality indication
CN105612554B (en) * 2013-10-11 2019-05-10 冒纳凯阿技术公司 Method for characterizing the image obtained by video-medical equipment
US20150127377A1 (en) * 2013-11-07 2015-05-07 A.T. Still University Color matching for health management
US10204411B2 (en) 2014-05-09 2019-02-12 Given Imaging Ltd. System and method for sequential image analysis of an in vivo image stream
US10649634B2 (en) * 2014-06-06 2020-05-12 International Business Machines Corporation Indexing and annotating a usability test recording
WO2017200885A1 (en) * 2016-05-18 2017-11-23 Stuart Bradley Systems and methods for observing and analyzing swallowing
WO2018017071A1 (en) * 2016-07-20 2018-01-25 Hitachi, Ltd. Data visualization device and method for big data analytics
DE102016121668A1 (en) * 2016-11-11 2018-05-17 Karl Storz Se & Co. Kg Automatic identification of medically relevant video elements

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5392072A (en) * 1992-10-23 1995-02-21 International Business Machines Inc. Hybrid video compression system and method capable of software-only decompression in selected multimedia systems
US5970173A (en) * 1995-10-05 1999-10-19 Microsoft Corporation Image compression and affine transformation for image motion compensation
WO2000058967A1 (en) * 1999-03-30 2000-10-05 Tivo, Inc. Multimedia program bookmarking system
US20020193669A1 (en) * 2000-05-31 2002-12-19 Arkady Glukhovsky Method for measurement of electrical characteristics of tissue

Family Cites Families (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3909792A (en) * 1973-02-26 1975-09-30 American Optical Corp Electrocardiographic review system
JPS6052344B2 (en) 1977-09-19 1985-11-19 芝浦メカトロニクス株式会社 foot valve
US4149184A (en) * 1977-12-02 1979-04-10 International Business Machines Corporation Multi-color video display systems using more than one signal source
JPS5519124A (en) * 1978-07-27 1980-02-09 Olympus Optical Co Camera system for medical treatment
US5993378A (en) * 1980-10-28 1999-11-30 Lemelson; Jerome H. Electro-optical instruments and methods for treating disease
US4936823A (en) * 1988-05-04 1990-06-26 Triangle Research And Development Corp. Transendoscopic implant capsule
JPH04109927A (en) 1990-08-31 1992-04-10 Toshiba Corp Electronic endoscope apparatus
JP2643596B2 (en) * 1990-11-29 1997-08-20 株式会社日立製作所 Display method of scalar quantity distribution
DE69222102T2 (en) * 1991-08-02 1998-03-26 Grass Valley Group Operator interface for video editing system for the display and interactive control of video material
JP3020376B2 (en) * 1993-03-26 2000-03-15 サージミヤワキ株式会社 Internal body identification device for animals
IL108352A (en) * 1994-01-17 2000-02-29 Given Imaging Ltd In vivo video camera system
US6222547B1 (en) * 1997-02-07 2001-04-24 California Institute Of Technology Monitoring and analysis of data in cyberspace
US6600496B1 (en) * 1997-09-26 2003-07-29 Sun Microsystems, Inc. Interactive graphical user interface for television set-top box
US6219837B1 (en) * 1997-10-23 2001-04-17 International Business Machines Corporation Summary frames in video
US6240312B1 (en) * 1997-10-23 2001-05-29 Robert R. Alfano Remote-controllable, micro-scale device for use in in vivo medical diagnosis and/or treatment
US6188403B1 (en) * 1997-11-21 2001-02-13 Portola Dimensional Systems, Inc. User-friendly graphics generator using direct manipulation
IL122602A0 (en) * 1997-12-15 1998-08-16 Tally Eitan Zeev Pearl And Co Energy management of a video capsule
US6097399A (en) * 1998-01-16 2000-08-01 Honeywell Inc. Display of visual data utilizing data aggregation
US8636648B2 (en) * 1999-03-01 2014-01-28 West View Research, Llc Endoscopic smart probe
US6614452B1 (en) * 1999-11-15 2003-09-02 Xenogen Corporation Graphical user interface for in-vivo imaging
IL134017A (en) * 2000-01-13 2008-04-13 Capsule View Inc Camera for viewing inside intestines
US7039453B2 (en) 2000-02-08 2006-05-02 Tarun Mullick Miniature ingestible capsule
KR100800040B1 (en) 2000-03-08 2008-01-31 기븐 이미징 리미티드 A capsule for in vivo imaging
US6709387B1 (en) * 2000-05-15 2004-03-23 Given Imaging Ltd. System and method for controlling in vivo camera capture and display rate
FR2812293B1 (en) 2000-07-28 2002-12-27 Rhodia Chimie Sa METHOD FOR SYNTHESIS OF BLOCK POLYMERS BY CONTROLLED RADICAL POLYMERIZATION
JP2004508757A (en) * 2000-09-08 2004-03-18 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ A playback device that provides a color slider bar
KR100741999B1 (en) 2000-09-27 2007-07-23 기븐 이미징 리미티드 An immobilizable in vivo monitoring system and its method
JP2004519968A (en) * 2001-04-17 2004-07-02 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Method and system for selecting locations in an image sequence
US7119814B2 (en) 2001-05-18 2006-10-10 Given Imaging Ltd. System and method for annotation on a moving image
US7245746B2 (en) * 2001-06-12 2007-07-17 Ge Medical Systems Global Technology Company, Llc Ultrasound color characteristic mapping
AU2002304266A1 (en) 2001-06-20 2003-01-02 Given Imaging Ltd. Motility analysis within a gastrointestinal tract
US7219034B2 (en) * 2001-09-13 2007-05-15 Opnet Technologies, Inc. System and methods for display of time-series data distribution
US20040066398A1 (en) * 2002-10-03 2004-04-08 Koninklijke Philips Electronics N.V System and method for removing, trimming and bookmarking images of an ultrasound image sequence
US20040184639A1 (en) * 2003-02-19 2004-09-23 Linetech Industries, Inc. Method and apparatus for the automated inspection and grading of fabrics and fabric samples
US7557805B2 (en) * 2003-04-01 2009-07-07 Battelle Memorial Institute Dynamic visualization of data streams
JP4493386B2 (en) 2003-04-25 2010-06-30 オリンパス株式会社 Image display device, image display method, and image display program
CN100431475C (en) * 2003-04-25 2008-11-12 奥林巴斯株式会社 Device, method and program for image processing
JP3810381B2 (en) * 2003-04-25 2006-08-16 オリンパス株式会社 Image display device, image display method, and image display program
US7295346B2 (en) * 2003-06-23 2007-11-13 Xeorx Corporation Methods and apparatus for antialiasing using selective implementation of logical and averaging filter operations

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5392072A (en) * 1992-10-23 1995-02-21 International Business Machines Inc. Hybrid video compression system and method capable of software-only decompression in selected multimedia systems
US5970173A (en) * 1995-10-05 1999-10-19 Microsoft Corporation Image compression and affine transformation for image motion compensation
WO2000058967A1 (en) * 1999-03-30 2000-10-05 Tivo, Inc. Multimedia program bookmarking system
US20020193669A1 (en) * 2000-05-31 2002-12-19 Arkady Glukhovsky Method for measurement of electrical characteristics of tissue

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
ECONOMIDES M.J. ET AL.: "Advances in production engineering", WEB, 11 September 2003 (2003-09-11), pages 1 - 23, XP002985231, Retrieved from the Internet <URL:http://pumpjack.tamu.edu/~valko/CV/ValkoPDF/CanadianInvPaper.pdf> *
FROHLICH B. ET AL.: "Exploring geo-scientific data in virtual environments", ACM PROC. CONF. ON VIS., November 1999 (1999-11-01), pages 169 - 173, XP010365000 *
NUNTIUS ET AL.: "Multimedia technology, H.264 - A new technology for video compression", pages 1 - 4, XP002985232 *

Cited By (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1891886A3 (en) * 2003-04-25 2009-08-05 Olympus Corporation Image display unit, image display method, and image display program
US8620044B2 (en) 2003-04-25 2013-12-31 Olympus Corporation Image display apparatus, image display method, and computer program
EP1891887A3 (en) * 2003-04-25 2009-08-05 Olympus Corporation Image display unit, image display method and image display program
EP1891886A2 (en) * 2003-04-25 2008-02-27 Olympus Corporation Image display unit, image display method, and image display program
EP1891888A3 (en) * 2003-04-25 2009-08-05 Olympus Corporation Image display unit, image display method, and image display program
EP1618832A4 (en) * 2003-04-25 2009-08-05 Olympus Corp Image display unit, image display method and image display program
EP1618832A1 (en) * 2003-04-25 2006-01-25 Olympus Corporation Image display unit, image display method and image display program
JP2006288976A (en) * 2005-04-14 2006-10-26 Olympus Medical Systems Corp Simplified image displaying device and receiving system
US8144192B2 (en) 2005-04-14 2012-03-27 Olympus Medical Systems Corp. Simplified image display apparatus and receiving system
JP2006334297A (en) * 2005-06-06 2006-12-14 Olympus Medical Systems Corp Image display device
JP4716794B2 (en) * 2005-06-06 2011-07-06 オリンパスメディカルシステムズ株式会社 Image display device
US8169472B2 (en) 2005-08-22 2012-05-01 Olympus Corporation Image display apparatus with interactive database
US8406489B2 (en) 2005-09-09 2013-03-26 Olympus Medical Systems Corp Image display apparatus
EP1922979A1 (en) * 2005-09-09 2008-05-21 Olympus Medical Systems Corp. Image display device
EP1922979A4 (en) * 2005-09-09 2010-10-06 Olympus Medical Systems Corp Image display device
EP1924193A2 (en) * 2005-09-15 2008-05-28 Given Imaging Ltd. System and method for presentation of data streams
EP1924193A4 (en) * 2005-09-15 2009-12-02 Given Imaging Ltd System and method for presentation of data streams
EP1952751A1 (en) * 2005-11-24 2008-08-06 Olympus Medical Systems Corp. Device for displaying in vivo image, receiving device, and image display system and method using them
US8175347B2 (en) 2005-11-24 2012-05-08 Olympus Medical Systems Corp. In vivo image display apparatus, receiving apparatus, and image display system using same and image display method thereof
AU2006317028B2 (en) * 2005-11-24 2010-06-17 Olympus Medical Systems Corp. Device for displaying in vivo image, receiving device, and image display system and method using them
WO2007061008A1 (en) 2005-11-24 2007-05-31 Olympus Medical Systems Corp. Device for displaying in vivo image, receiving device, and image display system and method using them
EP1952751A4 (en) * 2005-11-24 2009-07-22 Olympus Medical Systems Corp Device for displaying in vivo image, receiving device, and image display system and method using them
DE102006008508A1 (en) * 2006-02-23 2007-09-13 Siemens Ag Medical visualization method for e.g. cardiological diagnosis of heart, involves storing recorded organ layer pictures with information about series of organ layer pictures and time period of recording in image data memory
US8224419B2 (en) 2006-02-23 2012-07-17 Siemens Aktiengesellschaft Medical visualization method, combined display/input device, and computer program product
US8900124B2 (en) 2006-08-03 2014-12-02 Olympus Medical Systems Corp. Image display device
JP2008036028A (en) * 2006-08-03 2008-02-21 Olympus Medical Systems Corp Picture display
JP2008061704A (en) * 2006-09-05 2008-03-21 Olympus Medical Systems Corp Image display device
US10320491B2 (en) 2006-09-06 2019-06-11 Innurvation Inc. Methods and systems for acoustic data transmission
US9900109B2 (en) 2006-09-06 2018-02-20 Innurvation, Inc. Methods and systems for acoustic data transmission
EP2148611A4 (en) * 2007-04-20 2013-03-27 Given Imaging Los Angeles Llc Diagnostic system for display of high-resolution physiological data of multiple properties
EP2148611A1 (en) * 2007-04-20 2010-02-03 Sierra Scientific Instruments, Inc. Diagnostic system for display of high-resolution physiological data of multiple properties
EP2149332A4 (en) * 2007-05-17 2010-11-03 Olympus Medical Systems Corp Image information display processing device and display processing method
EP2149332A1 (en) * 2007-05-17 2010-02-03 Olympus Medical Systems Corp. Image information display processing device and display processing method
US9017248B2 (en) 2007-11-08 2015-04-28 Olympus Medical Systems Corp. Capsule blood detection system and method
US8617058B2 (en) 2008-07-09 2013-12-31 Innurvation, Inc. Displaying image data from a scanner capsule
US9351632B2 (en) 2008-07-09 2016-05-31 Innurvation, Inc. Displaying image data from a scanner capsule
US9788708B2 (en) 2008-07-09 2017-10-17 Innurvation, Inc. Displaying image data from a scanner capsule
JP2010046525A (en) * 2009-11-30 2010-03-04 Olympus Medical Systems Corp Image display device
EP2996541A4 (en) * 2013-05-17 2017-03-22 EndoChoice, Inc. Interface unit in a multiple viewing elements endoscope system
EP3538644A4 (en) * 2016-11-10 2020-07-08 Becton, Dickinson and Company Timeline system for monitoring a culture media protocol
EP3979208A1 (en) * 2016-11-10 2022-04-06 Becton, Dickinson and Company Timeline system for monitoring a culture media protocol
US11694378B2 (en) 2016-11-10 2023-07-04 Becton, Dickinson And Company Timeline system for monitoring a culture media protocol
EP4235448A3 (en) * 2016-11-10 2023-10-11 Becton, Dickinson and Company Timeline system for monitoring a culture media protocol

Also Published As

Publication number Publication date
US7215338B2 (en) 2007-05-08
US20100053313A1 (en) 2010-03-04
EP2290613A1 (en) 2011-03-02
JP2008062070A (en) 2008-03-21
US20070159483A1 (en) 2007-07-12
US20050075551A1 (en) 2005-04-07
JP4604057B2 (en) 2010-12-22
DE602004031443D1 (en) 2011-03-31
JP2007507277A (en) 2007-03-29
EP1676244A4 (en) 2007-01-10
JP4027409B2 (en) 2007-12-26
EP2290613B1 (en) 2017-02-15
JP4740212B2 (en) 2011-08-03
US20120139936A1 (en) 2012-06-07
ES2360701T3 (en) 2011-06-08
US7636092B2 (en) 2009-12-22
US8228333B2 (en) 2012-07-24
AU2004277001B2 (en) 2010-08-19
EP1676244B1 (en) 2011-02-16
ATE498875T1 (en) 2011-03-15
JP2007222657A (en) 2007-09-06
US8144152B2 (en) 2012-03-27
AU2004277001A1 (en) 2005-04-07
EP1676244A1 (en) 2006-07-05

Similar Documents

Publication Publication Date Title
US8144152B2 (en) System and method for presentation of data streams
US20070060798A1 (en) System and method for presentation of data streams
US7567692B2 (en) System and method for detecting content in-vivo
US7577283B2 (en) System and method for detecting content in-vivo
US9514556B2 (en) System and method for displaying motility events in an in vivo image stream
US8423123B2 (en) System and method for in-vivo feature detection
EP2868100B1 (en) System and method for displaying an image stream
US20080039692A1 (en) Image display device
JP2015509026A5 (en)

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2006531011

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 2004277001

Country of ref document: AU

WWE Wipo information: entry into national phase

Ref document number: 2004770577

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2004277001

Country of ref document: AU

Date of ref document: 20040928

Kind code of ref document: A

WWP Wipo information: published in national office

Ref document number: 2004277001

Country of ref document: AU

WWP Wipo information: published in national office

Ref document number: 2004770577

Country of ref document: EP