EP1044401A1 - Nicht-manuelle betätigung eines medizinischen bilddarstellungsgeräts - Google Patents

Nicht-manuelle betätigung eines medizinischen bilddarstellungsgeräts

Info

Publication number
EP1044401A1
EP1044401A1 EP99971534A EP99971534A EP1044401A1 EP 1044401 A1 EP1044401 A1 EP 1044401A1 EP 99971534 A EP99971534 A EP 99971534A EP 99971534 A EP99971534 A EP 99971534A EP 1044401 A1 EP1044401 A1 EP 1044401A1
Authority
EP
European Patent Office
Prior art keywords
images
digital display
commands
display system
manual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP99971534A
Other languages
English (en)
French (fr)
Inventor
Daniel Jay Geisberg
David Irwin Lappen
William Ascher Lappen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AMD Industries LLC
Original Assignee
AMD Industries LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/339,427 external-priority patent/US6462868B1/en
Application filed by AMD Industries LLC filed Critical AMD Industries LLC
Priority to EP03011085A priority Critical patent/EP1335270A1/de
Publication of EP1044401A1 publication Critical patent/EP1044401A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00129Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a display device, e.g. CRT or LCD monitor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00249Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a photographic apparatus, e.g. a photographic printer or a projector

Definitions

  • the present invention relates to medical image display systems, and more particularly, to a method and apparatus for using non-manual control, such as voice, to control a Medical Image Display Station.
  • a physician can not readily view as many patient images on a workstation as may be required for proper diagnosis;
  • the workstation presents images in a manner that is counterintuitive and opposite to traditional work flow.
  • Such control may include image enhancement, image positioning and orientation as well as control over patient records.
  • an extremely easy and intuitive interface to the digital viewing station is provided by allowing the viewing physician to command the digital viewing station by using non-manual control features including voice commands.
  • this invention eliminates or streamlines some tasks while providing a work environment with a minimum of changes from manipulation of traditional "hardcopy” films while still providing the advantages of digital imaging.
  • This invention allows the viewing physician to maintain eye focus and concentration while manipulating the viewed image for improved diagnosis.
  • Head and/or eye motion sensing hardware and software can be used by the view station to control the position of a cursor on the screen and/or command functions.
  • “Non-manual" indicates a system that does not use limbs, either the hands or the feet for system control.
  • the displayed information can include patient history data, test data and information in place of or in addition to the diagnostic images which can include an X-ray, CT scan, MRI scan and other types of images.
  • the invented method, apparatus and system include the following features:
  • An apparatus and system for using non-manual control, preferably voice control and/or gaze control, to change the display of medical images comprises a device for receiving medical images; a display for showing the received medical images; and a device for controlling the display of the medical images through, at least in part, non-manual, preferably, voice commands and/or gaze control.
  • the device for receiving medical images comprises at least one of a transmission medium such as wire, cable, microwave, radio wave, light wave, sound wave, or physical media that holds the images such as floppy disk, removable hard disk, CD ROM or tape.
  • a transmission medium such as wire, cable, microwave, radio wave, light wave, sound wave, or physical media that holds the images such as floppy disk, removable hard disk, CD ROM or tape.
  • the display for the received medical images comprises display monitors, projection systems, or printers connected to a computer system.
  • the device for controlling the display of medical images comprises at least one microphone connected to a computer capable of recognizing voice commands and a gaze detection system.
  • the computer system may also be connected to a dictation system capable of recording the reading physician's diagnosis.
  • the computer system may also be capable of recognizing dictation and converting it to machine and/or human readable text.
  • the voice command feature may be deactivated at will and the system controlled by conventional means namely, manual devices including at least one of a keyboard, trackball, mouse, foot pedal, light pen, touch screen or other manual input devices. Alternatively, any of the listed manual control devices can be used along with voice command.
  • the information displayed may include reports or other non-image information as well as audio playback or text of dictated remarks.
  • the apparatus for using head or eye control to point to a location on a viewing station comprises a device for receiving medical images; a display for received medical images; a gaze control device for indicating the current location of interest; and a device for controlling the movement of the current location of interest with changes in gaze, i.e., with changes in head or eye position.
  • the computer system is also capable of notifying the user of improper commands and instructing or suggesting other commands either by visual and/or audio cues.
  • the users' impressions (oral and/or typed) or verified reports may be automatically e- mailed, together with a copy of the images analyzed, to referring physicians or wherever the user wishes.
  • the present invention can be incorporated into a medical viewing station to provide voice control as well as other non-manual system control.
  • FIG. 1 is a schematic of a viewing station according to one embodiment of the present invention.
  • FIG. 2 is a typical screen interface according to an embodiment of the present invention.
  • FIG. 3 is a typical flow diagram of the system according to an embodiment of the present invention.
  • FIG. 4 is a drawing illustrating a workflow on a preferred system according to an embodiment of the present invention.
  • the following preferred embodiment is an example of the application of the present invention, but is not intended to limit the invention in any way.
  • the present invention provides a system for using non-manual commands to control the actions of a medical image display system.
  • like element numerals are used to describe like elements illustrated in one or more of the figures.
  • the workflow model envisioned in the software allows the workstations of the present invention to greatly increase productivity.
  • the workstation has been designed to fit into the existing workflow of a radiologist's practice. Many repetitive functions currently being performed have been automated and simplified to allow for time savings.
  • the layout and controls use the paradigm of hanging film. As such, a technician may preview and arrange studies (ordering the film hanging, prioritizing the patients). However, a technician is not required.
  • the software can group studies by patient, date, requested reading physician and other criteria for automatic display. In either case, the reading physician has full control over the images and their locations.
  • the radiologist In a standard hospital-based practice, the radiologist is presented with studies of various patients, along with clinical observations for each patient. The clinical observations present the basis for the tests that have been performed and indicate the symptoms or suspected problems. If prior studies of the same patient are available, they should ideally be viewed adjacent to the current studies. With this information, the radiologist "reads" the images and dictates findings. In about 80% of the cases, the radiologist finds nothing worth commenting upon and merely indicates that he/she viewed the studies and that everything is normal. In the remainder of the cases, there will be substantive dictation required. The dictated report is typically routed to the hospital's dictation pool and then through the hospital to the referring physician or to other specialists. The radiologist may be paid on a "per read" basis or some other basis.
  • Digital medical images can be obtained from many sources.
  • the actual modality generating the image might be digital (MRI, CAT) and the modality can directly feed its images to the inventive workstation.
  • PACS picture archiving and communications system
  • Non-digital images film
  • Non-standard digital images may be either printed to film for scanning into a standard digital format or may be converted to feed the inventive workstation or PACS.
  • the images may be transmitted to the workstation via removable media (CD-ROM, diskette or tape) or via a network.
  • the pending worklist may be managed by a technician prior to the reading physician beginning a reading session. This management may include prioritizing the worklist in a manner other than the default modified first in, first out manner.
  • a technician may also arrange the images in an order that the reading physician prefers. However, order preferences may also be entered into the workstation and will be automatically filled, where appropriate information is available with the images.
  • the reading physician may view his/her entire worklist at any time so as to move to any patient, but the physician will probably just advance to the next patient after completing the present patient.
  • the physician When viewing the worklist, the physician only sees his/her worklist plus any unassigned studies. He/she may disable the unassigned study display to concentrate on his/her cases. This allows other physicians to process the unassigned cases, while only the assigned physician may read the assigned studies.
  • the worklist continuously updates as new cases arrive. After all pending patients are analyzed, the worklist automatically appears. Once the studies have been read, the inventive workstation can easily and quickly capture the normal diagnosis (no acute abnormalities found). This represents approximately 80% of the standard hospital practice and can be accomplished with a single keystroke or command, completely eliminating dictation for most patient reports.
  • the information that can be captured (and printed or saved) includes the date of the reading, the physician who performed the reading, the studies examined, the patient name, the referring physician, the hospital where the studies were performed and other information that normally would take up to 60 seconds to dictate per patient. It can also include any additional information provided by the original data file, HIS, RIS or DICOM data file; for example, any information provided by patient data fields in DICOM (Digital Image Communication in Medicine, a file and communications standard developed by an international committee including physicians, hardware manufacturers and software developers). This feature will save radiologists, typists and referring physicians a significant amount of time.
  • DICOM Digital Image Communication in Medicine
  • the workstation may route this normal diagnosis, as well as other diagnoses, as required back to the referring department, to a hospital information system or to paper, to the source of the image, either near the workstation or at some remote location, or anywhere else the radiologist desires. E-mail over the internet, including copies of images seen, may be sent automatically from the workstation. In the event that dictation of the diagnosis is necessary, the workstation can still capture information concerning which specific images have been read (as well as the rest of the standard information required) and make a notation to see the dictated report. As with the normal diagnosis, this information can be quickly routed to the appropriate destination. In addition, typed comments can be entered and routed to the appropriate destination.
  • an audio dictation file can be stored and routed. If required, the workstation can direct specific images to be printed, along with notes that the reading physician may wish to make. Like the other items of the diagnosis, this can be routed and/or e-mailed as desired.
  • a schematic of a viewing station according to an embodiment of the present invention is provided.
  • a viewing station comprising one or more video monitors (or other display means such as LCD, plasma panels, or projection systems, etc.) 1 coupled via cables 2 to a computer system 3.
  • the video monitors 1 and the computer system 3 may alternatively be connected via any type of data communication pathway such as wires, microwave, light wave, sound wave or radio wave.
  • the image(s) to be viewed arrive at the computer controlling the viewing station either over a communication line 4 or on removable media such as a diskette or a removable hard drive.
  • the viewing station presents the first patient's studies.
  • Other manual input devices such as keyboards, foot switches, and pointing devices, e.g., trackballs, trackpads, computer mice, drawing pads, touch screens, light pens and the like, are used for changing the system or user profiles, or system command and control protocols (software) and are represented by item 7, which represents all types of possible manual inputs, on FIG. 1.
  • These devices as represented by item 7, are connected via a cable 10 to the computer system 3 which controls the viewing station.
  • these devices 7 may be connected to the computer system 3 via any type of data communication pathway such as microwave, sound wave (ultra sound), light wave or radio wave.
  • the following additional devices may also be connected to the computer 3, depending upon the configuration desired by the client: (a) phone line (for internet access or remote diagnostics and/or control), (b) a network connection that may be different from 4, (c) speakers to provide audio instructions, (d) a video monitor calibration tool, (e) archival storage, (f) backup devices, (g) printers, etc.
  • data input uses a keyboard or similar device to actually input "random" information, both software commands as well as actual data upon which the software operates.
  • data input function has been simplified through the use of menus which list commands or even preselected data for automatic input.
  • a pointing device such as a mouse or trackball is used to select the desired menu item by "pointing" to it.
  • control functions are redundant so that they can be entered in a number of alternate and convenient ways.
  • control functions can usually also be selected by a "keyboard shortcut" (a special sequence of key strokes) and, in some systems, the entire command can be typed in.
  • image manipulation such as with the workstation of the present invention, it is usual to apply software " tools" to all or portions of an image (for example to alter contrast or rotate the image, etc.).
  • the "tools” can be considered commands and are selected by any of the usual methods to issue commands.
  • the pointing device is then used to select an area of interest in the image on which area of interest the selected tool will operate.
  • menus can be displayed at the cursor location as opposed to at the edge of the screen as in many graphical user interfaces.
  • the menu can be displayed in a compressed format or an extended format, which contains more features. This mode of display can be preset as a physician preference.
  • the expanded menu might consist of the following commands, any of which can be selected with the pointing device:
  • the workstation's controls have been designed to be simple to use and remember. Physicians are not required to learn complicated keystroke sequences to perform the functions they desire. Generally, redundant methods of manual input are supplied. For example, a trackball (or mouse or similar pointing device) can allow the physician to move the cursor to any location on any monitor. Alternatively, touch screens or light pens can be provided to permit manual cursor control. Gaze control based on the physician's head or eye position can replace more traditional pointing devices. Once the pointing device has set the cursor location, commands may be issued that will effect that location. Commands can then be issued using the pointing device and menus or by means of the keyboard or by means of voice control or any combination of these methods.
  • a trackball or mouse or similar pointing device
  • the workstation responds to spoken commands.
  • a microphone is connected to the system which allows the voice command to be interpreted by a speech recognition software module and passed to the display and control modules as if invoked by the pointing device.
  • speech recognition software module to be interpreted by a speech recognition software module and passed to the display and control modules as if invoked by the pointing device.
  • only appropriate commands are included in the dictionary of words to be recognized. In this way, the system "concentrates" on matching only a small subset of possible voice commands. For example, when the system is in the Cine mode, only commands related to Cine are active.
  • the applicable voice commands and subcommands are:
  • the system will respond with audio feedback indicating the status of the system.
  • This audio feedback may be presented through speakers (with adjustable volume control) or through an earphone or headset and is available even when the physician does not use Voice Commands. If the audio feedback is presented through a headset, the same headset can conveniently be part of the gaze control system.
  • the audio feedback may indicate that an action has been taken ("patient studies read - no dictation", “skip studies”...) or why an action can't be taken ("more images to read”, “can't pan up", “no cine available”).
  • Any type of alerts can conveniently be presented through the audio feedback system including information on the arrival of emergency, or "stat", cases that require immediate information. Again, the goal is to present vital information without forcing the physician to divert his or her eyes from the image being studied.
  • a microphone 6 is connected via a cable 5 to the computer system 3 which controls the viewing station.
  • the microphone 6 may be connected to the computer system 3 via any type of data communication pathway such as microwave, sound wave (ultra sound), light wave or radio wave.
  • the physician issues a verbal command into the microphone 6, which transmits it to the computer controlling the viewing station for interpretation and action.
  • the computer employs voice recognition software to interpret a restricted list of words as commands.
  • the recognition system By limiting the number of words allowable as commands it is possible for the recognition system to recognize a wide ranges of voices and speech styles without requiring computer training for each user.
  • the voice input is translated to commands, and the commands execute, the resulting image(s) are then transmitted from the computer system 3 to the video monitor(s) 1.
  • a more natural way of handling the problem of image portion selection is to allow the physician's gaze to do the work. That is to say, a person will naturally gaze or stare at the important portion of an image. If the system is able to determine at which region the physician is gazing, that region can be automatically selected. At least two technologies are known for allowing the computer system to determine where the physician is gazing.
  • the more complex method relies on actually determining the focal point of the physician's gaze.
  • This is achieved through an image analysis system (e.g., a small video camera located with the video monitors to capture the physician's image which is sent to the computer system for analysis) which constantly analyzes the physician's head position and the position of the pupils of the physician's eyes. From these computations the system is able to decide which portion of the image is in the center of the physician's gaze.
  • a simpler approach that yields the same information is to have the physician wear a headset or similar device that projects a beam of light (preferably invisible infrared light) towards the screen where one or more light sensors or camera devices detect the striking point of the beam.
  • Other types of " gaze" determining devices are known in the art.
  • the system automatically selects an image region surrounding the center of the physician's gaze.
  • the deduced gaze center can be used to "steer" a cursor in a manner analogous to the use of a computer mouse or similar pointing device but without any use of the hands.
  • the present invention is directed towards non-manual control of an image viewing station.
  • Voice command is a principal non-manual control method and is somewhat analogous to command and control by means of menus.
  • voice control is not well adapted to "pointing" and as explained above, image manipulation benefits from the use of a pointing device. Therefore, it is advantageous to allow the physician's head or eye movement to control a non-manual pointing device to direct the location of a cursor or otherwise delimit a region of interest on the screen of the viewing station.
  • Gaze control This type of non-manual input is sometimes called "gaze control.” Pointing by means of gaze control allows the reading physician to merely look at the portion of the image of interest and then issue the appropriate voice or other command to the viewing station which responds by enlarging the pointed to portion, or centering the pointed to portion, etc. This avoids the tedious manual or voice task of identifying a particular image portion to manipulate.
  • a gaze control system 8 is connected to the computer 3 through a data communication pathway 9.
  • the pathway 9 may be a simple cable but any other type of data communication pathway, as mentioned above in conjunction with the cables 2 and 5 can also be used.
  • the primary function of the gaze control pointing device is to select images or image regions for the commands (voice or menu) to operate upon.
  • the gaze control pointing device can perform any of the traditional functions of a pointing device, e.g., selecting a command from a menu or selecting a button or switch in a dialog box, etc. Again the system is extremely flexible because of the nature of the non-manual command inputs provided.
  • the voice control primarily issues commands — rather like a combination or a pointing device and a menu or typing in a command.
  • the voice control can also invoke a menu or select a button — rather like a pointing device alone.
  • Gaze control can select an image or image region like a pointing device and can issue commands by selecting them from an item.
  • the non-manual functions are complementary and somewhat interchangeable. The physician is able to mix or match as he or she finds most convenient. Further, traditional manual input (keyboard and pointing devices) remain fully functional and can also be intermingled with the non- manual controls. Not only does this provide the utmost flexibility for the physician, it also provides great convenience for a disabled or handicapped physician. For example, a physician with limited hand or arm mobility can readily use voice and gaze control to rapidly process a large number or images.
  • the viewing station can display not only images, but also various reports on patient data. Such reports may come from a Hospital Information System or a Radiology Information System or other storage database as well as from the referring physician. By being able to cross check test and other data on the patient the reviewing physician can often confirm or extend his or her diagnosis. With less versatile systems the physician would have to phone or otherwise contact a laboratory or Hospital Information System to confirm a possible diagnosis. This interrupts the work flow and considerably slows the whole process.
  • a typical screen interface according to an embodiment of the present invention is provided.
  • the figure shows one example of typical user interface, but is not intended to limit the invention.
  • the viewing area as represented by a monitor 11 is used to display an image 12 and basic information 13 about the patient including the patient's case or institution identifier number and information on the most recent command that has been issued by the physician to the system as well as the status of voice and gaze control and other available system control inputs.
  • the system is designed to train the user as he or she operates the system.
  • the viewing station notifies the user when an improper or illogical command has been detected and/or instructs the user on the proper use of a command.
  • Warning can be by visual and/or audio means, such as feedback through a headset or speaker system, or a warning message or icon (flashing) on the screen.
  • a detailed command menu 14 that lists other available commands can be made to appear (by voice or manual means) anywhere on the screen.
  • the overall command structure, either through voice or menu is contextual. That is, the commands available at a particular instant are related to the present status of the image and system. For example, when the Window/Level command has been given, only commands related to the Window/Level function are available. That means that Zoom, Rotate and other commands will not be active.
  • Other icons or word descriptions 15, 16 can be used to remind the user of the status of the overriding command structure being used (voice, manual control, gaze control or a combination of these), whether additional images are available or a variety of other status issues.
  • other items can appear on the monitor that relate to the images. This could include, among other items, clinical reports that describe the reason for the study, reports from prior studies, and unverified reports that require verification.
  • FIG. 3 a typical flow diagram of the system according to an embodiment of the present invention is provided. Images are received by means as described in FIG. 1 and reproduced in FIG. 3 as item 4. In step 101, these images are processed through the command and control protocols and in step 102, the images are temporarily stored in the storage unit. In step 103, information about the image is sent to the worklist and included on this list based on either information sent with the image and/or by protocols stored in the command and control protocols. The system is connected via a data feed (FIG. 3, item 4) and the image arrives by a means described in relation to FIG.l and reproduced in FIG. 3, item 4. The user controls the system via an input device (FIG. 3, items 6, 7, and/or 8).
  • the user log-in command is matched to that user's profile and to the appropriate worklist.
  • step 104 the first set of images on the worklist is presented.
  • step 105 using the image manipulation tools stored in the command and control protocols the images are reviewed by commands issued by the user.
  • step 106 the user, at some point, may elect to make or not to make a diagnosis. This decision either sends the image being reviewed back to the storage unit and sends the work item either to the unread portion of the worklist (if no diagnosis is made) or to the finished work portion of the worklist (if a diagnosis is made).
  • step 108 the user may elect to have the system send the diagnosis and/or the analyzed image(s) to a designated recipient by e-mail.
  • the images are compressed to speed transmission.
  • the recipient may be the requesting physician, a colleague who is aiding with diagnosis or providing a second opinion, a portion of the Hospital Information System (HIS), or an insurance company or some other entity that requires direct information regarding the diagnosis and its basis.
  • HIS Hospital Information System
  • step 109 if a diagnosis is made, then information regarding the type of diagnosis and, in a preferred embodiment of the current application, the diagnosis itself in either audio or digital form is recorded for later processing. This processing could either be done within the system or sent to another system (e.g., part of the HIS). The information or the processed reports can be automatically forwarded by e-mail as mentioned above. Thereafter, in step 111, the image will either be deleted from the system (typically to be long-term archived outside the viewing station) or stored for a short period of time on the system. These choices are determined by the amount of storage available within the system and the work flow of the physicians using the system.
  • the using physician or another physician is expected to reassess the image in the near future, it is preferable to keep the image in local storage (on the workstation).
  • local storage on the workstation.
  • FIG. 4 a drawing illustrating a workflow on a preferred system according to an embodiment of the present invention is provided. A further detailed description of the system of the inventive workstation is provided.
  • This workstation is derived from a commercial product (CATELLA, trademark of AMD Industries, LLC).
  • Images may be provided from a library as in step 402, or the images may be generated, as in step 404.
  • step 406 it is determined if the images provided are digital. If the images are not digital, the system goes to step 408 in which the images are scanned to generate a digital image.
  • step 410 the system determines whether to store the digital images. If the digital images are to be stored, the system goes to step 412 and the images are stored until they are read. The images may be sent to an archiving system or a film library for storage.
  • the inventive workstation stores the images until they are viewed by the physician. However, permanent storage (up to seven years and in some cases for the life of the patient) is usually required for medical images. While the workstation can store images for a short period of time after reading, the primary long-term storage of the images generally occurs elsewhere.
  • the workstation can be connected to an archiving system or to a film library for such storage.
  • the capacity of the commercial version workstation in its standard form is 200 2k x 2k images (typical chest X-ray) or 8,000 512 x 512 images (typical MRI).
  • the system may be expanded to hold over 10,000 2k x 2k images or over 400,000 512 x 512 images.
  • the system can be configured to automatically delete images based upon various criteria including date of last access.
  • the last used images are the first to be deleted when the system is starting to fill up.
  • Other automatic deletion criteria can be set, including source of the image and confirmation that the image has been properly stored in a permanent storage location prior to deletion.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Medical Treatment And Welfare Office Work (AREA)
EP99971534A 1998-10-30 1999-10-29 Nicht-manuelle betätigung eines medizinischen bilddarstellungsgeräts Withdrawn EP1044401A1 (de)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP03011085A EP1335270A1 (de) 1998-10-30 1999-10-29 Nicht-manuelle Betätigung von einem medizinischen Bilddarstellungsgerät

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
US339427 1994-11-14
US10639298P 1998-10-30 1998-10-30
US106392P 1998-10-30
US09/339,427 US6462868B1 (en) 1999-06-24 1999-06-24 Display device for both hardcopy and digital images
US43051499A 1999-10-29 1999-10-29
US430514 1999-10-29
PCT/US1999/025532 WO2000026758A1 (en) 1998-10-30 1999-10-29 Non-manual control of a medical image display station

Related Child Applications (1)

Application Number Title Priority Date Filing Date
EP03011085A Division EP1335270A1 (de) 1998-10-30 1999-10-29 Nicht-manuelle Betätigung von einem medizinischen Bilddarstellungsgerät

Publications (1)

Publication Number Publication Date
EP1044401A1 true EP1044401A1 (de) 2000-10-18

Family

ID=27380112

Family Applications (1)

Application Number Title Priority Date Filing Date
EP99971534A Withdrawn EP1044401A1 (de) 1998-10-30 1999-10-29 Nicht-manuelle betätigung eines medizinischen bilddarstellungsgeräts

Country Status (3)

Country Link
EP (1) EP1044401A1 (de)
AU (1) AU1459400A (de)
WO (1) WO2000026758A1 (de)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004073462A (ja) * 2002-08-16 2004-03-11 Konica Minolta Holdings Inc 医用画像制御装置、医用画像制御システム及びプログラム
CN109917553A (zh) * 2019-04-19 2019-06-21 张三妹 一种智能阅片机

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2696560B1 (fr) * 1992-10-02 1994-11-18 Sextant Avionique Dispositif d'exécution de commandes virtuelles autoadaptatif.
US5546943A (en) * 1994-12-09 1996-08-20 Gould; Duncan K. Stimulating a beneficial human response by using visualization of medical scan data to achieve psychoneuroimmunological virtual reality
US5544654A (en) * 1995-06-06 1996-08-13 Acuson Corporation Voice control of a medical ultrasound scanning machine
JPH09114543A (ja) * 1995-10-02 1997-05-02 Xybernaut Corp ハンドフリーコンピュータ装置
JPH1039995A (ja) * 1996-07-19 1998-02-13 Nec Corp 視線・音声入力装置
EP0862159A1 (de) * 1997-03-01 1998-09-02 Agfa-Gevaert N.V. Spracherkennungssystem für eine medizinische Röntgenapparatur

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO0026758A1 *

Also Published As

Publication number Publication date
AU1459400A (en) 2000-05-22
WO2000026758A1 (en) 2000-05-11

Similar Documents

Publication Publication Date Title
US8036917B2 (en) Methods and systems for creation of hanging protocols using eye tracking and voice command and control
US7573439B2 (en) System and method for significant image selection using visual tracking
US7331929B2 (en) Method and apparatus for surgical operating room information display gaze detection and user prioritization for control
US7501995B2 (en) System and method for presentation of enterprise, clinical, and decision support information utilizing eye tracking navigation
US7576757B2 (en) System and method for generating most read images in a PACS workstation
US20050114140A1 (en) Method and apparatus for contextual voice cues
US6392633B1 (en) Apparatus for audio dictation and navigation of electronic images and documents
US6518952B1 (en) System for manipulation and display of medical images
US9841811B2 (en) Visually directed human-computer interaction for medical applications
US8423081B2 (en) System for portability of images using a high-quality display
US7694240B2 (en) Methods and systems for creation of hanging protocols using graffiti-enabled devices
US7834891B2 (en) System and method for perspective-based procedure analysis
US20070118400A1 (en) Method and system for gesture recognition to drive healthcare applications
US20090177477A1 (en) Voice-Controlled Clinical Information Dashboard
US20090182577A1 (en) Automated information management process
US20070197909A1 (en) System and method for displaying image studies using hanging protocols with perspectives/views
US20060235936A1 (en) System and method for PACS workstation conferencing
WO2006039687A2 (en) System and method for handling multiple radiology applications and workflows
US20070106501A1 (en) System and method for subvocal interactions in radiology dictation and UI commands
EP1335270A1 (de) Nicht-manuelle Betätigung von einem medizinischen Bilddarstellungsgerät
US20060111936A1 (en) Container system and method for hosting healthcare applications and componentized archiecture
JP6440136B1 (ja) 医療映像スイッチャー
EP1044401A1 (de) Nicht-manuelle betätigung eines medizinischen bilddarstellungsgeräts
WO2009107644A1 (ja) 医用画像管理装置、印刷画面表示方法及びプログラム
JP5958320B2 (ja) 電子カルテ装置及びプログラム

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

17P Request for examination filed

Effective date: 20001108

17Q First examination report despatched

Effective date: 20020318

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20050207