US20210210194A1 - Apparatus for displaying data - Google Patents

Apparatus for displaying data Download PDF

Info

Publication number
US20210210194A1
US20210210194A1 US17/206,431 US202117206431A US2021210194A1 US 20210210194 A1 US20210210194 A1 US 20210210194A1 US 202117206431 A US202117206431 A US 202117206431A US 2021210194 A1 US2021210194 A1 US 2021210194A1
Authority
US
United States
Prior art keywords
image data
user interface
graphical user
input unit
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/206,431
Inventor
Ronaldus Petrus Johannes Hermans
Adrie BASELMANS
Ivo Don Stuyfzand
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Priority to US17/206,431 priority Critical patent/US20210210194A1/en
Publication of US20210210194A1 publication Critical patent/US20210210194A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04895Guidance during keyboard input operation, e.g. prompting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/452Remote windowing, e.g. X-Window System, desktop virtualisation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H15/00ICT specially adapted for medical reports, e.g. generation or transmission thereof
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42222Additional components integrated in the remote control device, e.g. timer, speaker, sensors for detecting position, direction or movement of the remote control, microphone or battery charging device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Definitions

  • the present invention relates to an apparatus for displaying data, and to a method for displaying data, as well as to a computer program element and a computer readable medium.
  • GUI graphical user interface
  • US 2012/0182244 A1 relates to systems and methods for providing remote assistance with a medical procedure by a technician via a remote device such as a laptop or tablet.
  • Video output generated by medical devices and video captured by camera, may be transmitted via a network and rendered on the remote device.
  • an apparatus for displaying data comprising:
  • a user can interact with a touch input device and on a separate display device be provided with visual feedback about the area of the touch device user interface that the user is interacting with.
  • a user gets direct feedback on a display unit (such as a viewing device) of the part of a graphical user interface on an input unit (such as a touch device) they are interacting with.
  • blind touch screen control is provided with feedback about the area of the touch device user interface that a user is interacting with being provided to the user.
  • improved interaction feedback is provided, when interacting on an input unit such as a touch while looking at a separate display unit for interaction feedback.
  • the second image data corresponds to the image data representing the graphical user interface for the active zone, that is the zone with which the user is interacting.
  • the contents of the active zone of the GUI are mirrored on the main display.
  • an input unit such as a touch screen is showing imagery that is already being presented on the at least one display unit, the imagery is not duplicated on the display device.
  • the first data comprises a further graphical user interface and the second image data comprises a portion of the touch screen graphical user interface, and the second image data is rendered next to a portion of the further graphical user interface.
  • the second image data is blended with a portion of the further graphical user interface.
  • the second image data is rendered inside a portion of the first graphical user interface.
  • the processing unit is configured to determine the user interaction portion when a user moves a hand over at least one zone of the plurality of zones of the graphical user interface.
  • the processing unit is configured to determine the user interaction portion when a user touches at least one zone of the plurality of zones of the graphical user interface.
  • the processing unit is configured to determine the user interaction portion when a pointer moves over at least one zone of the plurality of zones of the graphical user interface.
  • the position of for example a cursor as shown on an image displayed on a touch device can now be presented on a separate viewing device which shows a portion of the image shown on the touch device along with the cursor position.
  • the user can keep track of where a cursor is located on a touch screen whilst looking at a separate viewing screen.
  • an input unit e.g. touch device
  • graphical user interface shows different content to that shown on at least one display unit (e.g. viewing device)
  • a localized user interaction portion e.g. a cursor position
  • the user is better able to keep track of where the cursor is located on the touch device and can do so without having to look away from the viewing device.
  • display of the second image data on the at least one display unit is enabled or disabled as a function of input from the user.
  • a medical imaging system such as an interventional X-ray system, comprising an apparatus for displaying data in accordance with the invention.
  • the display unit is preferably provided as an overhead monitor, for example an exam room monitor, and the touch input device is preferably provides as a table side module (TSM), that is a control device arranged adjacent to a patient table.
  • TSM table side module
  • a method for displaying data comprising:
  • a computer program element for controlling an apparatus as previously described which, when the computer program element is executed by a processing unit, is adapted to perform the method steps as previously described.
  • a computer readable medium having stored the computer element as previously described.
  • FIG. 1 shows an example of a method for displaying data
  • FIG. 2 shows a schematic set up of an example apparatus for displaying data
  • FIG. 3 shows an example of an input unit, and two display units.
  • FIG. 4 shows an example of data being displayed by an example apparatus for displaying data
  • FIG. 5 shows the same data as shown in FIG. 4 , represented in schematic form
  • FIG. 6 shows an example of data being displayed by an example apparatus for displaying data
  • FIG. 7 shows the same data as shown in FIG. 6 , represented in schematic form.
  • FIG. 8 shows an example of a technical realization of an example apparatus for displaying data with an example workflow.
  • FIG. 1 shows a method 10 for displaying data in its basic steps, the method 10 comprising:
  • first data is displayed on at least one display unit.
  • a user interaction portion of a user input area of at least one input unit is determined.
  • a second displaying step 40 on the at least one display unit a portion of the first data is displayed simultaneously with second image data, wherein the second image data is representative of the user interaction portion of the user input area of the at least one input unit.
  • the first data comprises image data.
  • the first data comprises text data.
  • the first data comprises signals.
  • the first data comprises any combination of these data.
  • the first data comprises image and text data.
  • FIG. 2 shows an apparatus 50 for displaying data.
  • the apparatus 50 comprises at least one display unit 60 , at least one input unit 70 comprising a user input area, and a processing unit 80 .
  • the processing unit 80 is configured to display first image data on the at least one display unit 60 .
  • the processing unit 80 is also configured to determine a user interaction portion of the user input area of the at least one input unit 70 .
  • the processing unit 80 is also configured to display on the at least one display unit 60 a portion of the first image data simultaneously with second image data, wherein the second image data is representative of the user interaction portion of the user input area of the at least one input unit 70 .
  • the at least one display unit comprises a monitor configured to display medical image data, preferably an overhead monitor as may be arranged inside a hospital examination room.
  • the first data comprises interventional X-ray data.
  • the first data comprises CT X-ray image data, X-ray fluoroscopic image data, X-ray tomography image data, Magnetic Resonance (MR) image data, ultrasound image data or any combination of these image data.
  • MR Magnetic Resonance
  • the at least one touch input unit comprises a device with a screen that a user can touch, such as a smartphone or tablet PC, or in a preferred embodiment a table side module adjacent to a patient table in a hospital examination room.
  • a user may be interacting with the touch input device while looking at an overhead display preferably showing a graphical user interface of the touch device.
  • a tablet user interface or graphical user interface can be directly rendered on the screen.
  • the at least one input unit can be a touch device with its own graphical user interface, such as a touch device showing separate processing, acquisition, medical device configuration and measurement functionalities relating to medical imagery presented on a separate display unit, and the processing unit can determine the area of the touch device a user is interacting with such as that relating to measurement functionalities and present the measurement functionalities as shown on the touch pad as a separate, or integrated, region on the at least one display unit along with the medical image data.
  • the input unit can have a display screen this does not necessarily mean that the second image data representative of the user interaction portion of the user input area of the input unit is mirroring image data presented on the input unit.
  • the display unit could show schematic representations of that imagery.
  • representative also means that the display unit could mirror at least a portion of what is being shown on a display screen of an input unit.
  • the at least one input unit comprises a device configured to detect the proximity of a user's hand or pointer or stylus without the user having to touch the input unit with their hand or pointer or stylus.
  • the at least one input unit can utilize microwave or laser sensing to determine the proximity to and location of the user's hand with respect to a user input area of the input unit in order to determine the user interaction portion.
  • the at least one input unit detects the proximity and location of a user's hand with respect to the input unit using a camera viewing the user's hand.
  • the at least one input unit detects the proximity and location of a user's had with respect to the input unit through magnetic field disturbance sensing of the user's hand.
  • the at least one input unit can be a touch device and/or a proximity device.
  • multiple display units and/or multiple input units there can be multiple display units and/or multiple input units.
  • multiple display units there can be a display unit in a Control Room and a display unit in an Exam Room.
  • an input unit such as a Table Side Module
  • a physician close to a patient
  • an input unit for a nurse there can be an input unit for a nurse.
  • the second image data is a direct reproduction of the image data representing at least an active portion of a graphical user interface that the user is interacting with.
  • the input unit can be a touch device with its own screen on which there is a first zone or area showing medical imagery and a second zone or area showing buttons relating to control or processing of that medical imagery.
  • the at least one display unit can be a medical exam room monitor displaying an enlargement of the imagery as shown on the at least one input unit, and when the user interacts with the input unit at the location of the buttons, imagery of the buttons as presented on the input unit can be mirrored on the at least one display unit.
  • the input unit e.g. touch device
  • the user may move their finger towards the center right of the screen of a touch device.
  • On the touch device there may be an image being presented on the center of the screen that is shown in expanded form on the viewing device, and a number of functional buttons may be presented on the far right of the touch screen.
  • the user's finger approaches the touch screen the user is presented on the viewing device with an image portion from the touch device with parts of buttons being shown.
  • the image on the viewing device shows less of the image as presented on the touch device and more of the buttons on the touch device.
  • the user is presented with information on the viewing device regarding the touch device presented as required, such as the functional buttons, enabling the user to press the button on the touch device that they wish to press. In this manner, they can interact with the imagery presented on at least one display unit via interacting with a separate input unit without having to move their line of sight away from the at least one display unit.
  • the at least one input unit 70 comprises a display screen 90 configured to display input unit image data, wherein the user input area of the at least one input unit 70 comprises at least a portion of the input unit image data displayed on the display screen, and wherein the second image data comprises at least a portion of the input unit image data.
  • the user input area of the at least one input unit comprising at least a portion of the input unit image data displayed on the display screen means that the user input area of the at least one input unit displays at least a portion of the input unit image data.
  • a part of a user interface or graphical user interface of an input unit is rendered on the at least one display unit (e.g., viewing device).
  • the part is an active zone, that is, one of a plurality of zone of the graphical user interface with which a user is interacting.
  • the input unit image data comprises interventional X-ray data.
  • the input unit image data comprises CT X-ray image data, X-ray fluoroscopic image data, X-ray tomography image data, Magnetic Resonance (MR) image data, ultrasound image data or any combination of these image data.
  • the input unit image data comprises image relating to button functionalities for the processing and/or acquisition of medical image data and/or control of equipment associated with the acquisition of image data, such as an X-ray unit or MRI unit.
  • the input unit image data can relate to buttons having functionalities including any combination of: acquisition; X-ray unit control and/or medical table control; X-ray beam collimation; processing; zoom & pan; contrast & Brightness; subtraction; measurements; image overlays; reset; and the opening of files.
  • the at least one input unit comprises a touch screen and a portion of the graphical user interface or user interface that the user is touching or has touched is represented in schematic form on the at least one display unit.
  • the at least one input unit comprises a touch screen and a portion of the graphical user interface or user interface that the user is touching or has touched is mirrored on the at least one display unit.
  • a subset of the imagery being shown on the input device such as a touch screen device, can be rendered on the display device, or a schematic representation of that imagery on the input device can be presented on the display device.
  • the at least one input unit comprises a proximity device having a screen and a portion of the graphical user interface or user interface that the user is in proximity with or has been in proximity with is represented in schematic form on the at least one display unit.
  • the at least one input unit comprises a proximity device having a screen and a portion of the graphical user interface or user interface that the user is in proximity with or has been in proximity with is mirrored on the at least one display unit.
  • the processing unit is configured to display the second image data without displaying any first data, as displayed on the at least one input unit, as part of the second image data.
  • an input unit such as a touch screen is showing imagery that is already being presented on the at least one display unit
  • the imagery is not duplicated on the display device.
  • image A can be presented on a display unit and image B can be shown on an input unit along with a number of functional buttons on the right hand side of the screen. If the user is interacting with the buttons at the top right of the screen then these buttons, along with the portion of image B near or around those buttons, can be displayed on the display unit along with image A. However, if the input unit was showing image A rather than image B, then only the buttons at the top right hand side of the input unit that the user is interacting with are displayed on the display unit. In this manner, redundant data is not presented to the user.
  • an image of a vascular region of a patient is shown on the screen of an input unit (e.g. touch device) and the same image at the same level of magnification is being shown on the display unit (e.g. viewing device) then the vascular image data from the touch device is not reproduced on the viewing device, and only the other image data that the user is interacting with such as that relating to buttons with image manipulation functionalities.
  • the same image data is shown on the at least one input unit as shown on the display unit, but at a different level of magnification, then it may be reproduced on the display unit if the user is interacting at that region of the input unit.
  • the user input area comprises a plurality of zones, and wherein the user interaction portion of the user input area comprises at least one zone.
  • a zone of the user interface or graphical user interface of the at least one input unit, such as a touch device or proximity device, that a user is interacting with, is rendered on the at least one display unit such as a viewing device.
  • a zone is equivalent to a panel on the input device.
  • the first data comprises a first graphical user interface and the second image data comprises a portion of a second graphical user interface, and wherein the portion of the second graphical user interface is rendered next to a portion of the first graphical user interface.
  • the portion of imagery on the input device that the user is interacting with can be shown on the display device next to either all of what was being shown on the display device or next to most of what was being shown.
  • the original imagery can be resized in order to present the new imagery next to the old imagery, or the new imagery can be overlaid over one side of the old imagery.
  • the first data comprises a first graphical user interface and the second image data comprises a portion of a second graphical user interface, and wherein the portion of the second graphical user interface is blended with a portion of the first graphical user interface.
  • the amount of blending is dependent upon the distance of a pointing device to the input device. For example, as a user's finger approaches the input unit (such as a touch device) the region of the touch device centered around the location of the finger (above the screen) is presented on the at least one display unit (e.g. viewing device). In an example, as the user's finger approaches or recedes from the touch device screen, the image from the touch device presented on the viewing device becomes brighter and fades accordingly.
  • the first data comprises a first graphical user interface and the second image data comprises a portion of a second graphical user interface, and wherein the portion of the second graphical user interface is rendered inside a portion of the first graphical user interface.
  • the positioning of the second image data can be controlled or adjusted as required.
  • the processing unit is configured to determine the user interaction portion when a user moves a hand over at least one portion of the user input area of the at least one input unit.
  • the processing unit determines that the user has moved their hand over the right hand portion of the touch screen of an input unit, and the buttons and/or image data shown at the right hand portion of the touch screen is displayed on the at least one display unit along with at least a portion of the image data that was already being displayed on the at least one display unit.
  • a user can be simultaneously hovering their hand over multiple zones, e.g., the left hand area and the right hand areas of a touch/proximity device.
  • the processing unit is configured to determine the user interaction portion when a user touches a portion of the user input area of the at least one input unit.
  • the processing unit determines that the user has touched the right hand portion of the touch/proximity screen of an input unit, and the buttons and/or image data shown at the right hand portion of the touch screen is displayed on the at least one display unit along with at least a portion of the image data that was already being displayed on the at least one display unit.
  • the processing unit is configured to determine the user interaction portion when a pointer moves over a portion of the user input area of the at least one input unit.
  • the pointer comprises a cursor displayed on the user input area of the at least one input unit.
  • the processing unit determines that the user has moved the cursor over the right hand portion of the screen of an input unit, and the buttons and/or image data shown at the right hand portion of the touch screen is displayed on the at least one display unit along with at least a portion of the image data that was already being displayed on the at least one display unit.
  • the processing unit is configured to determine a localized user interaction position of the user interaction portion, and wherein the second image data comprises image data representative of the localized user interaction position.
  • a localized interaction position comprises a position of a cursor displayed on the user input area of the at least one input unit.
  • display of the second image data on the at least one display unit is enabled or disabled as a function of input from the user.
  • the required functionality can be enabled/disabled as necessary, for example based on a user's hand approaching, touching or being removed from the input unit, such as a touch device or proximity device, or based on other input from the user.
  • the input from the user comprises a button or pedal being depressed.
  • the input from the user comprises a duration in time of input from the user. In other words, a threshold in time can apply and if the input from the user, such as the user's hand being in a particular position, exceeds a certain time then the display of second image data on the at least one display unit is enabled.
  • FIG. 3 shows an input device in the form of a touch device, shown at the bottom and labelled as “Touch”, and two display devices in the form of a desktop and viewing device, shown at the middle and top and labelled as “Desktop” and “Viewing”.
  • the display devices are showing enlarged portions of what is being shown on the touch device.
  • This arrangement represents an approach in which one or more viewing devices are used for main visual feedback and a touch device is used for interaction purposes.
  • the touch device can have its own user UI or GUI (e.g. a phone) or have no UI or GUI (e.g. a mouse pad or TV remote control). In this approach, while interacting on a touch device, the user gets no direct visual feedback on the viewing device as to where on the touch device interacting.
  • buttons are placed on the touch device at positions that are easily found by hand interaction, as represented on the screen of the touch device shown in FIG. 3 .
  • the user needs to look on the touch device to find the functionality of interest (e.g. a button) and after pressing the button or positioning the cursor over the button and clicking the user must then look back to the viewing device to judge the result.
  • the functionality of interest e.g. a button
  • the apparatus and method for displaying data of the present disclosure is applicable to a wide variety of device setups. For example:
  • Use case 1 which is represented in FIG. 4 and FIG. 5 which show an example of data being displayed by an example apparatus of the present disclosure for displaying data.
  • FIG. 4 and FIG. 5 an input unit in the form of a touch device having a UI is shown in the top image, and a display unit in the form of a viewing device having a UI is shown in the bottom image.
  • this part of the touch device UI that is, the right panel or right zone, is blended in the viewing device UI.
  • the button over which the user's hand is hovering or the button being touched on the touch device UI is highlighted.
  • TSM Table Side Module
  • TSM Table Side Module
  • a rendering of the panel UI showing the buttons is displayed on the exam room monitor and optionally directly shows over which button the user is hovering (or touching).
  • the panel UI rendering disappears from the exam room monitor, leaving a clean, non-distracting UI in the exam room.
  • Use case 2 which is represented in FIG. 6 and FIG. 7 which show an example of data being displayed by an example apparatus of the present disclosure for displaying data.
  • an input unit (not shown) in the form of the touch device having a UI is showing imagery as shown for the input unit of FIG. 4 and FIG. 5 .
  • a display unit in the form of a viewing device having a UI is shown.
  • the whole of the UI of the touch device is shown on top of the viewing device UI.
  • the subset of the touch device as discussed above with respect to FIG. 4 and FIG. 5 is shown on top of the viewing device UI, and again the specific button being interacted with by the user can be identified.
  • TSM Table Side Module
  • the UI of the TSM application renders the TSM UI fully on top of the X-ray viewer application on the exam room monitor.
  • the exam room monitor shows X-ray viewer application.
  • the positioning of the UI of the touch device within the UI of the viewing device can be adjusted as follows.
  • the amount of blending can be dependent on the distance of the pointing device to the touch device.
  • the content of the UI of the touch device can be partially in the UI of the viewing device to reduce the amount of screen area needed for the feedback. This can depend on the position of the pointing device with respect to the touch device UI, e.g. show only buttons from left panel when pointing device is at left, and from right panel when pointing device is at right. See FIG. 4 and FIG. 5 .
  • the rendering of the touch device UI on the viewing device can be:
  • FIG. 8 shows an example of a technical realization of an example apparatus for displaying data with an example workflow. Referring to the physical units shown in FIG. 8 and with reference to the circled numbers 1 - 6 also as shown in FIG. 8 , the example workflow is now explained, where the numbering below relates to the circled numbers in FIG. 8 :
  • the UI fragment PC renders a UI fragment and designates certain logical areas in the UI fragment as either blend targets or blend sources.
  • a blend target is a (logical) part of the UI fragment in which another part of UI fragment can be blended.
  • a blend source is a (logical) part of the UI fragment that will be blended into a target.
  • Rendering can be performed using a rendering framework like OpenGL, DirectX, GDI, or developed in-house.
  • UI creation can be accomplished using a third-party UI framework, e.g. WPF, QT, or developed in-house.
  • the computers send the UI fragment via a stream (this can be video stream like DVI or network stream via UDP) to the Backend-for-Frontend (BFF) PC.
  • BFF Backend-for-Frontend
  • the UI compositor has information about blend target areas and blend source areas of the different UIs.
  • This interface can be based on several technologies: a RPC API, a REST or HTTP etc.
  • the UI compositor program running on BFF uses these interfaces to ‘link’ blend targets and sources.
  • the UI Compositor has information about the different screen devices.
  • a software program running on the BFF PC can specify how the different UI fragments should be composed and shown on screens.
  • the Interaction Orchestrator program running on the BFF ensures that all user-interaction on the screen devices are captured.
  • the Interaction Orchestrator routes back the input events to the relevant UI fragment PC, which can then apply the required logic for interaction handling.
  • the information can is be sent via network stream or via simulated mouse/keyboard events.
  • UI fragment PCs there can be 1 (or more than 2), and instead of running on a separate BFF PC, the logic for UI composition and Interaction Orchestration could run on one of the UI fragment PCs (i.e. A single PC solution can be provided).
  • a PC can be a computer unit or processing unit, and a UI compositor can be a UI compositor computer program for example.
  • a computer program or computer program element for controlling an appropriate system that is characterized by being configured to execute the method steps according to one of the preceding embodiments.
  • the computer program element might therefore be stored on a computer unit, which might also be part of an embodiment.
  • This computing unit may be configured to perform or induce performing of the steps of the method described above. Moreover, it may be configured to operate the components of the above described apparatus.
  • the computing unit can be configured to operate automatically and/or to execute the orders of a user.
  • a computer program may be loaded into a working memory of a data processor.
  • the data processor may thus be equipped to carry out the method according to one of the preceding embodiments.
  • This exemplary embodiment of the invention covers both, a computer program that right from the beginning uses the invention and computer program that by means of an update turns an existing program into a program that uses invention.
  • the computer program element might be able to provide all necessary steps to fulfill the procedure of an exemplary embodiment of the method as described above.
  • a computer readable medium such as a CD-ROM
  • the computer readable medium has a computer program element stored on it which computer program element is described by the preceding section.
  • a computer program may be stored and/or distributed on a suitable medium, such as an optical storage medium or a solid state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the internet or other wired or wireless telecommunication systems.
  • a suitable medium such as an optical storage medium or a solid state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the internet or other wired or wireless telecommunication systems.
  • the computer program may also be presented over a network like the World Wide Web and can be downloaded into the working memory of a data processor from such a network.
  • a medium for making a computer program element available for downloading is provided, which computer program element is arranged to perform a method according to one of the previously described embodiments of the invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Primary Health Care (AREA)
  • General Health & Medical Sciences (AREA)
  • Epidemiology (AREA)
  • Software Systems (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • General Business, Economics & Management (AREA)
  • Business, Economics & Management (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

An apparatus displays (20) first data on at least one display unit. A user interaction portion of a user input area of at least one input unit is determined (30); and on the at least one display unit a portion of the first data is displayed (40) simultaneously with second image data. The second image data is representative of the user interaction portion of the user input area of the at least one input unit.

Description

    CROSS-REFERENCE TO PRIOR APPLICATIONS
  • This application is a Continuation of U.S. application Ser. No. 15/764,356, filed Mar. 29, 2018, now U.S. Pat. No. 10,957,441, which is the U.S. National Phase Application under 35 U.S.C. § 371 of International Application No. PCT/EP2016/073378, filed Sep. 30, 2016, which claims the benefit of European Patent Application No. 15188101.8, filed Oct. 2, 2015. These applications are hereby incorporated by reference herein.
  • FIELD OF THE INVENTION
  • The present invention relates to an apparatus for displaying data, and to a method for displaying data, as well as to a computer program element and a computer readable medium.
  • BACKGROUND OF THE INVENTION
  • In certain medical applications, for example those involving a medical imaging system such as an interventional X-ray system, there is a need for a user to operate a graphical user interface (GUI) on an input device such as a touch device, while viewing a different screen of a display device. However, if the user needs to activate certain functionality through the GUI on the input device, they need to look away from the screen of the display device, activate a button corresponding to the required functionality on the GUI of the input device, and then look back at the display device to judge the result of the action. Similar issues apply when a user interacts with an input unit in non-medical applications, such as when using a TV remote control whilst watching the television.
  • US 2012/0182244 A1 relates to systems and methods for providing remote assistance with a medical procedure by a technician via a remote device such as a laptop or tablet. Video output generated by medical devices and video captured by camera, may be transmitted via a network and rendered on the remote device.
  • However, the user may still not be provided with the necessary feedback regarding how images or other data are being displayed.
  • SUMMARY OF THE INVENTION
  • It would be advantageous to have an improved technique for displaying data in particular for a medical imaging system such as an interventional X-ray system.
  • The object of the present invention is solved with the subject matter of the independent claims, wherein further embodiments are incorporated in the dependent claims. It should be noted that the following described aspects of the invention apply also for the apparatus for displaying data, and to a method for displaying data, as well as to a computer program element and a computer readable medium.
  • According to a first aspect, there is provided an apparatus for displaying data, comprising:
    • at least one display unit;
    • at least one touch input unit comprising a display screen configured to display input unit image data representing a graphical user interface of a user input area, the graphical user interface comprising a plurality of zones; and
    • a processing unit;
      • wherein, the processing unit is configured to display first image data on the at least one display unit;
      • wherein, the processing unit is configured to determine, as a user interaction portion of the user interface, a zone of the plurality of zones with which the user is interacting, and
      • wherein, the processing unit is configured to display, on the at least one display unit, the first image data simultaneously with second image data being representative of the user interaction portion of the graphical user interface.
  • In other words, a user can interact with a touch input device and on a separate display device be provided with visual feedback about the area of the touch device user interface that the user is interacting with. To put it another way, a user gets direct feedback on a display unit (such as a viewing device) of the part of a graphical user interface on an input unit (such as a touch device) they are interacting with.
  • In this manner, a user does not need to look away from the display unit with respect to interacting with the input unit. In other words, in an example, blind touch screen control is provided with feedback about the area of the touch device user interface that a user is interacting with being provided to the user. To put it another way, improved interaction feedback is provided, when interacting on an input unit such as a touch while looking at a separate display unit for interaction feedback.
  • Preferably, only a single zone or panel of the graphical user interface (GUI) on the input unit is represented on the at least one display unit. Thereby, the amount of screen area on the at least one display unit required to represent the touch device GUI is limited. Preferably, the second image data corresponds to the image data representing the graphical user interface for the active zone, that is the zone with which the user is interacting. Thus, the contents of the active zone of the GUI are mirrored on the main display.
  • Advantageously, only a limited amount of space is taken up on the screen of the at least one display unit, and the space that is taken up is directly related to what the user is doing on the touch device. In other words, space on the at least one display unit is efficiently and effectively being used. To put it another way, because second image data of a touch device that is representative of the user interaction portion of the user input area of the touch device is shown on a viewing device, the amount of screen area on the viewing device to be dedicated to mirroring, or showing a schematic representation of, the touch device user interface is limited.
  • In other words, if an input unit such as a touch screen is showing imagery that is already being presented on the at least one display unit, the imagery is not duplicated on the display device.
  • In an example, the first data comprises a further graphical user interface and the second image data comprises a portion of the touch screen graphical user interface, and the second image data is rendered next to a portion of the further graphical user interface.
  • Alternatively, the second image data is blended with a portion of the further graphical user interface. In a further alternative embodiment, the second image data is rendered inside a portion of the first graphical user interface.
  • In an example, the processing unit is configured to determine the user interaction portion when a user moves a hand over at least one zone of the plurality of zones of the graphical user interface.
  • Alternatively or in addition, the processing unit is configured to determine the user interaction portion when a user touches at least one zone of the plurality of zones of the graphical user interface.
  • Alternatively or in addition, the processing unit is configured to determine the user interaction portion when a pointer moves over at least one zone of the plurality of zones of the graphical user interface.
  • In this manner, the position of for example a cursor as shown on an image displayed on a touch device can now be presented on a separate viewing device which shows a portion of the image shown on the touch device along with the cursor position. In other words, the user can keep track of where a cursor is located on a touch screen whilst looking at a separate viewing screen. To put it another way, when an input unit (e.g. touch device) user interface or graphical user interface shows different content to that shown on at least one display unit (e.g. viewing device), it can be unclear where a pointing device is located on the touch screen user interface if a user had to look at the touch device and attempt to locate a cursor within a complex image background. However, by providing a localized user interaction portion, e.g. a cursor position, as part of the second image data presented on the viewing device, the user is better able to keep track of where the cursor is located on the touch device and can do so without having to look away from the viewing device.
  • In an example, display of the second image data on the at least one display unit is enabled or disabled as a function of input from the user.
  • According to a second aspect, there is provided a medical imaging system, such as an interventional X-ray system, comprising an apparatus for displaying data in accordance with the invention. The display unit is preferably provided as an overhead monitor, for example an exam room monitor, and the touch input device is preferably provides as a table side module (TSM), that is a control device arranged adjacent to a patient table.
  • According to a third aspect, there is provided a method for displaying data, comprising:
    • a) displaying first image data on at least one display unit;
    • b) determining, as a user interaction portion of a graphical user interface, a zone of a plurality of zones of the graphical user interface with which the user is interacting; and
    • c) displaying on the at least one display unit, a portion of the first data simultaneously with second image data, the first image data simultaneously with second image data being representative of the user interaction portion of the graphical user interface.
  • According to another aspect, there is provided a computer program element for controlling an apparatus as previously described which, when the computer program element is executed by a processing unit, is adapted to perform the method steps as previously described.
  • According to another aspect, there is provided a computer readable medium having stored the computer element as previously described.
  • Advantageously, the benefits provided by any of the above aspects equally apply to all of the other aspects and vice versa.
  • The above aspects and examples will become apparent from and be elucidated with reference to the embodiments described hereinafter.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Exemplary embodiments will be described in the following with reference to the following drawings:
  • FIG. 1 shows an example of a method for displaying data;
  • FIG. 2 shows a schematic set up of an example apparatus for displaying data;
  • FIG. 3 shows an example of an input unit, and two display units.
  • FIG. 4 shows an example of data being displayed by an example apparatus for displaying data;
  • FIG. 5 shows the same data as shown in FIG. 4, represented in schematic form;
  • FIG. 6 shows an example of data being displayed by an example apparatus for displaying data;
  • FIG. 7 shows the same data as shown in FIG. 6, represented in schematic form.
  • FIG. 8 shows an example of a technical realization of an example apparatus for displaying data with an example workflow.
  • DETAILED DESCRIPTION OF EMBODIMENTS
  • FIG. 1 shows a method 10 for displaying data in its basic steps, the method 10 comprising:
  • In a first displaying step 20, also referred to as step a), first data is displayed on at least one display unit.
  • In a determining step 30, also referred to as step b), a user interaction portion of a user input area of at least one input unit is determined.
  • In a second displaying step 40, also referred to as step c), on the at least one display unit a portion of the first data is displayed simultaneously with second image data, wherein the second image data is representative of the user interaction portion of the user input area of the at least one input unit.
  • In an example, the first data comprises image data. In an example, the first data comprises text data. In an example, the first data comprises signals. In an example, the first data comprises any combination of these data. For example, in an example the first data comprises image and text data.
  • FIG. 2 shows an apparatus 50 for displaying data. The apparatus 50 comprises at least one display unit 60, at least one input unit 70 comprising a user input area, and a processing unit 80. The processing unit 80 is configured to display first image data on the at least one display unit 60. The processing unit 80 is also configured to determine a user interaction portion of the user input area of the at least one input unit 70. The processing unit 80 is also configured to display on the at least one display unit 60 a portion of the first image data simultaneously with second image data, wherein the second image data is representative of the user interaction portion of the user input area of the at least one input unit 70.
  • In an example, the at least one display unit comprises a monitor configured to display medical image data, preferably an overhead monitor as may be arranged inside a hospital examination room. In an example, the first data comprises interventional X-ray data. In an example, the first data comprises CT X-ray image data, X-ray fluoroscopic image data, X-ray tomography image data, Magnetic Resonance (MR) image data, ultrasound image data or any combination of these image data.
  • In an example, the at least one touch input unit comprises a device with a screen that a user can touch, such as a smartphone or tablet PC, or in a preferred embodiment a table side module adjacent to a patient table in a hospital examination room. A user may be interacting with the touch input device while looking at an overhead display preferably showing a graphical user interface of the touch device. For example, a tablet user interface or graphical user interface can be directly rendered on the screen.
  • In an example, the at least one input unit can be a touch device with its own graphical user interface, such as a touch device showing separate processing, acquisition, medical device configuration and measurement functionalities relating to medical imagery presented on a separate display unit, and the processing unit can determine the area of the touch device a user is interacting with such as that relating to measurement functionalities and present the measurement functionalities as shown on the touch pad as a separate, or integrated, region on the at least one display unit along with the medical image data. However, even though the input unit can have a display screen this does not necessarily mean that the second image data representative of the user interaction portion of the user input area of the input unit is mirroring image data presented on the input unit. Representative here means that, even if the input unit is showing imagery, the display unit could show schematic representations of that imagery. However, representative also means that the display unit could mirror at least a portion of what is being shown on a display screen of an input unit.
  • In an example, the at least one input unit comprises a device configured to detect the proximity of a user's hand or pointer or stylus without the user having to touch the input unit with their hand or pointer or stylus. For example, the at least one input unit can utilize microwave or laser sensing to determine the proximity to and location of the user's hand with respect to a user input area of the input unit in order to determine the user interaction portion. In an example, the at least one input unit detects the proximity and location of a user's hand with respect to the input unit using a camera viewing the user's hand. In an example, the at least one input unit detects the proximity and location of a user's had with respect to the input unit through magnetic field disturbance sensing of the user's hand. In other words, the at least one input unit can be a touch device and/or a proximity device.
  • In other words, in an example there can be multiple display units and/or multiple input units. For example with respect to multiple display units, there can be a display unit in a Control Room and a display unit in an Exam Room. For example with respect to multiple input units, there can be an input unit (such as a Table Side Module) for a physician (close to a patient) and there can be an input unit for a nurse.
  • In an example, the second image data is a direct reproduction of the image data representing at least an active portion of a graphical user interface that the user is interacting with. For example, the input unit can be a touch device with its own screen on which there is a first zone or area showing medical imagery and a second zone or area showing buttons relating to control or processing of that medical imagery.
  • The at least one display unit can be a medical exam room monitor displaying an enlargement of the imagery as shown on the at least one input unit, and when the user interacts with the input unit at the location of the buttons, imagery of the buttons as presented on the input unit can be mirrored on the at least one display unit.
  • In an example, as a user's finger approaches the input unit (e.g. touch device) the part of the image being presented on the touch device centered around the location of the finger, which may not have touched the touch device yet, is presented on the at least one display unit (e.g. viewing device). In this manner, the user may move their finger towards the center right of the screen of a touch device. On the touch device there may be an image being presented on the center of the screen that is shown in expanded form on the viewing device, and a number of functional buttons may be presented on the far right of the touch screen. In this case, as the user's finger approaches the touch screen the user is presented on the viewing device with an image portion from the touch device with parts of buttons being shown. As they move their finger to the right over the touch screen, the image on the viewing device shows less of the image as presented on the touch device and more of the buttons on the touch device. Finally, the user is presented with information on the viewing device regarding the touch device presented as required, such as the functional buttons, enabling the user to press the button on the touch device that they wish to press. In this manner, they can interact with the imagery presented on at least one display unit via interacting with a separate input unit without having to move their line of sight away from the at least one display unit.
  • According to an example, as shown in FIG. 2, the at least one input unit 70 comprises a display screen 90 configured to display input unit image data, wherein the user input area of the at least one input unit 70 comprises at least a portion of the input unit image data displayed on the display screen, and wherein the second image data comprises at least a portion of the input unit image data.
  • In an example, the user input area of the at least one input unit comprising at least a portion of the input unit image data displayed on the display screen means that the user input area of the at least one input unit displays at least a portion of the input unit image data.
  • In other words, in an example a part of a user interface or graphical user interface of an input unit (e.g. a touch device) is rendered on the at least one display unit (e.g., viewing device). Preferably, the part is an active zone, that is, one of a plurality of zone of the graphical user interface with which a user is interacting.
  • In an example, the input unit image data comprises interventional X-ray data. In an example, the input unit image data comprises CT X-ray image data, X-ray fluoroscopic image data, X-ray tomography image data, Magnetic Resonance (MR) image data, ultrasound image data or any combination of these image data. In an example, the input unit image data comprises image relating to button functionalities for the processing and/or acquisition of medical image data and/or control of equipment associated with the acquisition of image data, such as an X-ray unit or MRI unit. For example, the input unit image data can relate to buttons having functionalities including any combination of: acquisition; X-ray unit control and/or medical table control; X-ray beam collimation; processing; zoom & pan; contrast & Brightness; subtraction; measurements; image overlays; reset; and the opening of files.
  • In an example, the at least one input unit comprises a touch screen and a portion of the graphical user interface or user interface that the user is touching or has touched is represented in schematic form on the at least one display unit. In an example, the at least one input unit comprises a touch screen and a portion of the graphical user interface or user interface that the user is touching or has touched is mirrored on the at least one display unit. In other words, a subset of the imagery being shown on the input device, such as a touch screen device, can be rendered on the display device, or a schematic representation of that imagery on the input device can be presented on the display device.
  • In an example, the at least one input unit comprises a proximity device having a screen and a portion of the graphical user interface or user interface that the user is in proximity with or has been in proximity with is represented in schematic form on the at least one display unit. In an example, the at least one input unit comprises a proximity device having a screen and a portion of the graphical user interface or user interface that the user is in proximity with or has been in proximity with is mirrored on the at least one display unit.
  • According to an example, if the input unit image data comprises at least a portion of the first data, the processing unit is configured to display the second image data without displaying any first data, as displayed on the at least one input unit, as part of the second image data.
  • In other words, if an input unit such as a touch screen is showing imagery that is already being presented on the at least one display unit, the imagery is not duplicated on the display device. For example, image A can be presented on a display unit and image B can be shown on an input unit along with a number of functional buttons on the right hand side of the screen. If the user is interacting with the buttons at the top right of the screen then these buttons, along with the portion of image B near or around those buttons, can be displayed on the display unit along with image A. However, if the input unit was showing image A rather than image B, then only the buttons at the top right hand side of the input unit that the user is interacting with are displayed on the display unit. In this manner, redundant data is not presented to the user. In other words, if an image of a vascular region of a patient is shown on the screen of an input unit (e.g. touch device) and the same image at the same level of magnification is being shown on the display unit (e.g. viewing device) then the vascular image data from the touch device is not reproduced on the viewing device, and only the other image data that the user is interacting with such as that relating to buttons with image manipulation functionalities. In an example, if the same image data is shown on the at least one input unit as shown on the display unit, but at a different level of magnification, then it may be reproduced on the display unit if the user is interacting at that region of the input unit.
  • According to an example, the user input area comprises a plurality of zones, and wherein the user interaction portion of the user input area comprises at least one zone.
  • In an example, only a zone of the user interface or graphical user interface of the at least one input unit, such as a touch device or proximity device, that a user is interacting with, is rendered on the at least one display unit such as a viewing device. In an example, a zone is equivalent to a panel on the input device.
  • According to an example, the first data comprises a first graphical user interface and the second image data comprises a portion of a second graphical user interface, and wherein the portion of the second graphical user interface is rendered next to a portion of the first graphical user interface.
  • In other words, the portion of imagery on the input device that the user is interacting with can be shown on the display device next to either all of what was being shown on the display device or next to most of what was being shown. In other words, the original imagery can be resized in order to present the new imagery next to the old imagery, or the new imagery can be overlaid over one side of the old imagery.
  • According to an example, the first data comprises a first graphical user interface and the second image data comprises a portion of a second graphical user interface, and wherein the portion of the second graphical user interface is blended with a portion of the first graphical user interface.
  • In an example, the amount of blending is dependent upon the distance of a pointing device to the input device. For example, as a user's finger approaches the input unit (such as a touch device) the region of the touch device centered around the location of the finger (above the screen) is presented on the at least one display unit (e.g. viewing device). In an example, as the user's finger approaches or recedes from the touch device screen, the image from the touch device presented on the viewing device becomes brighter and fades accordingly.
  • According to an example, the first data comprises a first graphical user interface and the second image data comprises a portion of a second graphical user interface, and wherein the portion of the second graphical user interface is rendered inside a portion of the first graphical user interface.
  • In other words, the positioning of the second image data, such as a portion of what is being presented on a touch screen and represents or mirrors the content of the touch screen the user is interacting with, can be controlled or adjusted as required.
  • According to an example, the processing unit is configured to determine the user interaction portion when a user moves a hand over at least one portion of the user input area of the at least one input unit.
  • For example, the processing unit determines that the user has moved their hand over the right hand portion of the touch screen of an input unit, and the buttons and/or image data shown at the right hand portion of the touch screen is displayed on the at least one display unit along with at least a portion of the image data that was already being displayed on the at least one display unit. In an example, a user can be simultaneously hovering their hand over multiple zones, e.g., the left hand area and the right hand areas of a touch/proximity device.
  • According to an example, the processing unit is configured to determine the user interaction portion when a user touches a portion of the user input area of the at least one input unit.
  • For example, the processing unit determines that the user has touched the right hand portion of the touch/proximity screen of an input unit, and the buttons and/or image data shown at the right hand portion of the touch screen is displayed on the at least one display unit along with at least a portion of the image data that was already being displayed on the at least one display unit.
  • According to an example, the processing unit is configured to determine the user interaction portion when a pointer moves over a portion of the user input area of the at least one input unit.
  • In an example, the pointer comprises a cursor displayed on the user input area of the at least one input unit. For example, the processing unit determines that the user has moved the cursor over the right hand portion of the screen of an input unit, and the buttons and/or image data shown at the right hand portion of the touch screen is displayed on the at least one display unit along with at least a portion of the image data that was already being displayed on the at least one display unit.
  • According to an example, the processing unit is configured to determine a localized user interaction position of the user interaction portion, and wherein the second image data comprises image data representative of the localized user interaction position.
  • In an example, a localized interaction position comprises a position of a cursor displayed on the user input area of the at least one input unit.
  • According to an example, display of the second image data on the at least one display unit is enabled or disabled as a function of input from the user.
  • In this manner, the required functionality can be enabled/disabled as necessary, for example based on a user's hand approaching, touching or being removed from the input unit, such as a touch device or proximity device, or based on other input from the user. In an example, the input from the user comprises a button or pedal being depressed. In an example, the input from the user comprises a duration in time of input from the user. In other words, a threshold in time can apply and if the input from the user, such as the user's hand being in a particular position, exceeds a certain time then the display of second image data on the at least one display unit is enabled.
  • In an example, it is possible to have more than one display unit. In an example, it is possible to have more than one input unit. In an example, it is possible to have more than one display unit and more than one input unit.
  • FIG. 3 shows an input device in the form of a touch device, shown at the bottom and labelled as “Touch”, and two display devices in the form of a desktop and viewing device, shown at the middle and top and labelled as “Desktop” and “Viewing”. The display devices are showing enlarged portions of what is being shown on the touch device. This arrangement represents an approach in which one or more viewing devices are used for main visual feedback and a touch device is used for interaction purposes. The touch device can have its own user UI or GUI (e.g. a phone) or have no UI or GUI (e.g. a mouse pad or TV remote control). In this approach, while interacting on a touch device, the user gets no direct visual feedback on the viewing device as to where on the touch device interacting. For example, as shown in FIG. 3 when the touch device has a UI or GUI and the UI or GUI shows a different content than the content being displayed on the UI or GUI of the viewing device, it may be unclear or difficult to determine where a pointing device, such as a cursor, is located on the UI or GUI of the touch device. In the current approach, large buttons are placed on the touch device at positions that are easily found by hand interaction, as represented on the screen of the touch device shown in FIG. 3. However, the user needs to look on the touch device to find the functionality of interest (e.g. a button) and after pressing the button or positioning the cursor over the button and clicking the user must then look back to the viewing device to judge the result.
  • The apparatus and method for displaying data of the present disclosure is applicable to a wide variety of device setups. For example:
    • Interacting on a mobile phone or tablet (touch device) while looking at a television (viewing device) that shows the UI of the touch device. For example a tablet UI that serves as a remote control UI can be directly rendered on the television.
    • Interacting on a Table Side Module (touch device) of an X-ray device while looking at the exam room screen (viewing device) that shows the X-ray images.
  • For example, the apparatus and method can be used in the following cases: Use case 1, which is represented in FIG. 4 and FIG. 5 which show an example of data being displayed by an example apparatus of the present disclosure for displaying data. In FIG. 4 and FIG. 5, an input unit in the form of a touch device having a UI is shown in the top image, and a display unit in the form of a viewing device having a UI is shown in the bottom image. In summary, when a user's hand comes over the right-hand side of the touch device UI this part of the touch device UI, that is, the right panel or right zone, is blended in the viewing device UI. Furthermore, optionally, in the viewing device UI the button over which the user's hand is hovering or the button being touched on the touch device UI is highlighted. In more detail, for the situation where the input unit is a Table Side Module (TSM) and the display unit is an exam room monitor:
  • User moves his hand over the buttons in the right panel of the Table Side Module (TSM).
  • A rendering of the panel UI showing the buttons is displayed on the exam room monitor and optionally directly shows over which button the user is hovering (or touching).
  • User presses the button and can directly see the effect on the X-ray image on the exam room monitor.
  • User moves hand away from Table Side Module.
  • The panel UI rendering disappears from the exam room monitor, leaving a clean, non-distracting UI in the exam room.
  • Use case 2, which is represented in FIG. 6 and FIG. 7 which show an example of data being displayed by an example apparatus of the present disclosure for displaying data. In FIG. 6 and FIG. 7, an input unit (not shown) in the form of the touch device having a UI is showing imagery as shown for the input unit of FIG. 4 and FIG. 5. In FIG. 6 and FIG. 7 a display unit in the form of a viewing device having a UI is shown. In summary, upon an appropriate indication from the user the whole of the UI of the touch device is shown on top of the viewing device UI. In another arrangement, upon an indication of the user the subset of the touch device as discussed above with respect to FIG. 4 and FIG. 5 is shown on top of the viewing device UI, and again the specific button being interacted with by the user can be identified. In more detail, for the situation where the input unit is a Table Side Module (TSM) and the display unit is an exam room monitor:
  • User presses a pedal/button.
  • In the exam room the UI of the TSM application renders the TSM UI fully on top of the X-ray viewer application on the exam room monitor.
  • The position of finger is indicated with clear pointer (see FIG. 6 and FIG. 7, where a circular shaded dot is presented over the Contrast/Brightness functional button).
  • User releases pedal.
  • The exam room monitor shows X-ray viewer application.
  • With respect to the apparatus and method for displaying data according to the present disclosure, multiple embodiments are possible:
  • The positioning of the UI of the touch device within the UI of the viewing device can be adjusted as follows. The UI of touch device
  • Can be rendered directly next to the UI of the viewing device.
  • Can be blended with the UI of the viewing device. The amount of blending can be dependent on the distance of the pointing device to the touch device.
  • Can be rendered inside the UI of the viewing device.
  • The content of the UI of the touch device can be partially in the UI of the viewing device to reduce the amount of screen area needed for the feedback. This can depend on the position of the pointing device with respect to the touch device UI, e.g. show only buttons from left panel when pointing device is at left, and from right panel when pointing device is at right. See FIG. 4 and FIG. 5.
  • The rendering of the touch device UI on the viewing device can be:
  • Enabled/Disabled via a hardware or software switch that enables rendering of the touch device UI on the viewing device.
  • Enabled/Disabled by duration of proximity or contact of the pointing device with the touch device.
  • In general, the rendering of different UIs of a single application can be implemented by:
  • Running the logic and rendering of the UI on a computer program (process).
  • Streaming the rendered results to the different UI devices.
  • This allows for handling interaction in a central process and distributing the rendered results to different UI devices. In general, it is possible to have multiple display units (e.g. viewing devices) and multiple input units (e.g. touch devices).
  • FIG. 8 shows an example of a technical realization of an example apparatus for displaying data with an example workflow. Referring to the physical units shown in FIG. 8 and with reference to the circled numbers 1-6 also as shown in FIG. 8, the example workflow is now explained, where the numbering below relates to the circled numbers in FIG. 8:
  • The UI fragment PC renders a UI fragment and designates certain logical areas in the UI fragment as either blend targets or blend sources. A blend target is a (logical) part of the UI fragment in which another part of UI fragment can be blended. A blend source is a (logical) part of the UI fragment that will be blended into a target. Rendering can be performed using a rendering framework like OpenGL, DirectX, GDI, or developed in-house. UI creation can be accomplished using a third-party UI framework, e.g. WPF, QT, or developed in-house.
  • The computers send the UI fragment via a stream (this can be video stream like DVI or network stream via UDP) to the Backend-for-Frontend (BFF) PC. There the UI compositor composes the UI fragments.
  • Via an interface, the UI compositor has information about blend target areas and blend source areas of the different UIs. This interface can be based on several technologies: a RPC API, a REST or HTTP etc. The UI compositor program running on BFF, uses these interfaces to ‘link’ blend targets and sources.
  • The UI Compositor has information about the different screen devices. A software program running on the BFF PC can specify how the different UI fragments should be composed and shown on screens.
  • The Interaction Orchestrator program running on the BFF ensures that all user-interaction on the screen devices are captured.
  • The Interaction Orchestrator routes back the input events to the relevant UI fragment PC, which can then apply the required logic for interaction handling. The information can is be sent via network stream or via simulated mouse/keyboard events.
  • In the above example, various arrangements are possible. For example, instead of 2 UI fragment PCs, there can be 1 (or more than 2), and instead of running on a separate BFF PC, the logic for UI composition and Interaction Orchestration could run on one of the UI fragment PCs (i.e. A single PC solution can be provided). Additionally, in the above a PC can be a computer unit or processing unit, and a UI compositor can be a UI compositor computer program for example.
  • In another exemplary embodiment, a computer program or computer program element is provided for controlling an appropriate system that is characterized by being configured to execute the method steps according to one of the preceding embodiments.
  • The computer program element might therefore be stored on a computer unit, which might also be part of an embodiment. This computing unit may be configured to perform or induce performing of the steps of the method described above. Moreover, it may be configured to operate the components of the above described apparatus. The computing unit can be configured to operate automatically and/or to execute the orders of a user. A computer program may be loaded into a working memory of a data processor. The data processor may thus be equipped to carry out the method according to one of the preceding embodiments.
  • This exemplary embodiment of the invention covers both, a computer program that right from the beginning uses the invention and computer program that by means of an update turns an existing program into a program that uses invention.
  • Furthermore, the computer program element might be able to provide all necessary steps to fulfill the procedure of an exemplary embodiment of the method as described above.
  • According to a further exemplary embodiment of the present invention, a computer readable medium, such as a CD-ROM, is presented wherein the computer readable medium has a computer program element stored on it which computer program element is described by the preceding section.
  • A computer program may be stored and/or distributed on a suitable medium, such as an optical storage medium or a solid state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the internet or other wired or wireless telecommunication systems.
  • However, the computer program may also be presented over a network like the World Wide Web and can be downloaded into the working memory of a data processor from such a network. According to a further exemplary embodiment of the present invention, a medium for making a computer program element available for downloading is provided, which computer program element is arranged to perform a method according to one of the previously described embodiments of the invention.
  • It has to be noted that embodiments of the invention are described with reference to different subject matters. In particular, some embodiments are described with reference to method type claims whereas other embodiments are described with reference to the device type claims. However, a person skilled in the art will gather from the above and the following description that, unless otherwise notified, in addition to any combination of features belonging to one type of subject matter also any combination between features relating to different subject matters is considered to be disclosed with this application. However, all features can be combined providing synergetic effects that are more than the simple summation of the features.
  • While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive. The invention is not limited to the disclosed embodiments. Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing a claimed invention, from a study of the drawings, the disclosure, and the dependent claims.
  • In the claims, the word “comprising” does not exclude other elements or steps, and the indefinite article “a” or “an” does not exclude a plurality. A single processor or other unit may fulfill the functions of several items re-cited in the claims. The mere fact that certain measures are re-cited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope.

Claims (20)

1. A medical imaging system for displaying data, the system comprising:
at least one display unit;
at least one touch input unit comprising a user input area having a display screen configured to display input unit image data representing a graphical user interface comprising a plurality of zones; and
a processor configured to:
determine a user interaction portion of the graphical user interface, the user interaction portion corresponding to an active zone of the plurality of zones; and
display, on the at least one display unit, first image data including medical image data simultaneously with second image data corresponding to the input unit image data for the active zone.
2. The system according to claim 1, wherein the first image data comprises a further graphical user interface, and wherein the second image data is rendered next to a portion of the further graphical user interface.
3. The system according to claim 1, wherein the first image data comprises a further graphical user interface, and wherein the second image data is blended with a portion of the further graphical user interface.
4. The system according to claim 1, wherein the first image data comprises a further graphical user interface, and wherein the second image data is rendered inside a portion of the further graphical user interface.
5. The system according to claim 1, wherein the processor is configured to determine the user interaction portion when a user's hand or a pointer moves over at least one zone of the plurality of zones of the graphical user interface.
6. The system according to claim 1, wherein the processor is configured to determine the user interaction portion when a user touches at least one zone of the plurality of zones of the graphical user interface.
7. The system according to claim 1, wherein display of the second image data on the at least one display unit is enabled or disabled as a function of input from the user.
8. The system according to claim 1, wherein the at least one display unit is configured as an overhead monitor and the at least one touch input unit is configured as a table side module.
9. The system according to claim 1, wherein the second image data comprises an indicator representing which user interface element of the at least one user interface element the user is interacting with.
10. A method for displaying data, the method comprising:
causing, at a display screen of a user input area of at least one touch input unit, display of input unit image data representing a graphical user interface comprising a plurality of zones;
determining a user interaction portion of the graphical user interface, the user interaction portion corresponding to an active zone of the plurality of zones; and
displaying, on at least one display unit, first image data including medical image data simultaneously with second image data corresponding to the input unit image data for the active zone.
11. The method according to claim 10, wherein the first image data comprises a further graphical user interface, and the method further comprising rendering the second image data next to a portion of the further graphical user interface.
12. The method according to claim 10, wherein the first image data comprises a further graphical user interface, and the method further comprising blending the second image data with a portion of the further graphical user interface.
13. The method according to claim 10, wherein the first image data comprises a further graphical user interface, and the method further comprising rendering the second image data inside a portion of the further graphical user interface.
14. The method according to claim 10, wherein the user interaction portion is determined when a user's hand or a pointer moves over at least one zone of the plurality of zones of the graphical user interface.
15. The method according to claim 10, wherein the user interaction portion is determined when a user touches at least one zone of the plurality of zones of the graphical user interface.
16. The method according to claim 10, wherein display of the second image data on the at least one display unit is enabled or disabled as a function of input from the user.
17. The method according to claim 10, wherein the at least one display unit is configured as an overhead monitor and the at least one touch input unit is configured as a table side module.
18. The method according to claim 10, wherein the second image data comprises an indicator representing which user interface element of the at least one user interface element the user is interacting with.
19. A non-transitory computer-readable storage medium having stored a computer program comprising instructions for displaying data, the instructions, when the computer program is executed by a computer, cause the computer to:
cause, at a display screen of a user input area of at least one touch input unit, display of input unit image data representing a graphical user interface comprising a plurality of zones;
determine a user interaction portion of the graphical user interface, the user interaction portion corresponding to an active zone of the plurality of zones; and
display, on at least one display unit, first image data including medical image data simultaneously with second image data corresponding to the input unit image data for the active zone.
20. The storage medium according to claim 19, wherein the at least one display unit is configured as an overhead monitor and the at least one touch input unit is configured as a table side module.
US17/206,431 2015-10-02 2021-03-19 Apparatus for displaying data Abandoned US20210210194A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/206,431 US20210210194A1 (en) 2015-10-02 2021-03-19 Apparatus for displaying data

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
EP15188101 2015-10-02
EP15188101.8 2015-10-02
US15/764,356 US10957441B2 (en) 2015-10-02 2016-09-30 Apparatus for displaying image data on a display unit based on a touch input unit
PCT/EP2016/073378 WO2017055523A1 (en) 2015-10-02 2016-09-30 Apparatus for displaying data
US17/206,431 US20210210194A1 (en) 2015-10-02 2021-03-19 Apparatus for displaying data

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US15/764,356 Continuation US10957441B2 (en) 2015-10-02 2016-09-30 Apparatus for displaying image data on a display unit based on a touch input unit
PCT/EP2016/073378 Continuation WO2017055523A1 (en) 2015-10-02 2016-09-30 Apparatus for displaying data

Publications (1)

Publication Number Publication Date
US20210210194A1 true US20210210194A1 (en) 2021-07-08

Family

ID=54293065

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/764,356 Active 2036-12-04 US10957441B2 (en) 2015-10-02 2016-09-30 Apparatus for displaying image data on a display unit based on a touch input unit
US17/206,431 Abandoned US20210210194A1 (en) 2015-10-02 2021-03-19 Apparatus for displaying data

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/764,356 Active 2036-12-04 US10957441B2 (en) 2015-10-02 2016-09-30 Apparatus for displaying image data on a display unit based on a touch input unit

Country Status (5)

Country Link
US (2) US10957441B2 (en)
EP (2) EP3936991A1 (en)
JP (2) JP7059178B6 (en)
CN (1) CN108292194A (en)
WO (1) WO2017055523A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11587667B2 (en) * 2018-08-31 2023-02-21 Ventana Medical Systems, Inc. Contextually adaptive digital pathology interface

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10102664B1 (en) * 2014-12-03 2018-10-16 Charles Schwab & Co., Inc. System and method for causing graphical information to be rendered
CN117311543A (en) 2017-09-01 2023-12-29 平蛙实验室股份公司 Touch sensing device
US11040214B2 (en) 2018-03-01 2021-06-22 West Affum Holdings Corp. Wearable cardioverter defibrillator (WCD) system having main UI that conveys message and peripheral device that amplifies the message
CN112889016A (en) 2018-10-20 2021-06-01 平蛙实验室股份公司 Frame for touch sensitive device and tool therefor
WO2020153890A1 (en) 2019-01-25 2020-07-30 Flatfrog Laboratories Ab A videoconferencing terminal and method of operating the same
US11759110B2 (en) * 2019-11-18 2023-09-19 Koninklijke Philips N.V. Camera view and screen scraping for information extraction from imaging scanner consoles
CN114730228A (en) 2019-11-25 2022-07-08 平蛙实验室股份公司 Touch sensing equipment
US20230009306A1 (en) * 2019-12-06 2023-01-12 Flatfrog Laboratories Ab An interaction interface device, system and method for the same
JP2023512682A (en) 2020-02-10 2023-03-28 フラットフロッグ ラボラトリーズ アーベー Improved touch detector
USD986262S1 (en) * 2021-07-01 2023-05-16 Olympus Medical Systems Corporation Display screen with graphical user interface

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080267499A1 (en) * 2007-04-30 2008-10-30 General Electric Company Method and system for automatic detection of objects in an image
US20110302414A1 (en) * 2010-06-08 2011-12-08 Mark Logan Remote control of medical devices using instant messaging infrastructure
US20140111456A1 (en) * 2011-05-27 2014-04-24 Kyocera Corporation Electronic device

Family Cites Families (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0541507U (en) * 1991-11-07 1993-06-08 横河メデイカルシステム株式会社 Image diagnostic equipment
JP2000242383A (en) 1999-02-19 2000-09-08 Nec Corp Screen display enlargement control unit
JP4429635B2 (en) * 2003-06-04 2010-03-10 株式会社日立メディコ Medical image diagnostic apparatus and operation information display method thereof
JP2006209381A (en) 2005-01-27 2006-08-10 Digital Electronics Corp Control display device, its program, and recording medium
JP2006285598A (en) 2005-03-31 2006-10-19 Fujitsu Ten Ltd Touch panel device, operation support method for touch panel device, and operation support program for touch panel device
US20070198141A1 (en) 2006-02-21 2007-08-23 Cmc Electronics Inc. Cockpit display system
JP2007233459A (en) 2006-02-27 2007-09-13 Mitsubishi Electric Corp Programmable display unit
DE102006053261B4 (en) * 2006-11-11 2015-04-16 Visus Technology Transfer Gmbh System for the reproduction of medical images
WO2010083820A1 (en) * 2009-01-26 2010-07-29 Alexander Gruber Method for executing an input using a virtual keyboard displayed on a screen
US8416193B2 (en) 2009-05-21 2013-04-09 Microsoft Corporation Method of visualizing an input location
CN202142005U (en) * 2009-07-22 2012-02-08 罗技欧洲公司 System for long-distance virtual screen input
JP2011072532A (en) * 2009-09-30 2011-04-14 Konica Minolta Medical & Graphic Inc Medical diagnostic imaging apparatus and ultrasonograph
JP2011120785A (en) * 2009-12-11 2011-06-23 Toshiba Corp Medical image diagnostic apparatus
US9665278B2 (en) * 2010-02-26 2017-05-30 Microsoft Technology Licensing, Llc Assisting input from a keyboard
US20120182244A1 (en) 2010-06-11 2012-07-19 Systemsone, Llc Integrated Multi-Display with Remote Programming and Viewing Capability
KR101166895B1 (en) 2011-01-10 2012-07-18 터치디스플레이 주식회사 Integrated control device for vehicle
JP5741821B2 (en) * 2011-03-24 2015-07-01 コニカミノルタ株式会社 Data processing transmission apparatus, data processing transmission program, and method
CN103957800B (en) * 2011-11-30 2016-04-06 富士胶片株式会社 Medical system
US20130194188A1 (en) * 2012-01-31 2013-08-01 Research In Motion Limited Apparatus and method of facilitating input at a second electronic device
EP2624653A1 (en) * 2012-01-31 2013-08-07 Research In Motion Limited Mobile wireless communications device with wireless local area network and cellular scheduling and related methods
EP3355561B1 (en) * 2012-09-10 2021-04-28 Samsung Electronics Co., Ltd. Method for connecting mobile terminal and external display and apparatus implementing the same
JPWO2014112095A1 (en) 2013-01-18 2017-01-19 三菱電機株式会社 Information display control device
CN105120765A (en) 2013-03-15 2015-12-02 火山公司 Universal patient interface module and associated devices, systems, and methods
JP2015014998A (en) 2013-07-08 2015-01-22 船井電機株式会社 Operation system
WO2015027108A2 (en) * 2013-08-22 2015-02-26 McCarthy Music Corp. Interactive piano training system
KR20150066132A (en) 2013-12-06 2015-06-16 삼성전자주식회사 Display apparatus, remote controller, display system, and display method
WO2016027959A1 (en) * 2014-08-22 2016-02-25 Samsung Medison Co., Ltd. Method, apparatus, and system for outputting medical image representing object and keyboard image
JP6791617B2 (en) * 2015-06-26 2020-11-25 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Ultrasound image display device and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080267499A1 (en) * 2007-04-30 2008-10-30 General Electric Company Method and system for automatic detection of objects in an image
US20110302414A1 (en) * 2010-06-08 2011-12-08 Mark Logan Remote control of medical devices using instant messaging infrastructure
US20140111456A1 (en) * 2011-05-27 2014-04-24 Kyocera Corporation Electronic device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11587667B2 (en) * 2018-08-31 2023-02-21 Ventana Medical Systems, Inc. Contextually adaptive digital pathology interface
US11804294B2 (en) 2018-08-31 2023-10-31 Ventana Medical Systems, Inc. Contextually adaptive digital pathology interface
US12057217B2 (en) 2018-08-31 2024-08-06 Ventana Medical Systems, Inc. Contextually adaptive digital pathology interface

Also Published As

Publication number Publication date
CN108292194A (en) 2018-07-17
EP3936991A1 (en) 2022-01-12
US20180275836A1 (en) 2018-09-27
JP2018534667A (en) 2018-11-22
JP2022078047A (en) 2022-05-24
WO2017055523A1 (en) 2017-04-06
US10957441B2 (en) 2021-03-23
JP7059178B6 (en) 2022-06-02
EP3356923A1 (en) 2018-08-08
JP7059178B2 (en) 2022-04-25

Similar Documents

Publication Publication Date Title
US20210210194A1 (en) Apparatus for displaying data
US9342145B2 (en) Cursor control
CN110517758B (en) Display device and image display method using the same
JP6261894B2 (en) Medical image display apparatus and method
JP2010086149A (en) Drag and drop control apparatus, method and program, and computer terminal
US11169693B2 (en) Image navigation
JP7491360B2 (en) Radiation image display device, radiation imaging system and program
US20140258917A1 (en) Method to operate a device in a sterile environment
JP2011078527A (en) Medical image management device and medical image display device
JP2020081280A (en) Image display control system, image display system, and image analyzing device
US20170038914A1 (en) Medical image display apparatus, medical image display system, medical image display method, and program
US10324582B2 (en) Medical image display apparatus, method for controlling the same
JP2015208602A (en) Image display device and image display method
JP2010187758A (en) Medical image display device, medical image display method, and program
JP7099064B2 (en) Display control device, medical image display system and program
US7787677B2 (en) Presentation method, presentation device and computer program for presenting an image of an object
JP2016158828A (en) Medical image processor, method, and program
KR101860910B1 (en) A displaying apparatus, a method controlling the same and a medical apparatus
JP7483333B2 (en) Medical information processing system, terminal device and program
JP2008090101A (en) Information processor, image display control method therefor and image display control program thereof
US20230025725A1 (en) Storage medium, medical image display apparatus and medical image display system
EP4319170A1 (en) Vendor-agnostic remote-controlled screen overlay for collaboration in a virtualized radiology environment
Oshiro et al. Novel imaging using a touchless display for computer-assisted hepato-biliary surgery
Ritter et al. Combining Mobile Devices and Workstations for the reading of medical images
Harz et al. A novel workflow-centric breast MRI reading prototype utilizing multitouch gestures

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION