CN104007999B - Method for controlling an application and related system - Google Patents

Method for controlling an application and related system Download PDF

Info

Publication number
CN104007999B
CN104007999B CN201410050907.9A CN201410050907A CN104007999B CN 104007999 B CN104007999 B CN 104007999B CN 201410050907 A CN201410050907 A CN 201410050907A CN 104007999 B CN104007999 B CN 104007999B
Authority
CN
China
Prior art keywords
virtual
input
mouse
key
case
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410050907.9A
Other languages
Chinese (zh)
Other versions
CN104007999A (en
Inventor
K.多恩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens Healthineers AG
Original Assignee
Siemens AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens AG filed Critical Siemens AG
Publication of CN104007999A publication Critical patent/CN104007999A/en
Application granted granted Critical
Publication of CN104007999B publication Critical patent/CN104007999B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/467Arrangements for interfacing with the operator or the patient characterised by special input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/467Arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B6/469Arrangements for interfacing with the operator or the patient characterised by special input means for selecting a region of interest [ROI]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B8/469Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0381Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0382Plural input, i.e. interface arrangements in which a plurality of input device of the same type are in communication with a PC
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • General Physics & Mathematics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The invention relates to a method and a system for controlling an application (9), in particular for interactive image manipulation of a medical image data set (B), by means of a plurality of input devices (6, 14-17). A plurality of virtual switching elements (18) are defined and associated with the application (9) in such a way that activating each switching element (18) triggers a specific corresponding action (R, Z, P) of the application (9). The input signals (E1-E8) of the input devices (6, 14-17) are each associated with one of the virtual switching elements (18) for its activation according to a predetermined rule.

Description

Method for controlling an application and related system
Technical Field
The invention relates to a method for controlling a (software) application, i.e. an application program, by means of a plurality of input devices. The invention relates in particular to an application for interactive image manipulation of medical-technical image data. The invention also relates to a related system. The transmission of the input (command) to the application is understood here in particular as a "control" application, which is executed by the user using the input device.
Background
The concept "interactive image manipulation" is generalized to an operation in which a user interactively contacts with a change in image display. To this end, two-dimensional or three-dimensional rotation of the image information, a zoom operation (i.e., zooming in/out of the displayed image section), or a Pan (Pan) operation (i.e., moving the displayed image section) is performed. These operations are, in particular, time-critical, since the response of the system to significant delays in user interaction greatly limits the comfort of use.
Software applications, in particular in the field of image processing in medical technology, are generally programmed with regard to a specific input device, which should be used for operating the application. The implementation of the application therefore depends mainly on the input device to be selected subsequently. To ensure the availability of applications in different hardware environments, applications must typically be designed for a large number of different input devices, such as a mouse, a (single or multi) touch pad, a (single or multi) touch screen, a tablet, a keyboard, a joystick, a language-or gesture-recognized input device, etc.
This results in a considerable effort in the application extension, which must be taken into account in each case in terms of the input device program technology for the respective input signals in the processing. Furthermore, already set applications can subsequently only be matched with great effort to the type supporting the new extension of the input device (for example in the most recent multi-touch screens, language or gesture control). Current applications are therefore almost inflexible specified as a specific input device or a specific selection input device and thus as a specific hardware environment.
Another problem in the case of conventional application control, in particular in the case of interactive image manipulation of medical-technical image data, is that different types of input devices for generating the same commands (and thus for causing the same processing result of the application) require the user to carry out different operating processes. The user therefore usually has to operate the middle mouse button, for example for zooming the image (i.e. for enlarging a displayed image section), in the case of a three-button mouse control application, and, in the case of a multipoint touch screen control application, has to move the thumb and the index finger simultaneously on the touch-sensitive surface and in addition in a predetermined specific direction. That is, the user experiences different operating characteristics (user experience) when operating the same application with different input devices, which in turn runs an efficient workflow in different hardware environments.
Disclosure of Invention
The object of the present invention is to provide a method and a system for controlling applications by means of a plurality of input devices, which ensure simple implementation while allowing for high flexibility in terms of the input devices to be used.
With reference to said method, the above technical problem is solved by a solution having the technical features of the present invention. With reference to the system, the above technical problem is solved by a solution having the technical features of the present invention. Preferred and partial embodiments and developments of the invention which are to be seen from the invention are given in the dependent claims and the subsequent description.
In the course of the method according to the application, a specific number of virtual switching elements is defined and associated with the application to be controlled in such a way that activating each virtual switching element triggers a specific action of the application. The virtual switching elements are on the other hand logically connected to at least one input device (but preferably a plurality of input devices) so that the input signals of the input devices are each associated with a virtual switching element for the activation thereof. The association of the input signals with the individual virtual switching elements is carried out in accordance with predetermined rules.
A software component is referred to herein as a virtual switching element, which emulates the properties of a real (physical) switching element, such as a key switch, in that the virtual switching element is reversibly switchable between an activated state and a deactivated state and provides an output signal reflecting the respective state. Each switching element is realized technically, for example, by means of a variable, in particular a so-called "flag bit" program. Each virtual switching element can be reversibly activated and deactivated by one or more input signals.
In principle, the virtual switching element can be implemented as an integral part of the application within the scope of the invention. But preferably it is implemented outside the application. The virtual switching element can thus be implemented, for example, as part of the middleware (in particular a framework) provided between the application and the operating system. The virtual switching element may optionally be displayed on the display screen of the corresponding computer system as part of the user interface of the application, for example, according to the type of "key press". This is not absolutely necessary within the scope of the invention.
A signal is generally understood to be an "input signal" of an input device, i.e. the input signal is output by the respective input device when operated by a user. Input devices that can be operated in different ways output a specific input signal for each mode of operation, so that the respective input signal reflects the type of operation. Therefore, for example, when the three-key mouse is operated by a user to press a key, one of three different input signals is output, and whether a left mouse key, a middle mouse key or a right mouse key is pressed or not is judged according to the signal. In this case, different input signals of the same input device or different combinations of these input signals are usually associated with different virtual switching elements.
Within the scope of the invention, different input devices can be used, in particular single-or multi-button mice, trackballs, joysticks, data pens (pens) with associated control devicesA tablet, a single or multi-point touchpad, a single or multi-point touchscreen, a language controlled input device, and/or a gesture controlled input device. In this position, the touch-sensitive switching surface is selectively touched (
Figure BDA0000465933400000031
) Referred to as a touch panel, in which a switch point can be set by touching with a finger. A touch panel having a display screen is hereinafter referred to as a touch screen. A touch pad or touch screen that can only capture a single switch point at a certain point in time is called a single-point touch pad or single-point touch screen. Accordingly, a touch pad or touch screen that can capture a plurality of simultaneously and spatially separately generated switch points is referred to as a multi-point touch pad or multi-point touch screen.
In the implementation of the method, at least one of the plurality of input devices is always connected to the virtual switching element in a data transmission manner. The or each further switching element can be connected to the virtual switching element in a manner that is interchangeable with the input device mentioned above. An instance of the application to be controlled running on the desktop PC may for example be operated with a usual three-button mouse as input device, while another instance of the same application running on the tablet computer uses the integrated multi-touch screen of the tablet computer as input device. However, it is preferred within the scope of the invention that a plurality of input devices are also associated with virtual switching elements simultaneously and in parallel with one another, wherein in particular the plurality of input devices can also be used simultaneously. Thus, an application running on a laptop can also be controlled simultaneously during the method according to the invention by means of the integrated touch panel of the laptop, by means of the laptop keyboard and by means of the connected three-key mouse, wherein all three input devices communicate with the application only via virtual switching elements. The user can send zoom commands by means of a three-button mouse and perform image rotations simultaneously or directly before or after via the touch panel, for example.
The virtual switching element is thus used as an overall (and preferably unique) interface between the application on one side and the alternative or simultaneous input device on the other side. This solution leads to a major simplification in the case of application extensions, which can be programmed completely independently of the input device to be used. In particular, adaptation for applications supporting the new type of originally unset input device is substantially simplified, while only the association of the input signal of the new type of input device with the virtual switching element has to be defined.
In a preferred embodiment of the invention, the rule is predefined uniformly for all input devices, based on the association of the input signals of the input devices with virtual switching elements. "unified" means that the association rule is not additionally and arbitrarily specified for each input device or each type of input device, but is determined by the standards set by the superordinate. Such criteria are in particular the number of input signals generated by the respective input device and/or the (qualitative) information content of the input signals of the respective input device. The following distinguishes especially between "dedicated input signals" and "non-dedicated input signals" in view of the information content of the input signals.
In the following, an input signal of a plurality of input signals of an input device which can be unambiguously distinguished from every other input signal of the same input device and which thus for its part corresponds to a particular command (of a plurality of available commands) is to be understood as a "dedicated input signal". The dedicated input signal in this sense can be generated, for example, by operating a specific mouse key of a multi-key mouse, but also by a specific gesture or a language command.
In contrast to this, an input signal corresponding to a specific command is not understood as a "non-dedicated input signal" for its own sake. This is due, in particular, to the fact that the switching points (touch points) are produced on a multi-touch pad or a multi-touch screen (outside the authenticated keys).
The adaptation of the application for supporting the new type of input device is further simplified on the one hand by the unified rules. On the other hand, the "user experience", i.e. the operating characteristics caused by the individual input devices, is made uniform by uniformly correlating the input signals of the different input devices, so that a particularly intuitive and effective operation of the application is achieved for the user.
In a configuration variant of the method, which is preferred and which is particularly well matched to typical user characteristics, three virtual switching elements are defined, which simulate the input characteristics of a typical three-button mouse. The first virtual switching element corresponds here to the virtual left mouse button. The second virtual switch element corresponds to a virtual mouse middle key. Finally, the third virtual switch element corresponds to a virtual right mouse button. In this embodiment of the invention, all input devices are thus semantically standardized with a virtual three-button mouse, so that the user can trace back the usual semantics when operating further types of input devices (for example touch panels or speech controls). This substantially simplifies the handling of the applications by means of the different input devices from the user's point of view.
In a preferred embodiment of the rule, it is provided that in at least one (preferably each) input device which only permits the generation of a single input signal, this input signal is always associated with the virtual left mouse button. By operating such an input device, for example a single-button mouse, a single-point touch panel or a tablet, the left virtual mouse button is always activated.
In addition or alternatively, it is preferable within the scope of the rules that, in at least one (preferably each) input device, in particular in a multi-touch pad or multi-touch screen,
three simultaneously generated input signals are always associated with a virtual right mouse button,
exactly two (in the absence of one or every other input signal) simultaneously generated input signals are always associated with a virtual mouse middle key, an
A single (in the absence of one or every other input signal) generated input signal is always associated with the virtual left mouse button.
That is, the virtual left mouse button is activated in the case of exactly one generated switching point (single-finger interaction), the virtual middle mouse button is activated in the case of exactly two simultaneously generated switching points (two-finger interaction), and the virtual right mouse button is activated in the case of three simultaneously generated switching points (three-finger interaction).
Additionally or alternatively, in at least one (preferably each) input device within the scope of the rules that allows generating at least two different dedicated input signals, each input signal is preferably associated with a corresponding virtual mouse key, at least as long as further input signals are not also generated at the same time. Examples for corresponding input devices are in particular a two-key mouse, a three-key mouse, but also gesture-controlled input devices or language-controlled input devices. In the case of a language-controlled input device, the language command to be recognized (spoken command) is preferably adapted to the semantics of a virtual three-button mouse. In this sense, for example, the input device generates an input signal associated with a virtual left mouse button, if a spoken command "press left mouse button" is recognized, and so on. Likewise, in the case of gesture-controlled input devices, the gestures by which the input device can be controlled are preferably also adapted to the semantics of a virtual three-button mouse. For example, input signals associated with a virtual left mouse button or a virtual middle mouse button or a virtual right mouse button can be generated by moving a finger to the left, to the bottom or to the right, respectively.
As long as the input device (as in the case of a two-button mouse) can only generate two different input signals, these two input signals are preferably associated in pairs with a virtual left mouse button and a virtual right mouse button.
As long as the input device (as in the case of a three-key mouse) can generate three different input signals, these three input signals are preferably always associated with one of the three virtual mouse keys, respectively. The same applies to language commands or gestures recognized by language-controlled or gesture-controlled input devices. A corresponding command or a corresponding gesture is preferably specified for each virtual mouse key in the last-mentioned case.
As long as a plurality of dedicated input signals can be generated simultaneously in the case of an input device (for example a two-key mouse, a three-key mouse, if appropriate a gesture-controlled input device), preferably each input signal generated simultaneously is always associated with a respective virtual mouse key. In this method variant, the virtual right mouse button and the virtual left mouse button are thus activated simultaneously when the right mouse button and the left mouse button of the three-button mouse are operated simultaneously.
As an alternative to this, however, it is also possible within the scope of the rules to select the virtual mouse button to be activated in the case of the simultaneous generation of exactly two or three dedicated input signals, depending on the number of simultaneously generated input signals. In the case of exactly two simultaneously generated input signals, a virtual mouse middle key is activated in this case. The virtual right mouse button is activated in the case of possibly three simultaneously generated input signals. In this method variant, the virtual mouse center button is thus activated when the right mouse button and the left mouse button of the three-button mouse are actuated simultaneously.
The system according to the invention comprises an input relay module which signally interconnects the application to be controlled and the at least one input device and which thus transmits the input signals output by the or each input device to the module of the application. The input relay module is designed as a circuit and/or program technology for automatically carrying out the method according to the invention, in particular in the implementation variants described above.
The system according to the invention is in a narrow sense software, in particular a framework on which applications are built. In a further sense, the system also comprises hardware components required for its operation, in particular software components of the system, in particular the input relay module, and a computer on which the application is executable. The virtual switching elements described above, i.e. in particular the virtual mouse buttons, are implemented in particular in the area of the input relay module.
Still further, a further embodiment of the invention is a computer program product comprising a data carrier on which machine-readable instructions of a computer program are stored, which when run automatically executes a method according to the invention, in particular a method of one of the implementation variants described above, on a computer.
Drawings
Embodiments of the present invention will be further described with reference to the drawings. In the drawings:
fig. 1 schematically shows a system for controlling an application for interactive image manipulation of medical technical image data.
Detailed Description
The system 1 shown in fig. 1 comprises a framework 2 as a core part for the extension and execution of regional (software) applications for image processing of medical technology. The framework 2 here provides functions and services that are accessible to applications built on the framework 2. Optionally, the framework 2 also contains a runtime environment in which applications built on the framework 2 can run platform-independent. The framework 2 additionally comprises an input relay module 3 in addition to the components of the description.
In another sense, the system 1 comprises, in addition to the framework 2, an affiliated platform, which is formed by (data processing) hardware 4 (in particular a PC or workstation) and an operating system 5 thereon.
A plurality of input devices 6 for receiving user inputs, at least one display 7 as output device and at least one image data memory 8 are connected to the system 1 in a signal transmission-oriented manner.
Fig. 1 also shows a (software) application 9 on the framework 2. The application 9 is, for example, a so-called "reader", i.e. a program for displaying a (two-dimensional and/or three-dimensional) medical image data record B.
The application 9 comprises, in the example shown strongly simplified, an image preparation module 10 which prepares the image data set B supplied by the image data memory 8 for display on the display 7 and displays it on the display 7. The two-dimensional image data set B is prepared, for example, by format or resolution conversion, color matching, etc. In the three-dimensional image data record B, the image preparation includes decisively deriving a displayable two-dimensional view from the three-dimensional image information, for example by creating sectional images, or from a three-dimensional visualization (volume rendering) of the scene.
The image preparation module 10 here accesses, according to the exemplary illustration, a plurality of software modules each for performing a specific image manipulation. In this connection, a rotation module 11 for calculating a rotation of the two-dimensional or three-dimensional image information, a zoom module 12 for calculating enlarged image information of selected image sections of the image data set B and a pan module 13 for selecting image information corresponding to movable image sections are shown by way of example.
Within the scope of the (rotation) action R, the rotation module 11 calculates a rotated image from the original image data set B and feeds this rotated image back to the image preparation module 10 for display on the display screen 7. The zoom module 12 calculates the image information of the enlarged image section within the range of the (zooming) action Z and feeds back the enlarged image section to the image preparation module 10 for display on the display screen 7. In the event of a movement of the image section to be displayed within the scope of the (pan) action P, the pan module 13 determines the image data corresponding to this moved image section and feeds the moved image back to the image preparation module 10 for display on the display screen 7.
The described actions R, Z and P can be arranged in a targeted manner by the user by operating one or more input devices 6 by corresponding user inputs. The input devices 6 can in principle be connected to the system 1 instead of (or interchangeably with) each other. In which case only a single input device 6 is always provided to the system user. Preferably, however, as shown, a plurality of input devices 6 are connected to the system 1 in a data transmission technology at the same time and in parallel to one another, for example a three-key mouse 14, a digital tablet 15 with an associated data pen (pen), a single-point touchpad 16 and a multi-point touchscreen 17.
Furthermore, additional input devices 6 (not shown in detail) can be connected to the system 1 in addition to or as an alternative to the input devices 6 shown, in particular gesture-controlled (gesture-recognized) input devices or language-controlled (language-recognized) input devices.
Depending on its particular implementation, each input device 6 outputs a different number of input signals E1-E8 in response to user interaction. Thus the three-key mouse 14 may for example output three input signals E1 to E3, wherein the input signal E1 reports clicking the left mouse button, the input signal E2 reports clicking the middle mouse button and the input signal E3 reports clicking the right mouse button. With the aid of tablet 15 or an associated pen, it is possible to generate only one switching point in the two-dimensional switching plane of the tablet at any time, tablet 15 or the associated pen outputting a separate input signal E4. The same applies correspondingly to the one-point touch panel 16, which outputs the input signal E5 with a single switching point on its switching surface. With the aid of the multi-touch screen 17, up to three switching points can be generated simultaneously by touching the two-dimensional switching surface with one or more fingers. The multi-touch screen 17 can output three input signals E6, E7 and E8, wherein the input signal E6 reports the switching point of the first acquisition, the input signal E7 reports the switching point of the second acquisition if appropriate and the input signal E8 reports the switching point of the third acquisition if appropriate.
The input signals E1-E8 of the input device 6 are not directly transmitted to the application 9. Instead, the input signals E1-E8 of all input devices 6 are first transmitted within the framework 2 to the input relay module 3.
Three virtual switch planes 18 are defined in the input relay module 3, which represent a virtual left mouse button 19, a virtual middle mouse button 20 and a virtual right mouse button 21. Each input signal E1-E8 of the input device 6 is here associated with a respective virtual switching element 18 by means of the association module 22, so that the respective virtual switching element 18 is activated by means of the respective input signal E1-E8.
The virtual switching elements 18 are, on the other hand, associated with the applications 9 in order to output the respective output signals a1, a2 or A3 in such a way that activating any switching element 18 triggers a corresponding action of the application 9. The activation or deactivation of any one of the switching elements herein reversibly changes the state of the respective corresponding output signal a 1-A3. For example, the state "< MouseButton > Down" reports activation of the corresponding switching element 18, and the state "< MouseButton > Up" reports deactivation of the switching element 18. The placeholder "< MouseButton >" is always replaced here by a marking of the respective switching element 18, for example in the case of the virtual left mouse button 19 being replaced by "LeftMouseButton", in the case of the virtual middle mouse button 20 being replaced by "MiddleMouseButton" and in the case of the virtual right mouse button 21 being replaced by "RightMouseButton".
The association of the virtual switching element 18 with the function of the application 9 is here communicated semantically as is usual in medical technology image manipulation, whereby an image rotation can be triggered by pressing the left mouse button of a real three-button mouse, an enlargement (zooming) of the displayed image section can be triggered by pressing the middle mouse button, and a movement (panning) of the displayed image section can be triggered by pressing the right mouse button. Accordingly in the system 1 the virtual left mouse button 19 is associated with the rotation module 11, the virtual middle mouse button 20 is associated with the zoom module 12 and the virtual right mouse button 21 is associated with the pan module 13, and in the activated state triggers the respective action R, Z or P:
leftMouseButtonDown StartTotCommand Start spin command
Leftmousebutton up EndRotateCommand end rotation command
MiddleMouseButtonDown StartZoomModem Start zoom Command
MiddleMouseButtonUp EndZoomModend zoom Command
-rightMouseButtonDown StartPan Command Start panning command
rightMouseButtonUp EndPanCommand end panning Command
The association of the input signals E1-E8 with the virtual switching elements 18 by the association module 22 implements a semantic rule which is uniform for all switching elements 6, according to which
(1) Associating each input signal with a corresponding virtual mouse key 19, 20 or 21 in each input device allowing the generation of at least two different dedicated input signals, whereby the corresponding mouse key 19, 20 or 21 is activated in response to the reception of the respective input signal, and
(2) in each further input device, the switching surface to be activated is determined as a function of the number Σ of simultaneously generated input signals, wherein
a) When Σ =3, the virtual right mouse button 21 is activated,
b) when Σ =2, the virtual mouse middle key 20 is activated, and
c) when Σ =1, the virtual left mouse button 19 is activated.
According to rule (1), the input signals E1-E3 of the three-key mouse 14 are processed in the example shown. Here, the input signal E1 corresponding to the left key of the three-key mouse 14 is associated with the left virtual mouse key 19 by the association module 22. The input signal E2 corresponding to the middle key of the three-key mouse 14 is associated with the middle virtual mouse key 20 by the association module 22. Finally, the input signal E3 corresponding to the right key of the three-key mouse 14 is associated with the virtual mouse key 21 on the right side by the association module 22. This association is maintained when at least two input signals E1-E3 are generated simultaneously by pressing a plurality of keys. That is, the respective virtual mouse keys 19-21 are simultaneously activated in the case where a plurality of keys of the three-key mouse 14 are simultaneously pressed.
According to rule (1), input signals of possible language or gesture controlled input devices are also correlated by the correlation module 22. In this case, when a specific language command or a specific gesture is recognized by the respective input device, the switching element 18 corresponding to the recognized command or recognized gesture is always activated.
According to rule (2), the multi-touch screen 17 is processed on the one hand by the association module 22, by means of which one or more non-dedicated input signals E6-E8 can be generated (depending on the number of fingers touching the screen surface and the switching points generated thereby). The correlation module 22 activates the virtual left mouse button 19 if only one of the input signals E6-E8 (Σ = 1) is received, the middle mouse button 20 if only two of the three input signals E6-E8 (Σ = 2) are received, and the right mouse button 21 if all three input signals E6-E8 (Σ = 3) are received.
Furthermore, rule (2) also applies to input devices that can generate only one input signal. This relates in the example shown to input signal E4 of tablet 15 and to input signal E5 of one-touch pad 16. The number of input signals simultaneously generated for this input device 6 must always be the value one (Σ = 1), so that the virtual mouse left button 19 is always operated by the association module 22 in the case of receiving one of the input signals E4 or E5.
In the exemplary method flow the correlation module 22, in the case of receiving at least one input signal E1-E8 of one of the input devices 6, first checks whether the or each received input signal E1-E8 is a dedicated input signal corresponding to a particular virtual mouse key 19-21 by comparison with a stored correspondence table. The association module 22 activates the corresponding virtual mouse keys 19-21 as necessary.
Otherwise, the correlation module 22 determines the number Σ of received input signals E1-E8, and
activating the virtual left mouse button 19 when Σ =1,
otherwise activate the virtual mouse middle key 20 when Σ =2, and
otherwise activate the virtual right mouse button 21.
The described method sequence is repeated periodically or with each change of the input signals E1-E8, wherein one or more virtual mouse keys 19-21 are activated or deactivated depending on the type of signal change.
By implicitly mapping the input signals E1-E8 to virtual triple-key mice and by the unified semantics of the input signals E1-E8 of the different input devices 6, the user can work intuitively and without painfully with completely different, possibly completely new, input devices 6, since the respective input devices always react identically and in a predictable manner to user interactions. The system 1 described above is at the same time numerically low-cost and thus allows the image manipulation described to be performed in real time.
The program code sections of an exemplary software technology application of the input relay module 3 are described in the appendix below.
Although the present invention is particularly clear from the embodiments described above, it is not limited thereto. In fact, numerous additional embodiments of the present invention may be derived from the above description by a person skilled in the art.
Appendix 1: program code segment for evaluating a multi-touch event in the case of a multi-touch device
Figure BDA0000465933400000111
Appendix 2: code for evaluating mouse events in the case of a conventional computer, in particular a PC
Figure BDA0000465933400000121
Appendix 3: generalized command interface for mapping different input devices to generalized virtual three-key mouse
Appendix 4: semantically mapping to virtual mouse commands
Figure BDA0000465933400000142
Figure BDA0000465933400000151
Appendix 5: semantically mapping a multi-touch screen to virtual mouse commands
Figure BDA0000465933400000152
Figure BDA0000465933400000171
Figure BDA0000465933400000181
List of reference numerals
1 System
2 framework
3-input relay module
4 (data processing) hardware
5 operating system
6 input device
7 display screen
8 image data memory
9 applications
10 image preparation module
11 rotating module
12 zoom module
13 pan module
14 three-key mouse
15 figure input board
16 single point touchpad
17 multipoint touch screen
18 (virtual) switching element
19 (virtual) left mouse button
20 (virtual) mouse middle key
21 (virtual) Right mouse button
22 correlation module
B image data memory
R (rotational) motion
Z (zoom) action
P (pan) action
E1-E8 input signals
A1-A3 output signal

Claims (12)

1. A method for controlling an application (9), in particular for interactive image manipulation of a medical image data set (B), by means of a plurality of input devices (6, 14-17),
-wherein a plurality of virtual switching elements (18) are defined and associated with the application (9) in such a way that activating each switching element (18) triggers a specific corresponding action (R, Z, P) of the application (9),
-wherein the input signals (E1-E8) of the input devices (6, 14-17) are each associated with one of the virtual switching elements (18) for its activation according to a predetermined rule,
wherein the virtual switching element is designed as a software component which simulates the behavior of the physical switching element.
2. The method according to claim 1, wherein the rules are predefined according to a uniform standard for all input devices (6, 14-17).
3. Method according to claim 1, wherein three virtual switching elements (18) are defined, which correspond to
-a virtual left mouse button (19),
-virtual mouse middle keys (20),
-a virtual right mouse button (21).
4. Method according to claim 3, wherein the input signals (E4, E5) are always associated with a virtual left mouse button (19) within the scope of a rule in the case of at least one input device (6, 15, 16) which only allows the generation of one single input signal (E4, E5), in particular in the case of a single-button mouse, a single-point touch screen or a single-point touchpad (16) or a tablet (15).
5. Method according to claim 3, wherein in case of at least one input device (6, 17) within a regular range that allows for the simultaneous generation of a plurality of non-dedicated input signals (E6-E8), in particular in case of a multi-touch screen (17),
-three simultaneously generated input signals (E6-E8) are always associated with the virtual right mouse button (21),
-exactly two simultaneously generated input signals (E6-E8) are always associated with a virtual mouse middle key (20), and
-a single generated input signal (E6-E8) is always associated with the virtual left mouse button (19).
6. Method according to claim 3, wherein within the scope of a rule at least one input device (14) is allowed to generate at least two different dedicated input signals (E1-E3), in particular a two-key mouse, a three-key mouse (14), a gesture-controlled input device or a language-controlled input device
-the first input signal (E1) generated in the absence of the or each other input signal (E2, E3) is always associated with the virtual left mouse button (19),
-the second input signal (E3) generated in the absence of the or each other input signal (E1, E2) is always associated with the virtual right mouse button (21).
7. The method according to claim 6, wherein in the case of at least one input device (14) within the scope of rules that allow the generation of three different dedicated input signals (E1-E3), in particular in the case of a triple-key mouse (14), a gesture-controlled input device or a language-controlled input device,
-the third input signal (E2) generated in the absence of the or each other input signal (E1, E3) is always associated with a virtual mouse middle key (20).
8. The method according to claim 6 or 7, wherein, in the case of at least one input device (14) within a regular range that allows for the simultaneous generation of at least two different dedicated input signals (E1-E3), in particular in the case of a two-key mouse or a three-key mouse (14),
-exactly two simultaneously generated input signals (E1-E3) are always associated with a virtual mouse middle key (20).
9. The method according to claim 8, wherein, within the scope of rules, in the case of at least one input device (14) which allows three different dedicated input signals (E1-E3) to be generated simultaneously, in particular in the case of a three-key mouse (14),
-three simultaneously generated input signals (E1-E3) are always associated with the virtual right mouse button (21).
10. The method according to claim 6 or 7, wherein, in the case of at least one input device (14) within a regular range that allows for the simultaneous generation of at least two different dedicated input signals (E1-E3), in particular in the case of a two-key mouse or a three-key mouse (14),
-each of the simultaneously generated input signals (E1-E3) is always associated with a corresponding virtual mouse key (19-21), respectively.
11. A system (1) for controlling an application (9), in particular for interactive image manipulation of medical image data, having an input relay module (3) which signally interconnects the application (9) to be controlled and at least one input device (6), wherein the input relay module (3) is configured for automatically performing a method according to any one of claims 1 to 10.
12. A computer program product comprising a data carrier having machine-readable instructions on which a computer program is stored, which when run automatically executes on a computer a method according to any one of claims 1 to 10.
CN201410050907.9A 2013-02-21 2014-02-14 Method for controlling an application and related system Active CN104007999B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102013202818.0A DE102013202818B4 (en) 2013-02-21 2013-02-21 Method for controlling an application and associated system
DE102013202818.0 2013-02-21

Publications (2)

Publication Number Publication Date
CN104007999A CN104007999A (en) 2014-08-27
CN104007999B true CN104007999B (en) 2020-01-10

Family

ID=51263917

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410050907.9A Active CN104007999B (en) 2013-02-21 2014-02-14 Method for controlling an application and related system

Country Status (2)

Country Link
CN (1) CN104007999B (en)
DE (1) DE102013202818B4 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3072803B1 (en) * 2017-10-19 2021-05-07 Immersion SYSTEM AND METHOD FOR THE SIMULTANEOUS MANAGEMENT OF A PLURALITY OF DESIGNATION DEVICES

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101339453A (en) * 2008-08-15 2009-01-07 广东威创视讯科技股份有限公司 Simulated mouse input method based on interactive input apparatus
CN102043484A (en) * 2009-10-26 2011-05-04 宏正自动科技股份有限公司 Tools with multiple contact points for use on touch panel
CN102323875A (en) * 2011-10-26 2012-01-18 中国人民解放军国防科学技术大学 Mouse event-based multi-point touch gesture interaction method and middleware

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7631124B2 (en) 2007-04-06 2009-12-08 Microsoft Corporation Application-specific mapping of input device elements
US8462134B2 (en) 2009-06-29 2013-06-11 Autodesk, Inc. Multi-finger mouse emulation
WO2012037417A1 (en) 2010-09-16 2012-03-22 Omnyx, LLC Control configuration for digital image system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101339453A (en) * 2008-08-15 2009-01-07 广东威创视讯科技股份有限公司 Simulated mouse input method based on interactive input apparatus
CN102043484A (en) * 2009-10-26 2011-05-04 宏正自动科技股份有限公司 Tools with multiple contact points for use on touch panel
CN102323875A (en) * 2011-10-26 2012-01-18 中国人民解放军国防科学技术大学 Mouse event-based multi-point touch gesture interaction method and middleware

Also Published As

Publication number Publication date
DE102013202818A1 (en) 2014-08-21
CN104007999A (en) 2014-08-27
DE102013202818B4 (en) 2023-03-30

Similar Documents

Publication Publication Date Title
US8638315B2 (en) Virtual touch screen system
KR101930225B1 (en) Method and apparatus for controlling touch screen operation mode
US8065624B2 (en) Virtual keypad systems and methods
TWI553541B (en) Method and computing device for semantic zoom
US11073980B2 (en) User interfaces for bi-manual control
JP5270537B2 (en) Multi-touch usage, gestures and implementation
US6643721B1 (en) Input device-adaptive human-computer interface
US20120212420A1 (en) Multi-touch input control system
US20110018806A1 (en) Information processing apparatus, computer readable medium, and pointing method
MX2008014057A (en) Multi-function key with scrolling.
CN103842943A (en) System and method for application management on device having a touch screen display
WO2004010276A1 (en) Information display input device and information display input method, and information processing device
EP2972735A1 (en) User interface for toolbar navigation
US20150169122A1 (en) Method for operating a multi-touch-capable display and device having a multi-touch-capable display
KR20130137069A (en) Method of simulating the touch screen operation by means of a mouse
US11907741B2 (en) Virtual input device-based method and system for remotely controlling PC
JP5275429B2 (en) Information processing apparatus, program, and pointing method
US20140298275A1 (en) Method for recognizing input gestures
US20160328024A1 (en) Method and apparatus for input to electronic devices
Benko et al. Imprecision, inaccuracy, and frustration: The tale of touch input
Nakamura et al. Double-crossing: A new interaction technique for hand gesture interfaces
CN104007999B (en) Method for controlling an application and related system
CN111708475B (en) Virtual keyboard generation method and device
US20140085197A1 (en) Control and visualization for multi touch connected devices
CN104820489A (en) System and method in managing low-latency direct control feedback

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20220121

Address after: Erlangen

Patentee after: Siemens Healthineers AG

Address before: Munich, Germany

Patentee before: SIEMENS AG

TR01 Transfer of patent right