EP3400515A1 - Interface utilisateur pourvue de plusieurs écrans d'affichage et procédé de positionnement de contenus sur plusieurs écrans d'affichage - Google Patents

Interface utilisateur pourvue de plusieurs écrans d'affichage et procédé de positionnement de contenus sur plusieurs écrans d'affichage

Info

Publication number
EP3400515A1
EP3400515A1 EP17705837.7A EP17705837A EP3400515A1 EP 3400515 A1 EP3400515 A1 EP 3400515A1 EP 17705837 A EP17705837 A EP 17705837A EP 3400515 A1 EP3400515 A1 EP 3400515A1
Authority
EP
European Patent Office
Prior art keywords
display
user interface
user action
processor
graphical user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP17705837.7A
Other languages
German (de)
English (en)
Inventor
Christian Butter
Eckhard Seibert
Silvio Wolf
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens AG
Original Assignee
Siemens AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens AG filed Critical Siemens AG
Publication of EP3400515A1 publication Critical patent/EP3400515A1/fr
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display

Definitions

  • the invention relates to a user interface with several ⁇ ren displays and a method for positioning content in work environments with multiple displays.
  • Multi-display environments such as monitors or giant screens, provide a variety of content and are often used in control rooms.
  • the displays form a multi-monitor working environment in their specific arrangement. They can be any screens, large screens, or projectors that receive and output image information from a computer.
  • a user interface is to place ⁇ multi-display and a method for positioning of contents are created on a plurality of displays which provide an alternative to the prior art.
  • the user interface has a display and operating at least one processor, which is programmed to output a graphical user interface on the user display, wherein the graphicalêtoberflä ⁇ surface includes a set of windows.
  • the user interface of the ⁇ le further has connected to the processor at least one other display, and is characterized gekennzeich- net that the processor is programmed for selection ei ⁇ nes to be distributed window from the set of windows in response to a first user action on the graphical user interface for a selection on the graphic of a display from the other displays in response to a second user action User interface, and for outputting the selected window to be distributed on the selected display.
  • At least one processor displays on a control display a graphical user interface that represents a set of windows that are positionable on at least one other display.
  • the processor selects a window to be distributed based on a first user action from the set of windows in the graphical user interface. He then selects an advertisement from the further displays on the basis of a second user action on the graphical user interface. He then outputs the selected window to be distributed on the selected display.
  • the processor may execute an operating system that provides the graphical user interface.
  • the processor which is for example a microprocessor of a computer, can be supported in the execution of the method to any extent by other processors.
  • the processor is part of a one-chip system that includes at least one graphics processor.
  • the computer system in which the processor is operating may have one or more discrete graphics Availability checked ⁇ gen.
  • the graphics processors or graphics cards tasks can take this on the issue of the graphical user interface and control of the ads, which relieve the processor. The same applies to the embodiments and developments of the invention.
  • the advantages mentioned below need not necessarily be achieved by the subject-matter of the independent patent claims. Rather, these may also be advantages that are achieved only by individual embodiments, variants or developments.
  • the user interface and method allow a user to lenzierlan- to be distributed windows with the first user action on the GUI, and a target display, such as a monitor or a large screen, for For the second user action choose ⁇ on which to be distributed the Window should be displayed.
  • the user interface and method provide a Be ⁇ user thus has the advantage that it can position of the operator display in any view a multi-monitor work environment, including the existing big screens.
  • the user no longer has to pay attention to the process of moving by, for example, holding a mouse button down over long distances to move a window across multiple monitors, but can fully rely on the positioning of the window on the displays of the multi Focus on monitor work environment.
  • the positioning takes place accurately and is intuitive to use. It is possible even if the target ad is not in the user's preferred field of view or completely outside of its field of view.
  • the positioning is ergonomic because the user can work in his preferred field of view on the control panel.
  • the user interface and the method allow, for the first time, such positioning by touch input, even though the aim of the positioning is outside the operating display.
  • Touchscreens do not provide an input device, such as a mouse, to move content. With touch gestures therefore can not be worked or positioned beyond the physical limits of the touch screen addition.
  • the user interface and the method also enable positioning by touch input. Consequently, they are suitable for different input modalities, such as touches or mouse operation, and are independent of the specific input medium, which is why other input modalities, such as keyboard input, can also be used.
  • are at least one Prozes ⁇ sor is a schematic representation of the spatial arrangement of the further display on the graphical user interface, wherein by means of the second user action, each of the other ads in the schematic illustration is selectable.
  • FIG. 1 In the schematic illustration is a scaled down version or a schematic image which is formed from the display of the user interface ei ⁇ ner multi-monitor work environment.
  • the schematic representation of this case forms a control element which can for example be operated with the mouse or Be ⁇ agitation of the graphical user interface.
  • the first user action forms the beginning and the second user action the
  • a drag-and-drop operation which can be performed in particular with a mouse operation, or which is in particular carried out by contact, wherein the operator display is a tactile ⁇ screen.
  • the second user action forms a drag-and-drop operation, which can only be executed after the first user action has been completed.
  • This variant makes it possible to drag and drop the content to be distributed to one of the displays in the schematic diagram.
  • the processor allows a successive selection of several of the other displays during the second user action, wherein the last selected display is the selected display.
  • the processor issues visual feedback on the currently selected display, temporarily highlighting it during the second user action.
  • the user touches one of the further displays in the schematic display on the operating display (which in this case is a touch screen). While his finger still
  • Touched touch screen the currently selected display vi ⁇ highlighted, for example by a colored Markie ⁇ tion or a change in brightness or contrast.
  • This visual feedback takes place both in the schematic representation, as well as on the real selected display.
  • the user can move his fingertip on the operating display as part of the second user action until it comes to rest on another display in the schematic representation. Then the visual feedback is output on the newly selected display.
  • the user could also drive the mouse over the displays while holding down the mouse button.
  • the final selection display results from the last selected display when the user releases the mouse button or lifts his finger off the control panel.
  • the window can be dragged to be distributed to the schematic representation with a drag-and-drop operation, using both a tactile and a mouse ⁇ operation can also be implemented here.
  • the visual feedback before the end of the drag-and-drop operation ie before releasing, can be displayed successively.
  • the vi ⁇ selle feedback occurs here without significant time delay and follows the movement of the second user action. It supports collaborative work because all users of the multi-monitor workspace can follow the second user action and comment on and support the selection. Further- In addition, the visual feedback also allows the user to direct his gaze to the multi-monitor work environment during the second user action, that is, to the real further displays.
  • At least one of the further displays has a graphical user interface which is subdivided into a plurality of display areas. As part of the second user interaction, one of the display panels is selectable. The selected window to be distributed is displayed on the in ⁇ selected display area.
  • a control element is displayed on the graphical user interface for each window of the set of windows, wherein each of the control elements can be selected by the first user action.
  • the schematic representation is displayed immediately after the first user action overlapping or adjacent to the selected control.
  • a distribution mode is activated in advance after detection of an initial user action, wherein the windows from the set of windows in the graphical user interface are only visually marked as distributable in the distribution mode.
  • the aforementioned control elements are only in the distribution mode displayed.
  • the aforementioned schematic representation is displayed only in the distribution mode.
  • the distribution mode allows visual highlighting and marking of flexibly distributable windows as manipulable to the user.
  • the distributable in the multi-monitor work environment windows are displayed very clearly and in a way that does not additionally overloaded the complex gra ⁇ fishes user interface with permanently visible controls. This is because the distributable windows are only visually highlighted after activation of the distribution mode.
  • the readability of the graphical user interface is not a fishing ⁇ ⁇ side set in a standard mode, since no additional controls or icons to be displayed with the exception of a control element for the mode change.
  • a switch is displayed on the graphical user interface, wherein the initial user action is detected upon actuation of the switch.
  • the user interface has egg ⁇ NEN electrical switch, which is arranged in the vicinity of the control panel.
  • the processor is programmed to detect the initial user action based on an operation of the switch.
  • the operating display is a tactile ⁇ screen, which is arranged to the first user action and the second user action as Tastein complex to detektie ⁇ ren.
  • the computer-readable medium stores a computer program which executes the method when it is executed in the processor.
  • the computer program is executed in the processor and executes the procedure.
  • FIG. 1 shows an operating display with a graphical user interface 1 11, and two other indicators 2; 3, so ⁇ as an enlarged representation of the graphical user interface 11,
  • FIG. 2 shows a first user action 41, which selects a window 15 to be distributed on the graphical user interface 11, FIG. 3, a second user action 42, which in one
  • FIG. 13 shows a third user action 43, which is a third
  • FIG. 5 a representation of the window 15 to be distributed on the third display 3
  • FIG. 6 shows a user interface 11 in a standard mode
  • FIG. 8 shows a selection of a window 15 to be distributed with a first user action 41
  • FIG. 10 a representation of the window 15 to be distributed on the third display 3
  • FIG. 11 shows a final user action 44 for deactivating the distribution mode on the graphical user interface 11
  • FIG. 13 shows an architecture of a user interface with several displays.
  • Figure 1 shows an operator display screen 1 on which a graphical user interface is output 11, and a second on ⁇ showing 2, and a third display 3.
  • the ads are part ei ⁇ ner user interface, the architecture is designed, for example corresponding to Figure 13, which further below will be explained.
  • Deviating from Figure 1, the Benut ⁇ cut point may also comprise only the operation display 1 and the second display. 2 It is also possible that the user interface in addition to the operating display 1, the second display 2 and the third display 3 still further displays summarized.
  • the user interface can still have wider ⁇ re displays, which are driven and selected in the same manner, as described in the embodiments.
  • any display can be multiple areas, such as quadrants, have, in turn, treated as separate displays and selected who can ⁇ .
  • the second display 2 and the third display 3 can also be only partial areas of a large-screen screen.
  • the graphical user interface 11 is shown on the left side again enlarged for illustrative purposes. It contains windows 15, 16, 17, 18, which basically also on the second display 2 and the third Display 3 can be output.
  • the graphical user interface 11 provides for each window 15, 16, 17, 18 a Be ⁇ service element 12, by means of which the respective window for distribution to the second display 2 or the third display 3 can be selected.
  • the operation display 1, the second display 2, and the third on ⁇ show three, and possibly further display form in their spatial arrangement, a multi-monitor work environment.
  • the second display 2 is here, for example divided into four quadrants, each of which forms a display area for one of the windows 15, 16, 17, 18 on the operator display screen 1, currency ⁇ rend of each of these windows would be represented chanyogl ⁇ lend on the third display.
  • 3 Of course, for each of the displays 2, 3, depending on the requirement, such a subdivision or a picture-filling representation can be selected.
  • FIG. 2 shows the user interface from FIG. 1, wherein a user now touches the operating element 12 for the window 15 by means of a first user action 41, since the window 15 is to be distributed to one of the further displays 2, 3. Subsequently, as shown in FIG. 3, a schematic representation 13 of all displays, that is to say the entire MultiMonitor working environment, is output on the graphical user interface 11 below the selected control element 12.
  • the operating display 1, the second display 2 and the third display 3 are arranged schematically according to their real spatial arrangement. In one variant, the representation of the operating display 1 itself can be omitted here.
  • the user now uses a second user action 42 to select the right lower quadrant of the second display 2 as the display area for the window 15 to be distributed. Then a visual
  • All user actions can be carried out as touch inputs, provided that the operating display 1 is a touch screen.
  • the operator panel 1 could be part of a tablet computer.
  • the user actions may also be performed by mouse, keyboard, or any other suitable input device that permits selection of graphical user interface 11 windows.
  • Figure 4 shows the case that the user selects a third display 3 with a third user action 43 on which there ⁇ rillerhin visual feedback is issued 31st The user can thus successively select a plurality of displays in the schematic diagram 13 until the visual message corresponding to his wishes is displayed on the display.
  • Figure 3 could correspond to a simple mouse-over or a slight pressure with the fingertip on the touch screen.
  • Such a second user action 42 can be detected from the user interface ⁇ location and are used for outputting the visual feedback 21st
  • a final selection, such as by the third user action 43 in Figure 4 is then detected, for example, by a completed mouse click or a greater pressure touch.
  • the second Benut ⁇ zertress 42 may be as simple, and the third user action 43 are implemented as a double mouse click.
  • Another possibility is to combine the second user action 42 and the third user action 43 as part of a swipe movement.
  • the user depresses a mouse button and moves then hold down the mouse pointer over different display areas or displays until the visual feedback 31 on the desired display 3 is displayed.
  • the user then confirms the selection made. Even when touched on a touch screen, this procedure can be selected, wherein the on ⁇ display is selected, which touches the user before he lifts his finger from the touch screen.
  • To confirm the final selection may also be provided a separate control.
  • the operating element 12 of the to-distributing element 15 as part of a drag can-and-drop operation on the desired display surface or display are Gezo ⁇ gen, where here too in the context of drag and drop motion, which can be carried out either via mouse button or touch, different displays or display surfaces can be covered, each with a visual feedback ⁇ tion on the swept display area or display is ⁇ out.
  • the final selected display area or to show ⁇ is then obtained from the final position of the drag-and-drop operation.
  • FIG. 6 again shows a multi-monitor working environment which is formed, for example, from a control display 1, a second display 2 and a third display 3.
  • a graphical user interface 11, which is shown on the operating display 1, is shown enlarged again in Figure 6 on the left side for illustration.
  • the graphical user interface 11 is in a standard mode, in which with respect to the distribution of windows 15, 16, 17, 18 of the graphical user interface 11 to the other displays 2, 3 only a single graphical element, here a switch 14, is displayed , This has the advantage that the user interface 11 is not overloaded in the standard mode by numerous graphic elements for the distribution functions.
  • a switch 14 With an initial user action 40, a user now activates the switch 14, which activates a distribution mode.
  • FIG. 7 shows the graphical user interface 11 after the distribution mode has been activated.
  • the user interface 11 is shown inverted.
  • inverted representation of the distribution mode lifts the windows 15, 16, 17, 18 visually forth ⁇ and informs the user that they can be distributed over the other displays 2; 3.
  • a loading is determined for each of the windows 15, 16, 17, 18 in the distribution mode serving element 12 is on the graphical user interface 11 provided ⁇ with which the respective window can be selected by the user.
  • a schematic diagram 13 is displayed in the distribution mode, as was already explained ⁇ be in the context of Figures 3 and 4.
  • FIG. The schematic representation is now arranged at the top of grafi ⁇ 's user interface. 11
  • FIG. 8 shows a first user action 41 with which the user activates the operating element 12 of the window 15 to be distributed.
  • the window 15 is thereby selected for distribution to ei ⁇ ne of the displays 2 or 3.
  • the user touches, with a second user action 42, the icon of the third display 3 in the schematic diagram 13.
  • a visual feedback 31 is output on the selected third display 3.
  • the switch 14 is not necessarily an element of gra ⁇ fishing UI. 11 He may also be designed as physika ⁇ Lischer, electric switches and realized for example by a special key on a keyboard ⁇ .
  • FIG 13 shows a possible architecture of the user interface imagine ⁇ .
  • a computer 9 includes a processor 5, which executes a computer program 7, which it has loaded from a secondary memory 8 in its main memory 6.
  • the user inputs previously discussed can be detected via an input device 4, which is optionally the berhakungsempfindli ⁇ che surface of a touchscreen, a computer mouse, a keyboard or any other input device which is suitable for operating a graphical user interface.
  • the graphical user interface is output on an operating display 1, which still belongs directly to the computer 9.
  • a second display 2 and a third display 3 are also supplied by the computer 9 with image information, but which may show on ⁇ also be arranged at a greater distance.
  • the third display 3 could be a large ⁇ screen in a control room.
  • Display 2 and the third display 3 directly via a Gra ⁇ fik badge with image information controls.
  • it is an extended desktop whose different displays are managed and controlled by the graphics card.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne une interface utilisateur permettant à un utilisateur de sélectionner une fenêtre (15) à distribuer, sur une interface graphique utilisateur (11) par une première action utilisateur (41), et de sélectionner par une troisième action utilisateur (43) un écran d'affichage cible (3), par exemple un moniteur ou un grand écran, sur lequel la fenêtre (15) à distribuer, est ensuite représentée. Selon une variante, chaque affichage (2, 3) peut être sélectionné dans une représentation schématique (13) au moyen de la troisième action utilisateur (43).
EP17705837.7A 2016-02-22 2017-02-14 Interface utilisateur pourvue de plusieurs écrans d'affichage et procédé de positionnement de contenus sur plusieurs écrans d'affichage Ceased EP3400515A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102016202694.1A DE102016202694A1 (de) 2016-02-22 2016-02-22 Benutzerschnittstelle mit mehreren Anzeigen und Verfahren zur Positionierung von Inhalten auf mehreren Anzeigen
PCT/EP2017/053194 WO2017144298A1 (fr) 2016-02-22 2017-02-14 Interface utilisateur pourvue de plusieurs écrans d'affichage et procédé de positionnement de contenus sur plusieurs écrans d'affichage

Publications (1)

Publication Number Publication Date
EP3400515A1 true EP3400515A1 (fr) 2018-11-14

Family

ID=58057118

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17705837.7A Ceased EP3400515A1 (fr) 2016-02-22 2017-02-14 Interface utilisateur pourvue de plusieurs écrans d'affichage et procédé de positionnement de contenus sur plusieurs écrans d'affichage

Country Status (5)

Country Link
US (1) US20190065007A1 (fr)
EP (1) EP3400515A1 (fr)
CN (1) CN108700983A (fr)
DE (1) DE102016202694A1 (fr)
WO (1) WO2017144298A1 (fr)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10372306B2 (en) * 2016-04-16 2019-08-06 Apple Inc. Organized timeline
CN108762604A (zh) * 2018-03-30 2018-11-06 联想(北京)有限公司 一种显示方法、装置及电子设备
CN108762701A (zh) * 2018-04-13 2018-11-06 广州讯立享智能科技有限公司 一种触摸一体机的投屏控制方法、用户终端及触摸一体机
EP3627298A1 (fr) * 2018-09-21 2020-03-25 Kistler Holding AG Procédé de mesure d'une valeur mesurée physique et système de mesure destiné à la mise en uvre dudit procédé
CN111510646A (zh) * 2020-04-29 2020-08-07 京东方科技集团股份有限公司 拼接屏的视频显示方法、显示装置、计算机设备和介质
CN111736789A (zh) * 2020-07-31 2020-10-02 成都依能科技股份有限公司 一种扩展显示屏控制器及扩展显示屏操控方法
CN112068749B (zh) * 2020-08-21 2021-09-21 易思维(杭州)科技有限公司 多屏幕单终端设备屏幕集中显示、控制的系统及方法
US11157160B1 (en) * 2020-11-09 2021-10-26 Dell Products, L.P. Graphical user interface (GUI) for controlling virtual workspaces produced across information handling systems (IHSs)

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5999177A (en) * 1997-07-07 1999-12-07 International Business Machines Corporation Method and system for controlling content on a display screen in a computer system
US20030107604A1 (en) * 2001-12-12 2003-06-12 Bas Ording Method and system for automatic window resizing in a graphical user interface
US20060248471A1 (en) * 2005-04-29 2006-11-02 Microsoft Corporation System and method for providing a window management mode
US20080117225A1 (en) * 2006-11-21 2008-05-22 Rainer Wegenkittl System and Method for Geometric Image Annotation
US20080295012A1 (en) * 2007-05-23 2008-11-27 Microsoft Corporation Drag-and-drop abstraction
US20110063191A1 (en) * 2008-01-07 2011-03-17 Smart Technologies Ulc Method of managing applications in a multi-monitor computer system and multi-monitor computer system employing the method
US8788967B2 (en) * 2008-04-10 2014-07-22 Perceptive Pixel, Inc. Methods of interfacing with multi-input devices and multi-input display systems employing interfacing techniques
US10860162B2 (en) * 2009-04-17 2020-12-08 Abb Schweiz Ag Supervisory control system for controlling a technical system, a method and computer program products
US8832585B2 (en) * 2009-09-25 2014-09-09 Apple Inc. Device, method, and graphical user interface for manipulating workspace views
US8966401B2 (en) * 2010-05-03 2015-02-24 Lg Electronics Inc. Electronic device and methods of sending information with the electronic device, controlling the electronic device, and transmitting and receiving information in an information system
US8878794B2 (en) * 2011-09-27 2014-11-04 Z124 State of screen info: easel
US9298361B2 (en) * 2013-03-15 2016-03-29 Apple Inc. Analyzing applications for different access modes
US20140351722A1 (en) * 2013-05-23 2014-11-27 Microsoft User interface elements for multiple displays
US9348495B2 (en) * 2014-03-07 2016-05-24 Sony Corporation Control of large screen display using wireless portable computer and facilitating selection of audio on a headphone
DE102014210602A1 (de) * 2014-06-04 2015-12-17 Siemens Aktiengesellschaft Computernetzwerk und Verfahren zum Verschieben eines Objektes innerhalb eines Computernetzwerkes

Also Published As

Publication number Publication date
US20190065007A1 (en) 2019-02-28
CN108700983A (zh) 2018-10-23
WO2017144298A1 (fr) 2017-08-31
DE102016202694A1 (de) 2017-08-24

Similar Documents

Publication Publication Date Title
WO2017144298A1 (fr) Interface utilisateur pourvue de plusieurs écrans d'affichage et procédé de positionnement de contenus sur plusieurs écrans d'affichage
DE69129265T2 (de) Bildverschiebungssteuerung und Verfahren
DE69130773T2 (de) Elektronische anzeige und datenverarbeitungsvorrichtung
DE69937592T2 (de) Verfahren und Vorrichtung zur Zeicheneingabe mit virtueller Tastatur
DE69032645T2 (de) Datenverarbeitungssystem mit auf Gesten basierenden Eingabedaten
DE102010036906A1 (de) Konfigurierbares Pie-Menü
EP2017756A1 (fr) Procédé destiné à l'affichage et/ou au traitement de données images d'origine médicale avec détection de mouvement
DE102012219119B4 (de) Intelligente Fenstererstellung in einer grafischen Benutzeroberfläche
DE69615470T2 (de) Darstellung von Beziehungen zwischen graphischen Objekten in einer Rechneranzeigevorrichtung
DE60024655T2 (de) Verfahren zur benutzung von mit einem anzeigegerät verbundenen tasten für den zugriff und die ausführung von damit verbundenen funktionen
DE69429237T2 (de) Benutzerschnittstelnlevorrichtung für Rechnersystem
EP1374027B1 (fr) Positionnement de secteurs affiches sur une interface utilisateur
DE19744861A1 (de) Verfahren zum Einsatz einer dreidimensionalen Mouse im WINDOWS-Betriebssystem
DE202009018283U1 (de) Karten-Metapher für Aktivitäten in einem Rechengerät
DE202009018653U1 (de) Berührungsereignismodell
EP3372435B1 (fr) Procédé et système de commande destinés à fournir une interface utilisateur
EP2795451B1 (fr) Procédé de fonctionnement d'un affichage tactile multipoints et dispositif avec affichage tactile multipoints
EP3270278B1 (fr) Procede de fonctionnement d'un systeme de commande et systeme de commande
DE102012220062A1 (de) Einstellung mehrerer benutzereingabeparameter
DE112013006066T5 (de) Die Druckempfindlichkeit auf Multi-Touch-Einheiten emulieren
DE102015218963A1 (de) Steuerverfahren, Steuervorrichtung und elektronische Vorrichtung
DE69026516T2 (de) Digitalisiertablett mit zweimodenläufer/maus
DE102013203918A1 (de) Verfahren zum Betreiben einer Vorrichtung in einer sterilen Umgebung
DE10084249T5 (de) Zusätzliches LCD-Feld mit Sensorbildschirm
DE102013109268A1 (de) Eingabevorrichtungen

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20180810

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20190717

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20220212