EP3241091A1 - Fortbewegungsmittel, anwenderschnittstelle und verfahren zur definition einer kachel auf einer anzeigeeinrichtung - Google Patents
Fortbewegungsmittel, anwenderschnittstelle und verfahren zur definition einer kachel auf einer anzeigeeinrichtungInfo
- Publication number
- EP3241091A1 EP3241091A1 EP15826024.0A EP15826024A EP3241091A1 EP 3241091 A1 EP3241091 A1 EP 3241091A1 EP 15826024 A EP15826024 A EP 15826024A EP 3241091 A1 EP3241091 A1 EP 3241091A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- tile
- gesture
- user
- cells
- defining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
- 238000000034 method Methods 0.000 title claims abstract description 30
- 230000004044 response Effects 0.000 claims abstract description 21
- 238000011156 evaluation Methods 0.000 claims description 11
- 238000001514 detection method Methods 0.000 claims description 5
- 238000004590 computer program Methods 0.000 claims description 3
- 108010076504 Protein Sorting Signals Proteins 0.000 claims description 2
- 230000006870 function Effects 0.000 description 13
- 230000003993 interaction Effects 0.000 description 12
- 230000033001 locomotion Effects 0.000 description 6
- 238000013459 approach Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 241001025261 Neoraja caerulea Species 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/80—Arrangements for controlling instruments
- B60K35/81—Arrangements for controlling instruments for controlling displays
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/143—Touch sensitive instrument input devices
- B60K2360/1438—Touch screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/143—Touch sensitive instrument input devices
- B60K2360/1438—Touch screens
- B60K2360/1442—Emulation of input devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/146—Instrument input by gesture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
Definitions
- the present invention relates to a means of locomotion, a
- Display device of a user interface Display device of a user interface.
- the present invention provides intuitive and simple adjustments to a location and / or size and / or format of a tile.
- vehicle or vehicles In modern means of transport (hereinafter “vehicle or vehicles”) increasingly monitors are installed on soft information and
- Buttons for operating the vehicle can be displayed. Occasionally, functions or functional scopes or information scopes in
- the configuration mode differs from a display mode and an operating mode.
- a "display mode” is used in the present
- Invention understood a display modality in which the screen content for
- Functions assigned to the operating mode can also be called "primary functions" which can be accessed in computer-based operating systems often with a left-click or double-click, while the functions that can be operated in the configuration mode at
- the object identified above is achieved by a method for defining a tile on a display device of a user interface.
- the display device may be, for example, a matrix display on which optional content can be displayed.
- the display device may be, for example, a matrix display on which optional content can be displayed.
- a "tile" is understood as meaning such a field or a meaning unit of a graphical user interface on which functions and / or functional scopes and / or information scopes and / or operating options are combined to form a functional family and visually and logically of other information scopes are demarcated on the screen.
- tiles can be defined by the user
- a tile grid comprising a plurality of cells is displayed on the display device.
- a tile grid is in the context of the present invention, an optical
- the tile grid is to be distinguished from a pixel grid of a display device, which does not provide aesthetic screening for different tiles.
- the tile grid can specify matrix-type basic values for edge lengths of the tiles to be defined. Therefore, the smallest elements of a tile grid are larger than the smallest elements (edge lengths) of the pixel grid of the display device.
- the smallest possible areas of area are used to define tiles within the tile
- a user gesture is made in contact with a user
- touch-sensitive surface detected with respect to at least one cell.
- the user gesture the user defines one or more cells of the
- Tile grid which he wants to convert to a single tile.
- touch-sensitive surface for example, the used
- Display device defines a tile on the positions of those cells which have been selected by means of the user gesture.
- a tile is created on a single cell of the swept cells.
- a plurality, in particular preferably all cells, which have been selected in the context of the user gesture are merged into a tile.
- the user gesture proposed according to the invention can be designed, for example, as a swipe gesture.
- a so-called "multi-touch gesture” can define a multiplicity of cells, which are subsequently fused into a tile according to the invention.
- a respective user gesture will always define only a single tile which, depending on the farthest from each other
- arranged cells extends on the display device.
- a particularly simple and intuitive way to define a tile is proposed, which in particular during the leadership of a Means of transport has the least possible distraction of the user from the traffic situation.
- a selection menu for defining a content to be displayed in the tile can be automatically displayed in response to the definition of the tile.
- a selection of available functional scopes may be displayed which the user may assign to the tile.
- the selection menu described above for defining the content to be displayed in the tile may be overlaid.
- the user may select a contribution from the selection menu in response to which the completed tile displays the amount of information granted and optionally provides contained buttons for accepting user input.
- the selection gesture for selecting an entry of the selection menu can be any gesture.
- a swipe gesture with respect to the entry with the destination may include the area of the predefined tile.
- a drag / drop gesture is executed, which drops the entry on the defined tile.
- Selection menus to display the subordinate functions in the form of a selection submenu may be "unfolded" in response to a jog gesture on a respective entry Entry of the selection submenu define a corresponding content for the tile.
- the present invention proposes intuitive and easy to carry out user steps.
- the user may make a gesture with a minimum contact duration (English "long press” gesture) relative to the tile causing the tile to detach from its previous position within the tile grid.
- This process can be graphically illustrated by the fact that the tile initially "sticks" to the user's finger, irrespective of the positions of the cells of the tile grid.
- a swipe gesture or a drag-drop gesture is detected with respect to the tile and a new tile is defined as a function of the position of cells which are in a predetermined relationship to a target position of the swipe gesture.
- all cells of the tile grid which are superimposed on the detached tile at the end of the user's contact with the display device, can be set as the new position of the tile and determine its aspect ratio.
- an overlap measure between the detached tile and a cell of the tile grid may be specified, which determines whether the overlapped cell becomes part of the new tile.
- the degree of overlap can be described, for example, as a predetermined percentage or fraction of an overlapped edge length or an overlapped area of the cell in question, which has to be for example half, in particular one third, preferably one fourth of an entire edge length of the cell, so that the cell position of the In this way, it is possible to prevent accidentally produced smaller overlap measures from defining the overlapped cell as the future target position of the tile.
- the tile grid proposed according to the invention can be displayed in response to a detection of a swipe gesture with start position on an already defined tile.
- the swipe gesture is not essential directed horizontally so that it differs from gestures calling for a change to another screen view (eg another home screen).
- another screen view eg another home screen.
- the tile grid be displayed for better orientation of the user.
- Display means the tile grid are displayed to ensure the earliest possible orientation of the user with respect to possible target positions / sizes.
- User interface which includes a display device (e.g., a screen), a detection device (e.g., a touch-sensitive device)
- a display device e.g., a screen
- a detection device e.g., a touch-sensitive device
- an evaluation unit e.g., comprising a
- programmable processor in particular in the form of a microcontroller
- the evaluation unit can, for example, as
- the user interface is set up to carry out a method according to the first aspect of the invention.
- a third aspect of the present invention is a
- Computer program product proposed (eg, a data memory) on which instructions are stored, which enable a programmable processor (eg, an evaluation unit) to perform the steps of a method according to the first-mentioned aspect of the invention.
- the computer program product can be designed as a CD, DVD, blue-ray disk, flash memory, hard disk, RAM / ROM, cache, etc.
- a signal sequence is proposed representing instructions which enable a programmable processor (eg, an evaluation unit) to perform the steps of a method according to the first aspect of the invention.
- the information technology provision of the instructions is provided for the case under protection, that the storage means required for this purpose are outside the scope of the appended claims.
- a fifth aspect of the present invention is a
- Transportation means (such as a car, a van, a truck, a motorcycle, an aircraft and / or watercraft) proposed, which comprises a user interface according to the second-mentioned aspect of the invention.
- the user interface can be provided in particular for the driver of the means of locomotion, by means of which the driver can communicate with the means of locomotion and his technical equipment during the guidance of the means of locomotion.
- Figure 1 is a schematic overview of components of a
- Figure 2 is a schematic screen view, which a
- FIGS. 3 to 5 are an illustration of a user interaction for defining a first tile
- FIGS. 5 and 6 show a representation of a second user interaction for defining a second tile
- Figures 6 and 7 are an illustration of a third user interaction for defining a third tile
- Figures 7 and 8 are an illustration of a fourth user interaction for defining a fourth tile
- Figs. 8 and 9 are illustrations of a fifth user interaction for defining a fifth tile
- Figures 9 and 10 are an illustration of a sixth user interaction for defining a sixth tile
- FIG. 10 shows a representation of a user interaction for assigning a
- FIG. 11 is an illustration of a selection menu for defining a content of a tile
- FIG. 12 is an illustration of a selection submenu for defining a
- Figure 13 shows the result of the user interaction shown in Figure 12
- Figure 14 is a perspective view of a user interface
- Figure 15 is a perspective view of a user interface with tiles in an operating mode
- FIG. 16 a representation of a user input for the new positioning and for changing a size of an already defined tile
- Figure 17 is a flowchart illustrating steps of a
- FIG. 1 shows a passenger vehicle 10 as a means of locomotion, in which components of an exemplary embodiment according to the invention of a user interface are contained.
- Two screens 2, 2a are arranged one above the other in the dashboard of the car 10 as part of a display device and detection device. Between the screens 2, 2 a, an infrared LED strip 3 is arranged as a further component of the detection device and is set up to detect free user gestures (3D user inputs) executed in space.
- a data memory 5 is provided for the provision of instructions for executing a method according to the invention and for defining classes of predefined user inputs and, like the aforementioned components, is connected in terms of information technology to an electronic control unit 4 as an evaluation unit.
- the electronic control unit 4 also has a
- Ambient light strips 7a, 7b are embedded in the dashboard or in the door of the car 10 and can be controlled via the electronic control unit 4 to output an optional light color with an optional intensity.
- a driver's seat 8a and a passenger's seat 8b are provided for receiving a user of the user interface 1.
- FIG. 2 shows a possible screen view of a screen 2 of FIG.
- inventive user interface 1 in which in a rectangular grid 1 1 substantially square cells 12 are delimited by dashed lines against each other.
- the dashed lines give the user an orientation as to which positions of the screen view edge lines of definable tiles can be arranged.
- FIG. 3 shows a swipe gesture of a hand 9 of a user along an arrow P on a screen view corresponding to FIG. 2.
- Figure 4 shows the started in Figure 3 wiping gesture in an advanced
- FIG. 5 shows the result of in conjunction with FIGS. 3 and 4
- FIG. 5 shows a further user interaction along an arrow P, with which the user's hand 9 sweeps over the four cells 12 closest to the tile 13.
- FIG. 6 shows the result of the user interaction illustrated in FIG. 5 for defining a second tile 13, which has been merged from four cells 12.
- Figure 6 shows a swipe gesture of a hand 9 of a user along a Arrow P, which defines the remaining cells 12 in the upper row of the grid 1 1 as target positions of a new tile.
- FIG. 7 shows, on the one hand, the result of the user input shown in FIG. 6 and an additional wiping gesture of the user's hand 9 along an arrow P, by means of which the user defines the extreme right cell 12 in the lower row of the grid 11 as the position of a new tile.
- FIGS. 8 and 9 show corresponding gestures for defining two further tiles 13.
- FIG. 10 shows a touch gesture of a hand 9 of a user on a tile 13, which is configured entirely as a button for assigning a content to the tile 13.
- FIG. 11 shows the result of the user interaction shown in FIG. 10 in response to which a selection menu 14 comprising five entries 15 is displayed.
- the entries 15 represent from top to bottom the representation of a route, information of the on-board computer, communication options, a media playback and a button for removing the created tile 13.
- FIG. 12 shows the result of a selection of the entry 15 in response to which a selection submenu 16 with entries 17 is displayed.
- Entries 17 of the selection submenu represent a display of images, a representation of weather information, and media playback.
- the hand of the user 9 performs a jog gesture on the entry 17 for displaying the weather information.
- FIG. 13 shows the result of the selection illustrated in FIG.
- the recently defined tile 13 in the center of the screen now shows weather information
- Figure 14 shows a perspective view of a user interface 1 with a screen 2, which is embedded in a dashboard of a means of locomotion is.
- the screen contents show two already defined tiles 13 in one
- FIG. 15 shows an approach and a subsequent long-press gesture of a hand 9 of a user on the upper tile 13 on the screen 2.
- the approach causes a display of a tile grid 1 1, which, in contrast to FIG. 2, illustrates positions of cells 12 spaced apart from one another.
- the cells 12 of the tile grid 1 1 are not separated from each other by lines, but by strips of the wallpapers.
- FIG. 16 shows the result of a swiping gesture with respect to the upper left tile 13 to a position which covers all hitherto unoccupied cells 12 of FIG
- Tile grid is essentially closest. In this way, a change in the size and position of the tile 13 is effected such that the cells, which are the smallest distance from the target position shown, represent the target positions of a newly defined tile. In other words, the previous positions of the shifted tile 13 are abandoned and the two middle and superimposed and the two rightmost and stacked cells 12 selected as the common target position of the redefined tile when the user contacts the surface of the screen 2 on the illustrated Position interrupts.
- FIG. 17 shows steps of an exemplary embodiment of a method according to the invention for defining a tile on a display device of FIG. 17
- step 100 a
- step 200 a tile grid comprising a plurality of cells on the
- a user gesture is detected in contact with a touch-sensitive surface with respect to at least one cell.
- a tile is defined on the locations of those cells that have been substantially swept over by the user gesture. Minor scans below a predefined minimum are ignored.
- a jog gesture regarding the tile defined as described above is detected and, in response thereto, in step 600, a selection menu for defining a content to be displayed in the tile is displayed.
- a selection gesture relating to an entry of the selection menu of the defined tile is detected, and in response thereto, in step 800, a selection sub-menu comprising the entry of the
- Selection menus displayed thematically assigned entries.
- a content associated with the entry of the selection submenu is assigned to the tile.
- a gesture with a minimum contact duration is assigned to the tile.
- a rectangle is defined by those cells of the tile grid according to the invention whose edges form an envelope of the cells superposed by the detached and moved tile.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP15150032.9A EP3040808B1 (de) | 2015-01-02 | 2015-01-02 | Fortbewegungsmittel, Anwenderschnittstelle und Verfahren zur Definition einer Kachel auf einer Anzeigeeinrichtung |
PCT/EP2015/080632 WO2016107771A1 (de) | 2015-01-02 | 2015-12-18 | Fortbewegungsmittel, anwenderschnittstelle und verfahren zur definition einer kachel auf einer anzeigeeinrichtung |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3241091A1 true EP3241091A1 (de) | 2017-11-08 |
Family
ID=52292747
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP15150032.9A Active EP3040808B1 (de) | 2015-01-02 | 2015-01-02 | Fortbewegungsmittel, Anwenderschnittstelle und Verfahren zur Definition einer Kachel auf einer Anzeigeeinrichtung |
EP15826024.0A Ceased EP3241091A1 (de) | 2015-01-02 | 2015-12-18 | Fortbewegungsmittel, anwenderschnittstelle und verfahren zur definition einer kachel auf einer anzeigeeinrichtung |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP15150032.9A Active EP3040808B1 (de) | 2015-01-02 | 2015-01-02 | Fortbewegungsmittel, Anwenderschnittstelle und Verfahren zur Definition einer Kachel auf einer Anzeigeeinrichtung |
Country Status (5)
Country | Link |
---|---|
US (1) | US10782845B2 (de) |
EP (2) | EP3040808B1 (de) |
KR (2) | KR20170093228A (de) |
CN (1) | CN107111493B (de) |
WO (1) | WO2016107771A1 (de) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3324280B1 (de) * | 2016-11-16 | 2020-07-22 | Seat, S.A. | Verfahren zum konfigurieren eines graphischen anzeigesystems |
DE102018210825A1 (de) * | 2018-07-02 | 2020-01-02 | Zf Friedrichshafen Ag | Steuervorrichtung zum Bedienen eines Fahrzeugs, Fahrzeugsystem für ein Fahrzeug und Verfahren zum Bedienen eines Fahrzeugs |
JP6778735B2 (ja) * | 2018-12-26 | 2020-11-04 | 本田技研工業株式会社 | 表示装置、表示方法、およびプログラム |
DE102021201376A1 (de) * | 2021-02-12 | 2022-08-18 | Volkswagen Aktiengesellschaft | Verfahren zur Steuerung von Fahrzeugfunktionen |
DE102021201375A1 (de) * | 2021-02-12 | 2022-08-18 | Volkswagen Aktiengesellschaft | Anzeige- und Bedienvorrichtung zur Steuerung von Fahrzeugfunktionen |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120176382A1 (en) * | 2009-09-04 | 2012-07-12 | Sang-Gi Noh | Method for configuring user interface screen for electronic terminal, and electronic terminal for carrying out the same |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7987431B2 (en) * | 1999-10-29 | 2011-07-26 | Surfcast, Inc. | System and method for simultaneous display of multiple information sources |
US7788590B2 (en) * | 2005-09-26 | 2010-08-31 | Microsoft Corporation | Lightweight reference user interface |
US9489216B2 (en) * | 2007-07-26 | 2016-11-08 | Sap Se | Active tiled user interface |
DE102008048825A1 (de) | 2008-09-22 | 2010-03-25 | Volkswagen Ag | Anzeige- und Bediensystem in einem Kraftfahrzeug mit nutzerbeeinflussbarer Darstellung von Anzeigeobjekten sowie Verfahren zum Betreiben eines solchen Anzeige- und Bediensystems |
EP2466456A1 (de) * | 2010-12-20 | 2012-06-20 | Clayster Asia Ltd. | Vorrichtungsunabhängiges Verfahren zum Definieren einer grafischen Benutzeroberfläche |
US9104290B2 (en) * | 2011-02-11 | 2015-08-11 | Samsung Electronics Co., Ltd. | Method for controlling screen of mobile terminal |
CN102830900B (zh) * | 2012-06-29 | 2016-10-19 | 华为终端有限公司 | 控件设置方法和终端设备 |
DE112012006661T5 (de) * | 2012-09-28 | 2015-04-02 | Hewlett Packard Development Company, L.P. | Überführung in einen Zwischenenergiezustand |
CA2838165A1 (en) * | 2012-12-31 | 2014-06-30 | Smart Technologies Ulc | Method for manipulating tables on an interactive input system and interactive input system executing the method |
KR102090964B1 (ko) * | 2013-02-22 | 2020-03-19 | 삼성전자주식회사 | 터치 스크린에 디스플레이되는 아이콘을 제어하는 휴대 단말 및 방법 |
EP2973033A1 (de) * | 2013-03-15 | 2016-01-20 | AOL Inc. | Systeme und verfahren zur informationssammlung und bereitstellung de zugangs zu mehreren webdiensten über eine interaktive benutzerschnittstelle |
DE102013002891A1 (de) | 2013-03-22 | 2014-09-25 | Volkswagen Aktiengesellschaft | Informationswiedergabesystem für ein Fahrzeug und Verfahren zum Bereitstellen von Informationen für den Benutzer eines Fahrzeugs |
US20150007078A1 (en) * | 2013-06-28 | 2015-01-01 | Sap Ag | Data Displays in a Tile-Based User Interface |
KR20150032093A (ko) * | 2013-09-17 | 2015-03-25 | 주식회사 팬택 | 사용자 인터페이스를 편집하는 단말기 및 방법 |
US9612720B2 (en) * | 2014-08-30 | 2017-04-04 | Apollo Education Group, Inc. | Automatic processing with multi-selection interface |
-
2015
- 2015-01-02 EP EP15150032.9A patent/EP3040808B1/de active Active
- 2015-12-18 EP EP15826024.0A patent/EP3241091A1/de not_active Ceased
- 2015-12-18 WO PCT/EP2015/080632 patent/WO2016107771A1/de active Application Filing
- 2015-12-18 CN CN201580071911.6A patent/CN107111493B/zh active Active
- 2015-12-18 KR KR1020177018894A patent/KR20170093228A/ko not_active Application Discontinuation
- 2015-12-18 KR KR1020197010502A patent/KR102377998B1/ko active IP Right Grant
- 2015-12-18 US US15/538,845 patent/US10782845B2/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120176382A1 (en) * | 2009-09-04 | 2012-07-12 | Sang-Gi Noh | Method for configuring user interface screen for electronic terminal, and electronic terminal for carrying out the same |
Non-Patent Citations (1)
Title |
---|
See also references of WO2016107771A1 * |
Also Published As
Publication number | Publication date |
---|---|
KR102377998B1 (ko) | 2022-03-24 |
CN107111493B (zh) | 2021-06-22 |
CN107111493A (zh) | 2017-08-29 |
EP3040808A1 (de) | 2016-07-06 |
EP3040808B1 (de) | 2019-11-20 |
KR20190041540A (ko) | 2019-04-22 |
WO2016107771A1 (de) | 2016-07-07 |
KR20170093228A (ko) | 2017-08-14 |
US10782845B2 (en) | 2020-09-22 |
US20190155455A1 (en) | 2019-05-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2930049B1 (de) | Anwenderschnittstelle und Verfahren zur Anpassung einer Ansicht auf einer Anzeigeeinheit | |
EP3113969B1 (de) | Anwenderschnittstelle und verfahren zur signalisierung einer 3d-position eines eingabemittels bei der gestenerfassung | |
EP3040808B1 (de) | Fortbewegungsmittel, Anwenderschnittstelle und Verfahren zur Definition einer Kachel auf einer Anzeigeeinrichtung | |
EP3373123A1 (de) | Verfahren und vorrichtung zur darstellung von empfohlenen bedienhandlungen eines vorschlagssystems und interaktion mit dem vorschlagssystem | |
EP3036126B1 (de) | Bedienverfahren für eine bedien- und anzeigevorrichtung in einem fahrzeug und bedien- und anzeigevorrichtung in einem fahrzeug | |
EP3108331B1 (de) | Anwenderschnittstelle und verfahren zum berührungslosen bedienen eines in hardware ausgestalteten bedienelementes in einem 3d-gestenmodus | |
EP3040849B1 (de) | Fortbewegungsmittel, Anwenderschnittstelle und Verfahren zur übergreifenden Anzeige eines Anzeigeinhaltes auf zwei Anzeigeeinrichtungen | |
EP3097468A1 (de) | Anwenderschnittstelle und verfahren zur anpassung einer ansicht auf einer anzeigeeinheit | |
EP3007050A1 (de) | Anwenderschnittstelle und Verfahren zur Anpassung einer Menüleiste auf einer Anwenderschnittstelle | |
EP2937771B1 (de) | Anwenderschnittstelle für ein Infotainment-System eines Fortbewegungsmittels | |
EP3282352B1 (de) | Verfahren und bedienvorrichtung zum bedienen einer einrichtung | |
WO2016107770A1 (de) | Anwenderschnittstelle und verfahren zum betrieb einer anwenderschnittstelle für ein fortbewegungsmittel | |
DE102017106578A1 (de) | Fahrzeuganzeigevorrichtung | |
DE102020121415B3 (de) | Projektionssystem zum Erzeugen einer graphischen Benutzeroberfläche, graphische Benutzeroberfläche und Verfahren zum Betreiben eines Projektionssystems | |
EP2943866B1 (de) | Verfahren und vorrichtung zum bereitstellen einer benutzerschnittstelle in einem fahrzeug | |
EP3234749B1 (de) | Anwenderschnittstelle und verfahren zur individualisierung eines anzeigeinhaltes in einem fortbewegungsmittel | |
WO2017140569A1 (de) | Kraftfahrzeug-bedienvorrichtung und verfahren zum betreiben einer bedienvorrichtung, um eine wechselwirkung zwischen einer virtuellen darstellungsebene und einer hand zu bewirken | |
EP3140147B1 (de) | Anwenderschnittstelle und verfahren zum wechseln zwischen bildschirmansichten einer anwenderschnittstelle | |
EP3364283B1 (de) | Bediensystem, verfahren zum bedienen eines bediensystems und ein fahrzeug mit einem bediensystem | |
EP2917062B1 (de) | Verfahren zum anzeigen von informationen in einem fahrzeug und vorrichtung zum steuern der anzeige | |
DE102010009622A1 (de) | Verfahren zum Betreiben einer Benutzerschnittstelle und Vorrichtung dazu, insbesondere in einem Fahrzeug | |
DE102014202836A1 (de) | Anwenderschnittstelle und Verfahren zur Unterstützung eines Anwenders bei der Bedienung einer Anwenderschnittstelle | |
EP2809541B1 (de) | Verfahren zum bereitstellen einer bedienvorrichtung in einem fahrzeug und bedienvorrichtung für ein fahrzeug | |
EP3040836B1 (de) | Fortbewegungsmittel, Anwenderschnittstelle und Verfahren zur Reduzierung einer Lichtemission einer Anzeigeeinrichtung eines Fortbewegungsmittels | |
DE102022133384A1 (de) | Verfahren und system zum erfassen von benutzereingaben |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20170802 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20190412 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20220212 |