US20170024119A1 - User interface and method for controlling a volume by means of a touch-sensitive display unit - Google Patents
User interface and method for controlling a volume by means of a touch-sensitive display unit Download PDFInfo
- Publication number
- US20170024119A1 US20170024119A1 US15/112,687 US201415112687A US2017024119A1 US 20170024119 A1 US20170024119 A1 US 20170024119A1 US 201415112687 A US201415112687 A US 201415112687A US 2017024119 A1 US2017024119 A1 US 2017024119A1
- Authority
- US
- United States
- Prior art keywords
- volume
- display unit
- buttons
- user interface
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 19
- 230000004044 response Effects 0.000 claims description 17
- 230000008859 change Effects 0.000 description 5
- 230000001276 controlling effect Effects 0.000 description 4
- 230000007423 decrease Effects 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000010079 rubber tapping Methods 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005562 fading Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B60K37/06—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/162—Interface to dedicated audio devices, e.g. audio drivers, interface to CODECs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/165—Management of the audio stream, e.g. setting of volume, audio stream path
-
- B60K2350/1028—
-
- B60K2350/1052—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/11—Instrument graphical user interfaces or menu aspects
- B60K2360/117—Cursors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/143—Touch sensitive instrument input devices
- B60K2360/1438—Touch screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/146—Instrument input by gesture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/146—Instrument input by gesture
- B60K2360/1468—Touch gesture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04108—Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction
Definitions
- the present disclosure relates to a user interface and methods for controlling a volume via a touch-sensitive display unit.
- the present disclosure relates to minimizing the display of unnecessary additional buttons and/or minimizing the operating steps to performing volume control.
- Various means of transport such as vehicles, cars, trucks, and the like, are known to have different input means, through which the volume of a sound playback device can be adjusted.
- a turning knob is known as an input means, which, when rotated in a first direction decreases the volume and, when rotated in a second direction, increases the volume.
- mobile devices e.g., smart phones, tablet PCs
- DE 10 2007 039 445 A1 and DE 10 2009 008 041 A1 disclose user interfaces for vehicles, in which a proximity sensor system is used to switch the menu of a user interface from a display mode to an operating mode. It is proposed, inter alia, to use swipe gestures for influencing a playback of the volume depending on the display elements displayed.
- a volume may be controlled via a display unit, which, for example, may include a touch-sensitive surface. Such display units are commonly referred to as touch screens.
- a plurality of buttons may be displayed on the display unit.
- a “button” may be interpreted to mean an operating element displayed on the display unit which, upon tapping (“click gesture”) causes the execution of an associated function.
- a swipe gesture may be recognized in front of and/or on a button of the display unit.
- the recognition in front of the display unit for example, may be recognized via a proximity sensor system.
- a swipe gesture on (i.e., contacting) the display unit can be recognized via the touch-sensitive surface.
- a swipe gesture may substantially correspond to an essentially linear motion via an input means (e.g., a user's finger, stylus, or the like) carried out parallel to the display surface of the display unit.
- an input means e.g., a user's finger, stylus, or the like
- a swiping motion may start, for example, on a first button and extend over one or more other buttons.
- the volume may be controlled as a function of the swipe gesture. This can be done, for example, similar to a linearly configured volume control (“slider”) without the volume control means having been displayed on the display unit at the beginning of the process (“first contact”) of the swipe gesture.
- the volume change function may be initiated. In this way, available space on the display unit can be advantageously used for other information, without having to dispense with an intuitive and quickly usable possibility of influencing the volume.
- FIG. 1 shows a schematic view of a vehicle under an illustrative embodiment
- FIG. 2 shows a simulated screen view on a display unit and an operating step under an illustrative embodiment
- FIGS. 3 to 10 are operating steps in connection with a display unit under an illustrative embodiment.
- FIG. 11 is a flowchart showing steps of an illustrative embodiment of a method according to the present disclosure.
- the present disclosure is directed to recognizing a tap gesture in the area of a button among a plurality of buttons, and triggering a function associated with the button.
- the buttons may be assigned to a primary function which is not related to the change in volume. Accordingly, a tapping of a button (for example, assigned to a pause function) may not correspond to the changing of the volume.
- a tap gesture on a function associated with a “next track” or “previous track” may naturally change or mute the volume.
- the volume function may be configured to change regardless of an interruption or alteration of the reproduced content.
- the “tap gesture” may include no movement or a negligible movement parallel to the surface of the display unit. Accordingly, the volume of a current playback may not change, and, instead, a function associated with the button may be triggered.
- interactions with a single button may launch different functionalities without any of them being specifically visualized on the button (e.g., by a symbol, “icon”).
- functions may be launched regardless of whether the tap gesture is performed in an area in front of the display unit or on (by contacting) the display unit itself.
- a plurality of buttons may be configured for allowing access to their respective primary functions via a tap gesture.
- primary functions include, but are not limited to, “Source selection”, “Track selection”, “Previous track”, “Pause/Play”, “Next track” and “Sound settings”.
- audio playback can take place regardless of the currently accessed menu item on the display unit, there may be situations involving displays on the display unit that have nothing to do with an audio playback.
- configurations of the present disclosure may be used for rapid and intuitive changes in volume, so that changing to an audio playback menu for volume changes is not needed. In this way, unnecessary operations steps for changing the volume may be avoided.
- a volume control may be displayed in response to recognizing a swipe gesture. It may be displayed, for example, at a predefined location and/or below the input means used for carrying out the swipe gesture.
- the volume control therein is used to help the user orient himself with respect to the current relative volume range and allows additional input gestures for changing the volume.
- the volume can be controlled as a function of the position of the tap gesture on the volume control, using a tap gesture in the area of the volume control in some illustrative embodiments.
- the volume setting may jump to a value linked with the position of the tap gesture. In this way, a further swipe gesture is no longer necessary in the further course of determining the volume setting.
- the aforementioned tap gesture for setting the volume can also be performed either in an approaching area or in contact with a surface of the display unit.
- the user when the user sets a suitable volume, the user may terminate the input by removing the input means from the approaching area. After a predefined period of time after leaving the approaching area, and/or after a final controlling action of the volume, the volume control may be hidden or faded out, respectively. In instances where the display of the volume control has replaced the plurality of buttons (or a sub-plurality of buttons), the (sub-) plurality of buttons may be displayed again on the display unit. If the volume control configuration of the plurality of buttons was only superimposed in a partially transparent view, the plurality of buttons may re-appear after the lapse of the predefined time period.
- a further operating step for fading out the volume control can be omitted, whereby the user can, on the one hand, dedicate himself entirely to the task of driving and, on the other hand, a plurality of buttons or functions associated with these buttons can be operated again by the user with relative ease.
- a double click on the volume control can control the volume to a minimum value (“mute”).
- mute minimum value
- the volume can return to a last set (e.g., non-minimum) value.
- this process can be made dependent on whether the minimum value was selected by double-click or whether a swipe gesture occurred to set the minimum value. Particularly in cases when a double click had caused the minimum value, a further double click for overriding the mute function can be very intuitive and quick.
- a user interface with a display unit for displaying a plurality of buttons may be utilized.
- the display unit may, for example, be permanently installed in a vehicle. Such display units are often referred to as central information displays (CID).
- CID central information displays
- an instrument cluster may serve as a display unit or be included in the display unit, respectively.
- the user interface may further include an operating unit for gesture recognition, wherein the gestures can take place either in an approaching area in front of the display unit and/or in contact with a surface of the display unit.
- a control unit may be provided in the user interface, which sets up the user interface to perform functions described in various illustrative embodiments.
- the display unit may include a touch-sensitive surface in front of the display unit and/or an LED-based proximity sensor system.
- the LED-based proximity sensor system may include infrared LEDs to avoid blinding the user, yet be able to perform reliable gesture recognition. If the user interface is an electronic, portable end device, the display unit, the operating unit and the control unit may be housed in a common housing.
- the plurality of buttons may have at least one function that is not associated with a music playback control.
- the user interface can also have other functional scopes to which menus or views displayable on the display unit are assigned.
- the views not assigned to music playback may also have pluralities of buttons which will perform the method according to the present disclosure upon detecting a swipe gesture input according to the invention.
- a computer program product comprising instructions which, when executed by a programmable processor (e.g., a user interface), may cause the processor to perform the steps of a method according to the present disclosure.
- a vehicle comprising a user interface, as described herein, is disclosed.
- FIG. 1 shows a vehicle 10 as a means of transport, in the dashboard of which there is arranged a display 1 as a display unit under an illustrative embodiment.
- the display 1 is operatively connected by information technology means to an electronic controller 3 as a control unit, which is also operatively connected by information technology means with an infrared LED strip 2 as operating unit.
- FIG. 2 shows an illustration of a display unit 1 , in the upper part of which a main view 5 of a music playback function can be recognized.
- an operating bar 4 in the form of a plurality of buttons 41 , 42 , 43 , 44 , 45 , 46 is arranged below the main view 5 .
- the buttons may be assigned to a source selection, a track selection, a return skip function, a pause/playback function, a “next track” function, as well as a set-up function.
- the hand 6 of a user performs a tap gesture T with respect to the sixth button 46 and thus launches the display of settings.
- the setting function may be a primary function that is assigned to the sixth button 46 .
- FIG. 3 shows the view shown in FIG. 2 under an illustrative embodiment, wherein the buttons 41 , 42 , 43 , 44 , 45 , 46 of FIG. 2 are displayed in a reduced level of detail depth in order to save space on the display. Accordingly, in this example, only the icons are displayed on the buttons 41 ′, 42 ′, 43 ′, 44 ′, 45 ′, 46 ′, as long as the user does not hold any input means in the approaching area of the operating unit.
- the user has moved his hand 6 to the approaching area of the operating unit, in response to which, the extended display of the buttons 41 , 42 , 43 , 44 , 45 , 46 , as introduced in FIG. 2 , is used again.
- the hand 6 of the user starts a swipe gesture oriented in the direction of the arrow P, beginning from the sixth button 46 in the direction of buttons 41 , 42 , 43 , 44 , 45 of lower order numbers.
- a volume control 7 is displayed, as shown in FIG. 6 .
- the user has moved the hand 6 , contacting the display 1 , to the left to decrease the volume.
- the volume control 7 is displayed instead of buttons 41 , 42 , 43 , 44 , 45 , 46 .
- the current volume 8 is illustrated by a jump in the contrast of the bar. This arises from the fact that, at the start of the swipe gesture, hand 6 of the user has not selected the position on the display 1 correlating with the current volume 8 .
- the user has set the desired current volume 8 .
- the offset between the position of his hand 6 and the current position 8 has remained constant in this case. Now, the user lifts his hand 6 away from the surface of the display 1 .
- the user has expressed his desire for increased playback volume by the fact that he has operated the surface of display 1 in the area of the volume control 7 through a tap gesture T; in response, a control element 9 is displayed at the location of the current volume 8 and both, the operating element 9 and the current volume 8 , are displayed according to the current position of the tap gesture T.
- the user agrees with the currently set volume 8 and lifts, after the operating situation shown in FIG. 8 , his hand 6 away from the contact and approaching area of the user interface according to the invention.
- FIG. 9 shows the configuration illustrated in FIG. 8 after removal of the hand 6 of the user from the contact and approaching area of the user interface according to the present disclosure is complete.
- a timer (not shown) may determine the time which has passed since the hand has left the contact and approaching area.
- FIG. 10 shows the display illustrated in FIG. 9 after expiry of the timer.
- reduced buttons 41 ′, 42 ′, 43 ′, 44 ′, 45 ′, 46 ′ are again displayed on the operating bar 4 .
- a tap gesture in this area would again start primary functions of buttons 41 ′, 42 ′, 43 ′, 44 ′, 45 ′, 46 , rather than incrementally increasing the current volume value.
- FIG. 11 shows a flow chart illustrating steps of an exemplary embodiment of the present disclosure.
- Step 100 displays a plurality of buttons on the display unit of a user interface, while, in step 200 , a tap gesture is recognized in the area of a button of the plurality of buttons.
- a function associated with the button, which has been addressed by the tap gesture is triggered in step 300 .
- step 400 a swipe gesture in front of and/or on one of the buttons displayed is recognized.
- the volume of a current audio playback is controlled as a function of the recognized swipe gesture.
- a volume control is displayed on the display unit in step 600 .
- step 700 the expiry of the “inactivity timer” is recognized, in response to which, in step 800 , the volume control is again hidden. Hiding the volume control is accompanied by re-displaying (or fully displaying) the plurality of buttons on the display unit in step 900 .
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
- The present application claims priority under 35 U.S.C. §371 to International PCT Application No. PCT/EP2014/051056 to Holger Wild et al., titled “User Interface and Method for Controlling a Volume by Means of a Touch-Sensitive Display Unit” filed Jan. 20, 2014, which is incorporated by reference in its entirety herein.
- The present disclosure relates to a user interface and methods for controlling a volume via a touch-sensitive display unit. In particular, the present disclosure relates to minimizing the display of unnecessary additional buttons and/or minimizing the operating steps to performing volume control.
- Various means of transport, such as vehicles, cars, trucks, and the like, are known to have different input means, through which the volume of a sound playback device can be adjusted. In particular, a turning knob is known as an input means, which, when rotated in a first direction decreases the volume and, when rotated in a second direction, increases the volume. In the field of mobile devices (e.g., smart phones, tablet PCs), it is also known to display a sound controller on a screen with a touch-sensitive surface, wherein the operating element of which can be moved by a swipe gesture backwards and forwards.
- DE 10 2007 039 445 A1 and DE 10 2009 008 041 A1 disclose user interfaces for vehicles, in which a proximity sensor system is used to switch the menu of a user interface from a display mode to an operating mode. It is proposed, inter alia, to use swipe gestures for influencing a playback of the volume depending on the display elements displayed.
- The limited surface area of typical display elements requires an intelligent selection of the information and buttons associated with various functions. The solutions that are known in the prior art always require a display of a volume control, before the volume control can be operated.
- According to various illustrative embodiments, apparatus, systems and methods are disclosed for controlling a volume and a user interface, which is configured for carrying out related functions. A volume may be controlled via a display unit, which, for example, may include a touch-sensitive surface. Such display units are commonly referred to as touch screens. A plurality of buttons may be displayed on the display unit. In the context of the present disclosure, a “button” may be interpreted to mean an operating element displayed on the display unit which, upon tapping (“click gesture”) causes the execution of an associated function. Subsequently, a swipe gesture may be recognized in front of and/or on a button of the display unit. The recognition in front of the display unit, for example, may be recognized via a proximity sensor system. A swipe gesture on (i.e., contacting) the display unit can be recognized via the touch-sensitive surface. Thus, a swipe gesture may substantially correspond to an essentially linear motion via an input means (e.g., a user's finger, stylus, or the like) carried out parallel to the display surface of the display unit. In some illustrative embodiments, a swiping motion may start, for example, on a first button and extend over one or more other buttons. In response, the volume may be controlled as a function of the swipe gesture. This can be done, for example, similar to a linearly configured volume control (“slider”) without the volume control means having been displayed on the display unit at the beginning of the process (“first contact”) of the swipe gesture. Rather, only in response to the recognition of the swipe gesture on a button not primarily associated with a swipe gesture, the volume change function may be initiated. In this way, available space on the display unit can be advantageously used for other information, without having to dispense with an intuitive and quickly usable possibility of influencing the volume.
- Various illustrative embodiments are described in detail below with reference to the accompanying drawings. In the drawings:
-
FIG. 1 shows a schematic view of a vehicle under an illustrative embodiment; -
FIG. 2 shows a simulated screen view on a display unit and an operating step under an illustrative embodiment; -
FIGS. 3 to 10 are operating steps in connection with a display unit under an illustrative embodiment; and -
FIG. 11 is a flowchart showing steps of an illustrative embodiment of a method according to the present disclosure. - The present disclosure is directed to recognizing a tap gesture in the area of a button among a plurality of buttons, and triggering a function associated with the button. As described herein, the buttons may be assigned to a primary function which is not related to the change in volume. Accordingly, a tapping of a button (for example, assigned to a pause function) may not correspond to the changing of the volume. However, a tap gesture on a function associated with a “next track” or “previous track” may naturally change or mute the volume. Under the present disclosure however, the volume function may be configured to change regardless of an interruption or alteration of the reproduced content.
- According to some illustrative embodiments, the “tap gesture” may include no movement or a negligible movement parallel to the surface of the display unit. Accordingly, the volume of a current playback may not change, and, instead, a function associated with the button may be triggered. In this example, interactions with a single button may launch different functionalities without any of them being specifically visualized on the button (e.g., by a symbol, “icon”). In this context, functions may be launched regardless of whether the tap gesture is performed in an area in front of the display unit or on (by contacting) the display unit itself.
- In some illustrative embodiments, a plurality of buttons may be configured for allowing access to their respective primary functions via a tap gesture. Examples of primary functions include, but are not limited to, “Source selection”, “Track selection”, “Previous track”, “Pause/Play”, “Next track” and “Sound settings”. However, as music playback or, more generally, audio playback, can take place regardless of the currently accessed menu item on the display unit, there may be situations involving displays on the display unit that have nothing to do with an audio playback. In these menus, configurations of the present disclosure may be used for rapid and intuitive changes in volume, so that changing to an audio playback menu for volume changes is not needed. In this way, unnecessary operations steps for changing the volume may be avoided.
- In some illustrative embodiments, a volume control may be displayed in response to recognizing a swipe gesture. It may be displayed, for example, at a predefined location and/or below the input means used for carrying out the swipe gesture. The volume control therein is used to help the user orient himself with respect to the current relative volume range and allows additional input gestures for changing the volume.
- If the volume control has been faded in, the volume can be controlled as a function of the position of the tap gesture on the volume control, using a tap gesture in the area of the volume control in some illustrative embodiments. In other words, the volume setting may jump to a value linked with the position of the tap gesture. In this way, a further swipe gesture is no longer necessary in the further course of determining the volume setting. The aforementioned tap gesture for setting the volume can also be performed either in an approaching area or in contact with a surface of the display unit.
- In some illustrative embodiments, when the user sets a suitable volume, the user may terminate the input by removing the input means from the approaching area. After a predefined period of time after leaving the approaching area, and/or after a final controlling action of the volume, the volume control may be hidden or faded out, respectively. In instances where the display of the volume control has replaced the plurality of buttons (or a sub-plurality of buttons), the (sub-) plurality of buttons may be displayed again on the display unit. If the volume control configuration of the plurality of buttons was only superimposed in a partially transparent view, the plurality of buttons may re-appear after the lapse of the predefined time period. In this way, a further operating step for fading out the volume control can be omitted, whereby the user can, on the one hand, dedicate himself entirely to the task of driving and, on the other hand, a plurality of buttons or functions associated with these buttons can be operated again by the user with relative ease.
- If the volume control has been faded in, a double click on the volume control can control the volume to a minimum value (“mute”). In this way, a situation-dependent suppression of audio playback can be achieved in a way that is fast, intuitive and easy. If, upon recognizing the double click, the current volume is already set to a minimum value, in response to recognizing the double-click in front of or on the display unit, the volume can return to a last set (e.g., non-minimum) value. In some illustrative embodiments, this process can be made dependent on whether the minimum value was selected by double-click or whether a swipe gesture occurred to set the minimum value. Particularly in cases when a double click had caused the minimum value, a further double click for overriding the mute function can be very intuitive and quick.
- Under some illustrative embodiments, a user interface with a display unit for displaying a plurality of buttons may be utilized. The display unit may, for example, be permanently installed in a vehicle. Such display units are often referred to as central information displays (CID). Alternatively, an instrument cluster may serve as a display unit or be included in the display unit, respectively. Of course, those skilled in the art should recognize that the present disclosure may be configured to be used independently of automotive applications. The user interface may further include an operating unit for gesture recognition, wherein the gestures can take place either in an approaching area in front of the display unit and/or in contact with a surface of the display unit. In some examples, it is only important that a recognized gesture can be recognized as such, and evaluated to the effect as to whether it has been carried out above the plurality of buttons or can be assigned to the plurality of buttons.
- Further, a control unit may be provided in the user interface, which sets up the user interface to perform functions described in various illustrative embodiments. The display unit may include a touch-sensitive surface in front of the display unit and/or an LED-based proximity sensor system. In particular, the LED-based proximity sensor system may include infrared LEDs to avoid blinding the user, yet be able to perform reliable gesture recognition. If the user interface is an electronic, portable end device, the display unit, the operating unit and the control unit may be housed in a common housing.
- In some illustrative embodiments, the plurality of buttons may have at least one function that is not associated with a music playback control. In other words, in addition to music playback, the user interface can also have other functional scopes to which menus or views displayable on the display unit are assigned. The views not assigned to music playback may also have pluralities of buttons which will perform the method according to the present disclosure upon detecting a swipe gesture input according to the invention.
- In some illustrative embodiments, a computer program product comprising instructions is proposed which, when executed by a programmable processor (e.g., a user interface), may cause the processor to perform the steps of a method according to the present disclosure. In some illustrative embodiments, a vehicle comprising a user interface, as described herein, is disclosed.
-
FIG. 1 shows avehicle 10 as a means of transport, in the dashboard of which there is arranged adisplay 1 as a display unit under an illustrative embodiment. In this example, thedisplay 1 is operatively connected by information technology means to anelectronic controller 3 as a control unit, which is also operatively connected by information technology means with aninfrared LED strip 2 as operating unit. The operation according to the present disclosure is explained in conjunction with the following figures. -
FIG. 2 shows an illustration of adisplay unit 1, in the upper part of which amain view 5 of a music playback function can be recognized. In this example, an operatingbar 4, in the form of a plurality ofbuttons main view 5. In ascending order, the buttons may be assigned to a source selection, a track selection, a return skip function, a pause/playback function, a “next track” function, as well as a set-up function. During use, thehand 6 of a user performs a tap gesture T with respect to thesixth button 46 and thus launches the display of settings. The setting function may be a primary function that is assigned to thesixth button 46. An operation of the operatingbar 4 according to the invention will be explained in connection with the figures below. -
FIG. 3 shows the view shown inFIG. 2 under an illustrative embodiment, wherein thebuttons FIG. 2 are displayed in a reduced level of detail depth in order to save space on the display. Accordingly, in this example, only the icons are displayed on thebuttons 41′, 42′, 43′, 44′, 45′, 46′, as long as the user does not hold any input means in the approaching area of the operating unit. - In the example of
FIG. 4 , the user has moved hishand 6 to the approaching area of the operating unit, in response to which, the extended display of thebuttons FIG. 2 , is used again. - In the example of
FIG. 5 , thehand 6 of the user starts a swipe gesture oriented in the direction of the arrow P, beginning from thesixth button 46 in the direction ofbuttons volume control 7 is displayed, as shown inFIG. 6 . - In the example of
FIG. 6 , the user has moved thehand 6, contacting thedisplay 1, to the left to decrease the volume. In response to the recognized swipe gesture P, thevolume control 7 is displayed instead ofbuttons volume control 7, thecurrent volume 8 is illustrated by a jump in the contrast of the bar. This arises from the fact that, at the start of the swipe gesture,hand 6 of the user has not selected the position on thedisplay 1 correlating with thecurrent volume 8. - In the example of
FIG. 7 , the user has set the desiredcurrent volume 8. The offset between the position of hishand 6 and thecurrent position 8 has remained constant in this case. Now, the user lifts hishand 6 away from the surface of thedisplay 1. - In the example of
FIG. 8 , the user has expressed his desire for increased playback volume by the fact that he has operated the surface ofdisplay 1 in the area of thevolume control 7 through a tap gesture T; in response, acontrol element 9 is displayed at the location of thecurrent volume 8 and both, theoperating element 9 and thecurrent volume 8, are displayed according to the current position of the tap gesture T. The user agrees with the currently setvolume 8 and lifts, after the operating situation shown inFIG. 8 , hishand 6 away from the contact and approaching area of the user interface according to the invention. - The example of
FIG. 9 shows the configuration illustrated inFIG. 8 after removal of thehand 6 of the user from the contact and approaching area of the user interface according to the present disclosure is complete. A timer (not shown) may determine the time which has passed since the hand has left the contact and approaching area. - The example of
FIG. 10 shows the display illustrated inFIG. 9 after expiry of the timer. Instead ofvolume control 7, reducedbuttons 41′, 42′, 43′, 44′, 45′, 46′ are again displayed on theoperating bar 4. A tap gesture in this area would again start primary functions ofbuttons 41′, 42′, 43′, 44′, 45′, 46, rather than incrementally increasing the current volume value. -
FIG. 11 shows a flow chart illustrating steps of an exemplary embodiment of the present disclosure. Step 100 displays a plurality of buttons on the display unit of a user interface, while, instep 200, a tap gesture is recognized in the area of a button of the plurality of buttons. In response, a function associated with the button, which has been addressed by the tap gesture, is triggered instep 300. Then, instep 400, a swipe gesture in front of and/or on one of the buttons displayed is recognized. In response, the volume of a current audio playback is controlled as a function of the recognized swipe gesture. Also in response to recognizing the swipe gesture (step 400) a volume control is displayed on the display unit instep 600. Subsequently, the user removes the input means from the approaching area of the user interface, which starts a so-called “inactivity timer”. Instep 700, the expiry of the “inactivity timer” is recognized, in response to which, instep 800, the volume control is again hidden. Hiding the volume control is accompanied by re-displaying (or fully displaying) the plurality of buttons on the display unit instep 900. - Although the aspects of the invention and advantageous embodiments, which have been described in detail by way of the exemplary embodiments with reference to the accompanying figure and drawings, modifications and combinations of features of the illustrated exemplary embodiments are apparent to persons skilled in the art without departing from the present invention the scope of which is defined by the appended claims.
-
- 1 display
- 2 infrared LED strip
- 3 electronic controller
- 4 operating bar
- 5 main view
- 6 the user's hand
- 7 volume control
- 8 current volume
- 9 operating element of the volume control
- 10 vehicle
- 41, 42, 43,
- 44, 45, 46 operating in an extended display
- 41′, 42′, 43′,
- 44′, 45′, 46′ buttons in reduced display
- 100 to 900 steps of the method
- P swipe gesture
- T tap gesture
Claims (19)
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2014/051056 WO2015106830A1 (en) | 2014-01-20 | 2014-01-20 | User interface and method for controlling a volume by means of a touch-sensitive display unit |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170024119A1 true US20170024119A1 (en) | 2017-01-26 |
Family
ID=49998299
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/112,687 Pending US20170024119A1 (en) | 2014-01-20 | 2014-01-20 | User interface and method for controlling a volume by means of a touch-sensitive display unit |
Country Status (4)
Country | Link |
---|---|
US (1) | US20170024119A1 (en) |
EP (1) | EP3096969B1 (en) |
CN (1) | CN105916720B (en) |
WO (1) | WO2015106830A1 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10430063B2 (en) | 2017-09-27 | 2019-10-01 | Hyundai Motor Company | Input apparatus for vehicle having metal buttons and control method of the input apparatus |
CN113573936A (en) * | 2019-03-25 | 2021-10-29 | 大众汽车股份公司 | Method and device for detecting a parameter value in a vehicle |
EP4066069A4 (en) * | 2020-01-02 | 2023-02-01 | Universal Electronics Inc. | Universal voice assistant |
US11693531B2 (en) * | 2018-11-29 | 2023-07-04 | Beijing Bytedance Network Technology Co., Ltd. | Page display position jump method and apparatus, terminal device, and storage medium |
DE102022101807A1 (en) | 2022-01-26 | 2023-07-27 | Bayerische Motoren Werke Aktiengesellschaft | Method for adjusting an audio output in a vehicle |
US11756412B2 (en) | 2011-10-28 | 2023-09-12 | Universal Electronics Inc. | Systems and methods for associating services and/or devices with a voice assistant |
US11792185B2 (en) | 2019-01-08 | 2023-10-17 | Universal Electronics Inc. | Systems and methods for associating services and/or devices with a voice assistant |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112124231A (en) * | 2020-09-29 | 2020-12-25 | 广州小鹏汽车科技有限公司 | Vehicle interaction method and device |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080084400A1 (en) * | 2006-10-10 | 2008-04-10 | Outland Research, Llc | Touch-gesture control of video media play on handheld media players |
US20080163053A1 (en) * | 2006-12-28 | 2008-07-03 | Samsung Electronics Co., Ltd. | Method to provide menu, using menu set and multimedia device using the same |
US20100127847A1 (en) * | 2008-10-07 | 2010-05-27 | Cisco Technology, Inc. | Virtual dashboard |
US20100328224A1 (en) * | 2009-06-25 | 2010-12-30 | Apple Inc. | Playback control using a touch interface |
US20110082619A1 (en) * | 2009-10-05 | 2011-04-07 | Tesla Motors, Inc. | Adaptive Soft Buttons for a Vehicle User Interface |
US20120032899A1 (en) * | 2009-02-09 | 2012-02-09 | Volkswagen Ag | Method for operating a motor vehicle having a touch screen |
US20120274550A1 (en) * | 2010-03-24 | 2012-11-01 | Robert Campbell | Gesture mapping for display device |
US20120306773A1 (en) * | 2011-05-31 | 2012-12-06 | Acer Incorporated | Touch control method and electronic apparatus |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1969452A2 (en) * | 2005-12-30 | 2008-09-17 | Apple Inc. | Portable electronic device with multi-touch input |
CN101529874A (en) * | 2006-09-06 | 2009-09-09 | 苹果公司 | Incoming telephone call management for a portable multifunction device with touch screen display |
US8564544B2 (en) * | 2006-09-06 | 2013-10-22 | Apple Inc. | Touch screen device, method, and graphical user interface for customizing display of content category icons |
DE102007039445A1 (en) | 2007-08-21 | 2009-02-26 | Volkswagen Ag | Method for displaying information in a motor vehicle for an operating state and a display state and display device |
US20120110517A1 (en) * | 2010-10-29 | 2012-05-03 | Honeywell International Inc. | Method and apparatus for gesture recognition |
US10198097B2 (en) * | 2011-04-26 | 2019-02-05 | Sentons Inc. | Detecting touch input force |
DE102011112565A1 (en) * | 2011-09-08 | 2013-03-14 | Daimler Ag | Method for operating control device of motor car, involves selecting operating mode by providing another operating mode of operating device such that contents are displayed on element corresponding to selected operating mode of device |
CN102841757B (en) * | 2012-08-31 | 2015-04-08 | 深圳雷柏科技股份有限公司 | Intelligent terminal based interactive interface system and implementation method thereof |
-
2014
- 2014-01-20 EP EP14700933.6A patent/EP3096969B1/en active Active
- 2014-01-20 US US15/112,687 patent/US20170024119A1/en active Pending
- 2014-01-20 CN CN201480073503.XA patent/CN105916720B/en active Active
- 2014-01-20 WO PCT/EP2014/051056 patent/WO2015106830A1/en active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080084400A1 (en) * | 2006-10-10 | 2008-04-10 | Outland Research, Llc | Touch-gesture control of video media play on handheld media players |
US20080163053A1 (en) * | 2006-12-28 | 2008-07-03 | Samsung Electronics Co., Ltd. | Method to provide menu, using menu set and multimedia device using the same |
US20100127847A1 (en) * | 2008-10-07 | 2010-05-27 | Cisco Technology, Inc. | Virtual dashboard |
US20120032899A1 (en) * | 2009-02-09 | 2012-02-09 | Volkswagen Ag | Method for operating a motor vehicle having a touch screen |
US20100328224A1 (en) * | 2009-06-25 | 2010-12-30 | Apple Inc. | Playback control using a touch interface |
US20110082619A1 (en) * | 2009-10-05 | 2011-04-07 | Tesla Motors, Inc. | Adaptive Soft Buttons for a Vehicle User Interface |
US20120274550A1 (en) * | 2010-03-24 | 2012-11-01 | Robert Campbell | Gesture mapping for display device |
US20120306773A1 (en) * | 2011-05-31 | 2012-12-06 | Acer Incorporated | Touch control method and electronic apparatus |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11756412B2 (en) | 2011-10-28 | 2023-09-12 | Universal Electronics Inc. | Systems and methods for associating services and/or devices with a voice assistant |
US10430063B2 (en) | 2017-09-27 | 2019-10-01 | Hyundai Motor Company | Input apparatus for vehicle having metal buttons and control method of the input apparatus |
US11693531B2 (en) * | 2018-11-29 | 2023-07-04 | Beijing Bytedance Network Technology Co., Ltd. | Page display position jump method and apparatus, terminal device, and storage medium |
US11792185B2 (en) | 2019-01-08 | 2023-10-17 | Universal Electronics Inc. | Systems and methods for associating services and/or devices with a voice assistant |
CN113573936A (en) * | 2019-03-25 | 2021-10-29 | 大众汽车股份公司 | Method and device for detecting a parameter value in a vehicle |
EP4066069A4 (en) * | 2020-01-02 | 2023-02-01 | Universal Electronics Inc. | Universal voice assistant |
DE102022101807A1 (en) | 2022-01-26 | 2023-07-27 | Bayerische Motoren Werke Aktiengesellschaft | Method for adjusting an audio output in a vehicle |
Also Published As
Publication number | Publication date |
---|---|
CN105916720B (en) | 2019-06-14 |
EP3096969A1 (en) | 2016-11-30 |
CN105916720A (en) | 2016-08-31 |
WO2015106830A1 (en) | 2015-07-23 |
EP3096969B1 (en) | 2022-05-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20170024119A1 (en) | User interface and method for controlling a volume by means of a touch-sensitive display unit | |
US10642432B2 (en) | Information processing apparatus, information processing method, and program | |
US10366602B2 (en) | Interactive multi-touch remote control | |
KR101450231B1 (en) | Touch gestures for remote control operations | |
US8836648B2 (en) | Touch pull-in gesture | |
US9891782B2 (en) | Method and electronic device for providing user interface | |
EP2494697B1 (en) | Mobile device and method for providing user interface (ui) thereof | |
US9354780B2 (en) | Gesture-based selection and movement of objects | |
US20110134032A1 (en) | Method for controlling touch control module and electronic device thereof | |
US20120144299A1 (en) | Blind Navigation for Touch Interfaces | |
US11132119B2 (en) | User interface and method for adapting a view of a display unit | |
US20130227464A1 (en) | Screen change method of touch screen portable terminal and apparatus therefor | |
EP1969450A1 (en) | Mobile device and operation method control available for using touch and drag | |
JP2009536385A (en) | Multi-function key with scroll | |
US10146432B2 (en) | Method for operating an operator control device of a motor vehicle in different operator control modes, operator control device and motor vehicle | |
WO2015099731A1 (en) | Remote multi-touch control | |
US20150261432A1 (en) | Display control apparatus and method | |
KR101154137B1 (en) | User interface for controlling media using one finger gesture on touch pad | |
WO2018037738A1 (en) | Information processing device, program, and information processing system | |
KR20160018265A (en) | Method and apparatus of controlling display, and computer program for executing the method | |
JP5782821B2 (en) | Touch panel device and control method of touch panel device | |
KR20120004569A (en) | Apparatus and method of interface for mobile device, and recording medium for the same | |
US10437376B2 (en) | User interface and method for assisting a user in the operation of an operator control unit | |
EP2815295B1 (en) | Display and method in an electric device | |
US20140085540A1 (en) | Television and control device and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: VOLKSWAGEN AKTIENGESELLSCHAFT, GERMANY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WILD, HOLGER;CZELNIK, MARK PETER;SIGNING DATES FROM 20161010 TO 20161012;REEL/FRAME:040081/0012 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCV | Information on status: appeal procedure |
Free format text: NOTICE OF APPEAL FILED |
|
STCV | Information on status: appeal procedure |
Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER |
|
STCV | Information on status: appeal procedure |
Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED |
|
STCV | Information on status: appeal procedure |
Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS |
|
STCV | Information on status: appeal procedure |
Free format text: BOARD OF APPEALS DECISION RENDERED |
|
STCV | Information on status: appeal procedure |
Free format text: REQUEST RECONSIDERATION AFTER BOARD OF APPEALS DECISION |
|
STCV | Information on status: appeal procedure |
Free format text: BOARD OF APPEALS DECISION RENDERED AFTER REQUEST FOR RECONSIDERATION |