WO2020144138A1 - Interaction element, control element and motor vehicle - Google Patents

Interaction element, control element and motor vehicle Download PDF

Info

Publication number
WO2020144138A1
WO2020144138A1 PCT/EP2020/050140 EP2020050140W WO2020144138A1 WO 2020144138 A1 WO2020144138 A1 WO 2020144138A1 EP 2020050140 W EP2020050140 W EP 2020050140W WO 2020144138 A1 WO2020144138 A1 WO 2020144138A1
Authority
WO
WIPO (PCT)
Prior art keywords
interaction
electrode
surface layer
touch
input area
Prior art date
Application number
PCT/EP2020/050140
Other languages
French (fr)
Inventor
Mukesh Patel
Thomas Agung Nugraha
Original Assignee
Motherson Innovations Company Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Motherson Innovations Company Ltd. filed Critical Motherson Innovations Company Ltd.
Priority to US17/420,897 priority Critical patent/US11822723B2/en
Priority to EP20700190.0A priority patent/EP3908909A1/en
Publication of WO2020144138A1 publication Critical patent/WO2020144138A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0445Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using two or more layers of sensing electrodes, e.g. using two layers of electrodes separated by a dielectric layer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03KPULSE TECHNIQUE
    • H03K17/00Electronic switching or gating, i.e. not by contact-making and –breaking
    • H03K17/94Electronic switching or gating, i.e. not by contact-making and –breaking characterised by the way in which the control signals are generated
    • H03K17/96Touch switches
    • H03K17/962Capacitive touch switches
    • H03K17/9622Capacitive touch switches using a plurality of detectors, e.g. keyboard
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/143Touch sensitive instrument input devices
    • B60K2360/1434Touch panels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/33Illumination features
    • B60K2360/336Light guides
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/25Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using haptic output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0339Touch strips, e.g. orthogonal touch strips to control cursor movement or scrolling; single touch strip to adjust parameter or to implement a row of soft keys
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03KPULSE TECHNIQUE
    • H03K2217/00Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00
    • H03K2217/94Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00 characterised by the way in which the control signal is generated
    • H03K2217/96Touch switches
    • H03K2217/96062Touch switches with tactile or haptic feedback
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03KPULSE TECHNIQUE
    • H03K2217/00Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00
    • H03K2217/94Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00 characterised by the way in which the control signal is generated
    • H03K2217/96Touch switches
    • H03K2217/9607Capacitive touch switches
    • H03K2217/960785Capacitive touch switches with illumination
    • H03K2217/960795Capacitive touch switches with illumination using organic light emitting devices, e.g. light emitting polymer [OEP] or OLED

Definitions

  • the present invention relates to an interaction element, for receiving touch and/or gesture events inputs and/or providing tactile feedback outputs.
  • the invention also relates to a control element, for controlling the operation of at least one function of at least one device, comprising at least one or more such interaction elements.
  • the invention also relates to a motor vehicle comprising at least one or more such interaction elements and/or at least one or more such control elements.
  • FIG. la shows a state of the art panel 1 which might be used for example at a door panel arm rest of a motor vehicle for window glass management.
  • the panel 1 has an array of buttons 3 which are examples of such multiple level input buttons and a button 5 which is a press lock button.
  • Each button 3 in Figure la usually provides four levels of input, i.e. pull up (e.g. for window roll up), pull up two stages (e.g.
  • buttons for window automatic roll up
  • push down e.g. for window roll down
  • push down two stages e.g. for window automatic roll down
  • An illustration of an exemplary state of the art multiple level input button is shown in Figure lb in isolation.
  • these kind of buttons are susceptible to wear and user experience might also be better. Further, they have dedicated functions and cannot be flexible used for other ones.
  • DE 10 2014 008 040 A1 discloses an interaction element for receiving touch and/or gesture events inputs and/or providing visual feedback outputs or tactile feedback outputs by vibration.
  • the interaction element comprises at least one OLED-Display, having at least one input area element adapted to receive touch and/or gesture event inputs issued by at least one finger when interacting with at least one part of at least one surface of the OLED-Display. Further, a depression is adapted to provide a visual feedback output.
  • the display device for receiving touch and/or gesture event inputs and/or providing tactile feedback outputs.
  • the display device comprises a touch surface element having an input area element to receive touch event inputs. Further, at least one output area element provides a tactile feedback output.
  • the invention aims at providing user input devices which allow to receive user inputs at multiple levels and which are at the same time easy and intuitive to operate, hence increasing the safety of the driver and providing a better user experience.
  • high flexibility with respect to the assignment of functions controlled by the user input devices is demanded.
  • an interaction element for receiving touch and/or gesture events inputs and/or providing tactile feedback outputs, comprising at least one touch surface element, the touch surface element having (a) at least one input area element at least adapted to receive touch and/or gesture events inputs issued by at least one finger when interacting with at least one part of at least one surface of the input area element and/or the touch surface element; and (b) at least one output area element at least adapted to provide a tactile feedback output to the finger when interacting with at least one part of at least one surface of the output area element and/or the touch surface element; wherein the at least one input area element and the at least one output area element are formed as one.
  • the resulting interaction element therefore has at least one input area element and at least one output area element designed as one common element at least in certain areas and/or in sections.
  • At least one, preferably all, input area element(s) can, especially by means of the touch controller, interchangeably and/or simultaneously also be adapted to provide a tactile feedback to the finger when interacting with at least one part of at least one surface of the input area element and/or the touch surface element, especially by interchanging use of the first electrode(s) as second electrode(s).
  • the interaction element comprises a first plurality of input area elements, especially at least two, three, four, five, six, seven, eight, nine and/or ten input area elements, and a second plurality of output area elements, especially at least two, three, four, five, six, seven, eight, nine and/or ten output area elements, wherein preferably at least two neighboring input area elements, especially all of each two neighboring input area elements, are at least in certain areas and/or at least partly separated by each other by at least one of the second plurality of output area elements.
  • the input area element comprises at least one first surface layer element, especially the first surface layer element comprises and/or represents at least one insulator layer, at least one first electrode and/or at least one first substrate element, preferably comprising at least one glass substrate especially the first electrode is arranged directly or indirectly below the first surface layer element and/or the first substrate element is arranged directly or indirectly below the first electrode element, especially the respective elements are sandwich-like arranged, especially with the first electrode being arranged at least in certain areas and/or at least partly between the first surface layer element and the first substrate element, the first electrode is arranged directly or indirectly below the first surface layer element and/or the first substrate element is arranged directly or indirectly below the first electrode element.
  • the output area element comprises at least one second surface layer element, especially the second surface layer element comprises and/or represents at least one insulator layer, at least one second electrode and/or at least one second substrate element, especially the respective elements are sandwich-like arranged, especially with the second electrode being arranged at least in certain areas and/or at least partly between the second surface layer element and the second substrate element, the second electrode is arranged directly or indirectly below the second surface layer element and/or the second substrate element is arranged directly or indirectly below the second electrode element.
  • the second substrate element may comprise at least one glass substrate.
  • the tactile feedback (a) represents a haptic pattern dependent on the gesture/touch event input or (b) comprises increasing the surface friction of at least one portion of at least one surface of the output area element, especially of the second surface layer element, when applying a voltage to at least one part of the output area element, especially to the second electrode.
  • the output area element comprises at least one ultrasonic actuator, especially a plurality of ultrasonic actuators, coupled to or being in operative connection to the second substrate element, the second electrode or the second surface layer element for building an air bearing layer adjacent to at least one portion of at least one surface of the output area element, especially of the second substrate element and/or second surface layer element, when the one or more ultrasonic actuator(s) is/are activated.
  • the first and/or second surface layer element comprises at least one conductor, preferably a plurality of conductors, preferably for capacitive coupling, especially the one or more conductor(s) are arranged at least in sections and/or at least partly (a) within the second surface layer element, (b) parallel to at least one surface, especially the top and/or bottom surface, of the second surface layer element and/or (c) parallel to each other.
  • At least some of the first and/or second surface layer elements, preferably all of the first and/or second surface layer elements, are formed as one common surface layer element at least partly and/or at least in certain areas
  • at least some of the first and/or second substrate elements, preferably all of the first and/or second substrate elements are formed as one common substrate element at least partly and/or at least in certain areas
  • at least some of the first and/or second electrodes, preferably all of the first and/or second electrodes are formed as one common electrode, especially as at least one segmented electrode, at least partly and/or at least in certain areas.
  • the output area element, especially the second surface layer element comprises and/or represents at least one edge, at least one bump, at least one protrusion, at least one recess and/or at least one detent, whereby especially the output area element, especially the second surface layer element, can be manufactured using a printing process, an injection molding process, a heat forming process and/or a grinding process. This mechanical haptics approach is taken if no surface haptics as specified before is used.
  • At least one, preferably all, output area element(s) can, especially by means of at least one touch controller, interchangeably and/or simultaneously also be adapted to receive touch and/or gesture events issued by at least one finger when interacting with at least one part of at least one surface of the output area element and/or the touch surface element, especially by interchanging use of the second electrode(s) as first electrode(s).
  • the interaction element further comprises at least one light guide, preferably for illuminating through the input area element(s) and/or output area element(s) from beneath, especially the light guide being arranged and/or extending at least in sections and/or in certain areas directly or indirectly below and/or parallel to the first and/or second electrode(s) and/or below and/or parallel to the first and/or second substrate element(s).
  • the interaction element further comprises (a) at least one light source, especially adapted for coupling light into the light guide, (b) at least one printed wire board, preferably the light source and/or at least one element of the group comprising first/second surface layer element, first/second electrode and first/second substrate element being mounted at least partly on the printed wire board, and/or (c) at least one tactile feedback device for generating a tactile feedback to the user interacting with the interaction element.
  • the interaction element is designed as a free form element, especially having at least in certain areas a curved surface and/or having at least one first area extending in at least one first plane and at least one second area extending in at least one second plane, preferably the second plane being angled with respect to the first plane.
  • the input area element is designed such that a touch state can be detected, especially such that at least a finger-on state can be distinguished from a finger- push state, preferably by means of evaluation of the capacitive change value of the input area element selected by a user during interaction of the user's finger and said input area element and/or by means of evaluation of the capacitive change value of input area elements adjacent and/or neighboring to the selected input area element during interaction of the user's finger and the selected input area element and the input area elements adjacent and/or neighboring to said input area element.
  • the invention solves the problem according to a second aspect by a control element, for controlling the operation of at least one function of at least one device, comprising at least one or more, especially different, interaction element(s) of the first aspect of the invention.
  • the invention solves the problem according to a third aspect by a motor vehicle comprising at least one or more, especially different, interaction element(s) of the first aspect of the invention and/or at least one or more, especially different, control element(s) of the second aspect of the invention.
  • an interaction element comprising at least one touch surface element it becomes possible to overcome the restrictions coming along with the design of a conventional physical input device such as a button and to allow multiple levels of input, such as single tap, multiple tap, finger rest, short touch, long touch, swipe and the like, as well as multiple methods which are controlled by the interaction element using, respectively, the same area of touch of the touch surface element (i.e. using the same touch surface element).
  • multiple methods which are controlled by the interaction element using, respectively, the same area of touch of the touch surface element (i.e. using the same touch surface element).
  • controlling different methods can be eased by the interaction element since there is no longer the need to hard wire the interaction element with a certain functionality but the functionality can be changed even on the fly.
  • the touch surface element allows to providing both, user input ability as well as tactile feedback output ability.
  • certain areas (especially others than that of the touch input) of the touch surface element to provide a tactile feedback output to the finger when interacting with at least one part of at least one surface of the output area element and/or the touch surface element. Therefore, respective input area elements and output area elements are comprised by the touch surface element.
  • the touch surface element which is presented to the user may have one or more input areas and/or one or more output areas, the respective areas belonging to the respective input area elements and output area elements.
  • the respective input area elements and output area elements For example, two neighboring input area elements are separated by each other by one output area element. This allows that the user "feels" with his finger a borderline between the output area elements when interacting with the interaction element.
  • the inventor have found that a user input can be received by using an input area element which evaluates basically the change of capacity by means of a first electrode which is provided below the first surface layer element of the input area element.
  • the input area element might also comprise further elements such as a first substrate element.
  • a tactile feedback can be preferably produced using the coulomb force, i.e. an electrostatic approach, also known as electro-adhesive. Therefore the output area element might function according to a first surface haptic approach as illustrated in Figure 2a.
  • This approach uses a surface layer element 7 (being especially an insulator layer) and an electrode 9, especially the surface layer element 7 being arranged above the electrode 9.
  • the surface layer element i.e. the insulator
  • the surface layer element is thin, especially less than 3 micrometer. This regularly is preferred to have sufficient coulomb force. If now a user touches the surface layer element 7 (i.e. the insulator), the finger 11 of the user becomes attracted due to the coulomb force once an electrostatic field is applied, e.g.
  • the electrostatic induction is very close with the (outer) surface.
  • the (outer) surface is of a certain thickness.
  • the surface layer e.g. the surface layer element
  • electrostatic coupling in multiple layers of conductive layers is present.
  • there are integrated one or more conductors for capacitive coupling This allows to increase thickness of the outer layer without affecting the feedback mechanism adversely. This will be described in more detail later with reference to Figure 8 below.
  • the inventor have also found that a tactile feedback can be produced using ultrasonic, also known as electro-vibration.
  • the output area element might function alternatively and/or in addition according to a second surface haptic approach as illustrated in Figure 2b.
  • the output area element comprises at least one (preferably more than one) ultrasonic actuator (not shown in Figure 2b), coupled to or being in operative connection to a substrate element 15 and/or a surface layer element.
  • the actuators are placed, e.g.
  • Forming some or all elements of the input area elements and/or the output area elements as one common element (which might or might not be identical to the touch surface element) leads to an improved design since stability is increased, manufacturing costs are reduced and a clean design is achieved.
  • an area element i.e. input area element and output area element
  • input area element and output area element can be interchanged electronically, i.e. without physical reconfiguration. This allows for an improved utilization of the interaction element since input and output areas can be arranged as required in the respective situation.
  • Providing preferably a light guide and/or a light source allows to illuminate the input/output areas from behind for improving security and convenience.
  • Separating at least two input area elements from each other by means of at least one output area element is preferred since the user can operate without looking to the input area elements directly. This increases safety. It is also possible to provide a permanent tactile feedback mechanism such as by means of bumps, protrusions, recesses, detents and the like. These mechanisms are not dependent on energization, hence, are even more fail-safe. It has been further found that the components used for inputs and outputs such as the touch surface element can be used in conjunction with integrated displays or even on other surfaces such as wood, plastic, leather, chrome and the like, i.e. basically every surface can be prepared for being used as an input and/or output device.
  • touch sensitive areas can be placed in three dimensions around several sides of a device or a button, which makes all of those surfaces ready to be used for user inputs and/or outputs.
  • the interaction element especially the touch surface element, can be either placed as a separate layer underneath any surface or directly on-molded in the actual surface of an input device like a button.
  • the outer contour of the interaction element's and/or the control element's surface might have textures or defined pattern for a natural 3D feeling.
  • This feedback can be of provided in several forms: Surface vibration, taptic/haptic feedback to the contact surface, e.g. from an integrated feedback mechanism, acoustic confirmation, visual confirmation, e.g. with graphics or lighting, and the like.
  • the interaction element could be designed as being invisible to the user upon activation.
  • the outer surface of the interaction element can perform "hidden till lit” function until required to show presence of a user input area (i.e. a button) or a confirmation of an event. This way the interaction element can be hidden in a non-intrusive and aesthetic manner in a control element, hence in a motor vehicle.
  • the interaction element does not require any mechanical movement of any external components for receiving inputs and providing outputs, i.e. feedbacks.
  • the moving parts are away from the user. This lead to a quite clean design and, hence, provide convenient operation.
  • the design of such an interaction element allows the integration of the input and output functionality (along with the touch surface element) in one single element, especially in one single free form element. Consequently also the control element can be designed as a free form element. In both cases, preferably creation of any design shape and multi directional cutting and in-molding manufacturing processes of circuit elements is possible.
  • the proposed schemes makes it possible to give multiple levels of inputs, when the input components are arranged in required 3D positions. There is no need for mechanical movement of any external part to accomplish this task.
  • the same input area can be used for different functions altogether also with multiple level of input possibilities. This creates opportunity to remove several exclusively dedicated buttons and bring their functionalities to a single button or array of buttons, resulting into cost saving as well as efforts in production and integration.
  • Figure la shows an array of buttons according to the state of the art
  • Figure lb shows a button according to the state of the art
  • Figure 2a shows an illustration of the first surface haptic approach
  • Figure 2b shows an illustration of the second surface haptic approach
  • Figure 3 shows an illustration of an interaction element according to the first aspect of the invention in a first embodiment
  • Figure 4a shows an illustration of a first gesture/touch event as input to the
  • Figure 4b shows an illustration of a first instant of the gesture/touch event of
  • Figure 4c shows an illustration of a second instant of the gesture/touch event of
  • Figure 4d Illustration of a third instant of the gesture/touch event of Figure 4a;
  • Figure 5a Shows an illustration of a second gesture/touch event as input to the
  • Figure 5b Shows an illustration of a first instant of the gesture/touch event of
  • Figure 5a Figure 5c Shows an illustration of a second instant of the gesture/touch event of Figure 5a along with a second haptic pattern;
  • Figure 5d Shows an illustration of a third instant of the gesture/touch event of Figure
  • Figure 6a Shows an illustration of an interaction element according to the first aspect of the invention in a second embodiment
  • Figure 6b Shows an illustration of an interaction element according to the first aspect of the invention in a third embodiment
  • Figure 6c Shows an illustration of an interaction element according to the first aspect of the invention in a fourth embodiment
  • Figure 7a Shows an illustration of an interaction element according to the first aspect of the invention in a fifth embodiment
  • Figure 7b Shows an illustration of the different effective areas of the interaction
  • Figure 8 Shows an illustration of an interaction element according to the first aspect of the invention in a sixth embodiment
  • Figure 9a Shows an illustration of an interaction element according to the first aspect of the invention in an seventh embodiment
  • Figure 9b Shows an illustration of the interaction element of Figure 9a in a first
  • Figure 9c Shows an illustration of the different effective areas of the interaction
  • Figure 9d Shows an illustration of the interaction element of Figure 9a in a second operation state
  • Figure 9e Shows an illustration of the different effective areas of the interaction
  • Figure 10 Shows an illustration of an interaction element according to the first aspect of the invention in an eighth embodiment
  • Figures 12a-c Shows an illustration of a sensor matrix and detection states
  • Figures IBa-f Shows an illustration of detection states
  • Figure 14a Shows an illustration of an interaction element according to the first aspect of the invention in a ninth embodiment in a first variant
  • Figure 14b Shows an illustration of an interaction element according to the first aspect of the invention in a ninth embodiment in a second variant
  • Figure 14c Shows an illustration of an interaction element according to the first aspect of the invention in a tenth embodiment
  • Figure 15a shows an illustration of a control element according to the second aspect of the invention in a first embodiment
  • Figure 15b shows an illustration of a control element according to the second aspect of the invention in a second embodiment.
  • FIG. 3 shows an illustration of an interaction element 101 according to the first aspect of the invention in a first embodiment.
  • the interaction element 101 comprises a touch surface element 103.
  • the touch surface element 103 has two input area elements 105 which are, respectively, adapted to receive touch and/or gesture events inputs issued by a finger when interacting with at least one part of at least one surface of the input area element 105.
  • the touch surface element 103 has further one output area element 107 which is adapted to provide a tactile feedback output to the finger when interacting with at least one part of at least one surface of the output area element 107.
  • the input area elements 105 and output area element 107 are formed as one at least in certain areas, so that the haptic feedback area (i.e. output area element 107) is directly integrated in the touch surface element 103 along with the input area element 105.
  • the interaction element 101 provides for an edge detection for rolling over multiple surfaces by using a touch sensor (i.e. interaction element 101) with an haptic/tactile feedback.
  • the output area element 107 is an extended area as indicated by the dark area 107.
  • the output area element might be established only by the edge between both input area elements 105 separated from each other by the edge as the output area element 107.
  • Figure 4a shows an illustration of a first gesture/touch event as input to the interaction element 101 of Figure 3, the gesture/touch event being a "slide up".
  • the finger swipes across the input area element 105 (as shown in Figure 4b which shows an illustration of a first instant of the gesture/touch event of Figure 4a), the output area element 107 (as shown in Figure 4c which shows an illustration of a second instant of the gesture/touch event of Figure 4a) and the other input area element 105 (as shown in Figure 4d which shows an illustration of a third instant of the gesture/touch event of Figure 4a).
  • Figure 4b which shows an illustration of a first instant of the gesture/touch event of Figure 4a
  • the output area element 107 (as shown in Figure 4c which shows an illustration of a second instant of the gesture/touch event of Figure 4a)
  • the other input area element 105 (as shown in Figure 4d which shows an illustration of a third instant of the gesture/touch event of Figure 4a).
  • the finger 109 is pushed as illustrated by the arrow indicated with a "2 in a circle” in Figure 4a.
  • a first haptic pattern 111 is shown which illustrates the haptic / tactile effect which the Finger 109 expires while moving along the edge designed as output area element 107. It feels like many, closely successive peaks.
  • the haptic pattern 111 is only for the purpose of illustration and other haptic patterns might be possible as well.
  • first instance of the gesture/touch event it might be possible to locate the area of first touch and/or to measure the distance to the haptic feedback area (i.e. the output area element 107).
  • second instance of the gesture/touch event it might be possible to trace the direction of the slide and/or to enable the feedback pattern when finger crosses the haptic feedback area (i.e. the output area element 107).
  • a lighting as a secondary confirmation might be activated.
  • first instance of the gesture/touch event it might be possible to perform a function based on the gesture, especially once the finger 109 rests in the final area (i.e. the input area element 105).
  • Figure 5a shows an illustration of a second gesture/touch event as input to the interaction element 101 of Figure 3, the gesture/touch event being a "slide down".
  • the finger 109 swipes down for the purpose of issuing a "roll down" command of a window controller, as indicated by the dotted-dashed line in Figure 5a which is indicated with a "1 in a circle”.
  • the finger swipes across the input area element 105 (as shown in Figure 5b which shows an illustration of a first instant of the gesture/touch event of Figure 5a), the output area element 107 (as shown in Figure 5c which shows an illustration of a second instant of the gesture/touch event of Figure 5a) and the other input area element 105 (as shown in Figure 5d which shows an illustration of a third instant of the gesture/touch event of Figure 5a).
  • Figure 5b which shows an illustration of a first instant of the gesture/touch event of Figure 5a
  • the output area element 107 (as shown in Figure 5c which shows an illustration of a second instant of the gesture/touch event of Figure 5a)
  • the other input area element 105 (as shown in Figure 5d which shows an illustration of a third instant of the gesture/touch event of Figure 5a).
  • a second haptic pattern 113 which illustrates the haptic / tactile effect which the Finger 109 expires while moving along the edge designed as output area element 107. It feels like many, more distant (compared to the first haptic pattern 111) successive peaks.
  • the second haptic pattern 113 is only for the purpose of illustration and other haptic patterns might be possible as well.
  • first instance of the gesture/touch event it might be possible to locate the area of first touch and/or to measure the distance to the haptic feedback area (i.e. the output area element 107).
  • second instance of the gesture/touch event it might be possible to trace the direction of the slide and/or to enable the feedback pattern when finger crosses the haptic feedback area (i.e. the output area element 107).
  • a lighting as a secondary confirmation might be activated.
  • first instance of the gesture/touch event it might be possible to perform a function based on the gesture, especially once the finger 109 rests in the final area (i.e. the input area element 105).
  • the input area element(s) and the output area element(s), are formed as one.
  • the same touch surface element e.g. touch sensor layer
  • Figure 6a shows an illustration of an interaction element according to the first aspect of the invention in a second embodiment.
  • Features which functionally correspond as far as possible to those of the first embodiment of interaction element 101 are provided with the same reference signs, however, single dashed. Since the functionality of the second embodiment of interaction element 101' largely corresponds to the first embodiment of the interaction element 101, only differences between the first and second embodiments are discussed below. And besides, the explanations given above apply for the second embodiment and the respective Figure accordingly.
  • Interaction element 101' has a flat design in contrast to the curved design of interaction element 101.
  • the edge detection method is equally applicable on SD as well as flat surfaces. There is no additional change in method due to change in geometry.
  • Figure 6b shows an illustration of an interaction element according to the first aspect of the invention in a third embodiment.
  • Features which functionally correspond as far as possible to those of the first embodiment of interaction element 101 and second embodiment of interaction element 101' are provided with the same reference signs, however, doubled dashed. Since the functionality of the third embodiment of interaction element 101'' largely corresponds to the first embodiment of the interaction element 101 and the second embodiment of the interaction element 101', only differences between the third and first and second embodiments are discussed below. And besides, the explanations given above apply for the third embodiment and the respective Figure accordingly.
  • Interaction element 101'' has a free form design which is a more advanced design than that of interaction element 101 and interaction element 101'. Here, a natural feeling of edge detection is possible.
  • Figure 6c shows an illustration of an interaction element according to the first aspect of the invention in a fourth embodiment.
  • Features which functionally correspond as far as possible to those of the first embodiment of interaction element 101, the second embodiment of interaction element 101' and the third embodiment of interaction element 101'' are provided with the same reference signs, however, triple dashed. Since the functionality of the fourth embodiment of interaction element 101''' largely corresponds to the first embodiment of the interaction element 101, the second embodiment of the interaction element 101' and the third embodiment of interaction element 101'', only differences between the fourth and first, second and third embodiments are discussed below. And besides, the explanations given above apply for the fourth embodiment and the respective Figure accordingly.
  • Interaction element 101"' is especially similar to interaction element 101.
  • interaction element 101"' further comprises a light guide 115'", for illuminating through the input area elements 105'" and output area element 107'" from beneath.
  • the interaction element 101'" further comprises a light source 117'" adapted for coupling light into the light guide 115'", optionally at least one printed wire board 119'", the light source 117'" being mounted at least partly on the printed wire board 119'", and optionally a tactile feedback device 121'" for generating a tactile feedback to the user interacting with the interaction element 101'".
  • a further feedback mechanism is provided in addition to the output area element 107'".
  • the tactile feedback device 121'" for example might provide feedback for prompting the success or failure of some operational command after completion of the touch/gesture input of the user.
  • Figure 7a shows an illustration of an interaction element 201 according to the first aspect of the invention in a fifth embodiment.
  • Insofar interaction element 201 has static borderlines, where the places of the borderlines are fixed and do not move.
  • Interaction element 201 comprises a touch surface element having a first plurality (i.e. six) of input area elements 203 and a second plurality (i.e. five) of output area elements 205.
  • Each input area element 203 comprises a first electrode 207 and each output area element 205 comprises a second electrode 209.
  • the first electrodes 207 are for sensing (i.e. receiving touch/gesture inputs) and the second electrodes 209 are for providing tactile feedback outputs.
  • each input area element 203 and each output area element 207 comprise, respectively, a first and second surface layer element. All surface layer elements are formed as one common surface layer element 211.
  • Figure 7a shows more details of an interaction element than the previous Figures.
  • the input area element 203 and the output area element 207 are formed as one in certain areas is to be understood here with respect to the first and second surface layer element which is presented as the common surface layer element 211.
  • Figure 7b shows an illustration of the different effective areas of the interaction element 200 of Figure 7a, i.e. the presentation of the interaction element 201 from a user's view.
  • each touch button 213 has a dedicated electrode 207 for touch sensing and each virtual borderline 215 (i.e. where the finger feels the borderline) has a dedicated electrode 209 to induce the electrostatic field.
  • Figure 8 shows an illustration of an interaction element 217 according to the first aspect of the invention in a sixth embodiment. Essentially it is based on the interaction element 201 and develops it further, hence, features which functionally correspond as far as possible to those of the fifth embodiment of interaction element 201 are provided with the same reference signs.
  • the second surface layer element comprises a plurality of conductors 219 for capacitive coupling.
  • the conductors 219 are arranged within the second surface layer element 211, parallel to the top surface of the second surface layer element 211 and parallel to each other.
  • the solution approach is done by electrostatic coupling in multiple layers of conductive layers 219.
  • FIG 9a shows an illustration of an interaction element 301 according to the first aspect of the invention in a seventh embodiment.
  • Insofar interaction element 301 represents dynamic borderlines, where the places of the borderlines may change dependent on external and/or internal conditions such as display content, physical location of the interaction element, history of touch/gesture events and/or history of commands issued by user inputs, and the like.
  • Interaction element 301 comprises a segmented electrode 302a where input and output areas can be interchangeably controlled, especially by means of an touch controller.
  • each electrode can be used as both, first electrode and second electrode, dependent on the configuration, hence, the distribution of input area elements and output area elements can be chosen nearly arbitrarily in interaction element 301.
  • a common surface layer element 302b is present (see also description with respect to Figure 7a and element 211 above for details which apply here mutatis mutandis, too), especially as part of the touch surface element.
  • Figure 9b shows an illustration of the interaction element 301 of Figure 9a in a first operation state.
  • interaction element 301 comprises a touch surface element having a first plurality (i.e. six) of input area elements 303 and a second plurality (i.e. five) of output area elements 305.
  • Each input area element 303 comprises three electrodes working as first electrodes and each output area element 305 comprises one electrode working as second electrode.
  • Figure 9c shows an illustration of the different effective areas of the interaction element 301 of Figure 9b, i.e. the presentation of the interaction element 301 from a user's view.
  • interaction element 201 with static borderlines is quite similar to the interaction element 301 in the first operation state.
  • the number of electrodes may vary. Nevertheless, to a wide extend the explanations given above with respect to interaction element 201 apply mutatis mutandis also to the interaction element 301 in the first operation state.
  • Figure 9d shows an illustration of the interaction element 301 of Figure 9a in a second operation state.
  • interaction element 301 comprises a first plurality (i.e. five) of input area elements 303 and a second plurality (i.e. four) of output area elements 305.
  • Each input area element 303 comprises four or three electrodes working as first electrodes and each output area element 305 comprises one electrode working as second electrode.
  • Figure 9e shows an illustration of the different effective areas of the interaction element 301 of Figure 9d, i.e. the presentation of the interaction element 301 from a user's view.
  • the explanations given above with respect to Figure 9c apply here mutatis mutandis, too, and must, therefore, not being repeated here.
  • FIG 10 shows an illustration of an interaction element 401 according to the first aspect of the invention in an eighth embodiment.
  • the interaction element 401 comprises a touch surface element having a common surface layer element 403 and a plurality of first and second electrodes 405 arranged below the surface layer element 403, hence, realizing respective input area elements and output area elements.
  • the electrodes are realized as transparent conductives.
  • interaction element 401 comprises a light guide 407 and a light source 409 which couples light into the light guide 407.
  • the lighting segment approach may be taken to improve sensory feedback on top of haptics feedback of the respective button (i.e. output area element).
  • Figures lla-c show further different realizations of an interaction element according to the first aspect of the invention. Especially it can be taken from the examples that lighting segment approach may be taken to improve sensory feedback on top of haptics feedback of the respective button (i.e. output area element).
  • the output area elements are highlighted by rectangles.
  • Figures lib and 11c illustrate by way of example the possibility to design the input/output elements free-form like. All the realizations have in common that they are designed as one with respect to a common surface, i.e. for example with respect to the first and/or second surface layer elements.
  • Figure 12a shows an illustration of a sensor matrix.
  • Figure 12a shows a segmented representation of the surface of the interaction element and/or the touch surface element of Figure 3, 4a-d, 5a-5d, 6c.
  • the principle applies to any other surface geometry mutates mutandis.
  • the surface is segmented into 18 rectangles addressed by columns yl through y3 and rows xl thorugh x6. The segmentation might be achieved for example by a respective arrangement of first electrodes (for input) or second electrodes (for output).
  • a finger is touching the surface within the rectangle with address (x2; y2), see Figure 12a.
  • the touch controller needs to distinguish between at least / for example the "finger-on” state and the "finger- push” state. This might be possible by using an evaluation of the capacitive change ("DELTA C") value monitored during "finger-on" to detect "finger-push".
  • Figure 12c shows respective detection states. The content of this Figure might especially be read in conjunction with the gesture/touch event described with respect to Figures 4a-d and Figures 5a-d above.
  • detection and evaluation can also incorporate both, capacitive change (“DELTA C”) on target sensor and relative sensor value change on neighboring sensors.
  • DELTA C capacitive change
  • Figures IBa-f shows different plots of detection states of the segmented matrix of Figures 12a-b for "no touch” (i.e. baseline; Figures IBa-b), for “finger on” ( Figures 13c-d) and for “finger push” ( Figures 13e-f) in, respectively, a perspective view and a top view. It can be taken from these plots (see Figures 12c-f) that there is a change on both, the peak value and the surrounding sensor values corresponding to a "finger-on” event and a “finger-push” event. The content of these plots might especially be read in conjunction with the gesture/touch event described with respect to Figures 4a-d and Figures 5a-d above.
  • Figure 14a shows an illustration of an interaction element 501 according to the first aspect of the invention in a ninth embodiment in a first variant.
  • the entire interaction element 501 is shown in a perspective view.
  • a part of the interaction element 501 is shown in a frontal view.
  • Interaction element 501 comprises a touch surface element having seven input area elements 503 and six output area elements 505.
  • the output area elements 505 in turn each comprises a protrusion 507 as tactile/haptic feedback element.
  • Figure 14b shows an illustration of an interaction element 601 according to the first aspect of the invention in the ninth embodiment in a second variant.
  • the entire interaction element 601 is shown in a perspective view.
  • a part of the interaction element 601 is shown in a frontal view.
  • Interaction element 601 comprises seven input area elements 603 and six output area elements 605.
  • the output area elements 605 in turn each comprises a recess 607 as tactile/haptic feedback element.
  • output area elements such as 505 and 605 are preferably feasible for static situations, i.e. where the output area elements do not move dynamically.
  • both solutions may or may not use lighting segments for visual feedback in addition, too, or they may be used together.
  • Figure 14c shows an illustration of an interaction element according to the first aspect of the invention in a tenth embodiment.
  • the interaction element shown in Figure 14c is quite similar to the interaction element shown in Figure 6c and described in more detail above. Therefore, features of interaction element of Figure 14c which functionally correspond as far as possible to those of interaction element of Figure 6c are provided with the same reference signs.
  • the interaction element 101"' of Figure 14c comprises an output area element 107"' which represents an edge.
  • the edge which separates the two input area elements 105'" from each other represents the output area element 107'".
  • the output area elements provide tactile feedback either by its mechanical shape, as shown in figures 14a, 14b, 14c or using active surface haptic such as surface electrostatic, as shown in figure 7, 9 and 10, or ultrasonic, as shown in figure 2.
  • FIG 15a shows an illustration of a control element 701 according to the second aspect of the invention in a first embodiment.
  • This is an example of a two-dies populated control element 701 with interaction elements 703.
  • All interaction elements 703 are non-moving (solid state) and have touch-sensitive surfaces as described above.
  • This interaction elements 703 can be used for the functions roll-up (swipe-bottom + back action), automatic roll-up (swipe-bottom + back and push), roll-down (swipe rear + down action), automatic roll-down (swipe rear + down and push action).
  • Figure 15b shows an illustration of a control element 801 according to the second aspect of the invention in a second embodiment.
  • control element 801 is a multi-sensor type which can be regarded as as multifunction touch button object and which is grasped by a user's hand 803. In other words, control element 801 is a "central touch thing”. This "thing” has the following optional features and advantages:
  • Multifunctional same module for multiple applications (window glass movement, ventilation control, seat control and so on%) - selection based on touch methods.
  • Multilevel several touch patterns.
  • Blind operation natural 3D shape, surface texture, feedback mechanism.
  • Placement Door panel, Seat panel, Dashboard, Center console, Hand-held remote.
  • the interaction elements are especially for receiving touch and/or gesture events inputs and/or providing tactile feedback outputs, that the input area elements are adapted to receive touch and/or gesture events inputs issued by a finger when interacting with at least one part of at least one surface of the respective input area element and that the output area elements are adapted to provide a tactile feedback output to the finger when interacting with at least one part of at least one surface of the respective output area element. It is further clear that especially all control elements are for controlling the operation of at least one function of at least one device.
  • the features disclosed in the claims, the specification, and the drawings maybe essential for different embodiments of the claimed invention, both separately or in any combination with each other.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The present invention relates to an interaction element, for receiving touch and/or gesture events inputs and/or providing tactile feedback outputs. The invention also relates to a control element, for controlling the operation of at least one function of at least one device, comprising at least one or more such interaction elements. Further, the invention also relates to a motor vehicle comprising at least one or more such interaction elements and/or at least one or more such control elements.

Description

Interaction element, control element and motor vehicle
Description
The present invention relates to an interaction element, for receiving touch and/or gesture events inputs and/or providing tactile feedback outputs. The invention also relates to a control element, for controlling the operation of at least one function of at least one device, comprising at least one or more such interaction elements. Further, the invention also relates to a motor vehicle comprising at least one or more such interaction elements and/or at least one or more such control elements.
In today's vehicles a large number of different user input devices such as buttons are present for receiving inputs for controlling different functions of different devices of the vehicle. However, in the current design of respective buttons the number of input levels which one button can provide is limited and in addition mechanical movement of the button is required in case it needs to provide multiple levels of input. Figure la shows a state of the art panel 1 which might be used for example at a door panel arm rest of a motor vehicle for window glass management. The panel 1 has an array of buttons 3 which are examples of such multiple level input buttons and a button 5 which is a press lock button. Each button 3 in Figure la usually provides four levels of input, i.e. pull up (e.g. for window roll up), pull up two stages (e.g. for window automatic roll up), push down (e.g. for window roll down), push down two stages (e.g. for window automatic roll down). An illustration of an exemplary state of the art multiple level input button is shown in Figure lb in isolation. However, these kind of buttons are susceptible to wear and user experience might also be better. Further, they have dedicated functions and cannot be flexible used for other ones.
DE 10 2014 008 040 A1 discloses an interaction element for receiving touch and/or gesture events inputs and/or providing visual feedback outputs or tactile feedback outputs by vibration. The interaction element comprises at least one OLED-Display, having at least one input area element adapted to receive touch and/or gesture event inputs issued by at least one finger when interacting with at least one part of at least one surface of the OLED-Display. Further, a depression is adapted to provide a visual feedback output.
DE 20 2017 101 606 Ul discloses a display device for receiving touch and/or gesture event inputs and/or providing tactile feedback outputs. The display device comprises a touch surface element having an input area element to receive touch event inputs. Further, at least one output area element provides a tactile feedback output.
Therefore, the invention aims at providing user input devices which allow to receive user inputs at multiple levels and which are at the same time easy and intuitive to operate, hence increasing the safety of the driver and providing a better user experience. In addition high flexibility with respect to the assignment of functions controlled by the user input devices is demanded.
The invention solves the problem according to a first aspect by an interaction element, for receiving touch and/or gesture events inputs and/or providing tactile feedback outputs, comprising at least one touch surface element, the touch surface element having (a) at least one input area element at least adapted to receive touch and/or gesture events inputs issued by at least one finger when interacting with at least one part of at least one surface of the input area element and/or the touch surface element; and (b) at least one output area element at least adapted to provide a tactile feedback output to the finger when interacting with at least one part of at least one surface of the output area element and/or the touch surface element; wherein the at least one input area element and the at least one output area element are formed as one. The resulting interaction element therefore has at least one input area element and at least one output area element designed as one common element at least in certain areas and/or in sections.
At least one, preferably all, input area element(s) can, especially by means of the touch controller, interchangeably and/or simultaneously also be adapted to provide a tactile feedback to the finger when interacting with at least one part of at least one surface of the input area element and/or the touch surface element, especially by interchanging use of the first electrode(s) as second electrode(s). In one embodiment the interaction element comprises a first plurality of input area elements, especially at least two, three, four, five, six, seven, eight, nine and/or ten input area elements, and a second plurality of output area elements, especially at least two, three, four, five, six, seven, eight, nine and/or ten output area elements, wherein preferably at least two neighboring input area elements, especially all of each two neighboring input area elements, are at least in certain areas and/or at least partly separated by each other by at least one of the second plurality of output area elements.
In one embodiment the input area element comprises at least one first surface layer element, especially the first surface layer element comprises and/or represents at least one insulator layer, at least one first electrode and/or at least one first substrate element, preferably comprising at least one glass substrate especially the first electrode is arranged directly or indirectly below the first surface layer element and/or the first substrate element is arranged directly or indirectly below the first electrode element, especially the respective elements are sandwich-like arranged, especially with the first electrode being arranged at least in certain areas and/or at least partly between the first surface layer element and the first substrate element, the first electrode is arranged directly or indirectly below the first surface layer element and/or the first substrate element is arranged directly or indirectly below the first electrode element.
In one embodiment the output area element comprises at least one second surface layer element, especially the second surface layer element comprises and/or represents at least one insulator layer, at least one second electrode and/or at least one second substrate element, especially the respective elements are sandwich-like arranged, especially with the second electrode being arranged at least in certain areas and/or at least partly between the second surface layer element and the second substrate element, the second electrode is arranged directly or indirectly below the second surface layer element and/or the second substrate element is arranged directly or indirectly below the second electrode element. The second substrate element may comprise at least one glass substrate.
In one embodiment the tactile feedback (a) represents a haptic pattern dependent on the gesture/touch event input or (b) comprises increasing the surface friction of at least one portion of at least one surface of the output area element, especially of the second surface layer element, when applying a voltage to at least one part of the output area element, especially to the second electrode..
In one embodiment the output area element comprises at least one ultrasonic actuator, especially a plurality of ultrasonic actuators, coupled to or being in operative connection to the second substrate element, the second electrode or the second surface layer element for building an air bearing layer adjacent to at least one portion of at least one surface of the output area element, especially of the second substrate element and/or second surface layer element, when the one or more ultrasonic actuator(s) is/are activated.
In one embodiment the first and/or second surface layer element comprises at least one conductor, preferably a plurality of conductors, preferably for capacitive coupling, especially the one or more conductor(s) are arranged at least in sections and/or at least partly (a) within the second surface layer element, (b) parallel to at least one surface, especially the top and/or bottom surface, of the second surface layer element and/or (c) parallel to each other.
In one embodiment (a) at least some of the first and/or second surface layer elements, preferably all of the first and/or second surface layer elements, are formed as one common surface layer element at least partly and/or at least in certain areas, (b) at least some of the first and/or second substrate elements, preferably all of the first and/or second substrate elements, are formed as one common substrate element at least partly and/or at least in certain areas and/or (c) at least some of the first and/or second electrodes, preferably all of the first and/or second electrodes, are formed as one common electrode, especially as at least one segmented electrode, at least partly and/or at least in certain areas.
In one embodiment the output area element, especially the second surface layer element, comprises and/or represents at least one edge, at least one bump, at least one protrusion, at least one recess and/or at least one detent, whereby especially the output area element, especially the second surface layer element, can be manufactured using a printing process, an injection molding process, a heat forming process and/or a grinding process. This mechanical haptics approach is taken if no surface haptics as specified before is used.
In one embodiment at least one, preferably all, output area element(s) can, especially by means of at least one touch controller, interchangeably and/or simultaneously also be adapted to receive touch and/or gesture events issued by at least one finger when interacting with at least one part of at least one surface of the output area element and/or the touch surface element, especially by interchanging use of the second electrode(s) as first electrode(s).
In one embodiment the interaction element further comprises at least one light guide, preferably for illuminating through the input area element(s) and/or output area element(s) from beneath, especially the light guide being arranged and/or extending at least in sections and/or in certain areas directly or indirectly below and/or parallel to the first and/or second electrode(s) and/or below and/or parallel to the first and/or second substrate element(s).
In one embodiment the interaction element further comprises (a) at least one light source, especially adapted for coupling light into the light guide, (b) at least one printed wire board, preferably the light source and/or at least one element of the group comprising first/second surface layer element, first/second electrode and first/second substrate element being mounted at least partly on the printed wire board, and/or (c) at least one tactile feedback device for generating a tactile feedback to the user interacting with the interaction element.
In one embodiment the interaction element, especially the touch surface element, is designed as a free form element, especially having at least in certain areas a curved surface and/or having at least one first area extending in at least one first plane and at least one second area extending in at least one second plane, preferably the second plane being angled with respect to the first plane.
In one embodiment the input area element is designed such that a touch state can be detected, especially such that at least a finger-on state can be distinguished from a finger- push state, preferably by means of evaluation of the capacitive change value of the input area element selected by a user during interaction of the user's finger and said input area element and/or by means of evaluation of the capacitive change value of input area elements adjacent and/or neighboring to the selected input area element during interaction of the user's finger and the selected input area element and the input area elements adjacent and/or neighboring to said input area element. The invention solves the problem according to a second aspect by a control element, for controlling the operation of at least one function of at least one device, comprising at least one or more, especially different, interaction element(s) of the first aspect of the invention.
The invention solves the problem according to a third aspect by a motor vehicle comprising at least one or more, especially different, interaction element(s) of the first aspect of the invention and/or at least one or more, especially different, control element(s) of the second aspect of the invention.
It has, thus, been surprisingly found that by providing an interaction element comprising at least one touch surface element it becomes possible to overcome the restrictions coming along with the design of a conventional physical input device such as a button and to allow multiple levels of input, such as single tap, multiple tap, finger rest, short touch, long touch, swipe and the like, as well as multiple methods which are controlled by the interaction element using, respectively, the same area of touch of the touch surface element (i.e. using the same touch surface element). Especially, controlling different methods can be eased by the interaction element since there is no longer the need to hard wire the interaction element with a certain functionality but the functionality can be changed even on the fly.
The inventors have particularly found that the touch surface element allows to providing both, user input ability as well as tactile feedback output ability. In this respect it is possible to adapt certain areas of the touch surface element to receive touch and/or gesture events inputs issued by at least one finger when interacting with at least one part of at least one surface of the input area element and/or the touch surface element. And further to adapt certain areas (especially others than that of the touch input) of the touch surface element to provide a tactile feedback output to the finger when interacting with at least one part of at least one surface of the output area element and/or the touch surface element. Therefore, respective input area elements and output area elements are comprised by the touch surface element.
Hence, the touch surface element which is presented to the user (directly or indirectly, i.e. there can be also one or more additional layers on top of the touch surface element, if desired), may have one or more input areas and/or one or more output areas, the respective areas belonging to the respective input area elements and output area elements. For example, two neighboring input area elements are separated by each other by one output area element. This allows that the user "feels" with his finger a borderline between the output area elements when interacting with the interaction element.
The inventor have found that a user input can be received by using an input area element which evaluates basically the change of capacity by means of a first electrode which is provided below the first surface layer element of the input area element. The input area element might also comprise further elements such as a first substrate element.
The inventor have found that a tactile feedback can be preferably produced using the coulomb force, i.e. an electrostatic approach, also known as electro-adhesive. Therefore the output area element might function according to a first surface haptic approach as illustrated in Figure 2a. This approach uses a surface layer element 7 (being especially an insulator layer) and an electrode 9, especially the surface layer element 7 being arranged above the electrode 9. Preferably the surface layer element (i.e. the insulator) is thin, especially less than 3 micrometer. This regularly is preferred to have sufficient coulomb force. If now a user touches the surface layer element 7 (i.e. the insulator), the finger 11 of the user becomes attracted due to the coulomb force once an electrostatic field is applied, e.g. by powering the electrode 9. In other words, the surface friction is increased due to the coulomb force when an electrostatic field is applied. This gives the effect of "stickiness" to the surface of a substrate element 13, e.g. a glass substrate, which might be provided below the electrode 9. All three elements might be designed as layers and arranged sandwich-like, with the electrode 9 arranged in the middle.
Often it is required that the electrostatic induction is very close with the (outer) surface. However, sometimes it is also required that the (outer) surface is of a certain thickness. In case the surface layer (e.g. the surface layer element) is thick, a solution has been found in that electrostatic coupling in multiple layers of conductive layers is present. In other words, there are integrated one or more conductors for capacitive coupling. This allows to increase thickness of the outer layer without affecting the feedback mechanism adversely. This will be described in more detail later with reference to Figure 8 below. The inventor have also found that a tactile feedback can be produced using ultrasonic, also known as electro-vibration. Therefore the output area element might function alternatively and/or in addition according to a second surface haptic approach as illustrated in Figure 2b. For this approach the output area element comprises at least one (preferably more than one) ultrasonic actuator (not shown in Figure 2b), coupled to or being in operative connection to a substrate element 15 and/or a surface layer element. Of course, there is only need for either the substrate element or the surface layer, but also both elements might be present. And there is no need for a electrode in order to function, although it might be advantageous in order to use the output element as input element as well. However, as the actuators are placed, e.g. around the substrate element 15 / surface layer element, a reduced surface friction due to an "air bearing" layer 17 that is build up when ultrasonic vibration is activated is present. This gives the effect of "slipperiness" of the surface for a finger 19 touching the substrate element 15. For example 7 inches needs twenty actuators (hap2u).
Forming some or all elements of the input area elements and/or the output area elements as one common element (which might or might not be identical to the touch surface element) leads to an improved design since stability is increased, manufacturing costs are reduced and a clean design is achieved.
It is preferred that the type of an area element, i.e. input area element and output area element, can be interchanged electronically, i.e. without physical reconfiguration. This allows for an improved utilization of the interaction element since input and output areas can be arranged as required in the respective situation.
Providing preferably a light guide and/or a light source allows to illuminate the input/output areas from behind for improving security and convenience.
Separating at least two input area elements from each other by means of at least one output area element is preferred since the user can operate without looking to the input area elements directly. This increases safety. It is also possible to provide a permanent tactile feedback mechanism such as by means of bumps, protrusions, recesses, detents and the like. These mechanisms are not dependent on energization, hence, are even more fail-safe. It has been further found that the components used for inputs and outputs such as the touch surface element can be used in conjunction with integrated displays or even on other surfaces such as wood, plastic, leather, chrome and the like, i.e. basically every surface can be prepared for being used as an input and/or output device. Particularly, such touch sensitive areas can be placed in three dimensions around several sides of a device or a button, which makes all of those surfaces ready to be used for user inputs and/or outputs. It has been particularly advantageously that the interaction element, especially the touch surface element, can be either placed as a separate layer underneath any surface or directly on-molded in the actual surface of an input device like a button. The outer contour of the interaction element's and/or the control element's surface might have textures or defined pattern for a natural 3D feeling.
The inventors have also realized that there are many ways for providing confirmation of an user input event to the user in form of a feedback. This feedback can be of provided in several forms: Surface vibration, taptic/haptic feedback to the contact surface, e.g. from an integrated feedback mechanism, acoustic confirmation, visual confirmation, e.g. with graphics or lighting, and the like.
Especially the interaction element could be designed as being invisible to the user upon activation. For example the outer surface of the interaction element can perform "hidden till lit" function until required to show presence of a user input area (i.e. a button) or a confirmation of an event. This way the interaction element can be hidden in a non-intrusive and aesthetic manner in a control element, hence in a motor vehicle.
It is particularly noted that the interaction element, and so the control element, does not require any mechanical movement of any external components for receiving inputs and providing outputs, i.e. feedbacks. In case there are any movements at all, especially due to one or more ultrasonic actuator(s), the moving parts are away from the user. This lead to a quite clean design and, hence, provide convenient operation.
Indeed, the design of such an interaction element allows the integration of the input and output functionality (along with the touch surface element) in one single element, especially in one single free form element. Consequently also the control element can be designed as a free form element. In both cases, preferably creation of any design shape and multi directional cutting and in-molding manufacturing processes of circuit elements is possible.
Alternatively or in addition it can be stated that the proposed schemes makes it possible to give multiple levels of inputs, when the input components are arranged in required 3D positions. There is no need for mechanical movement of any external part to accomplish this task. Upon having availability to have selection method, the same input area can be used for different functions altogether also with multiple level of input possibilities. This creates opportunity to remove several exclusively dedicated buttons and bring their functionalities to a single button or array of buttons, resulting into cost saving as well as efforts in production and integration.
The following drawings show aspects of the invention for improving the understanding of the invention in connection with some exemplary illustrations, wherein
Figure la shows an array of buttons according to the state of the art;
Figure lb shows a button according to the state of the art;
Figure 2a shows an illustration of the first surface haptic approach;
Figure 2b shows an illustration of the second surface haptic approach;
Figure 3 shows an illustration of an interaction element according to the first aspect of the invention in a first embodiment;
Figure 4a shows an illustration of a first gesture/touch event as input to the
interaction element of Figure 3;
Figure 4b shows an illustration of a first instant of the gesture/touch event of
Figure 4a;
Figure 4c shows an illustration of a second instant of the gesture/touch event of
Figure 4a along with a first haptic pattern;
Figure 4d Illustration of a third instant of the gesture/touch event of Figure 4a;
Figure 5a Shows an illustration of a second gesture/touch event as input to the
interaction element of Figure 3;
Figure 5b Shows an illustration of a first instant of the gesture/touch event of
Figure 5a; Figure 5c Shows an illustration of a second instant of the gesture/touch event of Figure 5a along with a second haptic pattern;
Figure 5d Shows an illustration of a third instant of the gesture/touch event of Figure
5a;
Figure 6a Shows an illustration of an interaction element according to the first aspect of the invention in a second embodiment;
Figure 6b Shows an illustration of an interaction element according to the first aspect of the invention in a third embodiment;
Figure 6c Shows an illustration of an interaction element according to the first aspect of the invention in a fourth embodiment;
Figure 7a Shows an illustration of an interaction element according to the first aspect of the invention in a fifth embodiment;
Figure 7b Shows an illustration of the different effective areas of the interaction
element of Figure 7a;
Figure 8 Shows an illustration of an interaction element according to the first aspect of the invention in a sixth embodiment;
Figure 9a Shows an illustration of an interaction element according to the first aspect of the invention in an seventh embodiment;
Figure 9b Shows an illustration of the interaction element of Figure 9a in a first
operation state;
Figure 9c Shows an illustration of the different effective areas of the interaction
element of Figure 9b;
Figure 9d Shows an illustration of the interaction element of Figure 9a in a second operation state;
Figure 9e Shows an illustration of the different effective areas of the interaction
element of Figure 9d;
Figure 10 Shows an illustration of an interaction element according to the first aspect of the invention in an eighth embodiment;
Figure lla-c Different realizations of an interaction element according to the first aspect of the invention;
Figures 12a-c Shows an illustration of a sensor matrix and detection states;
Figures IBa-f Shows an illustration of detection states; Figure 14a Shows an illustration of an interaction element according to the first aspect of the invention in a ninth embodiment in a first variant;
Figure 14b Shows an illustration of an interaction element according to the first aspect of the invention in a ninth embodiment in a second variant;
Figure 14c Shows an illustration of an interaction element according to the first aspect of the invention in a tenth embodiment;
Figure 15a shows an illustration of a control element according to the second aspect of the invention in a first embodiment; and
Figure 15b shows an illustration of a control element according to the second aspect of the invention in a second embodiment.
Figure 3 shows an illustration of an interaction element 101 according to the first aspect of the invention in a first embodiment. The interaction element 101 comprises a touch surface element 103. The touch surface element 103 has two input area elements 105 which are, respectively, adapted to receive touch and/or gesture events inputs issued by a finger when interacting with at least one part of at least one surface of the input area element 105. The touch surface element 103 has further one output area element 107 which is adapted to provide a tactile feedback output to the finger when interacting with at least one part of at least one surface of the output area element 107.
The input area elements 105 and output area element 107 are formed as one at least in certain areas, so that the haptic feedback area (i.e. output area element 107) is directly integrated in the touch surface element 103 along with the input area element 105. In other words, the interaction element 101 provides for an edge detection for rolling over multiple surfaces by using a touch sensor (i.e. interaction element 101) with an haptic/tactile feedback. In this embodiment the output area element 107 is an extended area as indicated by the dark area 107. In other embodiments the output area element might be established only by the edge between both input area elements 105 separated from each other by the edge as the output area element 107.
Of course, the shape of the interaction element 101 is just one example but every other shape might be possible as well. Figure 4a shows an illustration of a first gesture/touch event as input to the interaction element 101 of Figure 3, the gesture/touch event being a "slide up". A finger 109 swipes up for the purpose of issuing a "roll up" command of a window controller, as indicated by the dashed line in Figure 4a which is indicated with a "1 in a circle". I.e. the finger swipes across the input area element 105 (as shown in Figure 4b which shows an illustration of a first instant of the gesture/touch event of Figure 4a), the output area element 107 (as shown in Figure 4c which shows an illustration of a second instant of the gesture/touch event of Figure 4a) and the other input area element 105 (as shown in Figure 4d which shows an illustration of a third instant of the gesture/touch event of Figure 4a). For the purpose of issuing an "automatic roll up" command of the window controller, firmly after swiping, the finger 109 is pushed as illustrated by the arrow indicated with a "2 in a circle" in Figure 4a.
In Figure 4c in addition a first haptic pattern 111 is shown which illustrates the haptic / tactile effect which the Finger 109 expires while moving along the edge designed as output area element 107. It feels like many, closely successive peaks. Of course, the haptic pattern 111 is only for the purpose of illustration and other haptic patterns might be possible as well.
Additionally or alternatively it is preferably possible: In course of the first instance of the gesture/touch event it might be possible to locate the area of first touch and/or to measure the distance to the haptic feedback area (i.e. the output area element 107). In course of the second instance of the gesture/touch event it might be possible to trace the direction of the slide and/or to enable the feedback pattern when finger crosses the haptic feedback area (i.e. the output area element 107). In addition also a lighting as a secondary confirmation might be activated. In course of the first instance of the gesture/touch event it might be possible to perform a function based on the gesture, especially once the finger 109 rests in the final area (i.e. the input area element 105).
Figure 5a shows an illustration of a second gesture/touch event as input to the interaction element 101 of Figure 3, the gesture/touch event being a "slide down". The finger 109 swipes down for the purpose of issuing a "roll down" command of a window controller, as indicated by the dotted-dashed line in Figure 5a which is indicated with a "1 in a circle". I.e. the finger swipes across the input area element 105 (as shown in Figure 5b which shows an illustration of a first instant of the gesture/touch event of Figure 5a), the output area element 107 (as shown in Figure 5c which shows an illustration of a second instant of the gesture/touch event of Figure 5a) and the other input area element 105 (as shown in Figure 5d which shows an illustration of a third instant of the gesture/touch event of Figure 5a). For the purpose of issuing a "automatic roll down" command of the window controller, firmly after swiping, the finger 109 is pushed as illustrated by the arrow indicated with a "2 in a circle" in Figure 5a.
In Figure 5c in addition a second haptic pattern 113 is shown which illustrates the haptic / tactile effect which the Finger 109 expires while moving along the edge designed as output area element 107. It feels like many, more distant (compared to the first haptic pattern 111) successive peaks. Of course, the second haptic pattern 113 is only for the purpose of illustration and other haptic patterns might be possible as well.
Additionally or alternatively it is preferably possible: In course of the first instance of the gesture/touch event it might be possible to locate the area of first touch and/or to measure the distance to the haptic feedback area (i.e. the output area element 107). In course of the second instance of the gesture/touch event it might be possible to trace the direction of the slide and/or to enable the feedback pattern when finger crosses the haptic feedback area (i.e. the output area element 107). In addition also a lighting as a secondary confirmation might be activated. In course of the first instance of the gesture/touch event it might be possible to perform a function based on the gesture, especially once the finger 109 rests in the final area (i.e. the input area element 105).
The examples described by way of Figures 4a-d and Figures 5a-d demonstrate that different haptic patterns (e.g. first and second haptic patterns 111 and 113) can be used for different slide directions.
Especially it is emphasized that preferably no additional layer is required for the haptic feedback but both, the input area element(s) and the output area element(s), are formed as one. In other words the same touch surface element (e.g. touch sensor layer) can be used for input as well as output.
Figure 6a shows an illustration of an interaction element according to the first aspect of the invention in a second embodiment. Features which functionally correspond as far as possible to those of the first embodiment of interaction element 101 are provided with the same reference signs, however, single dashed. Since the functionality of the second embodiment of interaction element 101' largely corresponds to the first embodiment of the interaction element 101, only differences between the first and second embodiments are discussed below. And besides, the explanations given above apply for the second embodiment and the respective Figure accordingly.
Interaction element 101' has a flat design in contrast to the curved design of interaction element 101. In other words, the edge detection method is equally applicable on SD as well as flat surfaces. There is no additional change in method due to change in geometry.
Figure 6b shows an illustration of an interaction element according to the first aspect of the invention in a third embodiment. Features which functionally correspond as far as possible to those of the first embodiment of interaction element 101 and second embodiment of interaction element 101' are provided with the same reference signs, however, doubled dashed. Since the functionality of the third embodiment of interaction element 101'' largely corresponds to the first embodiment of the interaction element 101 and the second embodiment of the interaction element 101', only differences between the third and first and second embodiments are discussed below. And besides, the explanations given above apply for the third embodiment and the respective Figure accordingly.
Interaction element 101'' has a free form design which is a more advanced design than that of interaction element 101 and interaction element 101'. Here, a natural feeling of edge detection is possible.
Figure 6c shows an illustration of an interaction element according to the first aspect of the invention in a fourth embodiment. Features which functionally correspond as far as possible to those of the first embodiment of interaction element 101, the second embodiment of interaction element 101' and the third embodiment of interaction element 101'' are provided with the same reference signs, however, triple dashed. Since the functionality of the fourth embodiment of interaction element 101''' largely corresponds to the first embodiment of the interaction element 101, the second embodiment of the interaction element 101' and the third embodiment of interaction element 101'', only differences between the fourth and first, second and third embodiments are discussed below. And besides, the explanations given above apply for the fourth embodiment and the respective Figure accordingly.
Interaction element 101"' is especially similar to interaction element 101. However, interaction element 101"' further comprises a light guide 115'", for illuminating through the input area elements 105'" and output area element 107'" from beneath. The interaction element 101'" further comprises a light source 117'" adapted for coupling light into the light guide 115'", optionally at least one printed wire board 119'", the light source 117'" being mounted at least partly on the printed wire board 119'", and optionally a tactile feedback device 121'" for generating a tactile feedback to the user interacting with the interaction element 101'". By means of the tactile feedback device 121'" a further feedback mechanism is provided in addition to the output area element 107'". The tactile feedback device 121'" for example might provide feedback for prompting the success or failure of some operational command after completion of the touch/gesture input of the user.
Figure 7a shows an illustration of an interaction element 201 according to the first aspect of the invention in a fifth embodiment. Insofar interaction element 201 has static borderlines, where the places of the borderlines are fixed and do not move. Interaction element 201 comprises a touch surface element having a first plurality (i.e. six) of input area elements 203 and a second plurality (i.e. five) of output area elements 205. Each input area element 203 comprises a first electrode 207 and each output area element 205 comprises a second electrode 209. The first electrodes 207 are for sensing (i.e. receiving touch/gesture inputs) and the second electrodes 209 are for providing tactile feedback outputs. Further, each input area element 203 and each output area element 207 comprise, respectively, a first and second surface layer element. All surface layer elements are formed as one common surface layer element 211. Hence, Figure 7a shows more details of an interaction element than the previous Figures. In particular it becomes even more clear that the input area element 203 and the output area element 207 are formed as one in certain areas is to be understood here with respect to the first and second surface layer element which is presented as the common surface layer element 211.
Figure 7b shows an illustration of the different effective areas of the interaction element 200 of Figure 7a, i.e. the presentation of the interaction element 201 from a user's view. There are areas 213 working as touch buttons and there are areas 215 working as virtual borderline, i.e. providing tactile feedback. Areas 213 and areas 215 are alternating from left to right. In other words, each touch button 213 has a dedicated electrode 207 for touch sensing and each virtual borderline 215 (i.e. where the finger feels the borderline) has a dedicated electrode 209 to induce the electrostatic field.
However, since electrostatic induction needs to be very close with the surface, in case the surface layer element 211 is thick (e.g. more than three micron) it might be advantageous using an advanced approach.
Figure 8 shows an illustration of an interaction element 217 according to the first aspect of the invention in a sixth embodiment. Essentially it is based on the interaction element 201 and develops it further, hence, features which functionally correspond as far as possible to those of the fifth embodiment of interaction element 201 are provided with the same reference signs.
However, in interaction element 217 the second surface layer element comprises a plurality of conductors 219 for capacitive coupling. The conductors 219 are arranged within the second surface layer element 211, parallel to the top surface of the second surface layer element 211 and parallel to each other. In other words, the solution approach is done by electrostatic coupling in multiple layers of conductive layers 219.
Figure 9a shows an illustration of an interaction element 301 according to the first aspect of the invention in a seventh embodiment. Insofar interaction element 301 represents dynamic borderlines, where the places of the borderlines may change dependent on external and/or internal conditions such as display content, physical location of the interaction element, history of touch/gesture events and/or history of commands issued by user inputs, and the like. Interaction element 301 comprises a segmented electrode 302a where input and output areas can be interchangeably controlled, especially by means of an touch controller. In other words, each electrode can be used as both, first electrode and second electrode, dependent on the configuration, hence, the distribution of input area elements and output area elements can be chosen nearly arbitrarily in interaction element 301. Also a common surface layer element 302b is present (see also description with respect to Figure 7a and element 211 above for details which apply here mutatis mutandis, too), especially as part of the touch surface element.
Figure 9b shows an illustration of the interaction element 301 of Figure 9a in a first operation state. Here, interaction element 301 comprises a touch surface element having a first plurality (i.e. six) of input area elements 303 and a second plurality (i.e. five) of output area elements 305. Each input area element 303 comprises three electrodes working as first electrodes and each output area element 305 comprises one electrode working as second electrode.
Figure 9c shows an illustration of the different effective areas of the interaction element 301 of Figure 9b, i.e. the presentation of the interaction element 301 from a user's view.
There are areas 307 working as touch buttons and there are areas 309 working as virtual borderline, i.e. providing tactile feedback. Areas 307 and areas 309 are alternating from left to right.
Consequently, during the final operation the interaction element 201 with static borderlines is quite similar to the interaction element 301 in the first operation state. However, the number of electrodes may vary. Nevertheless, to a wide extend the explanations given above with respect to interaction element 201 apply mutatis mutandis also to the interaction element 301 in the first operation state.
Figure 9d shows an illustration of the interaction element 301 of Figure 9a in a second operation state. Here, interaction element 301 comprises a first plurality (i.e. five) of input area elements 303 and a second plurality (i.e. four) of output area elements 305. Each input area element 303 comprises four or three electrodes working as first electrodes and each output area element 305 comprises one electrode working as second electrode.
Figure 9e shows an illustration of the different effective areas of the interaction element 301 of Figure 9d, i.e. the presentation of the interaction element 301 from a user's view. The explanations given above with respect to Figure 9c apply here mutatis mutandis, too, and must, therefore, not being repeated here.
Consequently, the number of buttons and borderlines and their locations and extensions are might be controlled in a quite efficient and easy way. Figure 10 shows an illustration of an interaction element 401 according to the first aspect of the invention in an eighth embodiment. The interaction element 401 comprises a touch surface element having a common surface layer element 403 and a plurality of first and second electrodes 405 arranged below the surface layer element 403, hence, realizing respective input area elements and output area elements. Especially, the electrodes are realized as transparent conductives. In addition, interaction element 401 comprises a light guide 407 and a light source 409 which couples light into the light guide 407. The lighting segment approach may be taken to improve sensory feedback on top of haptics feedback of the respective button (i.e. output area element).
Figures lla-c show further different realizations of an interaction element according to the first aspect of the invention. Especially it can be taken from the examples that lighting segment approach may be taken to improve sensory feedback on top of haptics feedback of the respective button (i.e. output area element). In Figure 11a the output area elements are highlighted by rectangles. Figures lib and 11c illustrate by way of example the possibility to design the input/output elements free-form like. All the realizations have in common that they are designed as one with respect to a common surface, i.e. for example with respect to the first and/or second surface layer elements.
Figure 12a shows an illustration of a sensor matrix. In other words, Figure 12a shows a segmented representation of the surface of the interaction element and/or the touch surface element of Figure 3, 4a-d, 5a-5d, 6c. Of course, the principle applies to any other surface geometry mutates mutandis. According to Figure 12a, the surface is segmented into 18 rectangles addressed by columns yl through y3 and rows xl thorugh x6. The segmentation might be achieved for example by a respective arrangement of first electrodes (for input) or second electrodes (for output). In Figure 12b a finger is touching the surface within the rectangle with address (x2; y2), see Figure 12a.
To realize the application of the interaction elements as described above, the touch controller needs to distinguish between at least / for example the "finger-on" state and the "finger- push" state. This might be possible by using an evaluation of the capacitive change ("DELTA C") value monitored during "finger-on" to detect "finger-push". Figure 12c shows respective detection states. The content of this Figure might especially be read in conjunction with the gesture/touch event described with respect to Figures 4a-d and Figures 5a-d above.
This detection and evaluation requires preferably only a minimum touch sensor matrix (see Figure 12a-b) and comes along with a simple implementation since e.g. a simple self- capacitive type sensor might be implemented.
Furthermore, detection and evaluation can also incorporate both, capacitive change ("DELTA C") on target sensor and relative sensor value change on neighboring sensors. In other words, if the user touches a certain area of a certain the input/output area element, also adjacent and/or neighboring input/output area elements expires a capacitive change or a touch/gesture event.
Figures IBa-f shows different plots of detection states of the segmented matrix of Figures 12a-b for "no touch" (i.e. baseline; Figures IBa-b), for "finger on" (Figures 13c-d) and for "finger push" (Figures 13e-f) in, respectively, a perspective view and a top view. It can be taken from these plots (see Figures 12c-f) that there is a change on both, the peak value and the surrounding sensor values corresponding to a "finger-on" event and a "finger-push" event. The content of these plots might especially be read in conjunction with the gesture/touch event described with respect to Figures 4a-d and Figures 5a-d above.
Figure 14a shows an illustration of an interaction element 501 according to the first aspect of the invention in a ninth embodiment in a first variant. On the left side of Figure 14a the entire interaction element 501 is shown in a perspective view. On the right side of Figure 14a a part of the interaction element 501 is shown in a frontal view.
Interaction element 501 comprises a touch surface element having seven input area elements 503 and six output area elements 505. The output area elements 505 in turn each comprises a protrusion 507 as tactile/haptic feedback element.
Figure 14b shows an illustration of an interaction element 601 according to the first aspect of the invention in the ninth embodiment in a second variant. On the left side of Figure 14b the entire interaction element 601 is shown in a perspective view. On the right side of Figure 14b a part of the interaction element 601 is shown in a frontal view. Interaction element 601 comprises seven input area elements 603 and six output area elements 605. The output area elements 605 in turn each comprises a recess 607 as tactile/haptic feedback element.
Of course, output area elements such as 505 and 605 are preferably feasible for static situations, i.e. where the output area elements do not move dynamically. Independent of the approach used for segments tactile feedback to the finger (i.e. mechanical approach or surface haptic approach via electrostatic or ultrasonic), both solutions may or may not use lighting segments for visual feedback in addition, too, or they may be used together.
Figure 14c shows an illustration of an interaction element according to the first aspect of the invention in a tenth embodiment. The interaction element shown in Figure 14c is quite similar to the interaction element shown in Figure 6c and described in more detail above. Therefore, features of interaction element of Figure 14c which functionally correspond as far as possible to those of interaction element of Figure 6c are provided with the same reference signs.
Further, due to the similarity, only differences between the two interaction elements need to be discussed. Insofar, the interaction element 101"' of Figure 14c comprises an output area element 107"' which represents an edge. In other words, the edge which separates the two input area elements 105'" from each other represents the output area element 107'".
The output area elements provide tactile feedback either by its mechanical shape, as shown in figures 14a, 14b, 14c or using active surface haptic such as surface electrostatic, as shown in figure 7, 9 and 10, or ultrasonic, as shown in figure 2.
Figure 15a shows an illustration of a control element 701 according to the second aspect of the invention in a first embodiment. This is an example of a two-dies populated control element 701 with interaction elements 703. All interaction elements 703 are non-moving (solid state) and have touch-sensitive surfaces as described above. This interaction elements 703 can be used for the functions roll-up (swipe-bottom + back action), automatic roll-up (swipe-bottom + back and push), roll-down (swipe rear + down action), automatic roll-down (swipe rear + down and push action). In addition for lock window operation a long touch to lock is proposed. Figure 15b shows an illustration of a control element 801 according to the second aspect of the invention in a second embodiment. The control element 801 is a multi-sensor type which can be regarded as as multifunction touch button object and which is grasped by a user's hand 803. In other words, control element 801 is a "central touch thing". This "thing" has the following optional features and advantages:
Multifunctional: same module for multiple applications (window glass movement, ventilation control, seat control and so on...) - selection based on touch methods.
Multilevel : several touch patterns.
Blind operation: natural 3D shape, surface texture, feedback mechanism.
Feedback : vibration, tacticle & haptic, audio & visual confirmation, lighting.
No mechanical movement.
Physical : Any shape, On surface / In-mold touch & lighting, Any surface (Hidden till lit) - (same surface as other interior).
Placement: Door panel, Seat panel, Dashboard, Center console, Hand-held remote.
Multiple variant: with/without display/lighting.
These advantages also apply mutatis mutandis optionally also to all other control elements and interaction elements described above, where appropriate.
Even if it may not be stated every times explicitly in the explanations above, it is clear that the interaction elements are especially for receiving touch and/or gesture events inputs and/or providing tactile feedback outputs, that the input area elements are adapted to receive touch and/or gesture events inputs issued by a finger when interacting with at least one part of at least one surface of the respective input area element and that the output area elements are adapted to provide a tactile feedback output to the finger when interacting with at least one part of at least one surface of the respective output area element. It is further clear that especially all control elements are for controlling the operation of at least one function of at least one device. The features disclosed in the claims, the specification, and the drawings maybe essential for different embodiments of the claimed invention, both separately or in any combination with each other.
Reference Signs
Panel
Button
Button
Surface layer element
Electrode
Finger
Substrate element
Substrate element
Layer
Finger
, 101', 101”, 101'” Interaction element
, 103', 103”, 103'” Touch surface element, 105', 105”, 105'” Input area element
, 107', 107”, 107'” Output area element
, 109', 109'” Finger
Haptic pattern
Haptic pattern
'” Light guide
Light source
'" Wire board
Tactile feedback device Interaction element
Input area element
Output area element
Electrode
Electrode
Common surface layer element
Area
Area
Interaction element Conductors
Interaction elementa Electrode
b Common surface layer element
Input area element
Output area element
Area
Area
Interaction element
Common surface layer element
Electrodes
Light guide
Light source
Interaction element
Input area element
Output area element
Protrusion
Interaction element
Input area element
Output area element
Recess
Control element
Interaction element
Control element
Hand

Claims

Claims
1. An interaction element (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703), for receiving touch and/or gesture events inputs and/or providing tactile feedback outputs, comprising at least one touch surface element (103, 103', 103”, 103'”), the touch surface element (103, 103', 103”, 103'”) having
(a) at least one input area element (105, 105', 105”, 105'”, 203, 303, 503, 603) at least adapted to receive touch and/or gesture events inputs issued by at least one finger (109, 109', 109'”) when interacting with at least one part of at least one surface of the input area element (105, 105', 105”, 105'”, 203, 303, 503, 603) and/or the touch surface element (103, 103', 103”, 103'”); and
(b) at least one output area element (107, 107', 107”, 107'”, 205, 305, 505, 605) at least adapted to provide a tactile feedback output to the finger (109, 109', 109'”) when interacting with at least one part of at least one surface of the output area element (107, 107', 107”, 107”', 205, 305, 505, 605) and/or the touch surface element (103, 103', 103”, 103'”); wherein the input area element (105, 105', 105”, 105'”, 203, 303, 503, 603) and the output area element (107, 107', 107”, 107'”, 205, 305, 505, 605) are formed as one, characterized in that at least one, preferably all, input area element(s) can, especially by means of the touch controller, interchangeably and/or simultaneously also be adapted to provide a tactile feedback to the finger (109, 109', 109'”) when interacting with at least one part of at least one surface of the input area element (105, 105', 105”, 105”', 203, 303, 503, 603) and/or the touch surface element (103, 103', 103”, 103'”), especially by interchanging use of the first electrode(s) (207) as second electrode(s) (209).
2. The interaction element (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703) according to claim 1, characterized in that the interaction element (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703) comprises a first plurality of input area elements (105, 105', 105”, 105'”, 203, 303, 503, 603) and a second plurality of output area elements (107, 107', 107”, 107”', 205, 305, 505, 605), wherein at least two neighboring input area elements (105, 105', 105”, 105'”, 203, 303, 503, 603), especially all of each two neighboring input area elements (105, 105', 105”, 105'”, 203, 303, 503, 603), are at least in certain areas and/or at least partly separated by each other by at least one of the second plurality of output area elements (107, 107', 107”, 107'”, 205, 305, 505, 605) .
3. The interaction element (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703) according to any of the preceding claims characterized in that the input area element (105, 105', 105”, 105'”, 203, 303, 503, 603) comprises
at least one first surface layer element, especially the first surface layer element comprises and/or represents at least one insulator layer,
at least one first electrode (207) and/or
at least one first substrate element, preferably comprising at least one glass substrate,
wherein the first electrode (207) is arranged directly or indirectly below the first surface layer element and/or the first substrate element is arranged directly or indirectly below the first electrode (207) element,
especially the respective elements are sandwich-like arranged, especially with the first electrode (207) being arranged at least in certain areas and/or at least partly between the first surface layer element and the first substrate element, the first electrode (207) is arranged directly or indirectly below the first surface layer element and/or the first substrate element is arranged directly or indirectly below the first electrode (207) element.
4. The interaction element (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703) according to any of the preceding claims characterized in that the output area element (107, 107', 107”, 107”', 205, 305, 505, 605) comprises
at least one second surface layer element, especially the second surface layer element comprises and/or represents at least one insulator layer,
at least one second electrode (209) and/or at least one second substrate element, preferably comprising at least one glass substrate,
wherein the second electrode (209) is arranged directly or indirectly below the second surface layer element and/or the second substrate element is arranged directly or indirectly below the second electrode (209) element,
especially the respective elements are sandwich-like arranged, especially with the second electrode (209) being arranged at least in certain areas and/or at least partly between the second surface layer element and the second substrate element,.
5. The interaction element (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703) according to any of the preceding claims characterized in that the tactile feedback
(a) represents a haptic pattern (111, 113) dependent on the gesture/touch event input or
(b) comprises increasing the surface friction of at least one portion of at least one surface of the output area element, especially of the second surface layer element, when applying a voltage to at least one part of the output area element, especially to the second electrode (209).
6. The interaction element (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703) according to any of the preceding claims characterized in that the output area element (107, 107', 107”, 107'”, 205, 305, 505, 605) comprises at least one ultrasonic actuator, especially a plurality of ultrasonic actuators, coupled to or being in operative connection to the second substrate element, the second electrode (209) or the second surface layer element for building an air bearing layer adjacent to at least one portion of at least one surface of the output area element, especially of the second substrate element and/or second surface layer element, when the one or more ultrasonic actuator(s) is/are activated.
7. The interaction element (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703) according to any of the preceding claims characterized in that the first and/or second surface layer element comprises at least one conductor, preferably a plurality of conductors, preferably for capacitive coupling, especially the one or more conductor(s) are arranged at least in sections and/or at least partly (a) within the second surface layer element, (b) parallel to at least one surface, especially the top and/or bottom surface, of the second surface layer element and/or (c) parallel to each other.
8. The interaction element (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703) according to any of the preceding claims characterized in that
(a) at least some of the first and/or second surface layer elements, preferably all of the first and/or second surface layer elements, are formed as one common surface layer element (211, 302b, 403),
(b) at least some of the first and/or second substrate elements, preferably all of the first and/or second substrate elements, are formed as one common substrate element and/or
(c) at least some of the first and/or second electrodes, preferably all of the first and/or second electrodes, are formed as one common electrode, especially as at least one segmented electrode.
9. The interaction element (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703) according to any of the preceding claims characterized in that the output area element, especially the second surface layer element, comprises and/or represents at least one edge, at least one bump, at least one protrusion (507), at least one recess (607) and/or at least one detent, whereby especially the output area element, especially the second surface layer element, can be manufactured using a printing process, an injection molding process, a heat forming process and/or a grinding process.
10. The interaction element (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703) according to any of the preceding claims characterized in that at least one, preferably all, output area element(s) can, especially by means of at least one touch controller, interchangeably and/or simultaneously also be adapted to receive touch and/or gesture events issued by at least one finger (109, 109', 109'”) when interacting with at least one part of at least one surface of the output area element (107, 107', 107”, 107'”, 205, 305, 505, 605) and/or the touch surface element (103, 103', 103”, 103'”), especially by interchanging use of the second electrode(s) (209) as first electrode(s) (207).
11. The interaction element (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703) according to any of the preceding claims characterized in that the interaction element (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703) further comprises at least one light guide (115'”, 407), preferably for illuminating through the input area element(s) and/or output area element(s) from beneath, especially the light guide (115'”, 407) being arranged and/or extending at least in sections and/or in certain areas directly or indirectly below and/or parallel to the first and/or second electrode(s) (209) and/or below and/or parallel to the first and/or second substrate element(s).
12. The interaction element (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703) according to any of the preceding claims characterized in that the interaction element (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703) further comprises
(a) at least one light source (117'”, 409), especially adapted for coupling light into a light guide (407),
(b) at least one printed wire board (119'”), preferably the light source (117'”, 409) and/or at least one element of the group comprising first/second surface layer element, first/second electrode (207, 209) and first/second substrate element being mounted at least partly on the printed wire board (119'”), and/or
(c) at least one tactile feedback device (121'”) for generating a tactile feedback to the user interacting with the interaction element (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703).
13. The interaction element (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703) according to any of the preceding claims characterized in that the interaction element (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703), especially the touch surface element (103, 103', 103”, 103'”), is designed as a free form element.
14. The interaction element (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703) according to claim 14, characterized by having at least in certain areas a curved surface and/or having at least one first area extending in at least one first plane and at least one second area extending in at least one second plane.
15. The interaction element (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703) according to claim 15, characterized in that the second plane being angled with respect to the first plane, preferably the angle is between 115° an 155°.
16. The interaction element (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703) according to any of the preceding claims characterized in that the input area element (105, 105', 105”, 105'”, 203, 303, 503, 603) is designed such that a touch state can be detected, especially such that at least a finger-on state can be distinguished from a finger-push state,
by means of evaluation of the capacitive change value of the input area element (105, 105', 105”, 105'”, 203, 303, 503, 603) selected by a user during interaction of the user's finger (109, 109', 109'”) and said input area element (105, 105', 105”, 105'”, 203, 303, 503, 603) and/or
by means of evaluation of the capacitive change value of input area elements (105, 105', 105”, 105”', 203, 303, 503, 603) adjacent and/or neighboring to the selected input area element (105, 105', 105”, 105'”, 203, 303, 503, 603) during interaction of the user's finger (109, 109', 109'”) and the selected input area element (105, 105', 105”, 105'”, 203, 303, 503, 603) and the input area elements (105, 105', 105”, 105”', 203, 303, 503, 603) adjacent and/or neighboring to said input area element.
17. A control element (701, 801), for controlling the operation of at least one function of at least one device, comprising at least one or more, especially different, interaction element(s) (101, 101', 101”, 101”', 201, 301, 401, 501, 601, 703) of any one of the claims 1 to 16.
18. A motor vehicle comprising at least one or more, especially different, interaction element(s) (101, 101', 101”, 101'”, 201, 301, 401, 501, 601, 703) of any one of the claims 1 to 16 and/or at least one or more, especially different, control element(s) (701, 801) of claim 17.
PCT/EP2020/050140 2019-01-11 2020-01-06 Interaction element, control element and motor vehicle WO2020144138A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/420,897 US11822723B2 (en) 2019-01-11 2020-01-06 Interaction element, control element and motor vehicle
EP20700190.0A EP3908909A1 (en) 2019-01-11 2020-01-06 Interaction element, control element and motor vehicle

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019100657.0 2019-01-11
DE102019100657 2019-01-11

Publications (1)

Publication Number Publication Date
WO2020144138A1 true WO2020144138A1 (en) 2020-07-16

Family

ID=69143598

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/050140 WO2020144138A1 (en) 2019-01-11 2020-01-06 Interaction element, control element and motor vehicle

Country Status (3)

Country Link
US (1) US11822723B2 (en)
EP (1) EP3908909A1 (en)
WO (1) WO2020144138A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022116435A1 (en) 2022-06-30 2024-01-04 Audi Aktiengesellschaft System for controlling functions

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130106765A1 (en) * 2011-10-26 2013-05-02 Nokia Corporation Apparatus and Associated Methods
EP2778855A2 (en) * 2013-03-15 2014-09-17 Immersion Corporation User interface device
US9063610B2 (en) * 2012-08-23 2015-06-23 Lg Electronics Inc. Display device and method for controlling the same
EP2889727A1 (en) * 2013-12-31 2015-07-01 Immersion Corporation Friction augmented controls and method to convert buttons of touch control panels to friction augmented controls
DE102014008040A1 (en) 2014-05-28 2015-12-03 GM Global Technology Operations LLC (n. d. Gesetzen des Staates Delaware) Motor vehicle input / output device and method
DE202017101606U1 (en) 2016-04-05 2017-07-07 Google Inc. Computer equipment with wiping surfaces

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101529916B1 (en) * 2008-09-02 2015-06-18 엘지전자 주식회사 Portable terminal
CN211293787U (en) * 2018-08-24 2020-08-18 苹果公司 Electronic watch

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130106765A1 (en) * 2011-10-26 2013-05-02 Nokia Corporation Apparatus and Associated Methods
US9063610B2 (en) * 2012-08-23 2015-06-23 Lg Electronics Inc. Display device and method for controlling the same
EP2778855A2 (en) * 2013-03-15 2014-09-17 Immersion Corporation User interface device
EP2889727A1 (en) * 2013-12-31 2015-07-01 Immersion Corporation Friction augmented controls and method to convert buttons of touch control panels to friction augmented controls
DE102014008040A1 (en) 2014-05-28 2015-12-03 GM Global Technology Operations LLC (n. d. Gesetzen des Staates Delaware) Motor vehicle input / output device and method
DE202017101606U1 (en) 2016-04-05 2017-07-07 Google Inc. Computer equipment with wiping surfaces

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022116435A1 (en) 2022-06-30 2024-01-04 Audi Aktiengesellschaft System for controlling functions
DE102022116435B4 (en) 2022-06-30 2024-05-02 Audi Aktiengesellschaft System for controlling functions

Also Published As

Publication number Publication date
US11822723B2 (en) 2023-11-21
US20220066560A1 (en) 2022-03-03
EP3908909A1 (en) 2021-11-17

Similar Documents

Publication Publication Date Title
US7863822B2 (en) Operating element for a vehicle
EP2467760B1 (en) Keyless entry assembly having capacitance sensor operative for detecting objects
US9740324B2 (en) Vehicle accessory control interface having capacitive touch switches
EP3783802B1 (en) Control system for vehicle interior
CN104750309A (en) Friction augmented controls and method to convert buttons of touch control panels to friction augmented controls
US20120032915A1 (en) Operating element for actuation by a user and operating element module
EP3631609B1 (en) Capacitive sensor device
EP3017545A1 (en) Capacitive touch panel
EP3690612B1 (en) Vehicle interior panel with shape-changing surface
US10938391B2 (en) Device having a touch-sensitive free-form surface and method for the production thereof
WO2014143675A1 (en) Human machine interfaces for pressure sensitive control in a distracted operating environment and method of using the same
JP2016129135A (en) Multi-stage switch
US20140028614A1 (en) Portable terminal having input unit and method of driving the input unit
CN113002446A (en) Shape-changeable operating unit and method for controlling vehicle functions
US11822723B2 (en) Interaction element, control element and motor vehicle
CN106547349B (en) Touch sensitive device with haptic feedback
CN114555403A (en) Operating system for a vehicle, motor vehicle having an operating system, and method for operating an operating system for a vehicle
EP4025448A1 (en) Operating system for a vehicle and method for operating an operating system for a vehicle
WO2023157966A1 (en) Switch device and user interface device
KR102178052B1 (en) A key-pad with the hybrid switch
CN110709274B (en) Actuating element for a vehicle and method for actuating an element
US20240103674A1 (en) Human-machine interface device and vehicle comprising such a human-machine interface device
CN115599201A (en) Guiding device for guiding a hand of a user of a vehicle and method for operating a guiding device
EP3299939B1 (en) Input apparatus with touch sensitive input device
CN117178244A (en) Haptic feedback based on electroadhesion on three-dimensional surfaces of user controls

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20700190

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020700190

Country of ref document: EP

Effective date: 20210811