EP3947009A1 - Procédé et dispositif de détection d'une valeur de paramètre dans un véhicule - Google Patents

Procédé et dispositif de détection d'une valeur de paramètre dans un véhicule

Info

Publication number
EP3947009A1
EP3947009A1 EP20713201.0A EP20713201A EP3947009A1 EP 3947009 A1 EP3947009 A1 EP 3947009A1 EP 20713201 A EP20713201 A EP 20713201A EP 3947009 A1 EP3947009 A1 EP 3947009A1
Authority
EP
European Patent Office
Prior art keywords
gesture
actuation
parameter value
area
input gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20713201.0A
Other languages
German (de)
English (en)
Inventor
Dirk PAPENDIECK
Oliver Jungeblut
Stefan Brosig
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Volkswagen AG
Original Assignee
Volkswagen AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Volkswagen AG filed Critical Volkswagen AG
Publication of EP3947009A1 publication Critical patent/EP3947009A1/fr
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60HARRANGEMENTS OF HEATING, COOLING, VENTILATING OR OTHER AIR-TREATING DEVICES SPECIALLY ADAPTED FOR PASSENGER OR GOODS SPACES OF VEHICLES
    • B60H1/00Heating, cooling or ventilating [HVAC] devices
    • B60H1/00642Control systems or circuits; Control members or indication devices for heating, cooling or ventilating devices
    • B60H1/00985Control systems or circuits characterised by display or indicating devices, e.g. voice simulators
    • B60K35/10
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03KPULSE TECHNIQUE
    • H03K17/00Electronic switching or gating, i.e. not by contact-making and –breaking
    • H03K17/94Electronic switching or gating, i.e. not by contact-making and –breaking characterised by the way in which the control signals are generated
    • H03K17/96Touch switches
    • H03K17/962Capacitive touch switches
    • H03K17/9622Capacitive touch switches using a plurality of detectors, e.g. keyboard
    • B60K2360/111
    • B60K2360/115
    • B60K2360/119
    • B60K2360/139
    • B60K2360/1434
    • B60K2360/146
    • B60K2360/1472
    • B60K2360/332
    • B60K2360/34
    • B60K2360/345
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03KPULSE TECHNIQUE
    • H03K2217/00Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00
    • H03K2217/94Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00 characterised by the way in which the control signal is generated
    • H03K2217/94052Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00 characterised by the way in which the control signal is generated with evaluation of actuation pattern or sequence, e.g. tapping
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03KPULSE TECHNIQUE
    • H03K2217/00Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00
    • H03K2217/94Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00 characterised by the way in which the control signal is generated
    • H03K2217/96Touch switches
    • H03K2217/96066Thumbwheel, potentiometer, scrollbar or slider simulation by touch switch
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03KPULSE TECHNIQUE
    • H03K2217/00Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00
    • H03K2217/94Indexing scheme related to electronic switching or gating, i.e. not by contact-making or -breaking covered by H03K17/00 characterised by the way in which the control signal is generated
    • H03K2217/96Touch switches
    • H03K2217/9607Capacitive touch switches
    • H03K2217/960785Capacitive touch switches with illumination

Definitions

  • the present invention relates to a method and a device for detecting a parameter value in a vehicle.
  • a large number of electronic devices are provided, the setting and operation of which must be made possible for the driver or another vehicle occupant.
  • These devices include, for example, an air conditioning unit through which air can be directed into certain areas of the vehicle interior and by means of which further elements such as seat heaters can also be controlled.
  • Other facilities include a navigation system,
  • Driver assistance systems as well as communication and multimedia units, such as one
  • Telephone system or devices for reproducing music and speech such as a radio or CD player.
  • set parameters must be easy to grasp for the user in order to be able to assess whether a currently set parameter value should be changed or whether suitable settings can be retained.
  • DE 10 2016 200 110 A1 discloses a device for operating a heating / air conditioning system of a means of transport, in which a finger groove is formed in an essentially flat surface, in which swiping gestures can be detected.
  • the present invention is based on the object of providing a method and a device of the type mentioned at the outset which enable the most versatile and quick operation possible.
  • an input gesture is detected in a detection area and the detected input gesture is assigned to a first or second gesture type. If the captured input gesture was assigned to the first gesture type, a
  • Determination of the actuation trajectory and the parameter value is changed by an amount dependent on a length of the actuation trajectory. If the recorded input gesture has been assigned to the second gesture type, a
  • Operating position is determined and a predetermined parameter value assigned to the operating position is set.
  • Parameter value relative to the current value, while an absolute value is recorded for an input gesture of the second gesture type is recorded for an input gesture of the second gesture type.
  • the method therefore advantageously combines the advantages of relative and absolute input options for parameter values.
  • the input gesture is only changed relative to the currently set parameter value. For example, a certain length of the actuation trajectory, for example when wiping along a section of a slider element with a certain length, is assigned a certain amount by which the parameter value is increased or decreased.
  • the set parameter is assigned directly to the actuation position. This means that actuation, for example of a slider element, at a specific actuation position leads directly to the setting of a specific parameter value. So there is an absolute correlation between the
  • the parameter can be set particularly precisely by setting it using relative operation, for example by changing it in several small steps.
  • the user by selecting the appropriate input gesture, the user can quickly and easily decide which type of operation appears to be particularly useful in the current situation.
  • an actuation object is used to carry out an input gesture, in particular the hand or a finger of the user.
  • the user's finger is the object to be actuated; however, the information refers to other things
  • an “input gesture” means a certain position of the
  • Input gestures can be designed in a manner known per se. They include, in particular, tapping gestures, swiping gestures and holding gestures as well as combinations of several such gestures, possibly executed immediately one after the other. Further gestures, for example combined with a rotary movement, can also be provided.
  • the gestures are carried out in the detection area, which in particular comprises a surface of a detection unit.
  • gestures can be carried out in a detection area of practically any configuration and, if necessary, detected with different detection methods.
  • a gesture can also be recorded in three-dimensional space.
  • a virtual operating object for example a virtual button
  • the gesture can be recorded in a spatial region in which the virtual operating object is generated, for example, by means of a projection or by virtual reality methods, and a gesture is recorded with reference to this virtual operating object.
  • a position, a gesture trajectory, a direction and / or a speed of a gesture are recorded for this purpose.
  • the detection of the gesture can, for example, be optical or electronic
  • Detection procedures take place; For example, a laser,
  • the input gesture includes, for example, a touch at a specific position within the detection area, an actuation position and duration being detected until the touch is released. It can further comprise a swiping gesture, a movement from a start to an end position being carried out during the touch.
  • touch describes an actuation trajectory, that is, a chronologically ordered and coherent sequence of actuation positions.
  • the parameter value is then set according to the entry.
  • Parameter value changed that is, a relative change is made.
  • a length is determined for the actuation trajectory, in particular the distance between the start and end position of an actuation, and a difference to the currently set parameter value is determined on the basis of the length.
  • the parameter value can be increased or decreased based on this amount, for example depending on a direction of the
  • Input gesture For example, with a curve from left to right, the parameter value can be increased and with the opposite curve, it can be decreased. Other changes depending on the direction, such as vertical or oblique, are also conceivable.
  • the user can easily determine how the parameter value should be changed.
  • different setting options for the parameter value can therefore be implemented.
  • a specific parameter can be selected directly by selecting an actuation position, while an input gesture of the first gesture type has a specific one
  • the captured input gesture can also be assigned to a third gesture type, wherein, if the captured input gesture has been assigned to the third gesture type, an actuation position is determined based on the captured input gesture and the parameter value is changed by an increment determined based on the actuation position.
  • a fixed, predetermined increment is added to or subtracted from the current parameter value with each actuation by an input gesture of the third gesture type.
  • Different sub-areas of the detection area can be used for addition and subtraction or different amounts of the increment
  • the user can use the same control element with a specific one
  • the first type of gesture comprises a swiping gesture.
  • the parameter value can thus advantageously be set particularly precisely and easily, but nevertheless quickly.
  • a direction, a speed and / or a distance is determined on the basis of the swiping gesture and the parameter value is changed as a function of the determined direction, speed and / or the determined distance.
  • a start and an end position of the swiping gesture are determined for this purpose, in which an actuation in
  • the detection area begins and ends, for example by touching and later releasing the touch; the distance and / or a direction between these positions can be determined. Furthermore, a time between the start and end position of the swiping gesture can be determined in order to then determine the speed of the swiping gesture.
  • the parameter value can be increased in one direction and decreased in another direction, for example by changing the amount of the change based on the distance between the start and end position and a difference value per unit of length.
  • the parameter value is then changed, for example, by a greater amount, the greater the certain distance between the start and end position.
  • the difference value can be determined as a function of the speed, for example a larger difference value at a higher speed;
  • another gesture type can be defined in this way, for example a “swipe” at a higher speed, which can lead, for example, to a larger change in the amount of the parameter value per unit of length.
  • the Change the parameter value when the speed of the swiping gesture is a
  • a threshold value for the distance and / or the speed can be defined, the wiping gesture leading to a predetermined change in the parameter value when the threshold value is exceeded.
  • a swipe can automatically lead to the setting of a maximum or minimum parameter value.
  • the second type of gesture includes a tap gesture. This can
  • the parameter value can thus advantageously be set particularly precisely without the user's attention having to be focused on the operating process for a long time.
  • a distinction between swiping and tapping gestures can be evaluated, in particular, a distance between the start and end positions of the touch or a maximum distance along an actuation trajectory. For example, the distance can be compared with a threshold value. If the distance exceeds the threshold value, the input gesture is a swiping gesture; if it falls below the threshold value, it is a tapping gesture.
  • the second type of gesture is depending on a duration
  • a start time and an end time of a touch are detected.
  • a contact duration can be determined on the basis of these points in time and compared with a threshold value. If the threshold value, for example 400 ms or 800 ms, is exceeded, it is an input gesture of the second gesture type; if it is not reached, it is, for example, a Input gesture of the third gesture type.
  • the second type of gesture corresponds to a hold or a so-called “long push” or “long press”, while the third type of gesture corresponds to a tap.
  • an input gesture of the first gesture type can be carried out in the entire detection area and always results in the same change in the parameter value regardless of the specific location of the actuation. This means that only the length of the actuation trajectory is evaluated, but not its position within the detection area.
  • two sub-areas on the left and right in the detection area are designed in such a way that a tap, that is, an input gesture of the third gesture type, leads to an incremental decrease or increase in the parameter value.
  • the detection area is designed in such a way that holding the actuation, that is to say an input gesture of the second gesture type, for directly setting a specific
  • Parameter value leads;
  • a sub-area arranged on the left becomes a minimum parameter value
  • a sub-area arranged on the right creates a maximum
  • Parameter value and a certain intermediate parameter value is set by a sub-area arranged in the middle.
  • the parameter value is changed in such a way that the parameter value assumes the next value in each case from an ordered series of setting values with each actuation.
  • This advantageously makes it possible to switch between predefined settings particularly quickly and easily. In particular, it is a so-called toggle switch.
  • the switchover can take place during a long stop, in particular to the next one after a certain time interval has elapsed
  • the detection area is a surface area
  • a touch-sensitive surface can advantageously be used particularly efficiently.
  • the detection area is designed in particular on a touch screen or in some other known manner.
  • capacitive or resistive sensors can be used to detect an actuation.
  • the detection area can be designed in a different way. For example, as a spatial area, in particular above a surface or in the vicinity of a control element.
  • the detection area has a longitudinal extent and a
  • the transverse extent runs perpendicular to the longitudinal extent and the longitudinal extent is at least twice, preferably three times that
  • the detection area is thus made elongated and therefore
  • It can therefore be, for example, a slider element through which a functionality of an analog slider is implemented.
  • acoustic feedback is generated. The user can thereby
  • the acoustic feedback can alternatively or additionally also be generated when the parameter value is changed.
  • the acoustic feedback is generated in a manner known per se. Different responses can be selected or generated dynamically, for example depending on the recognized gesture type, the set parameter value or other influencing variables.
  • the inventive device for capturing a parameter value in a vehicle comprises a capturing unit which has a capturing area and which is set up to capture an input gesture in the capturing area, and a control unit which is set up to assign the captured input gesture to a first or second gesture type .
  • the control unit is also set up, if the recorded input gesture was assigned to the first gesture type, based on the recorded input gesture a
  • the device according to the invention is designed in particular to implement the method according to the invention described above.
  • the device thus has the same advantages as the method according to the invention.
  • the device has a surface structure in the area of the touch-sensitive surface, in particular a depression or elevation.
  • a feeler aid is provided in this way, with the aid of which the user can detect the position and extent of the detection areas.
  • a tactilely detectable surface deformation can include a locally changed roughness.
  • a substantially punctiform elevation or depression can be formed on the surface, or an elongate depression or elevation can be provided.
  • a depression can also run in a straight line or along a curved line.
  • the device further comprises a sensor for detecting an approach to the touch-sensitive surface in a manner known per se. If such an approach is detected, the lighting elements can be controlled in such a way that they emit at least light of the basic intensity in order to display the position of the lighting elements to a user or to indicate an input option, for example.
  • the parameter value relates to a setting of a temperature, a fan or a media playback device of the vehicle.
  • Figure 1 shows a vehicle with an embodiment of the device according to the invention
  • FIG. 2 shows further details of the embodiment of the invention
  • Figures 3A to 3C show an embodiment of an output of a parameter value by means of a segment display
  • FIGS. 4A and 4B show an exemplary embodiment for setting parameter values by means of a slider element
  • FIGS. 5A to 5F show an exemplary embodiment for setting an air distribution by a fan
  • Figures 6A to 6C show a further embodiment for the setting of a
  • a vehicle 1 comprises a detection unit 2 which is coupled to a control unit 3.
  • An air conditioning unit 4 is also coupled to the control unit 3.
  • the detection unit 2 has a surface facing a user in the vehicle 1. Different symbols are arranged on this surface, some of which can be backlit by light sources, in particular LEDs. There are also areas with luminous surfaces that are covered by a layer of paint so that the luminous surfaces are essentially only visible to the user when they are actually illuminated, while they are practically invisible when the luminous surfaces are not illuminated. In particular, a display designed as a so-called black panel is used.
  • the surface of the detection unit 2 can be flat. You can also be
  • the detection unit 2 also comprises a film produced using the IML process (in-mold labeling) and back-injected with plastic. In the exemplary embodiment, it also includes
  • Sensor elements Sa, Sb, Sc which are designed here as capacitive sensor elements.
  • the sensor elements Sa to Sc are arranged behind the surface of the detection unit 2 in such a way that they are not visible to the user.
  • the sensor elements Sa to Sc are designed in a manner known per se such that they can detect an actuation by an actuating element.
  • they each have a detection area which, for example, comprises an area of the surface of the detection unit 2 and / or a spatial area arranged above the surface.
  • a finger of the user can be used as an actuating element.
  • Sensor elements Sa to Sc an actuation based on the entry of the actuating element into the detection area, based on a touch of a surface, based on its distance to a sensor element Sa to Sc, based on a movement in the detection area and / or based on a period of time during which the actuating element is detected. This actuation is then evaluated by the detection unit 2 and / or the control unit 3.
  • the sensor elements Sa to Sc are arranged equidistant from one another along a straight line.
  • a slide or slider element is implemented along this line. Its function is explained in detail below.
  • the detection unit 2 alternatively or additionally has touch-sensitive surface areas embodied in a different manner known per se. Through this, an actuation by an actuating element can be detected in a manner analogous to the mode of operation of the sensor elements Sa to Sc explained above.
  • the detection unit 2 when an actuation is detected, the detection unit 2 generates a control signal and transmits this to the control unit 3.
  • a parameter value can be set, whereby either the detection unit 2 itself already processes the input to such an extent that it assigns a specific parameter value to it, or the control unit 3 these
  • Processing of the input or control signal generated by the detection unit 2 takes over.
  • the detection unit 2 comprises luminous elements La, Lb, Lc, Ld, Le, which adjoin one another in the form of a segment display along a linear direction of extent are arranged.
  • the lighting elements La to Le can be controlled by the control unit 3 independently of one another.
  • the air conditioning unit 4 is formed in a manner known per se and, in the exemplary embodiment, includes, inter alia, a heater for the vehicle 1, seat heaters for the driver and front passenger seats, a steering wheel heater, window heaters and a fan for introducing air into the interior of the vehicle 1, where the direction, distribution, intensity and temperature of the incoming air can be adjusted.
  • FIG. 2 shows a view of the surface of the detection unit 2 facing the user in the interior of the vehicle 1.
  • This surface is designed essentially as a horizontally stretched rectangle.
  • Button elements 101, 102, 103, 104, 105, 106, 107 are arranged next to one another in the upper area. In the exemplary embodiment, these are designed as touch-sensitive surface areas which can be actuated by touching an actuating element, in particular a finger of the user.
  • the individual pushbutton elements 101 to 107 are assigned
  • touch-sensitive areas indicated by dashed lines. Within these areas, luminous areas are also formed, which can be activated by activating a
  • LED can be illuminated with light of a certain intensity and / or color, for example in order to output the status, the activity or a setting of a function assigned to the respective button element 101 to 107.
  • control unit 3 Acquisition unit 2 and, if necessary, the evaluation of signals acquired by acquisition unit 2 is carried out by control unit 3.
  • buttons 108, 109, 110, 111 are also formed. These are also with him.
  • additional functions can be called up, activated or set.
  • a menu display on a display in the vehicle 1 can be called up by pressing the button element 110 “MENU”.
  • the air conditioning unit 4 can be switched off by pressing the button element 111 “OFF”.
  • the air conditioning system of the air conditioning unit 4 of the vehicle 1 can be activated by means of the button element 109 “A / C”
  • An automatic mode of the air-conditioning unit 4 can be activated using the button element 108 “AUTO”.
  • the pushbutton elements 101 to 111 can be designed as mechanical switches, in particular pushbutton switches.
  • further buttons 101 to 111 can be designed as mechanical switches, in particular pushbutton switches.
  • Embodiments other functions can be provided alternatively or additionally.
  • Segment displays 115, 116 are arranged, which in the exemplary embodiment are suitable for outputting a two-digit temperature value with one decimal place. Furthermore, slider elements 112, 113, 114 for setting a temperature and a fan level are arranged here.
  • the respective adjustable functions of the air conditioning unit 4 are indicated by symbols on the surface.
  • the slider elements 112 to 114 each comprise a horizontal straight line of a certain length, along which a depression is formed on the surface of the detection unit 2. Behind it, covered by the surface, sensor elements Sa to Sc are indicated, through which a touch in the area of a slider element 112 to 114 can be detected, in particular a position of the touch and possibly a movement along the slider element 112 to 114 being detected.
  • the fan that is to say the fan slider 112
  • the fan slider 112 can be illuminated segment by segment by lighting elements La to Le arranged behind it.
  • FIGS. 3A to 3C an exemplary embodiment of an output of a set level of a fan of the air-conditioning unit 4 is explained by means of a segment display.
  • a segment display the embodiment of the device explained above with reference to FIGS. 1 and 2 is assumed.
  • Embodiment is arranged in particular in the area of the fan slider 112 and the control is carried out by the control unit 3.
  • seven luminous areas LED1 to LED7 are arranged alongside one another along a straight line and can be controlled independently of one another.
  • the number of illuminated luminous areas LED1 to LED7 corresponds to the activated level of the fan, that is, as many levels as luminous areas LED1 to LED7 are provided.
  • the fan of the air conditioning unit 4 is deactivated.
  • the graph shows the intensity of light emission on the Y-axis while the individual
  • Luminous areas LED1 to LED7 are assigned positions along the X-axis. None of the luminous areas shines, which is shown in the diagram in FIG. 3A by columns that are practically invisible.
  • Air conditioning unit 4 activated.
  • the graphs show the intensity of the light emitted by the illuminated areas LED1 to LED7 as a function of the position or of the respective illuminated area LED1 to LED7.
  • a night mode is activated
  • a day mode of the segment display is activated.
  • the first three light-emitting areas are controlled in such a way that they light up with 60% or 100% of a maximum intensity, while the remaining four light-emitting areas are controlled in such a way that they light up with only 10% or 20% of the maximum intensity. That is, im
  • “overlighting” from a brightly illuminated luminous area to an adjacent, unlit or less illuminated luminous area is concealed in that all luminous areas are illuminated with at least one basic intensity. Only the illuminated areas actually used for the display are illuminated with a higher display intensity. This means that all the light surfaces that are not used to indicate the blower level are evenly illuminated, instead of being illuminated at different intensities
  • the first three LEDs LED1, LED2, LED3 are operated with a first current for 60% of the maximum intensity, with overlighting leading to the fact that in the area of the two adjacent illuminated areas LED4 and LED5 already emitted a certain intensity becomes.
  • the LEDs of these luminous areas LED4 and LED5 are therefore only operated with a lower current than the LEDs of luminous areas LED6 and LED7 positioned further away, in order to achieve a uniform basic intensity of the LEDs LED4 to LED7.
  • the directly adjacent luminous area LED4 is operated with 5% and the further adjacent luminous area LED5 with 7%, while the more distant luminous areas LED6 and LED7 are operated with 10%.
  • other light sources in particular light into the light-emitting areas LED1 to LED7, can be taken into account and compensated for by suitable control of the LEDs.
  • Ambient brightness can be adjusted. Furthermore, values of basic intensity and display intensity assigned to one another can be firmly predetermined, as is the case with the
  • the basic intensity can, for example, be a certain fraction of the display intensity or the basic intensity can be determined in some other way, for example using a physical model in which the intensity of an overlighting is determined as a function of the display intensity and then a
  • Basic intensity is formed in such a way that the overlighting is concealed.
  • all the luminous areas are illuminated with a basic intensity that is determined, for example, on the basis of an ambient brightness.
  • the illuminated surfaces can be used as design elements and to display a level “0”. The user can then in particular recognize that a display for setting the fan is located in a certain area and / or that an operation for setting the fan can be performed in such an area.
  • the luminous areas can be used to display another parameter. They can also be used in connection with various slider elements 112, 113, 114 or other displays and controlled in the manner described.
  • a higher number of luminous areas can be used, in particular more luminous areas than can be set. For example, you can do this in this way
  • FIGS. 4A and 4B an exemplary embodiment for setting parameter values by means of a slider element is explained.
  • the embodiment of the device explained above with reference to FIGS. 1 and 2 is assumed.
  • the control takes place in particular by means of the control unit 3.
  • the slider element 112 which, in the exemplary embodiment as a fan slider 112, is assigned to the detection unit 2 for setting a fan of the air-conditioning unit 4.
  • the method can also be used for other slider elements 112, 113, 114 and for recording other parameter values.
  • a fan symbol 112b is arranged for a maximum active state of the fan.
  • Light sources are arranged behind the surface, LEDs in the exemplary embodiment, by means of which both the line 112 and the symbols 112a, 112b can be illuminated.
  • the line 112 can also be illuminated as a segment display, that is to say behind here light sources are arranged in a row next to one another in such a way that individual areas of the line 112 can be illuminated independently of one another with different intensities.
  • the set level of the fan is output according to the method explained above with reference to FIGS. 3A to 3C.
  • the symbols 112a, 112b and the line 112 can be permanently visibly printed on or can only be made visible by means of a ⁇ / ac / c pane / technique when they are illuminated from behind.
  • touch-sensitive areas 149, 141a, 141b, 142a to 142i are indicated by dashed lines. In these areas the
  • Sensor elements Sa to Sc an actuation by an actuation object as already explained above, or the detection can take place in another way.
  • the surface of the detection unit 2 is in one touch-sensitive area touched by the actuating object.
  • the actuation object is located in a specific spatial area or at a position, for example just above the surface of the detection unit 2.
  • an actuation at a specific position is detected in that the sensor elements Sa to Sc depending on the position of the
  • the strength of a signal detected by a capacitive sensor depends on the distance from an actuating object entering a detection area.
  • the user touches the slider element 112 at any point or shifts the position of his movement along the slider element 112.
  • different signal strengths are thus detected by the sensor elements Sa to Sc.
  • the position is determined on the basis of these signal strengths and a parameter value is set as a function of the determined position.
  • different spatial areas in particular area areas on the surface of the detection unit 2
  • areas that are separate from one another in order to detect actuations therein.
  • these actuations can also be evaluated differently depending on the respective surface area.
  • a surface area can be used as a button element with a specific
  • Response behavior that is to say, for example, be configured with certain threshold values for time intervals for actuation, or as a slider element with a different response behavior.
  • a coherent area can be formed within which the position of an actuation or an actuation trajectory is detected, or individual areas can be formed in which an actuation is detected when it is assigned to any position within these individual areas.
  • the set level is displayed by a segment display in the area of the slider element 112. This takes place in the manner explained above with reference to FIGS. 3A to 3C.
  • the slider element 112 is actuated at a position, this is illuminated at the corresponding position and the corresponding level of the fan is set.
  • the The exemplary embodiment not only detects the area in which the actuating position is currently located, but an approach to surrounding areas is also detected.
  • the user can move his finger along the slider element 112 and thereby approach an area that is assigned to a next stage of the fan.
  • the lighting element arranged in this area is illuminated with increasing intensity the closer the user gets to the area of the next step.
  • the user reaches the next area it is illuminated with the normal display intensity.
  • the parameter value in the exemplary embodiment of the method is detected by means of fewer sensors, particularly flexibly and / or with a higher resolution. While known methods provide at least one sensor element for each detectable position, in the method explained fewer sensors are used in a particularly space-saving, cost-efficient and simple manner.
  • Start position at which the contact begins an end position at which the contact ends, and a trajectory along which the actuating object moves along the surface from the start to the end position.
  • the duration of the contact and / or the stay at a specific position can be recorded. If necessary, a direction and / or speed of a movement along the surface can also be determined and evaluated.
  • buttons or slider elements both in touch-sensitive surface areas and for mechanical switches.
  • the user can actuate the touch-sensitive area by “tapping”, with a time interval V being recorded between the beginning and the end of the touch that is shorter than a certain threshold value to: At ⁇ to.
  • the threshold value to can be, for example, 400 ms or 800 ms.
  • the point in time at which an actuation event is detected and, for example, a parameter value is changed, is here typically the point in time at which the touch is released.
  • an actuation event can also be recorded when the touch begins, in which case each touch already triggers a tap event.
  • Typical applications for tapping are, for example, switching a function on and off, changing parameter values incrementally or directly selecting a parameter value by tapping a button.
  • the user can also hold the touch at a specific position or in a specific surface area for a longer time interval At than a threshold value ti: At>.
  • the threshold value ti can be, for example, 400 ms or 800 ms.
  • Such an operation can be referred to as “hold”, “long press” or “long push”.
  • a corresponding hold event can be triggered as soon as the held time interval At exceeds the threshold value ti or when the contact is released.
  • Further conditions can be defined that the touch must be released at a certain position or in a certain area in order to trigger a hold event; In this case, the user can prevent the triggering by moving the actuating object to another area, for example to another button element.
  • a “multiple hold” can be carried out in that the touch of a first surface area lasts longer than a threshold value ti and then changes over to a second surface area, which is then also touched for longer than the threshold value ti.
  • several pushbutton elements can be actuated without having to lift the actuating object.
  • a first button element is actuated with a “hold” gesture and then the user with the actuation object slides on to another button element without releasing the touch.
  • an actuation is recorded for a time interval At longer than a threshold value to and a renewed actuation is recorded for each multiple of the threshold value to.
  • the user can trigger a multiple actuation by holding the actuation for a corresponding multiple of the threshold value t 0 .
  • a “wiping” can be detected as an actuation, for example when the actuation object stays in a surface area for a time interval At shorter than a threshold value t 0 and then actuates an adjacent surface area.
  • the actuation can then be detected, for example, for the adjacent surface area when the touch is released, with particular consideration being given to whether the user is the adjacent Surface area for a time interval At shorter than the threshold value to touched or whether a hold is carried out here, for example.
  • a “swipe” can be detected as an actuation, the position of the contact moving from a first to a second surface area and, in the process, in particular further surface areas being crossed.
  • the speed at which the position changes is also taken into account and, for example, a parameter value can be changed more quickly than when wiping.
  • a “swipe” can in particular be detected when the speed of the swiping gesture exceeds a threshold value.
  • the fan slider is defined as a contiguous, active slider area 149 which extends over the entire length of the line 112 and the adjacent symbols 112a, 112b.
  • the positions of touches and actuations can be detected within the area 149 of this active slider area 149. If the actuation object moves along the longitudinal extent of the fan slider 112, the position of the contact, in particular during a swiping gesture, is continuously detected and the set level of the fan follows this position. For example, a lowest level is assigned to the left area of the fan slider 112 or the left fan symbol 112a, while a highest level is assigned to the right area of the fan slider 112 or the right fan symbol 112b. In between spread over the
  • a level is set when the operating object reaches a position assigned to this level.
  • the step is not set until the touch is released, the step then being set which is assigned to the position when the touch is released.
  • Surface areas 142a to 142i are formed. These are operated by tapping, so that the user can adjust the fan speed directly by selecting a touch-sensitive one
  • the two outermost left 142a, 142b and right 142h, 142i touch-sensitive surface areas are also combined to form enlarged actuation areas 141a, 141b.
  • enlarged actuation areas 141a, 141b By touching one of these enlarged actuation areas 141a, 141 b the user can increase or decrease the level of the fan incrementally.
  • the level By continuously holding in one of the enlarged actuation areas 141a, 141b, the level is gradually increased or decreased, depending on the time interval during which the contact is kept.
  • the enlargement of the actuation areas 141a, 141b restricts the possibilities for the direct selection of a level of the fan insofar as the lowest and highest levels cannot be selected directly. Instead, these steps are only achieved in that, starting from the next adjacent step, the corresponding enlarged actuation area 141a, 141b is tapped again or maintained permanently.
  • FIGS. 4A and 4B are not to be understood as static configurations of the detection unit 2 in the exemplary embodiment. Rather, it switches
  • buttons 108 to 111 which are arranged adjacent to the slider elements 112, 113, 140, for the detection of a
  • Actuation are blocked after a swiping gesture in the area of one of the slider elements 112, 113, 114 has been detected. This prevents the user from accidentally actuating one of the pushbutton elements 108 to 111 if he continues a movement of the actuating object beyond the area of a slider element 112, 113, 114 with a swiping gesture.
  • Touch-sensitive surfaces or switching elements are carried out for a specific blocking time interval.
  • This blocking time interval begins in particular at the point in time at which the contact with the slider element 112, 113, 114 is ended, and it can in particular can be determined dynamically, for example based on the speed of the swiping gesture and / or a driving speed of vehicle 1.
  • a distance from a slider element 112, 113, 114 is defined, within which no actuation is detected in the blocking time interval. This distance can also be determined dynamically, for example on the basis of the speed of the swiping gesture, a longitudinal extent of the slider element 112, 113, 114 and / or on the basis of the driving speed of the vehicle 1. In particular, a
  • Surface area are defined which continues a longitudinal extension of the slider element 112, 113, 114; for example, an area above or below a horizontally running slider element 112, 113, 114 is not blocked, while laterally adjoining surface areas are blocked during the blocking time interval.
  • the adjacent pushbutton elements 108 to 111 are only blocked when it has been detected that a swipe gesture has been carried out at least as far as a lateral end of the slider element 112, 113, 114 or beyond.
  • the blocking of certain surface areas of the detection unit 2 can be triggered by events other than a wiping gesture, for example by any actuation in a certain surface area.
  • the user first presses a first key, then changes his selection and slides to another key.
  • the actuation is only detected when the user releases the touch. Only then is the blocking time interval triggered.
  • the user can slide from a first to a second key without releasing the touch and press the second key when lifting his finger. Only then does the blocking time interval begin and no other key can be pressed.
  • audible feedback is generated when a
  • Input gesture or a gesture type is recognized.
  • the acoustic feedback is generated when a control signal is generated on the basis of a recorded input. This enables the user to see whether his entry has been accepted.
  • the acoustic feedback can alternatively or additionally also be generated when the parameter value is changed. It is generated in a manner known per se, with different responses being able to be output, for example about a recognized gesture type, a set one
  • Output parameter value or other influencing variables can be formed dynamically for this purpose, for example by forming a pitch depending on the set parameter value.
  • the surface of the detection unit 2 shown in FIG. 2 comprises a button element 104, with which a distribution of the by a fan of the
  • Air conditioning unit 4 can be adjusted into the interior of the vehicle 1 air.
  • a display of the set distribution is also output in the area of this button element 104.
  • the distribution is output by arrows 132, 133, 134, which are arranged at different heights relative to a passenger representation 131.
  • the arrows 132, 133, 134 are formed by illuminated areas that can be illuminated independently of one another, while the passenger representation 131 is printed on the surface and is therefore permanently visible.
  • the arrows 132, 133, 134 are arranged approximately at the level of a head, torso or foot area of the passenger representation 131.
  • the button element 104 is used as a “Togg / e” switch. That is, there is a fixed sequence of different settings and each time the button element 104 is actuated, the following setting is set in the sequence. When the last setting is reached, there is a jump to the first setting in the sequence, in particular in the manner of a periodic boundary condition. For more
  • the sequence can be reversed when the last setting is reached, in particular in the manner of a reflective boundary condition.
  • FIG. 5A air is introduced in the upper region of the vehicle interior.
  • FIG. 5B an introduction also takes place in the foot area of the interior.
  • FIG. 5C the air flows only into the foot area.
  • FIG. 5D the air flows into the head, torso and foot area of the
  • Vehicle interior while in the case of Figure 5E it is introduced into the trunk and foot area. Finally, in the case of FIG. 5F, the air is introduced in such a way that it encounters a passenger in vehicle 1, for example in the torso area.
  • the air distributions can be arranged in a different order or formed in a different manner.
  • FIGS. 6A to 6C A further exemplary embodiment for setting a parameter value by means of a slider element is explained with reference to FIGS. 6A to 6C. This is based on the exemplary embodiments explained above.
  • the temperature slider 113 includes a horizontal straight line 113 at the ends of it
  • Temperature symbols 113a, 113b are arranged. In the exemplary embodiment, these are colored blue on the left-hand side and red on the right-hand side in order to symbolize low and high temperatures, respectively. Active slider areas 150, 150a, 150b, 150c are indicated by dashed lines, analogously to FIGS. 2, 4A and 4B.
  • the active slider area 150 extends over the entire length of the line 113 and a narrow area in its vicinity.
  • the user can the value of the temperature parameter for the air conditioning unit 4 by a
  • the set temperature is increased when a wiping gesture directed to the right is detected.
  • the set temperature is lowered if a swiping gesture directed to the left is detected.
  • the difference by which the temperature is changed depends in the exemplary embodiment on the actuation trajectory along which the swiping gesture is carried out.
  • the temperature can be increased or decreased by a certain interval, here up to 4 ° C.
  • a certain interval here up to 4 ° C.
  • the slider element 113 represents a relative scale for the relative change in the set temperature.
  • a swipe gesture is also provided, in which a speed is detected for the swipe gesture that exceeds a specific threshold value. If such a swipe gesture is detected, the temperature parameter can be changed more quickly, for example by jumping to a maximum or minimum temperature or by changing by a larger interval, approximately twice the interval as provided for a swiping gesture, that is to say 8 ° C.
  • active slider areas 150a, 150b are formed which contain the left and right temperature symbols 113a, 113b and a left one
  • the active slider areas 150a, 150b can be actuated here by tapping, holding or holding them permanently, with the set temperature parameter is gradually increased.
  • the temperature can be decreased by 0.5 ° C. with each tap in the left area 150a and increased by 0.5 ° C. with each tap in the right area 150b.
  • the increase takes place in several steps one after the other, the size of the steps being able to be determined, for example, as a function of the duration of the holding, so that, for example, after holding for a certain time, the parameter is changed in steps of 1 ° C to enable faster changes.
  • active slider regions 150a, 150b similar to those in the case shown in FIG. 6B are formed.
  • a central active slider area 150c is now provided, which is arranged between the two lateral active slider areas 150a, 150b.
  • the middle active slider region 150c extends over approximately 20% of the length of the line 113, while the two lateral active slider regions 150a, 150b to the right and left thereof each take up approximately 40% of the length.
  • the user can set a minimum, maximum or average parameter value directly.
  • a hold in which a touch in one of the active slider areas 150a, 150b, 150c is held longer than a certain threshold value, is detected as an actuation and evaluated as a direct selection.
  • a minimum parameter value “LO” is set directly for the temperature if the hold gesture was detected in the left active slider area 150a.
  • a maximum parameter value “Hl” becomes in the case of a hold gesture in the right active slider area 150b is set and a predetermined parameter value of 22 ° C. for a holding gesture in the middle active slider area 150c.
  • other parameter values can be selected directly.
  • different areas can be provided as active slider areas 150a, 150b, 150c, for example with different numbers or with different
  • other actuations can be provided for the direct selection, for example tapping an active slider area 150, 150a, 150b, 150c, in particular with several fingers at the same time.
  • further gestures can be provided, for example a simultaneous actuation of the outer active slider areas 150a, 150b or an actuation of a slider area 150, 150a, 150b, 150c with several fingers.
  • Certain gestures can also be used to call certain functions or to set certain parameters. For example, a “SYNC” mode of the air conditioning unit 4 of the vehicle 1 can be activated, with the same
  • Settings for different areas of the interior of the vehicle 1 can be set, for example for the driver and front passenger areas. Furthermore, different
  • the configurations and arrangements of active slider areas 150, 150a, 150b, 150c shown in FIGS. 6A to 6C can be designed as static configurations of the detection unit 2. However, it is provided in the embodiment that the
  • Detection unit 2 changes dynamically between the configurations, depending on how the temperature slider 113 was operated, that is to say which use case was detected. If an actuation is detected by means of a wipe or swipe gesture, this is evaluated in accordance with an input in the configuration shown in FIG. 6A with a narrow active slider area 150. If, on the other hand, a tapping, holding or a permanent holding gesture is detected, it is possible to automatically switch to detection in accordance with a configuration of FIG. 6B or 6C, in particular to enable direct selection.
  • the configurations mentioned can be combined or designed in other ways.
  • the vehicle 1 alternatively or additionally comprises another device for which a parameter value is acquired by means of the acquisition unit 2.
  • Such other devices can relate to media playback or a navigation system of vehicle 1, for example.
  • the inputs are then recorded analogously to the exemplary embodiments of the recording unit 2 explained above and the method for inputting a parameter value.
  • the detection unit 2 can comprise different operating elements, which can also be arranged differently.
  • slider elements 112, 113, 114 can also run vertically or in other spatial directions instead of horizontally.
  • control takes place in the exemplary embodiments by the control unit 3.
  • control unit 3 takes over the control and / or evaluates existing inputs, preprocessed and generates a control signal, for example to set the parameter value .

Abstract

Dans le procédé de détection d'une valeur de paramètre dans un véhicule, un geste de saisie est détecté dans une zone de détection. Le geste de saisie détecté est associé à un premier ou un deuxième type de geste. Si le geste de saisie détecté a été associé au premier type de geste, une trajectoire d'actionnement est déterminée sur la base du geste de saisie détecté et la valeur de paramètre est modifiée d'une quantité qui dépend de la longueur de la trajectoire d'actionnement. Si le geste de saisie détecté a été associé au deuxième type de geste, une position d'actionnement est déterminée sur la base du geste de saisie détecté et une valeur de paramètre prédéterminée associée à la position d'actionnement est réglée. Le dispositif de détection d'une valeur de paramètre dans un véhicule comprend une unité de détection (2) qui a une zone de détection et qui est conçue pour détecter un geste de saisie dans la zone de détection, et une unité de commande (3) qui est conçue pour associer le geste de saisie détecté au premier ou au deuxième type de geste. Le dispositif est conçu pour mettre en œuvre le procédé.
EP20713201.0A 2019-03-25 2020-03-11 Procédé et dispositif de détection d'une valeur de paramètre dans un véhicule Pending EP3947009A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019204051.9A DE102019204051A1 (de) 2019-03-25 2019-03-25 Verfahren und Vorrichtung zum Erfassen eines Parameterwerts in einem Fahrzeug
PCT/EP2020/056537 WO2020193144A1 (fr) 2019-03-25 2020-03-11 Procédé et dispositif de détection d'une valeur de paramètre dans un véhicule

Publications (1)

Publication Number Publication Date
EP3947009A1 true EP3947009A1 (fr) 2022-02-09

Family

ID=69941327

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20713201.0A Pending EP3947009A1 (fr) 2019-03-25 2020-03-11 Procédé et dispositif de détection d'une valeur de paramètre dans un véhicule

Country Status (4)

Country Link
EP (1) EP3947009A1 (fr)
CN (1) CN113573936A (fr)
DE (1) DE102019204051A1 (fr)
WO (1) WO2020193144A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019204047A1 (de) * 2019-03-25 2020-10-01 Volkswagen Aktiengesellschaft Verfahren und Vorrichtung zum Einstellen eines Parameterwerts in einem Fahrzeug

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9079498B2 (en) * 2009-10-05 2015-07-14 Tesla Motors, Inc. Morphing vehicle user interface
US20130275924A1 (en) * 2012-04-16 2013-10-17 Nuance Communications, Inc. Low-attention gestural user interface
DE102013215905A1 (de) * 2013-08-12 2015-02-12 Volkswagen Aktiengesellschaft Bedienvorrichtung mit berührungsempfindlicher Oberfläche
CN105916720B (zh) * 2014-01-20 2019-06-14 大众汽车有限公司 用户界面和用于借助触敏的显示单元控制音量的方法
DE102014226760A1 (de) * 2014-12-22 2016-06-23 Volkswagen Aktiengesellschaft Infotainmentsystem, Fortbewegungsmittel und Vorrichtung zur Bedienung eines Infotainmentsystems eines Fortbewegungsmittels
DE102015200036A1 (de) * 2015-01-05 2016-07-07 Volkswagen Aktiengesellschaft Verfahren und Vorrichtung in einem Kraftfahrzeug zur Eingabe von Daten mit zwei Eingabearten und haptischer Rückkopplung
DE102016200110A1 (de) 2016-01-07 2017-07-13 Volkswagen Aktiengesellschaft Armaturentafel, Fortbewegungsmittel und Vorrichtung zur Bedienung eines Heiz-Klima-Systems eines Fortbewegungsmittels
DE102016207611A1 (de) * 2016-05-03 2017-11-09 Volkswagen Aktiengesellschaft Anzeige- und Bediensystem
JP6902340B2 (ja) * 2016-09-01 2021-07-14 株式会社デンソーテン 入力装置、プログラムおよび検出方法

Also Published As

Publication number Publication date
CN113573936A (zh) 2021-10-29
WO2020193144A1 (fr) 2020-10-01
DE102019204051A1 (de) 2020-10-01

Similar Documents

Publication Publication Date Title
EP3947006B1 (fr) Dispositif et procédé d'émission d'une valeur de paramètre dans un véhicule
EP2884187B1 (fr) Procédé d'aide à l'utilisateur, composant d'écran et fabrication d'un composant d'écran
EP3237249B1 (fr) Moyen de déplacement, interface utilisateur et procédé d'affichage à recouvrement d'un contenu d'affichage sur deux dispositifs d'affichage
EP2885142B1 (fr) Dispositif de réglage d'un système de climatisation d'un véhicule et procédé correspondant
WO2015131921A1 (fr) Procédé et dispositif pour mettre à disposition une interface utilisateur graphique dans un véhicule
EP3372435B1 (fr) Procédé et système de commande destinés à fournir une interface utilisateur
WO2016102296A2 (fr) Barre tactile et son application
DE102015008071A1 (de) Steuerung einer Kfz-Innenraum-Beleuchtung
EP3270278A1 (fr) Procede de fonctionnement d'un systeme de commande et systeme de commande
WO2020193142A1 (fr) Dispositif et procédé pour la détection d'une entrée d'un utilisateur dans un véhicule
DE102014011118B4 (de) Verfahren zum Bedienen einer Lichtfunktion von Kraftfahrzeugscheinwerfern und Kraftfahrzeug mit einer Anzeigeeinrichtung und einem Bedienelement für eine Bedienung der Lichtfunktion der Kraftfahrzeugscheinwerfer
EP3545399B1 (fr) Procédé de détection d'une saisie d'utilisateur pour un dispositif de saisie comprenant plusieurs éléments de commutation et dispositif de saisie
EP3947009A1 (fr) Procédé et dispositif de détection d'une valeur de paramètre dans un véhicule
DE102017110104A1 (de) Fahrzeugschalthebelschnittstelle mit näherungserfassung
WO2017108856A1 (fr) Procédé et système de fourniture d'une interface utilisateur pour au moins un dispositif d'un véhicule
EP3947007A1 (fr) Procédé et dispositif de réglage d'une valeur de paramètre dans un véhicule
EP3947005A1 (fr) Dispositif et procédé de réglage d'une valeur de paramètre dans un véhicule
DE102019218386A1 (de) Verfahren und Vorrichtung zum Einstellen einer Position eines verstellbaren Fahrzeugteils
DE102015225248B4 (de) System zum Erfassen eines Wertes für einen von zumindest zwei Einstellungsparametern eines Kraftfahrzeugs
DE102015225310B3 (de) Vorrichtung zum Erfassen von Eingaben eines Nutzers in einem Fahrzeug und Fahrzeug
DE102022200015A1 (de) Bedienelement für eine Bedieneinrichtung
DE102020127418A1 (de) Bedieneinrichtung für wenigstens eine elektrische Einrichtung, insbesondere eine Klimaanlage eines Kraftfahrzeugs
DE102012105624A1 (de) Bedieneinrichtung für multifunktionale Maschinen und Verfahren zur Erleichterung einer Betätigung einer Bedieneinrichtung

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20211025

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)