EP3479201A1 - Control method and control interface for a motor vehicle - Google Patents

Control method and control interface for a motor vehicle

Info

Publication number
EP3479201A1
EP3479201A1 EP17732477.9A EP17732477A EP3479201A1 EP 3479201 A1 EP3479201 A1 EP 3479201A1 EP 17732477 A EP17732477 A EP 17732477A EP 3479201 A1 EP3479201 A1 EP 3479201A1
Authority
EP
European Patent Office
Prior art keywords
pointing element
sensory feedback
target boundary
touch surface
displacement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP17732477.9A
Other languages
German (de)
French (fr)
Inventor
Jean-Marc Tissot
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dav SA
Original Assignee
Dav SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dav SA filed Critical Dav SA
Publication of EP3479201A1 publication Critical patent/EP3479201A1/en
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/25Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using haptic output
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/50Instruments characterised by their means of attachment to or integration in the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/60Instruments characterised by their location or relative disposition in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/80Arrangements for controlling instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/90Calibration of instruments, e.g. setting initial or reference parameters; Testing of instruments, e.g. detecting malfunction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/141Activation of instrument input devices by approaching fingers or pens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/143Touch sensitive instrument input devices
    • B60K2360/1438Touch screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/55Remote control arrangements
    • B60K2360/56Remote control arrangements using mobile devices
    • B60K2360/566Mobile devices displaying vehicle information
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/55Remote control arrangements
    • B60K2360/56Remote control arrangements using mobile devices
    • B60K2360/573Mobile devices controlling vehicle functions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/26Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using acoustic output

Definitions

  • the present invention relates to a control method in particular for a motor vehicle for generating at least one sensory feedback to a user via a control interface such as a human-machine interface.
  • the invention also relates to an interface configured for implementing at least some steps of such a control method.
  • the car has become a real living space, perceived as a personal and interconnected communication center: with for example MP3 player, GPS, connection with mobile phones.
  • the introduction of these new features results in an increase in the number of buttons on the dashboard of a car cockpit.
  • the number of buttons can not be increased to infinity, especially because of the complexity generated, limited space, accessibility or cognitive load.
  • the interaction of the driver with the on-board systems in the car can result in an attention-overload situation in which the driver may not be able to handle all the information of the driving task at best, resulting in errors and delays. longer detection.
  • buttons are more customizable.
  • the touch screens have three other major advantages: they allow on the one hand a direct interaction (the co-implementation of the display and input), on the other hand they are flexible (the display can be easily configured for a number of functions), and finally they are intuitive (familiar interaction method such as "point").
  • a push button when the driver interacts with a touch surface such as a touch screen, it receives no feedback directly related to its action on the interface, other than the mere touch of his finger crashing on the touch surface.
  • a possible application of a haptic feedback is for example to vibrate the touch surface when crossing a target boundary.
  • target boundary is meant a delineation of the non-detectable touch surface haptiquement between two functional areas.
  • a first zone may be provided for controlling a first function, such as for example lowering the volume of an audio system, and a second adjacent zone for controlling a second function, for example increasing the volume.
  • the passage between the two areas is not detectable by the finger of a user.
  • the touch surface is generally associated with a screen disposed behind the touch surface and which displays pictograms associated with the control areas. It is therefore important that the haptic feedback is synchronized with the passage of the control finger between the two zones.
  • This shift can be disconcerting for the user because there is more correspondence between the visual displayed on the screen and the haptic feedback.
  • the finger or the stylus is located elsewhere than where the haptic feedback signal is felt, for example elsewhere than on a boundary delimiting a virtual key or a virtual control button, when the haptic feedback signal is issued.
  • This time shift can be even greater than the speed of movement of the finger or the stylus is important.
  • one solution would be to use faster electronic processing means, to minimize the time difference between the position of the finger or the stylus on a target border of the tactile surface, for example a border crossing. between two areas of the tactile surface, and the perception of the sensory feedback signal such as a haptic feedback.
  • a target border of the tactile surface for example a border crossing. between two areas of the tactile surface, and the perception of the sensory feedback signal such as a haptic feedback.
  • the sensory feedback signal such as a haptic feedback.
  • An object of the present invention is therefore to provide a control method and an associated control interface, to at least partially overcome the aforementioned drawbacks by generating sensory feedback perceived at a convenient time.
  • the subject of the present invention is a sensory feedback control method for a control interface, in particular for a motor vehicle, the control interface comprising a tactile surface, characterized in that the control method comprises the following steps:
  • detecting the displacement of the pointing element towards a target boundary when the pointing element is detected at a threshold distance from the target boundary, generating at least one sensory feedback, so that the sensory feedback is perceived by a user of said interface substantially at the time of crossing the target boundary, avoiding a time lag between the perception of sensory feedback and the crossing of the target boundary.
  • This solution can be independent of the speed of displacement of the pointing element, and makes it possible to reduce costs by fictitiously delimiting a sensory feedback activation threshold distance upstream of the target boundary characteristic of the change of the sensing element. real area.
  • We anticipate the crossing of the target boundary such as a boundary between two zones or delimiting a given zone, which makes it possible to generate the sensory feedback at the right moment, that is to say when the pointing element such as the the user's finger is actually on the target border.
  • the user experience in other words the perception of a sensory feedback characteristic of a change of zones, is improved.
  • control method may further comprise one or more of the following features taken alone or in combination:
  • the threshold distance is predetermined and fixed with respect to the target boundary
  • At least one displacement parameter of the pointing element is evaluated on the tactile surface in time, and as a function of said displacement parameter, the threshold distance is adjusted with respect to the target boundary;
  • the parameter of displacement of the pointing element on the touch-sensitive surface in time is at least one variable chosen from the speed of displacement of the pointing element and a function of the speed such as the acceleration of the element of pointing;
  • control method is for the generation of at least two sensory feedbacks of a different nature, and the generation of each sensory feedback is controlled simultaneously or shifted in time.
  • the invention also relates to a control interface for a motor vehicle configured to implement at least partially a control method according to any one of the preceding claims, the control interface comprising:
  • a touch surface configured to determine the position and movement of a pointing element on the touch surface
  • a sensory feedback module configured to generate at least one sensory feedback
  • Such an interface makes it possible to activate the sensory feedback when it is established that the pointing element will change zone and cross or pass on a target boundary to the extent that the pointing element has reached the threshold distance.
  • the interface allows this sensory feedback to be activated so as to be perceived by the user when the pointing element is actually on the target boundary.
  • such a control interface offers a less expensive solution compared to a solution comprising processing means with very fast response times.
  • control interface may further include one or more of the following features taken alone or in combination:
  • the processing unit is configured to evaluate at least one parameter of displacement of the pointing element on the touch surface over time, and according to the at least one evaluated displacement parameter, adapting the threshold distance with respect to the border target;
  • the sensory feedback module is configured to vibrate the touch surface so as to generate a haptic feedback, and / or generate a sound return and / or generate a visual feedback.
  • FIG. 1a schematically represents a control interface such as a human-machine interface for a motor vehicle
  • FIG. 1b schematically represents a side view of the interface of FIG. 1a;
  • FIGS. 2a to 4d show diagrammatically different examples of display on a tactile surface of the interface of FIGS. 1a and 1b for anticipation. sensory feedback generation,
  • FIG. 5 is a flowchart of the steps of a control method according to the invention for generating at least one anticipatory sensory feedback
  • Fig. 6 is a flowchart detailing the steps of the control method.
  • Control interface called human-machine interface
  • Figure la represents a control interface for a motor vehicle 1.
  • Such an interface 1 is commonly referred to as a human-machine interface. It is advantageously a man-machine interface that is reconfigurable.
  • the control interface 1 comprises: a tactile surface 2 configured to detect a touch of a user's finger or any other pointing element 3 (for example a stylus or a finger) and its displacement on the tactile surface 2 , and
  • a sensory feedback module 4 configured to generate at least one sensory feedback.
  • the control interface 1 may furthermore comprise a display screen 5.
  • the touch surface 2 can thus form an input device allowing the users of the control interface 1 to interact with it through touch.
  • the touch surface 2 is for example configured to determine the spatial coordinates of the point where the user presses with his finger or another pointing element 3 on the touch surface 2.
  • the touch surface 2 thus makes it possible to locate the position of the element 3 when the pointing element 3 moves, the touching surface 2 is configured to determine successive spatial coordinates corresponding to at least two points successive on the touch surface 2.
  • the tactile surface 2 comprises a capacitive touch screen.
  • the capacitive touch panel comprises at least one capacitive sensor 21 for detecting at least one variation of the capacitance at the surface of the capacitive touch-sensitive panel.
  • the capacitive sensor 21 comprises, for example, an array of electrodes, for example made of ITO (indium-tin oxide).
  • the capacitive touch screen may further comprise a front plate 22 (or contact plate), for example polycarbonate or glass.
  • the front plate 22 is arranged on the capacitive sensor 21 and is intended to face the user once mounted in the passenger compartment of the vehicle. This faceplate 22 may be rigid so as to provide the desired rigidity to the capacitive touchscreen.
  • the advantageously flat touch surface 2 of the capacitive touch-sensitive panel is thus formed by the surface of the front plate 22.
  • the touch surface 2 can use pressure sensitive resistors to detect a contact and a displacement of a pointing element such as a finger of the user on the touch surface 2.
  • the touch surface 2 then comprises a pressure sensor, such as using the FSR technology for "Force Sensing Resistor" in English.
  • the display screen 5 can be offset from the touch surface 2.
  • the front plate 22 of the capacitive touch screen can be painted with an opaque color so as to hide the elements arranged behind.
  • the capacitive touch screen can then form what is called a touchpad or "touchpad” in English or a push button or "push” in English.
  • the display screen 5 is arranged facing and in contact with the touch surface 2 such as a capacitive touch screen, more precisely under this touch surface 2, so as to form a touchscreen.
  • the display screen 5 is for example fixed by gluing the back of a capacitive sensor support 21 of the capacitive touch screen.
  • back is meant the portion opposite to the portion carrying the capacitive sensor 21.
  • the touch surface 2 is then transparent to display the images of the display screen 5 through the tactile surface 2.
  • the sensory feedback module 4 can be connected to the touch surface 2 and / or the display screen 5 in the case of a touch screen.
  • the sensory feedback module 4 can in particular be configured to generate a haptic feedback by vibrating the touch surface 2.
  • the sensory feedback module 4 may comprise at least one actuator 41, 42 connected to the touch surface 2, and configured to drive the touching surface in motion, so as to generate the haptic feedback as a function of a signal of ordered.
  • Two actuators 41, 42 are shown schematically in FIG.
  • the haptic feedback is a vibratory signal such as a vibration produced by a sinusoidal control signal or by a control signal comprising one or a succession of pulses, sent to actuator 41 and / or 42.
  • the vibration can be directed in the plane of the tactile surface 2 or orthogonally to the plane of the tactile surface 2 or in a combination of these two directions.
  • actuators 41, 42 these can be arranged under the touch surface 2, in different positions (in the center or on one side) or in different orientations (in the direction of the support on the surface or in a other axis).
  • Such actuators 41, 42 are known and will not be described in more detail in the present description.
  • a parameter of the haptic feedback can be chosen from among the intensity of the acceleration, the frequency, the amplitude, the duration, the duration between two identical signals, the phase. For example, it is possible to simulate different textures of the tactile surface 2, such as different surface roughnesses.
  • the sensory feedback generation module 4 can be configured to generate a sound feedback to the user.
  • a sound return parameter can be chosen from the intensity of the volume, the phase, the frequency, the duration, the duration between two identical signals.
  • an image displayed on the touch screen can represent marked patterns, such as microbosses, at the position of the pointing element 3. In this case we speak back visual.
  • the visual feedback can also be generated by the sensory feedback module 4.
  • the sensory feedback module 4 can be configured to generate all the sensory feedbacks, that is to say, haptic and / or sound and / or visual, at the same time.
  • the return module Sensory 4 can be configured to generate one or the other of the returns in a time-shifted manner with respect to another return.
  • a haptic feedback when a haptic feedback is generated, a sound feedback can be generated before, at the same time or after the generation of the haptic feedback. Indeed, so that the user perceives at the same time a haptic feedback and a sound return, it is better not to generate them at the same time but in an advanced or delayed manner with respect to each other. It is the same for the generation of a visual feedback that can be generated before, at the same time or after the generation of a haptic feedback and / or a sound return.
  • the sensory feedback module 4 is configured to generate one or more sensory feedbacks, of the same nature or of a different nature, when the pointing element 3, here the finger 3, passes or crosses a target boundary 6. Since the touch surface is smooth, this target boundary 6 is not materialized haptically on the touch surface 2.
  • It may be a border 6 separating two zones ZI, Z2 from the display screen 5, for example from the touch screen, as illustrated in FIG. 2a. It may also be a boundary 6 indicating that one enters an area or that one leaves an area, as schematized in Figure 2b schematically showing the finger 3 leaving the zone Z2 .
  • a haptic feedback it is possible to simulate on the tactile surface 2, a texture at this target boundary 6 between two zones Z1, Z2 for example, or delimiting an entry or a zone exit.
  • the control interface 1 further comprises a processing unit 7 shown schematically in FIGS. 1a and 1b.
  • the processing unit 7 is configured to exchange information and / or data with the touch surface 2 and also with the sensory feedback module 4.
  • This processing unit 7 may comprise one or more processing means such as one or more microcontrollers or computers, having memories and programs particularly adapted to receive position and displacement or sliding information of the pointing element 3, this information having been detected in particular by the tactile surface 2.
  • the processing unit 7 is for example the on-board computer of the motor vehicle.
  • the touch-sensitive surface 2 can detect successive spatial coordinates corresponding to support points, and can transmit this information with the coordinates to the processing unit 7.
  • the processing unit 7 therefore comprises at least one reception means information transmitted by the tactile surface 2.
  • the processing unit 7 further comprises one or more processing means such as a control circuit configured to drive the sensory feedback module 4 in order to generate at least one sensory feedback such as a haptic and / or sound return and / or visual.
  • the control circuit may include means for transmitting a sensory feedback generation command to the sensory feedback module 4.
  • the control interface 1 may further include a measuring device 9 shown very schematically in this FIG.
  • the measuring device 9 comprises for example one or more pressure sensors configured to measure the pressure exerted on the tactile surface 2.
  • This is for example one or more strain gauges.
  • the strain gauge or gauges are arranged in direct connection with the touch surface 2 and are cleverly distributed as required.
  • it is possible to provide a strain gauge substantially in the middle of the tactile surface 2, for example in the case of a tactile pad called "Touchpad”.
  • one or more strain gauges are arranged on one or more edges of the tactile surface 2.
  • four strain gages may be provided, each placed at an angle of the touch surface 2.
  • the or each gauge stress can be arranged at dampers provided under the touch surface 2 so as to measure the displacement of the touch surface 2 during contact and sliding of the pointing element 3 on the touch surface 2.
  • the processing unit 7 comprises at least one processing means for receiving measurement signals from the measuring device 9, in particular the or each pressure sensor.
  • the pressure measurement (s) are advantageously taken into account.
  • the pressure measurements make it possible to identify the way in which the user presses on the tactile surface 2 by means of the pointing element 3, so as to generate the appropriate signal for the simulation accordingly. texture.
  • processing unit 7 is advantageously configured to anticipate that the pointing element 3 will subsequently cross a target boundary 6 of the touch surface 2, and to control the sensory feedback module 4 before the actual crossing of the target boundary 6, in order to generate at least one sensory feedback perceived by the user when the pointing element 3 actually crosses the target boundary 6.
  • the processing unit 7 is therefore configured to anticipate the crossing of a target boundary 6 when a displacement is detected on the touch surface 2, and not when a single point of support is detected.
  • the processing unit 7 anticipates the crossing of the target boundary 6 when the pointing element 3 is at a threshold distance d, of activation of at least one sensory feedback, with respect to the target border 6.
  • processing unit 7 is configured to:
  • the threshold distance d it is possible to determine, for example by experiments, an average speed of displacement Vmoy of the pointing element on the touch surface 2 and it is possible to determine the delay Tu between the location of the pointing element 3 to a given location and an effective vibration on the touch surface 2 due to activation of the actuators 41 and 42 for example.
  • a first approach it is possible to set an overall average speed common to all the functions to be controlled.
  • a second approach in particular according to the morphology of the control areas such as their dimensions, their number on the tactile surface, their proximity or their geometry, it is possible to set an average speed for each control function. These average speeds can for example be recorded in a memory of the processing unit 7.
  • the pointing element 3 if the pointing element 3 is at a distance d from the target boundary 6 and moves at the average velocity ⁇ ⁇ , the pointing element 3 will cross the target boundary at moment when one can feel the haptic feedback on the tactile surface 2.
  • Said at least one sensory feedback is then perceived by a user of said interface 1 when the pointing element 3 crosses the target boundary 6.
  • a fictitious delimitation such as a fictitious line 13, upstream of the target boundary 6 and which is arranged at the threshold distance d of the target boundary 6.
  • upstream is used here for reference in the direction of displacement of the pointing element 3 on the tactile surface 2.
  • the fictitious delimitation may extend substantially parallel to the boundary 6 with which it is associated.
  • the processing unit 7 is therefore configured to generate the sensory feedback substantially at the instant at which the pointing element 3 is detected at the threshold distance d, for example on the imaginary delimitation 13.
  • the threshold distance d for example here the distance of the fictitious delineation 13 with respect to the target border 6, can be predefined and fixed.
  • the threshold distance d for example here the distance of the fictitious delineation 13 with respect to the target boundary 6, can be parameterized and can therefore be variable.
  • the processing unit 7 comprises one or more processing means for setting or defining the threshold distance d.
  • the processing unit 7 can be configured to analyze the displacement of the pointing element 3 so as to anticipate the crossing of the target boundary 6.
  • the processing unit 7 is therefore further configured to evaluate at least one displacement parameter of the pointing element 3 on the touch surface 2 in time. These include the evaluation of the speed of movement of the pointing element 3, or a function of the speed such as the derivative or acceleration.
  • processing unit 7 may comprise one or more processing means configured for:
  • the processing unit 7 can analyze the position location information of successive points, transmitted by the tactile surface 2. Thus, with reference to FIGS. 2a and 2b, the processing unit 7 can determine that the pointing element 3 moves in the direction shown schematically by the arrow Fl, which is here horizontal and to the right.
  • the processing unit 7 can detect rectilinear movements whether they are horizontal or vertical (arrow F2 in FIG. 3b), but also circular (arrow F3 in FIG. 4a) or else at an angle to the touch surface 2 (arrow F4, FIG. 4c), in one direction or the other, for example to the right (arrow Fl in FIGS. 2b, 4b and arrow F4 in FIG. 4c) or to the left (arrow F 5 in Figure 4d).
  • the vertical and horizontal terms, left and right, are here used with reference to the arrangement of the elements as shown in FIGS. 2a to 4d.
  • the processing unit 7 may comprise at least one calculation means making it possible to deduce the speed of the pointing element 3 from the location information of successive points, transmitted by the tactile surface 2, and function of an internal clock of the man-machine interface for example.
  • the processing unit 7 may comprise at least one means for calculating the derivative of the speed, and therefore of the acceleration, for example from location location information of points further away between them, transmitted by the tactile surface 2.
  • the processing unit 7 may comprise at least one processing means, for example a software part, allowing, starting from at least one characteristic quantity of the evolution of the displacement of the pointing element 3 on the touch surface 2 in time, to adapt the threshold distance cl with respect to the target boundary 6.
  • the processing unit 7 may be configured to take into account at least one parameter of displacement of the pointing element on the touch surface in the time evaluated for the determination of the threshold distance cl.
  • the processing unit 7 can anticipate that the finger 3 will cross the target boundary 6 between the two zones ZI and Z2.
  • the processing unit 7 can generate in advance, as soon as the finger 3 is detected at the threshold distance cl a control signal to the sensory feedback module 4 which generates the sensory feedback (s). When the finger 3 actually arrives on the border 6, the user perceives the sensory feedback or returns.
  • the two zones ZI, Z2 are open zones, that is to say without any boundary between the two zones ZI, Z2, but in this case it is the crossing of one of the borders marked around one of the zones, here Z2, which is anticipated.
  • processing unit 7 makes it possible to anticipate that a pointing element 3 will enter an area (FIG. 2a) but also that it will exit an area (FIG. 2b).
  • the tactile surface 2 may comprise a succession of distinct zones B1, B2, B3, B4 simulating control buttons, each zone B1, B2, B3, B4 being delimited by a closed surface forming 6.
  • the processing unit 7 is then configured to anticipate whether the pointing element 3 will enter or leave one or other of these areas B1, B2, B3, B4.
  • the pointing element 3 is detected in the zone B2, and if the user moves his finger 3 vertically, downwards with reference to the arrangement of the elements in FIG.
  • the processing unit 7 is configured to determine that the pointing element 3 will leave the zone B2 when it is located at a threshold distance cl of the border 6 at the bottom of the zone B2 in reference to the arrangement of the elements in FIG. 3b, and to activate in advance the generation of a significant sensory feedback from the output of B2. If the pointing element 3 continues its downward movement illustrated by the arrow F2, the processing unit 7 can anticipate if and when the pointing element 3 will enter the next zone B3 when the pointing element 3 is located at a threshold distance c1 of the boundary 6 at the top of the zone B3 with reference to the arrangement of the elements in FIG.
  • a different sensory feedback may be provided depending on whether the pointing element 3 enters or leaves a closed zone.
  • two distinct haptic feedbacks one can simulate the feeling of a user pressing and releasing a key.
  • the simulated buttons or control buttons are represented in line or in a corridor.
  • the virtual keys may be arranged in a circular manner as shown diagrammatically in FIG. 4a.
  • the processing unit 7 can anticipate from the displacement in a circular motion schematized by the arrow F3, that the pointing element 3 will cross a boundary 6 defining an area, here B2, when the pointing element 3 is for example located on the fictitious delineation 13 along a radius of the circle defined by the virtual keys, so as to control the generation of the sensory feedback or returns so that it (s) is (are) presented (s) to the good moment to the user.
  • the threshold distance cl is an angular distance.
  • a list of items A, B, C can be displayed on the display screen 5, for example on the touch screen.
  • the user moves the pointing element 3 on the touch surface 2, and when a contact pointing an item, here A, in the list, is detected on the touch-sensitive surface 2, a sub-list of items A1, A2, A3 may be displayed.
  • a sub-list of items A1, A2, A3 may be displayed.
  • A31, A32, A33 can be displayed.
  • the processing unit 7 can anticipate that the finger 3 will leave the zone corresponding to the item A3 when the finger 3 is located at a threshold distance d from the right boundary of item A3 with reference to the arrangement of the elements in FIG. 4b.
  • the processing unit 7 can anticipate that the finger 3 will enter the zone corresponding to the sub-item A31, when the finger 3 is located at a threshold distance d from the left boundary 6 of the sub-item A31. reference to the arrangement of the elements in FIG. 4b.
  • the treatment unit 7 can thus anticipate the appropriate sensory feedback at the right time.
  • FIG. 4c It is also possible to provide, as illustrated in FIG. 4c, a cascading arrangement of windows A, B, C, D.
  • the processing unit 7 can anticipate each window crossing and activate in anticipation the generation of the appropriate sensory feedback at the right moment. .
  • the processing unit 7 can still anticipate the crossing of successive lines 11 when the pointing element 3 is located at a threshold distance d of a given line 11, and activate in advance the generation of at least one return appropriate sensory associated with the crossing of each line 11.
  • each line 11 forms a target boundary.
  • a sensory feedback control method for a human-machine interface in particular for a motor vehicle, is described, the human-machine interface comprising a touch-sensitive surface 2. It is advantageously an interface of control 1 as described above with reference to Figures la to 4d.
  • the control method comprises a step E1 in which the displacement of a pointing element 3 such as a finger of a user of the control interface 1, or a stylus or a finger, on the tactile surface is detected. 2 in the direction of a target boundary 6 or 11 of the touch surface 2.
  • This detection step El is for example made by the touch surface 2.
  • the control method further comprises a step E2, for example said anticipation step, at which at the threshold distance d, at least one sensory feedback, in particular hap tic, is generated.
  • At least one parameter of displacement of the pointing element 3 on the touch surface 2 in time such as the acceleration of the pointing element 3 and / or the speed of displacement of the pointing element 3 can be used for the definition of the threshold distance d with respect to the target border 6 or 11, at this step E2.
  • step E2 can comprise substeps E200 to E220.
  • the position of the pointing element 3 is located on the touch surface 2, and the direction of movement of the pointing element 3 on the touch surface 2 is evaluated. be implemented by the touch surface 2 and the evaluation of the direction by the processing unit 7.
  • At least one characteristic variable of the evolution of the displacement of the pointing element 3 on the tactile surface 2 in time such as speed and / or acceleration .
  • Evaluation of one or more quantities such as speed and / or acceleration can be implemented by the processing unit 7.
  • the threshold distance d is not predetermined in a fixed manner, during a step E210, it is possible to define this threshold distance d, for example it is possible to define the distance between a fictitious delineation 13 upstream of the target boundary 6 according to the direction of displacement of the pointing element 3.
  • the threshold distance d for example here the location or the position of the imaginary delimitation 13, always upstream of the target border 6 or 11, can be adapted according to of the acceleration and / or the speed of displacement of the pointing element 3. For example, when the pointing element 3 moves very fast, the fictitious delineation 13 of the target boundary can be further removed. 6 or 11, in other words the threshold distance d can be increased with respect to the target boundary 6 or 11.
  • This step E210 can be implemented by the processing unit 7.
  • step E220 when the pointing element 3 arrives at this threshold distance c1, it is detected, for example here, when the pointing element 3 passes the imaginary delimitation 13 and the haptic return is generated.
  • This step E220 can be implemented by the processing unit 7.
  • a control method as described according to one or the other embodiment makes it possible to present to a user at least one sensory feedback at the right moment, that is to say that the user perceives this sensory feedback when it actually changes zone or crosses a target boundary 6 or 11.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to a sensory feedback control method for a control interface, in particular fpr a motor vehicle, said control interface comprising a touch-sensitive surface (2), characterised in that the method comprises the following steps: detecting the movement of a pointing element (3) towards a target boundary (6); when the pointing element (3) is detected at a threshold distance (d) from the target boundary, generating at least one sensory feedback, such that the sensory feedback is perceived by a user of the interface (1) substantially at the moment at which the target boundary (6) is crossed, preventing a time lag between the perception of the sensory feedback and the crossing of the target boundary (6). The invention relates to a control interface (1) for a motor vehicle, configured to carry out the aforementioned control method at least partially.

Description

Procédé de commande et interface de commande pour véhicule automobile  Control method and control interface for a motor vehicle
La présente invention concerne un procédé de commande notamment pour véhicule automobile permettant la génération d'au moins un retour sensoriel à un utilisateur via une interface de commande telle qu'une interface homme-machine. L'invention concerne aussi une interface configurée pour la mise en œuvre d'au moins certaines étapes d'un tel procédé de commande. The present invention relates to a control method in particular for a motor vehicle for generating at least one sensory feedback to a user via a control interface such as a human-machine interface. The invention also relates to an interface configured for implementing at least some steps of such a control method.
Depuis ces dernières années, les véhicules automobiles, en particulier les voitures, sont devenus plus faciles à manipuler avec l'apparition de nouvelles technologies émergentes (par exemple direction assistée, ABS, régulateur de vitesse, radar de recul etc..) entraînant une augmentation du nombre de fonctions à contrôler pendant la conduite.  In recent years, motor vehicles, especially cars, have become easier to handle with the emergence of new emerging technologies (eg power steering, ABS, cruise control, reversing radar etc ..) resulting in an increase the number of functions to be controlled while driving.
La voiture est devenue un véritable espace de vie, perçue comme un centre de communication personnel et interconnecté : avec par exemple le lecteur MP3, le GPS, la connexion avec les téléphones portables. L'introduction de ces nouvelles fonctions se traduit par une augmentation du nombre de boutons sur le tableau de bord d'un cockpit de voiture. Cependant, le nombre de boutons ne peut pas être augmenté à l'infini, du fait notamment de la complexité engendrée, de l'espace limité, de l'accessibilité ou de la charge cognitive. De plus, l'interaction du conducteur avec les systèmes embarqués dans la voiture peut produire une situation de surcharge attentionnelle dans laquelle le conducteur peut ne pas traiter au mieux toutes les informations de la tâche de conduite, se traduisant par des erreurs et un temps de détection plus long.  The car has become a real living space, perceived as a personal and interconnected communication center: with for example MP3 player, GPS, connection with mobile phones. The introduction of these new features results in an increase in the number of buttons on the dashboard of a car cockpit. However, the number of buttons can not be increased to infinity, especially because of the complexity generated, limited space, accessibility or cognitive load. In addition, the interaction of the driver with the on-board systems in the car can result in an attention-overload situation in which the driver may not be able to handle all the information of the driving task at best, resulting in errors and delays. longer detection.
Une possibilité est de centraliser les boutons en les remplaçant au moins en partie par une surface tactile telle qu'une dalle tactile ou un écran tactile. Ceci permet de continuer à augmenter le nombre des fonctions, celles-ci devenant programmables et reconfigurables et exposées de façon temporaire ou permanente selon le contexte ou la fonction activée. La surface tactile inclut ainsi une possibilité de multifonctionnalité, tout en dématérialisant les boutons. Les écrans tactiles sont de plus personnalisables. En outre, les écrans tactiles présentent trois autres avantages majeurs : ils permettent d'une part une interaction directe (la co-implantation de l'affichage et de saisie), d'autre part ils sont souples (l'affichage peut être facilement configuré pour un certain nombre de fonctions), et enfin ils sont intuitifs (méthode d'interaction familière comme par exemple « pointer »). Cependant, contrairement au cas d'un bouton-poussoir, lorsque le conducteur interagit avec une surface tactile tel qu'un écran tactile, il ne reçoit aucune rétroaction liée directement à son action sur l'interface, autre que le simple contact de son doigt s'écrasant sur la surface tactile. One possibility is to centralize the buttons by replacing them at least in part by a touch surface such as a touch screen or a touch screen. This makes it possible to continue to increase the number of functions, these becoming programmable and reconfigurable and exposed temporarily or permanently depending on the context or the activated function. The touch surface thus includes a possibility of multifunctionality, while dematerializing the buttons. Touch screens are more customizable. In addition, the touch screens have three other major advantages: they allow on the one hand a direct interaction (the co-implementation of the display and input), on the other hand they are flexible (the display can be easily configured for a number of functions), and finally they are intuitive (familiar interaction method such as "point"). However, unlike the case of a push button, when the driver interacts with a touch surface such as a touch screen, it receives no feedback directly related to its action on the interface, other than the mere touch of his finger crashing on the touch surface.
Afin de compenser la perte d'informations causée par la substitution d'organes de commande mécaniques classiques par des surfaces tactiles telles que des écrans tactiles, il est connu d'ajouter un ou plusieurs retours sensoriels, comme un retour haptique, pour fournir une rétroaction du système à l'utilisateur. Ce retour permet d'éviter l'ambiguïté possible de la prise en compte de l'action de l'utilisateur par le système, susceptible de favoriser l'apparition de situations dangereuses.  In order to compensate for the loss of information caused by the substitution of conventional mechanical control members by tactile surfaces such as touch screens, it is known to add one or more sensory feedbacks, such as a haptic feedback, to provide feedback. from the system to the user. This return makes it possible to avoid the possible ambiguity of taking into account the action of the user by the system, likely to favor the appearance of dangerous situations.
Ainsi, il est connu de faire vibrer une surface tactile pour faire un retour haptique informant par exemple l'utilisateur de la prise en compte d'une commande.  Thus, it is known to vibrate a tactile surface to make a haptic feedback informing for example the user of the consideration of a command.
Une possible application d'un retour haptique consiste par exemple à faire vibrer la surface tactile lors du franchissement d'une frontière cible. Par « frontière cible », on désigne une délimitation de la surface tactile non détectable haptiquement entre deux zones fonctionnelles.  A possible application of a haptic feedback is for example to vibrate the touch surface when crossing a target boundary. By "target boundary" is meant a delineation of the non-detectable touch surface haptiquement between two functional areas.
Comme exemple, on peut prévoir une première zone pour commander une première fonction, comme par exemple baisser le volume d'un système audio, et une seconde zone adjacente pour commander une seconde fonction, par exemple augmenter le volume.  As an example, a first zone may be provided for controlling a first function, such as for example lowering the volume of an audio system, and a second adjacent zone for controlling a second function, for example increasing the volume.
Comme la surface tactile est lisse, le passage entre les deux zones n'est pas détectable par le doigt d'un utilisateur.  As the touch surface is smooth, the passage between the two areas is not detectable by the finger of a user.
Pour pallier cet inconvénient, on peut générer sur la surface tactile un retour haptique destiné à matérialiser la frontière cible entre les deux zones, de sorte que l'utilisateur peut s'orienter sur la surface tactile même sans regarder celle-ci.  To overcome this drawback, it is possible to generate on the tactile surface a haptic feedback intended to materialize the target boundary between the two zones, so that the user can orientate himself on the tactile surface even without looking at it.
Pour matérialiser visuellement la frontière entre deux zones, la surface tactile est généralement associée à un écran disposé derrière la surface tactile et qui affiche des pictogrammes associés aux zones de commande. Il est donc important que le retour haptique soit synchronisé avec le passage du doigt de commande entre les deux zones.  To visually materialize the border between two areas, the touch surface is generally associated with a screen disposed behind the touch surface and which displays pictograms associated with the control areas. It is therefore important that the haptic feedback is synchronized with the passage of the control finger between the two zones.
Or, entre la détection du doigt de l'utilisateur et la génération effective d'un signal de retour haptique il existe un délai difficilement compressible d'environ 20ms. Il en résulte que, lorsque le doigt ou le stylet en contact avec la surface tactile se déplace, ce délai peut induire une différence entre le passage de la frontière cible à un instant ti et l'endroit où est ensuite ressenti le retour haptique lorsque l'utilisateur a poursuivi son déplacement à l'instant. However, between the detection of the user's finger and the actual generation of a haptic feedback signal there is a hardly compressible delay of about 20ms. As a result, when the finger or the stylus in contact with the touch surface moves, this delay can induce a difference between the passage of the target border at a time ti and the place where the haptic feedback is then felt when the user has continued his movement at the moment.
Ce décalage peut être déconcertant pour l'utilisateur car il n'y a plus de correspondance entre le visuel affiché par l'écran et le retour haptique.  This shift can be disconcerting for the user because there is more correspondence between the visual displayed on the screen and the haptic feedback.
Autrement dit, le doigt ou le stylet est situé ailleurs qu'à l'endroit où le signal de retour haptique est ressenti, par exemple ailleurs que sur une frontière délimitant une touche virtuelle ou un bouton de commande virtuel, lorsque le signal de retour haptique est émis. Ce décalage temporel peut être d'autant plus grand que la vitesse de déplacement du doigt ou du stylet est importante.  In other words, the finger or the stylus is located elsewhere than where the haptic feedback signal is felt, for example elsewhere than on a boundary delimiting a virtual key or a virtual control button, when the haptic feedback signal is issued. This time shift can be even greater than the speed of movement of the finger or the stylus is important.
Pour améliorer la situation, une solution serait d'utiliser des moyens électroniques de traitement plus rapides, permettant de réduire au maximum ce décalage temporel entre la position du doigt ou du stylet sur une frontière cible de la surface tactile, par exemple un passage de frontière entre deux zones de la surface tactile, et la perception du signal de retour sensoriel tel qu'un retour haptique. Cependant une telle solution serait trop coûteuse pour l'industrie automobile.  To improve the situation, one solution would be to use faster electronic processing means, to minimize the time difference between the position of the finger or the stylus on a target border of the tactile surface, for example a border crossing. between two areas of the tactile surface, and the perception of the sensory feedback signal such as a haptic feedback. However such a solution would be too expensive for the automotive industry.
Un but de la présente invention est donc de fournir un procédé de commande et une interface de commande associée, permettant de pallier au moins partiellement les inconvénients précités en générant un retour sensoriel perçu à un instant opportun.  An object of the present invention is therefore to provide a control method and an associated control interface, to at least partially overcome the aforementioned drawbacks by generating sensory feedback perceived at a convenient time.
À cet effet, la présente invention a pour objet un procédé de commande de retour sensoriel pour une interface de commande notamment pour véhicule automobile, l'interface de commande comportant une surface tactile, caractérisé en ce que le procédé de commande comporte les étapes suivantes :  For this purpose, the subject of the present invention is a sensory feedback control method for a control interface, in particular for a motor vehicle, the control interface comprising a tactile surface, characterized in that the control method comprises the following steps:
on détecte le déplacement de l'élément de pointage en direction d'une frontière cible, lorsque l'élément de pointage est détecté à une distance seuil de la frontière cible, on génère au moins un retour sensoriel, de sorte que le retour sensoriel est perçu par un utilisateur de ladite interface sensiblement au moment du franchissement de la frontière cible, évitant un décalage temporel entre la perception du retour sensoriel et le franchissement de la frontière cible.  detecting the displacement of the pointing element towards a target boundary, when the pointing element is detected at a threshold distance from the target boundary, generating at least one sensory feedback, so that the sensory feedback is perceived by a user of said interface substantially at the time of crossing the target boundary, avoiding a time lag between the perception of sensory feedback and the crossing of the target boundary.
Cette solution, peut être indépendante de la vitesse de déplacement de l'élément de pointage, et permet de réduire les coûts en délimitant de manière fictive une distance seuil d'activation du retour sensoriel en amont de la frontière cible caractéristique du changement de zone réel. On anticipe le franchissement de la frontière cible telle qu'une frontière entre deux zones ou délimitant une zone donnée, ce qui permet de générer le retour sensoriel au bon moment, c'est- à-dire lorsque l'élément de pointage tel que le doigt de l'utilisateur est réellement sur la frontière cible. On est ainsi au plus près de ce qu'attend l'utilisateur et il y a concordance entre le visuel et le sensoriel pour l'utilisateur. L'expérience utilisateur, autrement dit la perception d'un retour sensoriel caractéristique d'un changement de zones, en est améliorée. This solution can be independent of the speed of displacement of the pointing element, and makes it possible to reduce costs by fictitiously delimiting a sensory feedback activation threshold distance upstream of the target boundary characteristic of the change of the sensing element. real area. We anticipate the crossing of the target boundary such as a boundary between two zones or delimiting a given zone, which makes it possible to generate the sensory feedback at the right moment, that is to say when the pointing element such as the the user's finger is actually on the target border. We are thus closer to what the user is waiting for and there is a concordance between the visual and the sensory for the user. The user experience, in other words the perception of a sensory feedback characteristic of a change of zones, is improved.
Le procédé de commande peut en outre comporter une ou plusieurs des caractéristiques suivantes prises seules ou en combinaison :  The control method may further comprise one or more of the following features taken alone or in combination:
la distance seuil est prédéterminée et fixe par rapport à la frontière cible ;  the threshold distance is predetermined and fixed with respect to the target boundary;
- on évalue au moins un paramètre de déplacement de l'élément de pointage sur la surface tactile dans le temps, et en fonction dudit paramètre de déplacement, on adapte la distance seuil par rapport à la frontière cible ; at least one displacement parameter of the pointing element is evaluated on the tactile surface in time, and as a function of said displacement parameter, the threshold distance is adjusted with respect to the target boundary;
le paramètre de déplacement de l'élément de pointage sur la surface tactile dans le temps est au moins une grandeur choisie parmi la vitesse de déplacement de l'élément de pointage et une fonction de la vitesse telle que l'accélération de l'élément de pointage ;  the parameter of displacement of the pointing element on the touch-sensitive surface in time is at least one variable chosen from the speed of displacement of the pointing element and a function of the speed such as the acceleration of the element of pointing;
le procédé de commande est pour la génération d'au moins deux retours sensoriels de nature différente, et on commande la génération de chaque retour sensoriel de façon simultanée ou décalée dans le temps.  the control method is for the generation of at least two sensory feedbacks of a different nature, and the generation of each sensory feedback is controlled simultaneously or shifted in time.
L'invention a aussi pour objet une interface de commande pour véhicule automobile configurée pour mettre en œuvre au moins partiellement un procédé de commande selon l'une quelconque des revendications précédentes, l'interface de commande comportant :  The invention also relates to a control interface for a motor vehicle configured to implement at least partially a control method according to any one of the preceding claims, the control interface comprising:
une surface tactile configurée pour déterminer la position et le déplacement d'un élément de pointage sur la surface tactile, et  a touch surface configured to determine the position and movement of a pointing element on the touch surface, and
un module de retour sensoriel configuré pour générer au moins un retour sensoriel,  a sensory feedback module configured to generate at least one sensory feedback,
caractérisée en ce que l'interface de commande comporte en outre une unité de traitement configurée pour : characterized in that the control interface further comprises a processing unit configured to:
détecter le déplacement de l'élément de pointage en direction d'une frontière cible,  detect the displacement of the pointing element towards a target boundary,
générer à une distance seuil entre l'élément de pointage et la frontière cible au moins un retour sensoriel, de sorte que le retour sensoriel est perçu par un utilisateur de ladite interface sensiblement au moment du franchissement de la frontière cible, évitant un décalage temporel entre la perception du retour sensoriel et le franchissement de la frontière cible. generating, at a threshold distance between the pointing element and the target boundary, at least one sensory feedback, so that the sensory feedback is perceived by a user of said interface substantially at the time of crossing the target boundary, avoiding a temporal shift between the perception of sensory feedback and the crossing of the target border.
Une telle interface permet d'activer le retour sensoriel lorsqu'il est établi que l'élément de pointage va changer de zone et franchir ou passer sur une frontière cible dans la mesure où l'élément de pointage a atteint la distance seuil. L'interface permet que ce retour sensoriel soit activé de façon à être perçu par l'utilisateur lorsque l'élément de pointage est effectivement sur la frontière cible. En outre, une telle interface de commande offre une solution moins coûteuse par rapport à une solution comportant des moyens de traitement avec des temps de réponse très rapides.  Such an interface makes it possible to activate the sensory feedback when it is established that the pointing element will change zone and cross or pass on a target boundary to the extent that the pointing element has reached the threshold distance. The interface allows this sensory feedback to be activated so as to be perceived by the user when the pointing element is actually on the target boundary. In addition, such a control interface offers a less expensive solution compared to a solution comprising processing means with very fast response times.
L'interface de commande peut en outre comporter une ou plusieurs des caractéristiques suivantes prises seules ou en combinaison :  The control interface may further include one or more of the following features taken alone or in combination:
l'unité de traitement est configurée pour évaluer au moins un paramètre de déplacement de l'élément de pointage sur la surface tactile dans le temps, et en fonction du au moins un paramètre de déplacement évalué, adapter la distance seuil par rapport à la frontière cible ; le module de retour sensoriel est configuré pour faire vibrer la surface tactile de manière à générer un retour haptique, et/ou générer un retour sonore et/ou générer un retour visuel.  the processing unit is configured to evaluate at least one parameter of displacement of the pointing element on the touch surface over time, and according to the at least one evaluated displacement parameter, adapting the threshold distance with respect to the border target; the sensory feedback module is configured to vibrate the touch surface so as to generate a haptic feedback, and / or generate a sound return and / or generate a visual feedback.
D'autres avantages et caractéristiques apparaîtront à la lecture de la description de l'invention, ainsi que sur les figures annexées qui représentent un exemple de réalisation non limitatif de l'invention et sur lesquelles : Other advantages and characteristics will appear on reading the description of the invention, as well as on the appended figures which represent a non-limiting exemplary embodiment of the invention and in which:
- la figure la représente de façon schématique une interface de commande telle qu'une interface homme-machine pour véhicule automobile FIG. 1a schematically represents a control interface such as a human-machine interface for a motor vehicle
la figure lb représente de façon schématique une vue de côté de l'interface de la figure la, les figures 2a à 4d représentent de façon schématique différents exemples d'affichage sur une surface tactile de l'interface des figures la et lb pour une anticipation de génération de retour sensoriel,  FIG. 1b schematically represents a side view of the interface of FIG. 1a; FIGS. 2a to 4d show diagrammatically different examples of display on a tactile surface of the interface of FIGS. 1a and 1b for anticipation. sensory feedback generation,
la figure 5 est un organigramme des étapes d'un procédé de commande selon l'invention pour la génération d'au moins un retour sensoriel par anticipation,  FIG. 5 is a flowchart of the steps of a control method according to the invention for generating at least one anticipatory sensory feedback,
la figure 6 est un organigramme détaillant les étapes du procédé de commande.  Fig. 6 is a flowchart detailing the steps of the control method.
Sur ces figures, les éléments identiques portent les mêmes numéros de référence.  In these figures, the identical elements bear the same reference numbers.
Les réalisations suivantes sont des exemples. Bien que la description se réfère à un ou plusieurs modes de réalisation, ceci ne signifie pas nécessairement que chaque référence concerne le même mode de réalisation, ou que les caractéristiques s'appliquent uniquement à un seul mode de réalisation. De simples caractéristiques de différents modes de réalisation peuvent également être combinées ou interchangées pour fournir d'autres réalisations. The following achievements are examples. Although the description refers to one or In several embodiments, this does not necessarily mean that each reference relates to the same embodiment, or that the features apply only to a single embodiment. Simple features of different embodiments may also be combined or interchanged to provide other embodiments.
Dans la description, on peut indexer certains éléments, comme par exemple premier élément ou deuxième élément. Dans ce cas, il s'agit d'un simple indexage pour différencier et dénommer des éléments proches mais non identiques. Cette indexation n'implique pas une priorité d'un élément par rapport à un autre et on peut aisément interchanger de telles dénominations sans sortir du cadre de la présente description. Cette indexation n'implique pas non plus un ordre dans le temps.  In the description, it is possible to index certain elements, such as for example first element or second element. In this case, it is a simple indexing to differentiate and name close but not identical elements. This indexing does not imply a priority of one element with respect to another and it is easy to interchange such denominations without departing from the scope of the present description. This indexing does not imply an order in time either.
Interface de commande dite interface homme-machine Control interface called human-machine interface
La figure la représente une interface de commande pour véhicule automobile 1. Une telle interface 1 est couramment désignée par interface homme-machine. Il s'agit avantageusement d'une interface homme-machine qui soit reconfigurable. L'interface de commande 1 comporte : une surface tactile 2 configurée pour détecter un contact d'un doigt d'un utilisateur ou tout autre élément de pointage 3 (par exemple un stylet ou encore un doigt) et son déplacement sur la surface tactile 2, et  Figure la represents a control interface for a motor vehicle 1. Such an interface 1 is commonly referred to as a human-machine interface. It is advantageously a man-machine interface that is reconfigurable. The control interface 1 comprises: a tactile surface 2 configured to detect a touch of a user's finger or any other pointing element 3 (for example a stylus or a finger) and its displacement on the tactile surface 2 , and
un module de retour sensoriel 4 configuré pour générer au moins un retour sensoriel.  a sensory feedback module 4 configured to generate at least one sensory feedback.
L'interface de commande 1 peut comporter en outre un écran d'affichage 5.  The control interface 1 may furthermore comprise a display screen 5.
Surface tactile Touch surface
Une fois montée dans l'habitacle du véhicule, la surface tactile 2 peut ainsi former un périphérique d'entrée permettant aux utilisateurs de l'interface de commande 1 d' interagir avec celle-ci grâce au toucher.  Once mounted in the passenger compartment of the vehicle, the touch surface 2 can thus form an input device allowing the users of the control interface 1 to interact with it through touch.
La surface tactile 2 est par exemple configurée pour déterminer les coordonnées spatiales du point où l'utilisateur appuie avec son doigt ou un autre élément de pointage 3 sur la surface tactile 2. La surface tactile 2 permet donc de localiser la position de l'élément de pointage 3 sur la surface tactile 2. Lorsque l'élément de pointage 3 se déplace, la surface tactile 2 est configurée pour déterminer des coordonnées spatiales successives correspondant à au moins deux points successifs sur la surface tactile 2. The touch surface 2 is for example configured to determine the spatial coordinates of the point where the user presses with his finger or another pointing element 3 on the touch surface 2. The touch surface 2 thus makes it possible to locate the position of the element 3 when the pointing element 3 moves, the touching surface 2 is configured to determine successive spatial coordinates corresponding to at least two points successive on the touch surface 2.
Selon un mode de réalisation, la surface tactile 2 comporte une dalle tactile capacitive. Selon l'exemple illustré sur la figure lb, la dalle tactile capacitive comporte au moins un capteur capacitif 21 pour détecter au moins une variation de la capacité au niveau de la surface de la dalle tactile capacitive. Pour cela, le capteur capacitif 21 comporte par exemple un réseau d'électrodes, par exemple en ITO (oxyde indium - étain). La dalle tactile capacitive peut comporter en outre une plaque frontale 22 (ou plaque de contact), par exemple en polycarbonate ou en verre. La plaque frontale 22 est agencée sur le capteur capacitif 21 et est destinée à faire face à l'utilisateur une fois montée dans l'habitacle du véhicule. Cette plaque frontale 22 peut être rigide de façon à procurer la rigidité souhaitée à la dalle tactile capacitive. La surface tactile 2, avantageusement plane, de la dalle tactile capacitive est ainsi formée par la surface de la plaque frontale 22.  According to one embodiment, the tactile surface 2 comprises a capacitive touch screen. According to the example illustrated in FIG. 1b, the capacitive touch panel comprises at least one capacitive sensor 21 for detecting at least one variation of the capacitance at the surface of the capacitive touch-sensitive panel. For this purpose, the capacitive sensor 21 comprises, for example, an array of electrodes, for example made of ITO (indium-tin oxide). The capacitive touch screen may further comprise a front plate 22 (or contact plate), for example polycarbonate or glass. The front plate 22 is arranged on the capacitive sensor 21 and is intended to face the user once mounted in the passenger compartment of the vehicle. This faceplate 22 may be rigid so as to provide the desired rigidity to the capacitive touchscreen. The advantageously flat touch surface 2 of the capacitive touch-sensitive panel is thus formed by the surface of the front plate 22.
Selon un autre exemple non représenté, la surface tactile 2 peut utiliser des résistances sensibles à la pression pour détecter un contact et un déplacement d'un élément de pointage tel qu'un doigt de l'utilisateur sur la surface tactile 2. La surface tactile 2 comporte alors un capteur de pression, tel qu'utilisant la technologie FSR pour « Force Sensing Resistor » en anglais.  According to another example not shown, the touch surface 2 can use pressure sensitive resistors to detect a contact and a displacement of a pointing element such as a finger of the user on the touch surface 2. The touch surface 2 then comprises a pressure sensor, such as using the FSR technology for "Force Sensing Resistor" in English.
L'écran d'affichage 5 peut être déporté de la surface tactile 2. Selon une variante de réalisation, la plaque frontale 22 de la dalle tactile capacitive peut être peinte d'une couleur opaque de manière à cacher les éléments disposés derrière. La dalle tactile capacitive peut alors former ce qui est appelé un pavé tactile ou « Touchpad » en anglais ou un bouton poussoir ou « Push » en anglais.  The display screen 5 can be offset from the touch surface 2. According to an alternative embodiment, the front plate 22 of the capacitive touch screen can be painted with an opaque color so as to hide the elements arranged behind. The capacitive touch screen can then form what is called a touchpad or "touchpad" in English or a push button or "push" in English.
Selon une autre variante illustrée sur la figure lb, l'écran d'affichage 5 est disposé en regard et en contact avec la surface tactile 2 telle qu'une dalle tactile capacitive, plus précisément sous cette surface tactile 2, de manière à former un écran tactile. Dans ce cas, l'écran d'affichage 5 est par exemple fixé par collage au dos d'un support du capteur capacitif 21 de la dalle tactile capacitive. On entend par « dos », la partie opposée à la partie portant le capteur capacitif 21. La surface tactile 2 est alors transparente pour afficher les images de l'écran d'affichage 5 à travers la surface tactile 2. Autrement dit selon l'exemple avec une dalle tactile capacitive comprenant le capteur capacitif 21 et la plaque frontale 22, ces derniers sont transparents. Module de retour sensoriel Selon un exemple de réalisation, le module de retour sensoriel 4 peut être relié à la surface tactile 2 et/ou à l'écran d'affichage 5 dans le cas d'un écran tactile. According to another variant illustrated in FIG. 1b, the display screen 5 is arranged facing and in contact with the touch surface 2 such as a capacitive touch screen, more precisely under this touch surface 2, so as to form a touchscreen. In this case, the display screen 5 is for example fixed by gluing the back of a capacitive sensor support 21 of the capacitive touch screen. By "back" is meant the portion opposite to the portion carrying the capacitive sensor 21. The touch surface 2 is then transparent to display the images of the display screen 5 through the tactile surface 2. In other words according to the example with a capacitive touch screen comprising the capacitive sensor 21 and the front plate 22, the latter are transparent. Sensory feedback module According to an exemplary embodiment, the sensory feedback module 4 can be connected to the touch surface 2 and / or the display screen 5 in the case of a touch screen.
Le module de retour sensoriel 4 peut notamment être configuré pour générer un retour haptique en faisant vibrer la surface tactile 2. On désigne par « haptique », un retour par le toucher. Dans ce cas, le module de retour sensoriel 4 peut comporter au moins un actionneur 41, 42 relié à la surface tactile 2, et configuré pour entraîner en mouvement la surface tactile, de manière à générer le retour haptique en fonction d'un signal de commande. Deux actionneurs 41, 42 sont représentés de façon schématique sur la figure lb. Le retour haptique est un signal vibratoire tel qu'une vibration produite par un signal de commande sinusoïdal ou par un signal de commande comportant un ou une succession de puises, envoyé à actionneur 41 et/ou 42. La vibration peut être dirigée dans le plan de la surface tactile 2 ou orthogonalement au plan de la surface tactile 2 ou encore selon une combinaison de ces deux directions. Dans le cas de plusieurs actionneurs 41, 42, ces derniers peuvent être agencés sous la surface tactile 2, dans différentes positions (au centre ou sur un côté) ou dans différentes orientations (dans la direction de l'appui sur la surface ou dans un autre axe). De tels actionneurs 41, 42 sont connus et ne seront pas décrits plus en détail dans la présente description. Un paramètre du retour haptique peut être choisi parmi l'intensité de l'accélération, la fréquence, l'amplitude, la durée, la durée entre deux signaux identiques, la phase. On peut ainsi par exemple simuler différentes textures de la surface tactile 2, tel que différentes rugosités de surface.  The sensory feedback module 4 can in particular be configured to generate a haptic feedback by vibrating the touch surface 2. The term "haptic", a return by touch. In this case, the sensory feedback module 4 may comprise at least one actuator 41, 42 connected to the touch surface 2, and configured to drive the touching surface in motion, so as to generate the haptic feedback as a function of a signal of ordered. Two actuators 41, 42 are shown schematically in FIG. The haptic feedback is a vibratory signal such as a vibration produced by a sinusoidal control signal or by a control signal comprising one or a succession of pulses, sent to actuator 41 and / or 42. The vibration can be directed in the plane of the tactile surface 2 or orthogonally to the plane of the tactile surface 2 or in a combination of these two directions. In the case of several actuators 41, 42, these can be arranged under the touch surface 2, in different positions (in the center or on one side) or in different orientations (in the direction of the support on the surface or in a other axis). Such actuators 41, 42 are known and will not be described in more detail in the present description. A parameter of the haptic feedback can be chosen from among the intensity of the acceleration, the frequency, the amplitude, the duration, the duration between two identical signals, the phase. For example, it is possible to simulate different textures of the tactile surface 2, such as different surface roughnesses.
En variante ou en accompagnement du retour haptique, le module de génération de retour sensoriel 4 peut être configuré pour générer un retour sonore à l'utilisateur. Un paramètre du retour sonore peut être choisi parmi l'intensité du volume, la phase, la fréquence, la durée, la durée entre deux signaux identiques.  As a variant or as an accompaniment to the haptic feedback, the sensory feedback generation module 4 can be configured to generate a sound feedback to the user. A sound return parameter can be chosen from the intensity of the volume, the phase, the frequency, the duration, the duration between two identical signals.
On peut prévoir en outre d'afficher sur l'écran d'affichage 5 un ou plusieurs motifs correspondants au retour haptique et/ou sonore. Ainsi, avantageusement dans le cas d'un écran tactile, une image affichée sur l'écran tactile peut représenter des motifs marqués, telles des microbosses, au niveau de la position de l'élément de pointage 3. On parle dans ce cas de retour visuel. Le retour visuel peut également être généré par le module de retour sensoriel 4.  In addition, it is possible to display on the display screen 5 one or more patterns corresponding to the haptic and / or sound return. Thus, advantageously in the case of a touch screen, an image displayed on the touch screen can represent marked patterns, such as microbosses, at the position of the pointing element 3. In this case we speak back visual. The visual feedback can also be generated by the sensory feedback module 4.
Le module de retour sensoriel 4 peut être configuré pour générer tous les retours sensoriels, c'est-à-dire haptique et/ou sonore et/ou visuel, en même temps. Au contraire, le module de retour sensoriel 4 peut être configuré pour générer l'un ou l'autre des retours de façon décalée dans le temps par rapport à un autre retour. Autrement dit, à titre d'exemple illustratif non limitatif lorsqu'un retour haptique est généré, un retour sonore peut être généré avant, en même temps ou après la génération du retour haptique. En effet, afin que l'utilisateur perçoive en même temps un retour haptique et un retour sonore, il est préférable de ne pas les générer en même temps mais de façon avancée ou retardée l'un par rapport à l'autre. Il en est de même pour la génération d'un retour visuel qui peut être généré avant, en même temps ou après la génération d'un retour haptique et/ou d'un retour sonore. The sensory feedback module 4 can be configured to generate all the sensory feedbacks, that is to say, haptic and / or sound and / or visual, at the same time. On the contrary, the return module Sensory 4 can be configured to generate one or the other of the returns in a time-shifted manner with respect to another return. In other words, as a nonlimiting illustrative example when a haptic feedback is generated, a sound feedback can be generated before, at the same time or after the generation of the haptic feedback. Indeed, so that the user perceives at the same time a haptic feedback and a sound return, it is better not to generate them at the same time but in an advanced or delayed manner with respect to each other. It is the same for the generation of a visual feedback that can be generated before, at the same time or after the generation of a haptic feedback and / or a sound return.
En particulier, en référence aux figures 2a, 2b, le module de retour sensoriel 4 est configuré pour générer un ou plusieurs retours sensoriels, de même nature ou de nature différente, lorsque l'élément de pointage 3, ici le doigt 3, passe ou franchit une frontière cible 6. Comme la surface tactile est lisse, cette frontière cible 6 n'est pas matérialisée haptiquement sur la surface tactile 2.  In particular, with reference to FIGS. 2a, 2b, the sensory feedback module 4 is configured to generate one or more sensory feedbacks, of the same nature or of a different nature, when the pointing element 3, here the finger 3, passes or crosses a target boundary 6. Since the touch surface is smooth, this target boundary 6 is not materialized haptically on the touch surface 2.
Il peut s'agir d'une frontière 6 séparant deux zones ZI, Z2 de l'écran d'affichage 5, par exemple de l'écran tactile, comme illustré sur la figure 2a. Il peut également s'agir d'une frontière 6 indiquant que l'on entre dans une zone ou encore que l'on sort d'une zone, comme schématisé sur la figure 2b représentant de façon schématique le doigt 3 sortant de la zone Z2. Notamment, grâce à un retour haptique, on peut simuler sur la surface tactile 2, une texture au niveau de cette frontière cible 6 entre deux zones ZI, Z2 par exemple, ou délimitant une entrée ou une sortie de zone.  It may be a border 6 separating two zones ZI, Z2 from the display screen 5, for example from the touch screen, as illustrated in FIG. 2a. It may also be a boundary 6 indicating that one enters an area or that one leaves an area, as schematized in Figure 2b schematically showing the finger 3 leaving the zone Z2 . In particular, thanks to a haptic feedback, it is possible to simulate on the tactile surface 2, a texture at this target boundary 6 between two zones Z1, Z2 for example, or delimiting an entry or a zone exit.
Unité de traitement Processing unit
L'interface de commande 1 comporte en outre une unité de traitement 7 représentée de façon schématique sur les figures la et lb. L'unité de traitement 7 est configurée pour échanger des informations et/ou données avec la surface tactile 2 et également avec le module de retour sensoriel 4.  The control interface 1 further comprises a processing unit 7 shown schematically in FIGS. 1a and 1b. The processing unit 7 is configured to exchange information and / or data with the touch surface 2 and also with the sensory feedback module 4.
Cette unité de traitement 7 peut comporter un ou plusieurs moyens de traitement tels qu'un ou plusieurs microcontrôleurs ou ordinateurs, ayant des mémoires et programmes notamment adaptés pour recevoir des informations de position et déplacement ou glissement de l'élément de pointage 3, ces informations ayant été détectées notamment par la surface tactile 2. L'unité de traitement 7 est par exemple l'ordinateur de bord du véhicule automobile. Comme dit précédemment, la surface tactile 2 peut relever des coordonnées spatiales successives correspondant à des points d'appui, et peut transmettre ces informations avec les coordonnées à l'unité de traitement 7. L'unité de traitement 7 comprend donc au moins un moyen de réception des informations transmises par la surface tactile 2. This processing unit 7 may comprise one or more processing means such as one or more microcontrollers or computers, having memories and programs particularly adapted to receive position and displacement or sliding information of the pointing element 3, this information having been detected in particular by the tactile surface 2. The processing unit 7 is for example the on-board computer of the motor vehicle. As told previously, the touch-sensitive surface 2 can detect successive spatial coordinates corresponding to support points, and can transmit this information with the coordinates to the processing unit 7. The processing unit 7 therefore comprises at least one reception means information transmitted by the tactile surface 2.
L'unité de traitement 7 comporte en outre un ou plusieurs moyens de traitement tel qu'un circuit de pilotage configuré pour piloter le module de retour sensoriel 4 afin de générer au moins un retour sensoriel tel qu'un retour haptique et/ou sonore et/ou visuel. Le circuit de pilotage peut comprendre des moyens d'émission d'une commande de génération de retour sensoriel à destination du module de retour sensoriel 4.  The processing unit 7 further comprises one or more processing means such as a control circuit configured to drive the sensory feedback module 4 in order to generate at least one sensory feedback such as a haptic and / or sound return and / or visual. The control circuit may include means for transmitting a sensory feedback generation command to the sensory feedback module 4.
En référence à la figure lb, l'interface de commande 1 peut comporter en outre un dispositif de mesure 9 représenté de façon très schématique sur cette figure lb. Le dispositif de mesure 9 comprend par exemple un ou plusieurs capteurs de pression configurés pour mesurer la pression exercée sur la surface tactile 2. Il s'agit par exemple d'une ou plusieurs jauges de contrainte. La ou les jauges de contraintes sont agencées en liaison directe avec la surface tactile 2 et sont astucieusement réparties selon les besoins. À titre d'exemple non limitatif, on peut prévoir une jauge de contrainte sensiblement au milieu de la surface tactile 2, par exemple dans le cas d'un pavé tactile dit « Touchpad ». Avantageusement, une ou plusieurs jauges de contraintes sont disposées sur un ou plusieurs bords de la surface tactile 2. Par exemple, on peut prévoir quatre jauges de contraintes, chacune placée au niveau d'un angle de la surface tactile 2. La ou chaque jauge de contrainte peut être disposée au niveau d'amortisseurs prévus sous la surface tactile 2 de façon à mesurer le déplacement de la surface tactile 2 lors du contact et du glissement de l'élément de pointage 3 sur la surface tactile 2.  With reference to FIG. 1b, the control interface 1 may further include a measuring device 9 shown very schematically in this FIG. The measuring device 9 comprises for example one or more pressure sensors configured to measure the pressure exerted on the tactile surface 2. This is for example one or more strain gauges. The strain gauge or gauges are arranged in direct connection with the touch surface 2 and are cleverly distributed as required. By way of non-limiting example, it is possible to provide a strain gauge substantially in the middle of the tactile surface 2, for example in the case of a tactile pad called "Touchpad". Advantageously, one or more strain gauges are arranged on one or more edges of the tactile surface 2. For example, four strain gages may be provided, each placed at an angle of the touch surface 2. The or each gauge stress can be arranged at dampers provided under the touch surface 2 so as to measure the displacement of the touch surface 2 during contact and sliding of the pointing element 3 on the touch surface 2.
Dans le cas d'un tel dispositif de mesure 9, l'unité de traitement 7 comporte au moins un moyen de traitement pour recevoir des signaux de mesure du dispositif de mesure 9, en particulier du ou de chaque capteur de pression. La ou les mesures de pression sont avantageusement prises en compte. À titre d'exemple non limitatif, les mesures de pression permettent d'identifier la façon dont l'utilisateur appuie sur la surface tactile 2 par le biais de l'élément de pointage 3, afin de générer en conséquence le signal approprié pour la simulation de texture.  In the case of such a measuring device 9, the processing unit 7 comprises at least one processing means for receiving measurement signals from the measuring device 9, in particular the or each pressure sensor. The pressure measurement (s) are advantageously taken into account. By way of non-limiting example, the pressure measurements make it possible to identify the way in which the user presses on the tactile surface 2 by means of the pointing element 3, so as to generate the appropriate signal for the simulation accordingly. texture.
En outre, l'unité de traitement 7 est avantageusement configurée pour anticiper que l'élément de pointage 3 va franchir ultérieurement une frontière cible 6 de la surface tactile 2, et pour piloter le module de retour sensoriel 4 avant le franchissement effectif de la frontière cible 6, afin de générer au moins un retour sensoriel perçu par l'utilisateur lorsque l'élément de pointage 3 franchit effectivement la frontière cible 6. In addition, the processing unit 7 is advantageously configured to anticipate that the pointing element 3 will subsequently cross a target boundary 6 of the touch surface 2, and to control the sensory feedback module 4 before the actual crossing of the target boundary 6, in order to generate at least one sensory feedback perceived by the user when the pointing element 3 actually crosses the target boundary 6.
On évite ainsi un décalage temporel entre la perception du retour sensoriel et le passage de l'élément de pointage 3 sur la frontière cible 6.  This avoids a temporal shift between the perception of the sensory feedback and the passage of the pointing element 3 on the target boundary 6.
L'unité de traitement 7 est donc configurée pour anticiper le franchissement d'une frontière cible 6 lorsqu'un déplacement est détecté sur la surface tactile 2, et non lorsqu'un simple point d'appui est détecté. On décrit maintenant en référence à la figure 2a, un mode de réalisation particulier. Selon ce mode de réalisation, l'unité de traitement 7 anticipe le franchissement de la frontière cible 6 lorsque l'élément de pointage 3 se trouve à une distance seuil d, d' activation d'au moins un retour sensoriel, par rapport à la frontière cible 6.  The processing unit 7 is therefore configured to anticipate the crossing of a target boundary 6 when a displacement is detected on the touch surface 2, and not when a single point of support is detected. We will now describe with reference to FIG. 2a, a particular embodiment. According to this embodiment, the processing unit 7 anticipates the crossing of the target boundary 6 when the pointing element 3 is at a threshold distance d, of activation of at least one sensory feedback, with respect to the target border 6.
Plus précisément, l'unité de traitement 7 est configurée pour :  More specifically, the processing unit 7 is configured to:
- détecter le déplacement de l'élément de pointage 3 en direction d'une frontière cible 6 de la surface tactile 2, detecting the displacement of the pointing element 3 in the direction of a target boundary 6 of the tactile surface 2,
lorsque l'élément de pointage 3 est localisé à la distance seuil d, générer au moins un retour sensoriel par le module de retour sensoriel 4.  when the pointing element 3 is located at the threshold distance d, generating at least one sensory feedback via the sensory feedback module 4.
Pour fixer la distance seuil d, on peut déterminer par exemple par des expérimentations une vitesse moyenne de déplacement Vmoy de l'élément de pointage sur la surface tactile 2 et on peut déterminer le délai Tu entre la localisation de l'élément de pointage 3 à un endroit donné et une vibration effective sur la surface tactile 2 du fait de activation des actionneurs 41 et 42 par exemple. Dans ce cas, Selon une première approche, on peut fixer une vitesse moyenne globale commune à toutes les fonctions à commander. Selon une deuxième approche, notamment en fonction de la morphologie des zones de commande comme leurs dimensions, leur nombre sur la surface tactile, leur proximité ou leur géométrie, on peut fixer une vitesse moyenne pour chaque fonction de commande. Ces vitesses moyennes peuvent par exemple être enregistrées dans une mémoire de l'unité de traitement 7. In order to set the threshold distance d, it is possible to determine, for example by experiments, an average speed of displacement Vmoy of the pointing element on the touch surface 2 and it is possible to determine the delay Tu between the location of the pointing element 3 to a given location and an effective vibration on the touch surface 2 due to activation of the actuators 41 and 42 for example. In that case, According to a first approach, it is possible to set an overall average speed common to all the functions to be controlled. According to a second approach, in particular according to the morphology of the control areas such as their dimensions, their number on the tactile surface, their proximity or their geometry, it is possible to set an average speed for each control function. These average speeds can for example be recorded in a memory of the processing unit 7.
Ainsi, si l'élément de pointage 3 se trouve à une distance d de la frontière cible 6 et se déplace à la vitesse moyenne ν^, l'élément de pointage 3 va franchir la frontière cible au moment où on peut ressentir le retour haptique sur la surface tactile 2. Thus, if the pointing element 3 is at a distance d from the target boundary 6 and moves at the average velocity ν ^, the pointing element 3 will cross the target boundary at moment when one can feel the haptic feedback on the tactile surface 2.
Ledit au moins un retour sensoriel est alors perçu par un utilisateur de ladite interface 1 lorsque l'élément de pointage 3 franchit la frontière cible 6. Said at least one sensory feedback is then perceived by a user of said interface 1 when the pointing element 3 crosses the target boundary 6.
Certes, il peut y avoir une différence entre la vitesse de déplacement réelle de l'élément de pointage 3 et la vitesse moyenne et donc un léger décalage entre le franchissement réel de la frontière cible 6 et le ressenti du retour haptique, mais l'anticipation pour déclencher la génération du retour haptique permet de réduire, voire annuler le décalage entre des perceptions visuelles et haptiques. Certainly, there may be a difference between the actual traveling speed of the pointing element 3 and the average speed and therefore a slight shift between the actual crossing of the target border 6 and the feeling of haptic feedback, but the anticipation to trigger the generation of the haptic feedback makes it possible to reduce or even cancel the gap between visual and haptic perceptions.
Par exemple, on peut définir une délimitation fictive, telle qu'une ligne fictive 13, en amont de la frontière cible 6 et qui est agencé à la distance seuil d de la frontière cible 6. Le terme « amont » est ici utilisé en référence au sens de déplacement de l'élément de pointage 3 sur la surface tactile 2. La délimitation fictive peut s'étendre sensiblement parallèlement à la frontière 6 à laquelle elle est associée.  For example, it is possible to define a fictitious delimitation, such as a fictitious line 13, upstream of the target boundary 6 and which is arranged at the threshold distance d of the target boundary 6. The term "upstream" is used here for reference in the direction of displacement of the pointing element 3 on the tactile surface 2. The fictitious delimitation may extend substantially parallel to the boundary 6 with which it is associated.
Selon une première variante, l'unité de traitement 7 est donc configurée pour générer le retour sensoriel sensiblement à l'instant auquel l'élément de pointage 3 est détecté à la distance seuil d, par exemple sur la délimitation fictive 13.  According to a first variant, the processing unit 7 is therefore configured to generate the sensory feedback substantially at the instant at which the pointing element 3 is detected at the threshold distance d, for example on the imaginary delimitation 13.
La distance seuil d, par exemple ici la distance de la délimitation fictive 13 par rapport à la frontière cible 6, peut être prédéfinie et fixe.  The threshold distance d, for example here the distance of the fictitious delineation 13 with respect to the target border 6, can be predefined and fixed.
Selon une deuxième variante, la distance seuil d, par exemple ici la distance de la délimitation fictive 13 par rapport à la frontière cible 6, peut être paramétrée et peut donc être variable. Autrement dit, on peut prévoir d'adapter la distance seuil d par rapport à la frontière cible 6 en fonction de la vitesse et/ou de l'accélération de l'élément de pointage 3 sur la surface tactile 2. Pour ce faire, l'unité de traitement 7 comprend un ou plusieurs moyens de traitement pour paramétrer ou définir la distance seuil d.  According to a second variant, the threshold distance d, for example here the distance of the fictitious delineation 13 with respect to the target boundary 6, can be parameterized and can therefore be variable. In other words, it is possible to adapt the threshold distance d with respect to the target boundary 6 as a function of the speed and / or the acceleration of the pointing element 3 on the touch surface 2. To do this, the processing unit 7 comprises one or more processing means for setting or defining the threshold distance d.
Par exemple, plus la vitesse de déplacement est grande, plus on éloigne la délimitation fictive 13 de la frontière cible 6, 11 et inversement.  For example, the greater the speed of displacement, the farther delineation 13 of the target border 6, 11 and vice versa.
Pour cela, l'unité de traitement 7 peut être configurée pour analyser le déplacement de l'élément de pointage 3 de façon à anticiper le franchissement de la frontière cible 6. L'unité de traitement 7 est donc de plus configurée pour évaluer au moins un paramètre de déplacement de l'élément de pointage 3 sur la surface tactile 2 dans le temps. Il s'agit notamment de l'évaluation de la vitesse de déplacement de l'élément de pointage 3, ou d'une fonction de la vitesse telle que la dérivée ou accélération. For this, the processing unit 7 can be configured to analyze the displacement of the pointing element 3 so as to anticipate the crossing of the target boundary 6. The processing unit 7 is therefore further configured to evaluate at least one displacement parameter of the pointing element 3 on the touch surface 2 in time. These include the evaluation of the speed of movement of the pointing element 3, or a function of the speed such as the derivative or acceleration.
Plus précisément, l'unité de traitement 7 peut comporter un ou plusieurs moyens de traitement configurés pour :  More specifically, the processing unit 7 may comprise one or more processing means configured for:
évaluer la direction de déplacement de l'élément de pointage 3 sur la surface tactile 2, et évaluer au moins un paramètre de déplacement de l'élément de pointage 3 sur la surface tactile 2 dans le temps.  evaluate the direction of movement of the pointing element 3 on the touch surface 2, and evaluate at least one displacement parameter of the pointing element 3 on the touch surface 2 in time.
Pour l'évaluation de la direction, l'unité de traitement 7 peut analyser les informations de localisation de position de points successifs, transmises par la surface tactile 2. Ainsi, en référence aux figures 2a et 2b, l'unité de traitement 7 peut déterminer que l'élément de pointage 3 se déplace selon le sens schématisé par la flèche Fl, qui est ici horizontal et vers la droite. Bien entendu, l'unité de traitement 7 peut aussi bien détecter des mouvements rectilignes qu'ils soient horizontaux ou verticaux (flèche F2 sur la figure 3b), mais aussi circulaires (flèche F3 sur la figure 4a) ou encore en biais par rapport à la surface tactile 2 (flèche F4 figure 4c), dans un sens ou dans l'autre, par exemple vers la droite (flèche Fl sur les figures 2b, 4b et flèche F4 sur la figure 4c ) ou vers la gauche (flèche F 5 sur la figure 4d). Les termes verticaux et horizontaux, gauche et droite, sont ici utilisés en référence à la disposition des éléments tels que représentés sur les figures 2a à 4d.  For the evaluation of the direction, the processing unit 7 can analyze the position location information of successive points, transmitted by the tactile surface 2. Thus, with reference to FIGS. 2a and 2b, the processing unit 7 can determine that the pointing element 3 moves in the direction shown schematically by the arrow Fl, which is here horizontal and to the right. Of course, the processing unit 7 can detect rectilinear movements whether they are horizontal or vertical (arrow F2 in FIG. 3b), but also circular (arrow F3 in FIG. 4a) or else at an angle to the touch surface 2 (arrow F4, FIG. 4c), in one direction or the other, for example to the right (arrow Fl in FIGS. 2b, 4b and arrow F4 in FIG. 4c) or to the left (arrow F 5 in Figure 4d). The vertical and horizontal terms, left and right, are here used with reference to the arrangement of the elements as shown in FIGS. 2a to 4d.
De plus, l'unité de traitement 7 peut comprendre au moins un moyen de calcul permettant de déduire la vitesse de l'élément de pointage 3 à partir des informations de localisation de position de points successifs, transmises par la surface tactile 2, et en fonction d'une horloge interne de l'interface homme-machine par exemple. Afin d'améliorer les performances, l'unité de traitement 7 peut comprendre au moins un moyen de calcul de la dérivée de la vitesse, donc de l'accélération, par exemple à partir d'informations de localisation de position de points plus éloignés entre eux, transmises par la surface tactile 2.  In addition, the processing unit 7 may comprise at least one calculation means making it possible to deduce the speed of the pointing element 3 from the location information of successive points, transmitted by the tactile surface 2, and function of an internal clock of the man-machine interface for example. In order to improve the performance, the processing unit 7 may comprise at least one means for calculating the derivative of the speed, and therefore of the acceleration, for example from location location information of points further away between them, transmitted by the tactile surface 2.
Enfin, l'unité de traitement 7 peut comprendre au moins un moyen de traitement, par exemple une partie logicielle, permettant, à partir d'au moins une grandeur caractéristique de l'évolution du déplacement de l'élément de pointage 3 sur la surface tactile 2 dans le temps, d'adapter la distance seuil cl par rapport à la frontière cible 6. Finally, the processing unit 7 may comprise at least one processing means, for example a software part, allowing, starting from at least one characteristic quantity of the evolution of the displacement of the pointing element 3 on the touch surface 2 in time, to adapt the threshold distance cl with respect to the target boundary 6.
En variante ou en complément, l'unité de traitement 7 peut être configurée pour prendre en compte au moins un paramètre de déplacement de l'élément de pointage sur la surface tactile dans le temps évaluée pour la détermination la distance seuil cl.  Alternatively or in addition, the processing unit 7 may be configured to take into account at least one parameter of displacement of the pointing element on the touch surface in the time evaluated for the determination of the threshold distance cl.
Exemples d'applications non limitatifs : Examples of non-limiting applications:
Ainsi, dans l'exemple de la figure 2a, sur l'écran, ici l'écran tactile, au moins deux zones ZI et Z2 sont affichées. Lorsque l'élément de pointage 3, ici le doigt 3 de l'utilisateur, est détecté dans la zone ZI à la distance seuil cl de la frontière cible 6, par exemple ici sur la délimitation fictive 13, en se déplaçant ici horizontalement et vers la droite suivant la flèche Fl, l'unité de traitement 7 peut anticiper que le doigt 3 va franchir la frontière cible 6 entre les deux zones ZI et Z2.  Thus, in the example of FIG. 2a, on the screen, here the touch screen, at least two zones Z1 and Z2 are displayed. When the pointing element 3, here the finger 3 of the user, is detected in the zone ZI at the threshold distance c1 of the target boundary 6, for example here on the fictitious delineation 13, by moving here horizontally and towards the line along the arrow Fl, the processing unit 7 can anticipate that the finger 3 will cross the target boundary 6 between the two zones ZI and Z2.
L'unité de traitement 7 peut générer par anticipation, dès que le doigt 3 est détecté à la distance seuil cl un signal de commande à destination du module de retour sensoriel 4 qui génère le(s) retour(s) sensoriel(s). Lorsque le doigt 3 arrive effectivement sur la frontière 6, l'utilisateur perçoit le ou les retours sensoriels.  The processing unit 7 can generate in advance, as soon as the finger 3 is detected at the threshold distance cl a control signal to the sensory feedback module 4 which generates the sensory feedback (s). When the finger 3 actually arrives on the border 6, the user perceives the sensory feedback or returns.
Comme illustré sur la figure 2b, il se peut que les deux zones ZI, Z2 soient des zones ouvertes, c'est-à-dire sans frontière entre les deux zones ZI, Z2, mais dans ce cas, c'est le franchissement d'une des frontières marquées autour d'une des zones, ici Z2, qui est anticipé.  As illustrated in FIG. 2b, it is possible for the two zones ZI, Z2 to be open zones, that is to say without any boundary between the two zones ZI, Z2, but in this case it is the crossing of one of the borders marked around one of the zones, here Z2, which is anticipated.
Bien sûr, l'unité de traitement 7 permet d'anticiper qu'un élément de pointage 3 va entrer dans une zone (figure 2a) mais aussi qu'il va sortir d'une zone (figure 2b).  Of course, the processing unit 7 makes it possible to anticipate that a pointing element 3 will enter an area (FIG. 2a) but also that it will exit an area (FIG. 2b).
Bien entendu, l'invention ne se limite pas aux exemples des figures 2a et 2b. D'autres applications peuvent être envisagées. Par exemple, comme schématisé sur la figure 3a, la surface tactile 2 peut comporter une succession de zones Bl, B2, B3, B4 distinctes simulant des boutons de commande, chaque zone Bl, B2, B3, B4 étant délimitée par une surface fermée formant des frontières 6. L'unité de traitement 7 est alors configurée pour anticiper si l'élément de pointage 3 va entrer dans ou sortir de l'une ou l'autre de ces zones Bl, B2, B3, B4. À titre d'exemple non limitatif, si l'élément de pointage 3 est détecté dans la zone B2, et si l'utilisateur déplace son doigt 3 verticalement, vers le bas en référence à la disposition des éléments sur la figure 3b et comme schématisé par la flèche F2, l'unité de traitement 7 est configurée pour déterminer que l'élément de pointage 3 va sortir de la zone B2 lorsqu'il est localisé à une distance seuil cl de la frontière 6 en bas de la zone B2 en référence à la disposition des éléments sur la figure 3b, et pour activer par anticipation la génération d'un retour sensoriel significatif de la sortie de B2. Si l'élément de pointage 3 continue son mouvement descendant illustré par la flèche F2, l'unité de traitement 7 peut anticiper si et quand l'élément de pointage 3 entrera dans la zone B3 suivante lorsque l'élément de pointage 3 est localisé à une distance seuil cl de la frontière 6 en haut de la zone B3 en référence à la disposition des éléments sur la figure 3b, et commander de façon anticipée la commande de génération d'au moins un retour sensoriel associé à l'entrée dans la zone B3. On peut prévoir en outre un retour sensoriel différent selon que l'élément de pointage 3 entre dans ou quitte une zone fermée. En particulier, avec deux retours haptiques distincts l'on peut simuler le ressenti d'un utilisateur enfonçant, puis relâchant une touche. Of course, the invention is not limited to the examples of Figures 2a and 2b. Other applications may be considered. For example, as shown diagrammatically in FIG. 3a, the tactile surface 2 may comprise a succession of distinct zones B1, B2, B3, B4 simulating control buttons, each zone B1, B2, B3, B4 being delimited by a closed surface forming 6. The processing unit 7 is then configured to anticipate whether the pointing element 3 will enter or leave one or other of these areas B1, B2, B3, B4. By way of nonlimiting example, if the pointing element 3 is detected in the zone B2, and if the user moves his finger 3 vertically, downwards with reference to the arrangement of the elements in FIG. 3b and as shown schematically by the arrow F2, the processing unit 7 is configured to determine that the pointing element 3 will leave the zone B2 when it is located at a threshold distance cl of the border 6 at the bottom of the zone B2 in reference to the arrangement of the elements in FIG. 3b, and to activate in advance the generation of a significant sensory feedback from the output of B2. If the pointing element 3 continues its downward movement illustrated by the arrow F2, the processing unit 7 can anticipate if and when the pointing element 3 will enter the next zone B3 when the pointing element 3 is located at a threshold distance c1 of the boundary 6 at the top of the zone B3 with reference to the arrangement of the elements in FIG. 3b, and to order in advance the generation control of at least one sensory feedback associated with the entry into the zone B3. In addition, a different sensory feedback may be provided depending on whether the pointing element 3 enters or leaves a closed zone. In particular, with two distinct haptic feedbacks one can simulate the feeling of a user pressing and releasing a key.
Dans l'exemple des figures 3a et 3b, les boutons ou touches de commande simulées sont représentées en ligne ou dans un couloir. Bien entendu, toute autre disposition peut être prévue. Par exemple, les touches virtuelles peuvent être disposées de façon circulaire comme schématisé sur la figure 4a. Comme précédemment, l'unité de traitement 7 peut anticiper à partir du déplacement selon un mouvement circulaire schématisé par la flèche F3, que l'élément de pointage 3 va franchir une frontière 6 délimitant une zone, ici B2, lorsque l'élément de pointage 3 est par exemple localisé sur la délimitation fictive 13 suivant un rayon du cercle défini par les touches virtuelles, de façon à commander la génération du ou des retours sensoriels pour qu'il(s) soi(en)t présenté(s) au bon moment à l'utilisateur. Dans ce cas, la distance seuil cl est une distance angulaire.  In the example of FIGS. 3a and 3b, the simulated buttons or control buttons are represented in line or in a corridor. Of course, any other provision can be provided. For example, the virtual keys may be arranged in a circular manner as shown diagrammatically in FIG. 4a. As previously, the processing unit 7 can anticipate from the displacement in a circular motion schematized by the arrow F3, that the pointing element 3 will cross a boundary 6 defining an area, here B2, when the pointing element 3 is for example located on the fictitious delineation 13 along a radius of the circle defined by the virtual keys, so as to control the generation of the sensory feedback or returns so that it (s) is (are) presented (s) to the good moment to the user. In this case, the threshold distance cl is an angular distance.
Selon une autre variante illustrée sur la figure 4b, une liste d'items A, B, C peut être affichée sur l'écran d'affichage 5, par exemple sur l'écran tactile. En fonctionnement, pour naviguer dans la liste d'items A à C, l'utilisateur déplace l'élément de pointage 3 sur la surface tactile 2, et dès lors qu'un contact pointant un item, ici A, dans la liste, est détecté sur la surface tactile 2, une sous-liste d'items Al, A2, A3 peut s'afficher. Par exemple, pour chaque sous-liste, encore une autre sous-liste, ici A31, A32, A33 peut s'afficher. Ainsi, dans cet exemple, lorsque l'utilisateur glisse son doigt 3 ou autre élément de pointage sur la surface tactile 2, et lorsque le doigt 3 est détecté sur l'item A3 par exemple, si la direction du mouvement est vers la droite comme schématisé par la flèche Fl, l'unité de traitement 7 peut anticiper que le doigt 3 va quitter la zone correspondant à l'item A3 lorsque le doigt 3 est localisé à une distance seuil d de la frontière 6 de droite de l'item A3 en référence à la disposition des éléments sur la figure 4b. De même, l'unité de traitement 7 peut anticiper que le doigt 3 va entrer dans la zone correspondant au sous-item A31, lorsque le doigt 3 est localisé à une distance seuil d de la frontière 6 de gauche du sous-item A31 en référence à la disposition des éléments sur la figure 4b. L'unité de traitement 7 peut ainsi commander par anticipation les retours sensoriels appropriés et au bon moment. According to another variant illustrated in FIG. 4b, a list of items A, B, C can be displayed on the display screen 5, for example on the touch screen. In operation, to navigate in the list of items A to C, the user moves the pointing element 3 on the touch surface 2, and when a contact pointing an item, here A, in the list, is detected on the touch-sensitive surface 2, a sub-list of items A1, A2, A3 may be displayed. For example, for each sub-list, yet another sub-list, here A31, A32, A33 can be displayed. Thus, in this example, when the user slides his finger 3 or other pointing element on the tactile surface 2, and when the finger 3 is detected on the item A3 for example, if the direction of movement is to the right as schematized by the arrow Fl, the processing unit 7 can anticipate that the finger 3 will leave the zone corresponding to the item A3 when the finger 3 is located at a threshold distance d from the right boundary of item A3 with reference to the arrangement of the elements in FIG. 4b. Likewise, the processing unit 7 can anticipate that the finger 3 will enter the zone corresponding to the sub-item A31, when the finger 3 is located at a threshold distance d from the left boundary 6 of the sub-item A31. reference to the arrangement of the elements in FIG. 4b. The treatment unit 7 can thus anticipate the appropriate sensory feedback at the right time.
On peut encore prévoir comme illustré sur la figure 4c, un agencement en cascade de fenêtres A, B, C, D. L'unité de traitement 7 peut anticiper chaque franchissement de fenêtre et activer par anticipation la génération du retour sensoriel approprié au bon moment.  It is also possible to provide, as illustrated in FIG. 4c, a cascading arrangement of windows A, B, C, D. The processing unit 7 can anticipate each window crossing and activate in anticipation the generation of the appropriate sensory feedback at the right moment. .
Alternativement, l'unité de traitement 7 peut encore anticiper le franchissement de lignes successives 11 lorsque l'élément de pointage 3 est localisé à une distance seuil d d'une ligne 11 donnée, et activer par anticipation la génération d'au moins un retour sensoriel approprié associé au franchissement de chaque ligne 11. Dans cet exemple, chaque ligne 11 forme une frontière cible.  Alternatively, the processing unit 7 can still anticipate the crossing of successive lines 11 when the pointing element 3 is located at a threshold distance d of a given line 11, and activate in advance the generation of at least one return appropriate sensory associated with the crossing of each line 11. In this example, each line 11 forms a target boundary.
On a décrit ici des exemples d'affichage sur un écran tactile. Bien entendu, l'invention s'applique également lorsque l'affichage se fait sur un écran d'affichage 5 déporté de la surface tactile 2. Dans ce cas, de façon connue, à chaque zone de l'écran d'affichage 5 déportée est associée une zone sur la surface tactile 2. Procédé de commande  Examples of display on a touch screen have been described here. Of course, the invention also applies when the display is on a remote display screen of the touch surface 2. In this case, in known manner, to each zone of the remote display screen 5 is associated an area on the touch surface 2. Control method
En référence à la figure 5, on décrit maintenant un procédé de commande de retour sensoriel pour une interface homme-machine notamment pour véhicule automobile, l'interface homme-machine comportant une surface tactile 2. Il s'agit avantageusement d'une interface de commande 1 telle que décrite précédemment en référence aux figures la à 4d.  With reference to FIG. 5, a sensory feedback control method for a human-machine interface, in particular for a motor vehicle, is described, the human-machine interface comprising a touch-sensitive surface 2. It is advantageously an interface of control 1 as described above with reference to Figures la to 4d.
Le procédé de commande comprend une étape El dans laquelle on détecte le déplacement d'un élément de pointage 3 tel qu'un doigt d'un utilisateur de l'interface de commande 1, ou encore un stylet ou un doigt, sur la surface tactile 2 en direction d'une frontière cible 6 ou 11 de la surface tactile 2. Cette étape El de détection est par exemple réalisée par la surface tactile 2. Le procédé de commande comprend de plus une étape E2, par exemple dite étape d'anticipation, à laquelle à la distance seuil d, on génère au moins un retour sensoriel, en particulier hap tique. The control method comprises a step E1 in which the displacement of a pointing element 3 such as a finger of a user of the control interface 1, or a stylus or a finger, on the tactile surface is detected. 2 in the direction of a target boundary 6 or 11 of the touch surface 2. This detection step El is for example made by the touch surface 2. The control method further comprises a step E2, for example said anticipation step, at which at the threshold distance d, at least one sensory feedback, in particular hap tic, is generated.
Avantageusement, au moins un paramètre de déplacement de l'élément de pointage 3 sur la surface tactile 2 dans le temps, comme l'accélération de l'élément de pointage 3 et/ou la vitesse de déplacement de l'élément de pointage 3 peut être utilisée pour la définition de la distance seuil d par rapport à la frontière cible 6 ou 11, à cette étape E2.  Advantageously, at least one parameter of displacement of the pointing element 3 on the touch surface 2 in time, such as the acceleration of the pointing element 3 and / or the speed of displacement of the pointing element 3 can be used for the definition of the threshold distance d with respect to the target border 6 or 11, at this step E2.
Un mode de réalisation particulier est schématisé sur la figure 6. Selon ce mode de réalisation, à la suite de l'étape El, l'étape E2 peut comprendre des sous-étapes E200 à E220. A particular embodiment is shown diagrammatically in FIG. 6. According to this embodiment, following step E1, step E2 can comprise substeps E200 to E220.
Avantageusement, lors d'une étape préliminaire E200, on localise la position de l'élément de pointage 3 sur la surface tactile 2, et on évalue la direction de déplacement de l'élément de pointage 3 sur la surface tactile 2. La localisation peut être mise en œuvre par la surface tactile 2 et l'évaluation de la direction par l'unité de traitement 7.  Advantageously, during a preliminary step E200, the position of the pointing element 3 is located on the touch surface 2, and the direction of movement of the pointing element 3 on the touch surface 2 is evaluated. be implemented by the touch surface 2 and the evaluation of the direction by the processing unit 7.
De façon optionnelle, on peut également évaluer lors de cette étape préliminaire E200 au moins une grandeur caractéristique de l'évolution du déplacement de l'élément de pointage 3 sur la surface tactile 2 dans le temps, comme la vitesse et/ou l'accélération. L'évaluation d'une ou plusieurs grandeurs comme la vitesse et/ou l'accélération peut être mise en œuvre par l'unité de traitement 7.  Optionally, it is also possible to evaluate during this preliminary step E200 at least one characteristic variable of the evolution of the displacement of the pointing element 3 on the tactile surface 2 in time, such as speed and / or acceleration . Evaluation of one or more quantities such as speed and / or acceleration can be implemented by the processing unit 7.
Lorsque la distance seuil d n'est pas prédéterminée de façon fixe, lors d'une étape E210, on peut définir cette distance seuil d, par exemple on peut définir la distance entre une délimitation fictive 13 en amont de la frontière cible 6 selon le sens de déplacement de l'élément de pointage 3. La distance seuil d, par exemple ici l'emplacement ou la position de la délimitation fictive 13, toujours en amont de la frontière cible 6 ou 11, peut être adapté(e) en fonction de l'accélération et/ou de la vitesse de déplacement de l'élément de pointage 3. Par exemple, lorsque l'élément de pointage 3 se déplace très vite, on peut éloigner d'autant plus la délimitation fictive 13 de la frontière cible 6 ou 11, autrement dit on peut augmenter la distance seuil d par rapport à la frontière cible 6 ou 11. À l'inverse, pour de petites vitesses de déplacement, on peut rapprocher la délimitation fictive 13 de la frontière cible 6 ou 11, autrement dit on peut réduire la distance seuil d par rapport à la frontière cible 6 ou 11. Les plages considérées comme élevées ou petites sont laissées à l'appréciation de l'Homme du Métier selon l'application. Cette étape E210 peut être mise en œuvre par l'unité de traitement 7. When the threshold distance d is not predetermined in a fixed manner, during a step E210, it is possible to define this threshold distance d, for example it is possible to define the distance between a fictitious delineation 13 upstream of the target boundary 6 according to the direction of displacement of the pointing element 3. The threshold distance d, for example here the location or the position of the imaginary delimitation 13, always upstream of the target border 6 or 11, can be adapted according to of the acceleration and / or the speed of displacement of the pointing element 3. For example, when the pointing element 3 moves very fast, the fictitious delineation 13 of the target boundary can be further removed. 6 or 11, in other words the threshold distance d can be increased with respect to the target boundary 6 or 11. On the other hand, for small speeds of movement, the fictitious delineation 13 of the target border 6 or 11 can be approximated, in other words, we can reduce the distance d from the target boundary 6 or 11. The ranges considered high or small are left to the appreciation of the skilled person according to the application. This step E210 can be implemented by the processing unit 7.
On détecte à l'étape E220, lorsque l'élément de pointage 3 arrive à cette distance seuil cl, par exemple ici lorsque l'élément de pointage 3 franchit la délimitation fictive 13 et on génère le retour hap tique. Cette étape E220 peut être mise en œuvre par l'unité de traitement 7.  In step E220, when the pointing element 3 arrives at this threshold distance c1, it is detected, for example here, when the pointing element 3 passes the imaginary delimitation 13 and the haptic return is generated. This step E220 can be implemented by the processing unit 7.
Bien entendu, l'ordre d'au moins certaines étapes peut être interverti. Of course, the order of at least some steps can be reversed.
On comprend donc qu'un procédé de commande tel que décrit selon l'un ou l'autre mode de réalisation, permet de présenter à un utilisateur au moins un retour sensoriel au bon moment, c'est-à-dire que l'utilisateur perçoit ce retour sensoriel lorsqu'il change effectivement de zone ou franchit une frontière cible 6 ou 11. It is thus clear that a control method as described according to one or the other embodiment makes it possible to present to a user at least one sensory feedback at the right moment, that is to say that the user perceives this sensory feedback when it actually changes zone or crosses a target boundary 6 or 11.

Claims

RE VENDI CATIONS RE VENDI CATIONS
1. Procédé de commande de retour sensoriel pour une interface de commande (1) 1. Sensory feedback control method for a control interface (1)
notamment pour véhicule automobile, l'interface de commande (1) comportant une surface tactile (2), caractérisé en ce que le procédé de commande comporte les étapes suivantes :  in particular for a motor vehicle, the control interface (1) comprising a tactile surface (2), characterized in that the control method comprises the following steps:
on détecte le déplacement de l'élément de pointage (3) en direction d'une frontière cible (6, 11),  detecting the displacement of the pointing element (3) towards a target boundary (6, 11),
lorsque l'élément de pointage (3) est détecté à une distance seuil (d) de la frontière cible, on génère au moins un retour sensoriel, de sorte que le retour sensoriel est perçu par un utilisateur de ladite interface (1) sensiblement au moment du franchissement de la frontière cible (6, 11), évitant un décalage temporel entre la perception du retour sensoriel et le franchissement de la frontière cible (6, 11).  when the pointing element (3) is detected at a threshold distance (d) from the target boundary, at least one sensory feedback is generated, so that the sensory feedback is perceived by a user of said interface (1) substantially at the moment of crossing the target boundary (6, 11), avoiding a time lag between sensory feedback perception and crossing the target boundary (6, 11).
Procédé de commande selon la revendication 1, dans lequel la distance seuil (d) est prédéterminée et fixe par rapport à la frontière cible (6, 11). A control method according to claim 1, wherein the threshold distance (d) is predetermined and fixed with respect to the target boundary (6, 11).
Procédé de commande selon la revendication 1, dans lequel : A control method according to claim 1, wherein:
on évalue au moins un paramètre de déplacement de l'élément de pointage (3) sur la surface tactile (2) dans le temps, et  at least one displacement parameter of the pointing element (3) is evaluated on the touch surface (2) in time, and
en fonction dudit paramètre de déplacement, on adapte la distance seuil (d) par rapport à la frontière cible (6).  according to said displacement parameter, the threshold distance (d) is adapted with respect to the target boundary (6).
Procédé de commande selon la revendication précédente, dans lequel le paramètre de déplacement de l'élément de pointage (3) sur la surface tactile (2) dans le temps est au moins une grandeur choisie parmi la vitesse de déplacement de l'élément de pointage (3) et une fonction de la vitesse telle que l'accélération de l'élément de pointage (3). Control method according to the preceding claim, wherein the parameter of displacement of the pointing element (3) on the touch surface (2) in time is at least one selected from the speed of movement of the pointing element (3) and a function of the speed such as the acceleration of the pointing element (3).
Procédé de commande selon l'une quelconque des revendications précédentes, pour la génération d'au moins deux retours sensoriels de nature différente, dans lequel on commande la génération de chaque retour sensoriel de façon simultanée ou décalée dans le temps. Control method according to any of the preceding claims, for the generation of at least two sensory feedbacks of different nature, in which the generation of each sensory feedback is controlled simultaneously or shifted in the time.
6. Interface de commande (1) pour véhicule automobile configurée pour mettre en œuvre au moins partiellement un procédé de commande selon l'une quelconque des revendications précédentes, l'interface de commande (1) comportant : 6. A control interface (1) for a motor vehicle configured to at least partially implement a control method according to any one of the preceding claims, the control interface (1) comprising:
une surface tactile (2) configurée pour déterminer la position et le déplacement d'un élément de pointage (3) sur la surface tactile (2), et  a touch surface (2) configured to determine the position and displacement of a pointing element (3) on the touch surface (2), and
un module de retour sensoriel (4) configuré pour générer au moins un retour sensoriel, caractérisée en ce que l'interface de commande (1) comporte en outre une unité de traitement (7) configurée pour :  a sensory feedback module (4) configured to generate at least one sensory feedback, characterized in that the control interface (1) further comprises a processing unit (7) configured to:
détecter le déplacement de l'élément de pointage (3) en direction d'une frontière cible detecting the movement of the pointing element (3) toward a target boundary
(6, 11), (6, 11),
générer à une distance seuil (d) entre l'élément de pointage (3) et la frontière cible (6,11) au moins un retour sensoriel, de sorte que le retour sensoriel est perçu par un utilisateur de ladite interface (1) sensiblement au moment du franchissement de la frontière cible (6, 11), évitant un décalage temporel entre la perception du retour sensoriel et le franchissement de la frontière cible (6, 11).  generating at a threshold distance (d) between the pointing element (3) and the target boundary (6, 11) at least one sensory feedback, so that the sensory feedback is perceived by a user of said interface (1) substantially at the time of crossing the target boundary (6, 11), avoiding a time lag between the perception of sensory feedback and the crossing of the target boundary (6, 11).
7. Interface de commande (1) selon la revendication précédente, dans laquelle l'unité de traitement (7) est configurée pour : Control interface (1) according to the preceding claim, wherein the processing unit (7) is configured to:
évaluer au moins un paramètre de déplacement de l'élément de pointage (3) sur la surface tactile (2) dans le temps, et  evaluating at least one parameter of displacement of the pointing element (3) on the touch surface (2) in time, and
en fonction du au moins un paramètre de déplacement évalué, adapter la distance seuil (d) par rapport à la frontière cible (6, 11).  according to the at least one evaluated displacement parameter, adapting the threshold distance (d) with respect to the target boundary (6, 11).
8. Interface de commande (1) selon la revendication 6 ou 7, dans laquelle le module de retour sensoriel (4) est configuré pour : Control interface (1) according to claim 6 or 7, wherein the sensory feedback module (4) is configured to:
faire vibrer la surface tactile (2) de manière à générer un retour haptique, et/ou générer un retour sonore et/ou  vibrating the tactile surface (2) so as to generate a haptic feedback, and / or generate a sound and / or
générer un retour visuel.  generate a visual return.
EP17732477.9A 2016-06-29 2017-06-28 Control method and control interface for a motor vehicle Ceased EP3479201A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1656139A FR3053489A1 (en) 2016-06-29 2016-06-29 CONTROL METHOD AND CONTROL INTERFACE FOR MOTOR VEHICLE
PCT/EP2017/066074 WO2018002189A1 (en) 2016-06-29 2017-06-28 Control method and control interface for a motor vehicle

Publications (1)

Publication Number Publication Date
EP3479201A1 true EP3479201A1 (en) 2019-05-08

Family

ID=57348799

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17732477.9A Ceased EP3479201A1 (en) 2016-06-29 2017-06-28 Control method and control interface for a motor vehicle

Country Status (3)

Country Link
EP (1) EP3479201A1 (en)
FR (1) FR3053489A1 (en)
WO (1) WO2018002189A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2778856A1 (en) * 2013-03-14 2014-09-17 Immersion Corporation Systems and methods for haptic and gesture-driven paper simulation

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6219034B1 (en) * 1998-02-23 2001-04-17 Kristofer E. Elbing Tactile computer interface
US8332755B2 (en) * 2009-05-27 2012-12-11 Microsoft Corporation Force-feedback within telepresence
US20120249461A1 (en) * 2011-04-01 2012-10-04 Analog Devices, Inc. Dedicated user interface controller for feedback responses
FR3015383B1 (en) * 2013-12-19 2017-01-13 Dav CONTROL DEVICE FOR MOTOR VEHICLE AND CONTROL METHOD
FR3026868B1 (en) * 2014-10-02 2019-08-16 Dav DEVICE AND CONTROL METHOD FOR MOTOR VEHICLE

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2778856A1 (en) * 2013-03-14 2014-09-17 Immersion Corporation Systems and methods for haptic and gesture-driven paper simulation

Also Published As

Publication number Publication date
WO2018002189A1 (en) 2018-01-04
FR3053489A1 (en) 2018-01-05

Similar Documents

Publication Publication Date Title
EP1450247B1 (en) Human-computer interface with force feedback for pressure pad
FR3026866B1 (en) DEVICE AND CONTROL METHOD FOR MOTOR VEHICLE
BE1020021A3 (en) MULTIMODE TOUCH SCREEN DEVICE.
FR3026868B1 (en) DEVICE AND CONTROL METHOD FOR MOTOR VEHICLE
EP3221781B1 (en) Graphical interface and method for managing said graphical interface during the touch-selection of a displayed element
WO2017108896A1 (en) Control interface for a motor vehicle
WO2022090091A1 (en) Three-dimensional touch interface providing haptic feedback
WO2018002186A1 (en) Control method and control interface for a motor vehicle
WO2016051115A1 (en) Control device and method for a motor vehicle
EP3918446A1 (en) Method for generating a haptic feedback for an interface, and associated interface
WO2015092165A1 (en) Control device for a motor vehicle and control method
FR3030071A1 (en) DEVICE AND CONTROL METHOD FOR MOTOR VEHICLE
EP3479201A1 (en) Control method and control interface for a motor vehicle
EP3394713A1 (en) Control interface for automotive vehicle
FR3058938B1 (en) INTERFACE FOR MOTOR VEHICLE AND INTERFACING METHOD
WO2018219832A1 (en) Method for generating a sensory feedback for an interface and associated interface
EP4232883A1 (en) Haptic-feedback touch device with spatialized textures
EP3084584A1 (en) Man/machine interface for controlling at least two functions of a motor vehicle
WO2017211835A1 (en) Control module and method for motor vehicle
FR2971864A1 (en) Virtual reality equipment i.e. immersive virtual reality environment equipment, for virtual reality interaction with human-machine interface car, has contact device with touch pad positioned at point where interface is intended to appear
FR2996652A1 (en) Tactile detector for designating target in field of vision of driver of car, has tactile surface forming restricted detection zones, where relief is arranged on periphery of restricted detection zones to form guide for finger
FR2979722A1 (en) Portable electronic device i.e. mobile phone, has activation unit activating processing rule application unit upon detection of movement of phone by motion sensor, where activation unit is inhibited in absence of selection of graphic object

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20190205

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20200918

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20230219

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230528