EP3479212A1 - Control method and control interface for a motor vehicle - Google Patents
Control method and control interface for a motor vehicleInfo
- Publication number
- EP3479212A1 EP3479212A1 EP17733843.1A EP17733843A EP3479212A1 EP 3479212 A1 EP3479212 A1 EP 3479212A1 EP 17733843 A EP17733843 A EP 17733843A EP 3479212 A1 EP3479212 A1 EP 3479212A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- pointing element
- sensory feedback
- target boundary
- time window
- crossing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
- 238000000034 method Methods 0.000 title claims abstract description 29
- 230000001953 sensory effect Effects 0.000 claims abstract description 96
- 230000033001 locomotion Effects 0.000 claims abstract description 19
- 230000008447 perception Effects 0.000 claims abstract description 9
- 238000012545 processing Methods 0.000 claims description 60
- 238000006073 displacement reaction Methods 0.000 claims description 36
- 230000000007 visual effect Effects 0.000 claims description 11
- 238000005259 measurement Methods 0.000 claims description 5
- 230000002123 temporal effect Effects 0.000 claims description 5
- 150000001768 cations Chemical class 0.000 claims 1
- 230000005540 biological transmission Effects 0.000 abstract description 6
- 230000006870 function Effects 0.000 description 9
- 230000001133 acceleration Effects 0.000 description 8
- 238000003745 diagnosis Methods 0.000 description 8
- 238000009530 blood pressure measurement Methods 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 5
- 241001422033 Thestylus Species 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 238000011156 evaluation Methods 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 235000019587 texture Nutrition 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000003111 delayed effect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000004026 adhesive bonding Methods 0.000 description 1
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 230000000454 anti-cipatory effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- AMGQUBHHOARCQH-UHFFFAOYSA-N indium;oxotin Chemical compound [In].[Sn]=O AMGQUBHHOARCQH-UHFFFAOYSA-N 0.000 description 1
- 238000012804 iterative process Methods 0.000 description 1
- 230000015654 memory Effects 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 229920000515 polycarbonate Polymers 0.000 description 1
- 239000004417 polycarbonate Substances 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000003746 surface roughness Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Arrangement of adaptations of instruments
-
- B60K35/10—
-
- B60K35/25—
-
- B60K35/80—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- B60K2360/141—
-
- B60K2360/1438—
-
- B60K2360/1468—
-
- B60K2360/566—
-
- B60K2360/573—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
Definitions
- the present invention relates to a control method in particular for a motor vehicle for generating at least one sensory feedback to a user via a control interface such as a human-machine interface.
- the invention also relates to an interface configured for implementing at least some steps of such a control method.
- the car has become a real living space, perceived as a personal and interconnected communication center: with for example MP3 player, GPS, connection with mobile phones.
- the introduction of these new features results in an increase in the number of buttons on the dashboard of a car cockpit.
- the number of buttons can not be increased to infinity, especially because of the complexity generated, limited space, accessibility or cognitive load.
- the interaction of the driver with the on-board systems in the car can result in an attention-overload situation in which the driver may not be able to handle all the information of the driving task at best, resulting in errors and delays. longer detection.
- buttons are more customizable.
- the touch screens have three other major advantages: they allow on the one hand a direct interaction (the co-implementation of the display and input), on the other hand they are flexible (the display can be easily configured for a number of functions), and finally they are intuitive (familiar interaction method such as "point").
- a push button when the driver interacts with a touch surface such as a touch screen, it receives no feedback directly related to its action on the interface, other than the mere touch of his finger crashing on the touch surface.
- a possible application of a haptic feedback is for example to vibrate the touch surface when crossing a target boundary.
- target boundary is meant a delineation on the non-detectably tactile surface between two functional areas.
- a first zone may be provided for controlling a first function, such as for example lowering the volume of an audio system, and a second adjacent zone for controlling a second function, for example increasing the volume.
- the passage between the two areas is not detectable by the finger of a user.
- the touch surface is generally associated with a screen disposed behind the touch surface and which displays pictograms associated with the control areas. It is therefore important that the haptic feedback is synchronized with the passage of the control finger between the two zones.
- This shift can be disconcerting for the user because there is more correspondence between the visual displayed on the screen and the haptic feedback.
- the finger or the stylus is located elsewhere than where the haptic feedback signal is felt, for example elsewhere than on a boundary delimiting a virtual key or a virtual control button, when the haptic feedback signal is issued.
- This time shift can be even greater than the speed of movement of the finger or the stylus is important.
- one solution would be to use faster electronic processing means, to minimize the time difference between the position of the finger or the stylus on a target border of the tactile surface, for example a border crossing. between two areas of the tactile surface, and the perception of the sensory feedback signal such as a haptic feedback.
- a target border of the tactile surface for example a border crossing. between two areas of the tactile surface, and the perception of the sensory feedback signal such as a haptic feedback.
- the sensory feedback signal such as a haptic feedback.
- An object of the present invention is therefore to provide a control method and an associated control interface, to at least partially overcome the aforementioned drawbacks by generating sensory feedback perceived at a convenient time.
- the present invention relates to a sensory feedback control method for a control interface for a motor vehicle, the control interface comprising a touch surface, comprising the following steps:
- a provisional time window is determined during which the target element is crossed by the pointing element taking into account the displacement parameter of the pointing element towards the target boundary, an instant t is calculated, prior to the provisional time window for issuing a sensory feedback generation command taking into account the determined crossing time window,
- such a method By anticipating the crossing of a target boundary such as a boundary between two zones or delimiting a given zone, such a method makes it possible to generate the sensory feedback at the right moment, that is to say when the pointing element such as that the user's finger is actually on the target border. We are thus closer to what the user is waiting for and there is a concordance between the visual and the sensory for the user.
- the user experience in other words the perception of a sensory feedback characteristic of a change of zones, is improved.
- control method may further comprise one or more of the following features taken alone or in combination:
- the displacement parameter of the pointing element comprises the speed of movement of the pointing element towards the target boundary
- the displacement parameter of the pointing element comprises the acceleration of the pointing element towards the target boundary
- the displacement parameter of the pointing element comprises the distance between the pointing element and the target boundary
- At least one parameter representative of the pressure exerted by the pointing element on the tactile surface is measured, and the measurement of the parameter representing the pressure at the step of determining the time window for crossing the border is used. target;
- the duration of the provisional time window for crossing the target boundary is less than 5 ms, for example less than 2 ms, preferably less than 1 ms.
- the invention also relates to a control interface for a motor vehicle configured to implement a control method as defined above, the control interface comprising:
- a touch surface configured to detect at least one parameter of movement of the pointing element on the touch surface
- control interface further comprises a processing unit configured to:
- the interface may further include one or more of the following features taken alone or in combination:
- the processing unit is configured to determine the time window for crossing the target boundary from the position, direction and time evolution data of the displacement of the pointing element;
- the touch surface comprises at least one pressure sensor configured to measure at least one parameter representative of the pressure exerted on the touch surface, and the processing unit is configured to take into account the measurement of at least one parameter representative of the pressure for determining the time window for crossing the target boundary;
- the sensory feedback module is configured to vibrate the touch surface so as to generate a haptic feedback, and / or generate a sound return and / or generate a visual feedback.
- a haptic feedback and / or generate a sound return and / or generate a visual feedback.
- Such an interface makes it possible to activate the sensory feedback when it is established that the pointing element will change zone and cross or pass on a target boundary, and also allows this sensory feedback to be activated so as to be perceived by the user when the pointing element is actually on the target border.
- a control interface offers a less expensive solution compared to a solution comprising processing means with very fast response times.
- FIG. 1a schematically represents a control interface such as a human-machine interface for a motor vehicle
- FIG. 1b schematically represents a side view of the interface of FIG. 1a;
- FIGS. 2a to 4d show diagrammatically different examples of display on a tactile surface of the interface of FIGS. 1a and 1b for anticipation. sensory feedback generation,
- FIG. 5 is a flowchart of the steps of a control method according to the invention for generating at least one anticipatory sensory feedback
- Fig. 6 is a flowchart detailing the steps of the control method.
- Control interface called human-machine interface
- Figure la represents a control interface for a motor vehicle 1.
- Such an interface 1 is commonly referred to as a human-machine interface. It is advantageously a man-machine interface that is reconfigurable.
- the control interface 1 comprises: a tactile surface 2 configured to detect a touch of a user's finger or any other pointing element 3 (for example a stylus or a finger) and its displacement on the tactile surface 2 , and
- a sensory feedback module 4 configured to generate at least one sensory feedback.
- the control interface 1 may furthermore comprise a display screen 5.
- the touch surface 2 can thus form an input device allowing the users of the control interface 1 to interact with it through touch.
- the touch surface 2 is for example configured to determine the spatial coordinates of the point where the user presses with his finger or another pointing element 3 on the touch surface 2.
- the touch surface 2 thus makes it possible to locate the position of the element 3 when the pointing element 3 moves, the touching surface 2 is configured to determine successive spatial coordinates corresponding to at least two successive points on the touch surface 2.
- the tactile surface 2 comprises a capacitive touch screen.
- the capacitive touch panel comprises at least one capacitive sensor 21 for detecting at least one variation of the capacitance at the surface of the capacitive touch-sensitive panel.
- the capacitive sensor 21 comprises, for example, an array of electrodes, for example made of ITO (indium-tin oxide).
- the capacitive touch screen may further comprise a front plate 22 (or contact plate), for example polycarbonate or glass.
- the front plate 22 is arranged on the capacitive sensor 21 and is intended to face the user once mounted in the passenger compartment of the vehicle. This front plate 22 may be rigid so as to provide the desired rigidity to the capacitive touch screen.
- the advantageously flat touch surface 2 of the capacitive touch-sensitive panel is thus formed by the surface of the front plate 22.
- the touch surface 2 can use pressure sensitive resistors to detect a contact and a displacement of a pointing element such as a finger of the user on the touch surface 2.
- the touch surface 2 then comprises a pressure sensor, such as using the FSR technology for "Force Sensing Resistor" in English.
- the display screen 5 can be offset from the touch surface 2.
- the front plate 22 of the capacitive touch panel can be painted with an opaque color so as to hide the elements arranged behind.
- the capacitive touch screen can then form what is called a touchpad or "touchpad” in English or a push button or "push” in English.
- the display screen 5 is arranged facing and in contact with the touch surface 2 such as a capacitive touch screen, more precisely under this touch surface 2, so as to form a touchscreen.
- the display screen 5 is for example fixed by gluing the back of a capacitive sensor support 21 of the capacitive touch screen.
- back is meant the portion opposite to the portion carrying the capacitive sensor 21.
- the touch surface 2 is then transparent to display the images of the display screen 5 through the tactile surface 2.
- the capacitive touch screen comprising the capacitive sensor 21 and the front plate 22, the latter are transparent.
- the sensory feedback module 4 can be connected to the touch surface 2 and / or the display screen 5 in the case of a touch screen.
- the sensory feedback module 4 can in particular be configured to generate a haptic feedback by vibrating the touch surface 2.
- the sensory feedback module 4 may comprise at least one actuator 41, 42 connected to the touch surface 2, and configured to drive the touching surface in motion, so as to generate the haptic feedback as a function of a signal of ordered.
- Two actuators 41, 42 are shown schematically in FIG.
- the haptic feedback is a vibratory signal such as a vibration produced by a sinusoidal control signal or by a control signal having one or a succession of pulses, sent to the actuator 41 and / or 42.
- the vibration may be directed in the plane of the tactile surface 2 or orthogonal to the plane of the tactile surface 2 or in a combination of these two directions .
- actuators 41, 42 these can be arranged under the touch surface 2, in different positions (in the center or on one side) or in different orientations (in the direction of the support on the surface or in a other axis).
- Such actuators 41, 42 are known and will not be described in more detail in the present description.
- a parameter of the haptic feedback can be chosen from among the intensity of the acceleration, the frequency, the amplitude, the duration, the duration between two identical signals, the phase. For example, it is possible to simulate different textures of the tactile surface 2, such as different surface roughnesses.
- the sensory feedback generation module 4 can be configured to generate a sound feedback to the user.
- a sound return parameter can be chosen from the intensity of the volume, the phase, the frequency, the duration, the duration between two identical signals.
- an image displayed on the touch screen can represent marked patterns, such as microbosses, at the position of the pointing element 3. In this case we speak back visual.
- the visual feedback can also be generated by the sensory feedback module 4.
- the sensory feedback module 4 can be configured to generate all the sensory feedbacks, that is to say, haptic and / or sound and / or visual, at the same time. On the contrary, the sensory feedback module 4 can be configured to generate one or the other of the returns in a time-shifted manner with respect to another return.
- a haptic feedback when a haptic feedback is generated, a sound feedback can be generated before, at the same time or after the generation of the haptic feedback. Indeed, so that the user perceives at the same time a haptic feedback and a sound return, it is better not to generate them at the same time but in an advanced or delayed manner with respect to each other. It is the same for the generation of a visual feedback that can be generated before, at the same time or after the generation of a haptic feedback and / or a sound return.
- the sensory feedback module 4 is configured to generate one or more sensory returns, of the same nature or of a different nature, when the pointing element 3, here the finger 3, passes or crosses a target boundary 6. As the tactile surface is smooth, this target boundary 6 does not is not materialized haptically on the touch surface 2.
- It may be a border 6 separating two zones ZI, Z2 from the display screen 5, for example from the touch screen, as illustrated in FIG. 2a. It may also be a boundary 6 indicating that one enters an area or that one leaves an area, as schematized in Figure 2b schematically showing the finger 3 leaving the zone Z2 .
- a haptic feedback it is possible to simulate on the tactile surface 2, a texture at this target boundary 6 between two zones Z1, Z2 for example, or delimiting an entry or a zone exit.
- the control interface 1 further comprises a processing unit 7 shown schematically in FIGS. 1a and 1b.
- the processing unit 7 is configured to exchange information and / or data with the touch surface 2 and also with the sensory feedback module 4.
- This processing unit 7 may comprise one or more processing means such as one or more microcontrollers or computers, having memories and programs particularly adapted to receive position and displacement or sliding information of the pointing element 3, this information having been detected in particular by the tactile surface 2.
- the processing unit 7 is for example the on-board computer of the motor vehicle.
- the touch surface 2 can raise successive spatial coordinates corresponding to support points, and can transmit this information with the coordinates to the processing unit 7.
- the processing unit 7 therefore comprises at least one means receiving information transmitted by the touch surface 2.
- the processing unit 7 further comprises one or more processing means such as a control circuit configured to drive the sensory feedback module 4 in order to generate at least one sensory feedback such as a haptic and / or sound return and / or visual.
- the control circuit may include means for transmitting a sensory feedback generation command to the sensory feedback module 4.
- the control interface 1 may further include a measuring device 9 shown very schematically in this FIG.
- the measuring device 9 comprises for example one or more pressure sensors configured to measure the pressure exerted on the tactile surface 2. This is for example one or more strain gauges.
- the strain gauge or gauges are arranged in direct connection with the touch surface 2 and are cleverly distributed as required.
- a strain gauge substantially in the middle of the tactile surface 2, for example in the case of a tactile pad called "Touchpad”.
- one or more strain gauges are arranged on one or more edges of the tactile surface 2.
- four strain gages may be provided, each placed at an angle of the touch surface 2.
- the or each gauge stress can be arranged at dampers provided under the touch surface 2 so as to measure the displacement of the touch surface 2 during contact and sliding of the pointing element 3 on the touch surface 2.
- the processing unit 7 comprises at least one processing means for receiving measurement signals from the measuring device 9, in particular the or each pressure sensor.
- the pressure measurement (s) are advantageously taken into account.
- the pressure measurements make it possible to identify the way in which the user presses on the tactile surface 2 by means of the pointing element 3, so as to generate the appropriate signal for the simulation accordingly. texture.
- processing unit 7 is advantageously configured to anticipate that the pointing element 3 will subsequently cross a target boundary 6 of the tactile surface 2, and to drive the sensory feedback module 4 before the effective crossing of the border target 6, in order to generate at least one sensory feedback perceived by the user when the pointing element 3 actually crosses the target boundary 6.
- processing unit 7 is configured to:
- the time window for crossing the target border 6 by the pointing element 3 therefore comprises:
- the upper bound corresponds to a later time with respect to the lower bound.
- the duration of the provisional time window for crossing the target boundary is less than 5 ms, for example less than 2 ms, preferably less than 1 ms.
- the processing unit 7 can determine a time of crossing the target boundary 6 by the pointing element 3. This corresponds in this case to a time window whose lower and upper bound are equal.
- provisional time window is understood to mean a time interval with a lower bound and a different upper bound than a given instant, ie a time window whose upper bound is the same than the lower limit.
- the processing unit 7 is furthermore configured to calculate an emission time t of a sensory feedback generation command, the emission time t being before the transmission time. time window for crossing the target boundary 6. In other words, the emission instant t is earlier than the lower bound of the determined time window.
- the processing unit 7 is configured to emit, at the transmission instant t, the command for generating at least one sensory feedback by the sensory feedback module 4. Said at least one sensory feedback is then perceived by the user of the interface during the time window of crossing the target boundary 6 determined. This avoids a temporal shift between the perception of the sensory feedback and the passage of the pointing element 3 on the target boundary 6.
- the processing unit 7 is thus configured to provide for the crossing of a target boundary 6 in order to be able to associate therewith a haptic feedback.
- the processing unit 7 is configured to analyze the displacement of the pointing element 3 so as to provide for the crossing of the target boundary 6, that is to say so as to be able to determine that the pointing element 3 moves towards a target boundary 6 of the tactile surface 2, and determines, from the displacement of the pointing element 3, a provisional time window for crossing the target boundary 6 by the Pointing element 3.
- the processing unit 7 is configured to evaluate at least one parameter of the pointing element 3 on the touch surface 2, in time. These include the evaluation of the distance between the pointing element and the target boundary 6, the speed of movement of the pointing element 3, or a function of the speed such as the derivative or acceleration.
- the processing unit 7 is furthermore configured to use this displacement parameter, for example the speed or the acceleration evaluated, to estimate by calculation the provisional time window for crossing the target boundary 6.
- the processing unit 7 may comprise one or more processing means configured for:
- the processing unit 7 can analyze the information of position location of successive points, transmitted by the touch surface 2. Thus, with reference to FIGS. 2a and 2b, the processing unit 7 can determine that the pointing element 3 moves in the direction shown schematically by the arrow F1, which is here horizontal and to the right. Of course, the processing unit 7 can detect rectilinear movements whether they are horizontal or vertical (arrow F2 in FIG. 3b), but also circular (arrow F3 in FIG. 4a) or else at an angle to the touch surface 2 (arrow F4, FIG. 4c), in one direction or the other, for example to the right (arrow Fl in FIGS. 2b, 4b and arrow F4 in FIG. 4c) or to the left (arrow F 5 in Figure 4d).
- the vertical and horizontal terms, left and right, are here used with reference to the arrangement of the elements as shown in FIGS. 2a to 4d.
- the processing unit 7 may comprise at least one calculation means making it possible to deduce the speed of the pointing element 3 from the location information of successive points, transmitted by the tactile surface 2, and function of an internal clock of the man-machine interface for example.
- the processing unit 7 may comprise at least one means for calculating the derivative of the speed, and therefore of the acceleration, for example from location location information of points further away between them, transmitted by the tactile surface 2.
- the processing unit 7 may comprise a software part making it possible, based on this position information, direction and evolution of movement in time, to make a diagnosis of the user's intention, or otherwise said to allow a trajectory projection of the pointing element 3.
- the skilled person can adapt the calculation model used by the software part, which may in particular be a non-limiting network of neurons or based on a fuzzy logic.
- the processing unit 7 may comprise means for deciding whether or not to generate one or more sensory feedbacks from the diagnosis.
- the processing unit 7 can take into account the pressure measured by the pressure sensor or sensors of the measuring device 9 for determining the time window for crossing the target boundary 6 by the 3. Indeed, the pressure measurement or measurements make it possible to confirm or not the fact that the user will actually cross the target boundary 6. For example, if the pressure on the surface tactile 2 is of a continuous nature, that is to say, it does not evolve or little, this results in the fact that the user will not abruptly raise the pointing element 3 such as his finger and On the contrary, it will continue well on its trajectory, so the means of decision can confirm the generation of a sensory feedback in this case.
- the pressure measurement or measurements may also be taken into account by the processing unit 7 to determine the time available to perform the diagnosis, that is to say here to determine the time window for crossing the target zone 6. Indeed, if the pressure tends to increase, it reflects a slower movement, which allows to increase the time available to perform the diagnosis, for example to further improve its reliability.
- the processing unit 7 can make a provisional calculation allowing to determine the provisional time window during which the finger 3 will cross the border 6 between the two zones ZI and Z2. Since we know on the one hand the delay between the emission of a haptic feedback command signal on the one hand and the moment when the actuators 41 and / or 42 actually vibrate and the user can feel the haptic feedback, the processing unit 7 can determine the time to emit in advance, that is to say in advance with respect to the determined time window, a control signal to the sensory feedback module 4 which generates the return (s) haptic (s).
- the finger 3 actually arrives on the border 6, the user perceives the sensory feedback or returns in accordance with the visual if the zones ZI and Z2 are for example displayed.
- the two zones ZI, Z2 are open zones, that is to say without any boundary between the two zones ZI, Z2, but in this case it is the crossing of one of the borders marked around one of the zones, here Z2, which is anticipated.
- the processing unit 7 makes it possible to anticipate or diagnose that a pointing element 3 will enter an area (FIG. 2a) but also that it will exit an area (FIG. 2b).
- the tactile surface 2 may comprise a succession of distinct zones B1, B2, B3, B4 simulating control buttons, each zone B1, B2, B3, B4 being delimited by a closed surface forming 6.
- the processing unit 7 is then configured to anticipate / diagnose whether the pointing element 3 will enter or exit from one or other of these areas B1, B2, B3, B4.
- the pointing element 3 is detected in the zone B2, and if the user moves his finger 3 vertically, downwards with reference to the arrangement of the elements in FIG.
- the processing unit 7 is configured to anticipate whether and when the pointing element 3 will leave the zone B2, that is to say to determine that the pointing element 3 is moving towards a boundary 6 delimiting the zone B2 and determining the time window for crossing this boundary 6, in order to activate in advance the generation of a significant sensory feedback from the output of B2. If the pointing element 3 continues its downward movement illustrated by the arrow F2, the processing unit 7 can anticipate if and when the pointing element 3 will enter the next zone B3, that is to say can determine that the pointing element 3 moves towards a boundary 6 delimiting the zone B3 and determining the time window for crossing this boundary 6, in order to control in advance the control for generating at least one sensory feedback associated with the entered in zone B3. In addition, a different sensory feedback may be provided depending on whether the pointing element 3 enters or leaves a closed zone. In particular, with two distinct haptic feedbacks one can simulate the feeling of a user pressing and releasing a key.
- the simulated buttons or control buttons are represented in line or in a corridor.
- the virtual keys may be arranged in a circular manner as shown diagrammatically in FIG. 4a.
- the processing unit 7 can anticipate from the displacement in a circular motion schematized by the arrow F3, that the pointing element 3 will cross a boundary 6 defining an area, here B2, and when, otherwise said processing unit can determine, from the displacement of the pointing element 3, a time window during which the pointing element 3 will cross the boundary 6 delimiting the zone B2, so as to control the generation of the sensory feedback or returns so that it (s) is (s) perceived (s) at the right time to the user.
- a list of items A, B, C can be displayed on the display screen 5, for example on the touch screen.
- the user moves the pointing element 3 on the touch surface 2, and when a contact pointing an item, here A, in the list, is detected on the touch-sensitive surface 2, a sub-list of items A1, A2, A3 may be displayed.
- a sub-list of items A1, A2, A3 may be displayed.
- A31, A32, A33 can be displayed.
- the processing unit 7 can anticipate that the finger 3 will leave the area corresponding to the item A3 and enter the zone corresponding to the sub-item A31 and thus allow to order in advance the appropriate sensory feedback and at the right time.
- the processing unit 7 can anticipate each zone crossing, that is to say, determine a time window of crossing a border for each zone A, B, C or D, and activate in advance the generation of the appropriate sensory feedback at the right time.
- the processing unit 7 can still anticipate the crossing of successive lines 11, that is to say, determine a time window for crossing each line 11, and activate in anticipation the generation of at least one appropriate sensory feedback associated with the crossing of each line 11.
- each line 11 forms a target boundary.
- each zone of the remote display screen 5 is associated with a zone on the touch surface 2.
- a sensory feedback control method for a human-machine interface in particular for a motor vehicle, is described, the human-machine interface comprising a touch-sensitive surface 2. It is advantageously an interface of control 1 as described above with reference to Figures la to 4d.
- the control method comprises a step El in which at least one displacement parameter of a pointing element 3 such as a finger of a user of the control interface 1 is detected, or a stylet or a finger, on the touch surface 2 and towards a target boundary 6 or 11.
- This detection step E 1 is for example carried out by the touch surface 2.
- the control method further comprises a step E2, for example called anticipation or prediction step, to which, from the displacement parameter (s) of the pointing element 3, a provisional time window for crossing the the target boundary 6 or 11, of the tactile surface 2 by the pointing element 3, and on which, from this determined provisional crossing time window, a transmission instant t is generated for a generation command. sensory feedback.
- the emission time t is earlier than the time window, more precisely at the lower limit of this time window.
- This step E2 may be performed at least in part by one or more processing means of the processing unit 7 as described above.
- At least one parameter of displacement of the pointing element 3 on the touch surface 2 in time is the speed of movement, the acceleration of the pointing element 3 and / or the distance between the pointing element 3 and the target boundary 6 moving speed of the pointing element 3.
- control method further comprises a control step E3 to which the generation command of at least one sensory feedback is issued, at the transmission instant t, that is to say before the effective crossing of the target border 6 or 11.
- This is therefore an advance control step.
- this control step E3 is a piloting step sensory feedback module 4 as described above.
- the control or control step E3 is for example carried out by one or more processing means, such as the control circuit of the processing unit 7 previously described.
- step E4 upon receipt of one or more control signals, in step E4, at least one sensory feedback is generated.
- This step E4 can be implemented by the sensory feedback module 4 previously described.
- the sensory feedback is then perceived by the user when the pointing element 3, for example his finger, crosses the target border 6 or 11.
- the sensory feedback is perceived by the user during the time window of crossing of the target boundary 6, 11 determined. This avoids a time lag between the perception of sensory feedback and actual or actual crossing of the target boundary 6 or 11.
- step E4 one or more sensory returns of the same nature can be generated or, alternatively, several sensory returns of a different nature can be generated. In this case, it is possible to control the generation of the different sensory returns simultaneously or on the contrary with a shift in time.
- the advanced or delayed generation of a sound or visual feedback for example with respect to the generation of a haptic feedback makes it possible in particular for the user to perceive different haptic feedback at the same time.
- step E2 may comprise substeps E20 to E24.
- a step E20 the position of the pointing element 3 is located on the tactile surface 2, the direction of displacement of the pointing element 3 on the tactile surface 2 is evaluated and at least one parameter of moving the pointing element 3 on the touch surface 2 in time.
- the location can be achieved by the touch surface 2 and the evaluation of the direction and one or more quantities such as speed and / or acceleration can be implemented by the processing unit 7.
- step E21 for measuring at least one parameter representative of the pressure exerted by the pointing element 3 on the tactile surface 2.
- step E22 an estimate is made to determine the user's intention to cross the target boundary 6 or 11 from the position, direction and time course data of the movement of the element. point 3 taken at step E20, and possibly at from the parameter representative of the pressure measured in step E21. At this step E22, it is possible in particular to project the trajectory of the pointing element 3 in order to identify the intention of the user.
- This step E22 can be implemented by the processing unit 7.
- step E23 From the diagnosis made in step E22, it is determined or decided during a decision step E23, whether to generate one or more sensory feedback. If yes, that is to say if one or more sensory returns must be generated, we go to the next step E24.
- This step E24 can be implemented by the processing unit 7. If not, that is to say if it is not necessary to generate a sensory feedback, the method can start again at step E1.
- step E24 the provisional time window during which the pointing element 3 is going to cross the target boundary 6 or 11 is determined, and from this determined provisional crossing time window, a transmission time instant t is calculated. the advance control of sensory feedback generation.
- control step E3 can be implemented.
- a control method as described makes it possible to present to a user at least one sensory feedback at the right moment, that is to say that the user perceives this sensory feedback when he actually changes his zone or crosses a target border 6 or 11.
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1656138A FR3053488A1 (en) | 2016-06-29 | 2016-06-29 | CONTROL METHOD AND CONTROL INTERFACE FOR MOTOR VEHICLE |
PCT/EP2017/066071 WO2018002186A1 (en) | 2016-06-29 | 2017-06-28 | Control method and control interface for a motor vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3479212A1 true EP3479212A1 (en) | 2019-05-08 |
Family
ID=57348798
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17733843.1A Ceased EP3479212A1 (en) | 2016-06-29 | 2017-06-28 | Control method and control interface for a motor vehicle |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP3479212A1 (en) |
FR (1) | FR3053488A1 (en) |
WO (1) | WO2018002186A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102016007357A1 (en) * | 2016-06-15 | 2017-12-21 | GM Global Technology Operations LLC (n. d. Ges. d. Staates Delaware) | Motor vehicle device with a movable window pane, motor vehicle with such a motor vehicle device and method for operating such a motor vehicle device |
US10533362B2 (en) | 2017-06-16 | 2020-01-14 | GM Global Technology Operations LLC | Systems and methods for memory and touch position window |
CN112181263B (en) * | 2019-07-02 | 2024-04-09 | 三六零科技集团有限公司 | Touch screen drawing operation response method and device and computing equipment |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6219034B1 (en) * | 1998-02-23 | 2001-04-17 | Kristofer E. Elbing | Tactile computer interface |
US8332755B2 (en) * | 2009-05-27 | 2012-12-11 | Microsoft Corporation | Force-feedback within telepresence |
WO2012135373A2 (en) * | 2011-04-01 | 2012-10-04 | Analog Devices, Inc. | A dedicated user interface controller for feedback responses |
FR3026868B1 (en) * | 2014-10-02 | 2019-08-16 | Dav | DEVICE AND CONTROL METHOD FOR MOTOR VEHICLE |
FR3026866B1 (en) * | 2014-10-02 | 2019-09-06 | Dav | DEVICE AND CONTROL METHOD FOR MOTOR VEHICLE |
-
2016
- 2016-06-29 FR FR1656138A patent/FR3053488A1/en active Pending
-
2017
- 2017-06-28 EP EP17733843.1A patent/EP3479212A1/en not_active Ceased
- 2017-06-28 WO PCT/EP2017/066071 patent/WO2018002186A1/en unknown
Also Published As
Publication number | Publication date |
---|---|
FR3053488A1 (en) | 2018-01-05 |
WO2018002186A1 (en) | 2018-01-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
FR3026866B1 (en) | DEVICE AND CONTROL METHOD FOR MOTOR VEHICLE | |
EP1450247B1 (en) | Human-computer interface with force feedback for pressure pad | |
FR3026868B1 (en) | DEVICE AND CONTROL METHOD FOR MOTOR VEHICLE | |
CN109643219B (en) | Method for interacting with image content presented on a display device in a vehicle | |
EP3084570A1 (en) | Control device for motor vehicle and control method | |
WO2018002186A1 (en) | Control method and control interface for a motor vehicle | |
EP1515211B1 (en) | Capacitive touch switch for electrical window regulator or open roof | |
EP3234758A1 (en) | Device and method for control for automotive vehicle | |
EP3918446A1 (en) | Method for generating a haptic feedback for an interface, and associated interface | |
WO2015092165A1 (en) | Control device for a motor vehicle and control method | |
EP3201734A1 (en) | Control device and method for a motor vehicle | |
EP3084569A1 (en) | Control device for a motor vehicle and control method | |
FR3058938B1 (en) | INTERFACE FOR MOTOR VEHICLE AND INTERFACING METHOD | |
WO2018219832A1 (en) | Method for generating a sensory feedback for an interface and associated interface | |
FR3030071A1 (en) | DEVICE AND CONTROL METHOD FOR MOTOR VEHICLE | |
WO2018002189A1 (en) | Control method and control interface for a motor vehicle | |
EP4232884A1 (en) | Three-dimensional touch interface providing haptic feedback | |
WO2017211835A1 (en) | Control module and method for motor vehicle | |
EP4232883A1 (en) | Haptic-feedback touch device with spatialized textures | |
FR3041446A1 (en) | SELECTION BUTTON WITH MULTIPLE FINGER DETECTION |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20190205 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20200918 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20230225 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230528 |