EP3479201A1 - Control method and control interface for a motor vehicle - Google Patents
Control method and control interface for a motor vehicleInfo
- Publication number
- EP3479201A1 EP3479201A1 EP17732477.9A EP17732477A EP3479201A1 EP 3479201 A1 EP3479201 A1 EP 3479201A1 EP 17732477 A EP17732477 A EP 17732477A EP 3479201 A1 EP3479201 A1 EP 3479201A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- pointing element
- sensory feedback
- target boundary
- touch surface
- displacement
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
- 238000000034 method Methods 0.000 title claims abstract description 28
- 230000001953 sensory effect Effects 0.000 claims abstract description 80
- 230000008447 perception Effects 0.000 claims abstract description 9
- 238000012545 processing Methods 0.000 claims description 53
- 238000006073 displacement reaction Methods 0.000 claims description 37
- 230000006870 function Effects 0.000 claims description 15
- 230000001133 acceleration Effects 0.000 claims description 10
- 230000000007 visual effect Effects 0.000 claims description 10
- 150000001768 cations Chemical class 0.000 claims 1
- 238000011144 upstream manufacturing Methods 0.000 description 5
- 241001422033 Thestylus Species 0.000 description 4
- 238000011156 evaluation Methods 0.000 description 4
- 230000004913 activation Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 239000004334 sorbic acid Substances 0.000 description 3
- 239000004291 sulphur dioxide Substances 0.000 description 3
- 235000019587 texture Nutrition 0.000 description 3
- 239000005711 Benzoic acid Substances 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 238000009530 blood pressure measurement Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000015654 memory Effects 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 238000004026 adhesive bonding Methods 0.000 description 1
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 230000000454 anti-cipatory effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- AMGQUBHHOARCQH-UHFFFAOYSA-N indium;oxotin Chemical compound [In].[Sn]=O AMGQUBHHOARCQH-UHFFFAOYSA-N 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 229920000515 polycarbonate Polymers 0.000 description 1
- 239000004417 polycarbonate Substances 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000003746 surface roughness Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/22—Display screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/25—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using haptic output
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/50—Instruments characterised by their means of attachment to or integration in the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/60—Instruments characterised by their location or relative disposition in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/80—Arrangements for controlling instruments
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/90—Calibration of instruments, e.g. setting initial or reference parameters; Testing of instruments, e.g. detecting malfunction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/141—Activation of instrument input devices by approaching fingers or pens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/143—Touch sensitive instrument input devices
- B60K2360/1438—Touch screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/55—Remote control arrangements
- B60K2360/56—Remote control arrangements using mobile devices
- B60K2360/566—Mobile devices displaying vehicle information
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/55—Remote control arrangements
- B60K2360/56—Remote control arrangements using mobile devices
- B60K2360/573—Mobile devices controlling vehicle functions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/26—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using acoustic output
Definitions
- the present invention relates to a control method in particular for a motor vehicle for generating at least one sensory feedback to a user via a control interface such as a human-machine interface.
- the invention also relates to an interface configured for implementing at least some steps of such a control method.
- the car has become a real living space, perceived as a personal and interconnected communication center: with for example MP3 player, GPS, connection with mobile phones.
- the introduction of these new features results in an increase in the number of buttons on the dashboard of a car cockpit.
- the number of buttons can not be increased to infinity, especially because of the complexity generated, limited space, accessibility or cognitive load.
- the interaction of the driver with the on-board systems in the car can result in an attention-overload situation in which the driver may not be able to handle all the information of the driving task at best, resulting in errors and delays. longer detection.
- buttons are more customizable.
- the touch screens have three other major advantages: they allow on the one hand a direct interaction (the co-implementation of the display and input), on the other hand they are flexible (the display can be easily configured for a number of functions), and finally they are intuitive (familiar interaction method such as "point").
- a push button when the driver interacts with a touch surface such as a touch screen, it receives no feedback directly related to its action on the interface, other than the mere touch of his finger crashing on the touch surface.
- a possible application of a haptic feedback is for example to vibrate the touch surface when crossing a target boundary.
- target boundary is meant a delineation of the non-detectable touch surface haptiquement between two functional areas.
- a first zone may be provided for controlling a first function, such as for example lowering the volume of an audio system, and a second adjacent zone for controlling a second function, for example increasing the volume.
- the passage between the two areas is not detectable by the finger of a user.
- the touch surface is generally associated with a screen disposed behind the touch surface and which displays pictograms associated with the control areas. It is therefore important that the haptic feedback is synchronized with the passage of the control finger between the two zones.
- This shift can be disconcerting for the user because there is more correspondence between the visual displayed on the screen and the haptic feedback.
- the finger or the stylus is located elsewhere than where the haptic feedback signal is felt, for example elsewhere than on a boundary delimiting a virtual key or a virtual control button, when the haptic feedback signal is issued.
- This time shift can be even greater than the speed of movement of the finger or the stylus is important.
- one solution would be to use faster electronic processing means, to minimize the time difference between the position of the finger or the stylus on a target border of the tactile surface, for example a border crossing. between two areas of the tactile surface, and the perception of the sensory feedback signal such as a haptic feedback.
- a target border of the tactile surface for example a border crossing. between two areas of the tactile surface, and the perception of the sensory feedback signal such as a haptic feedback.
- the sensory feedback signal such as a haptic feedback.
- An object of the present invention is therefore to provide a control method and an associated control interface, to at least partially overcome the aforementioned drawbacks by generating sensory feedback perceived at a convenient time.
- the subject of the present invention is a sensory feedback control method for a control interface, in particular for a motor vehicle, the control interface comprising a tactile surface, characterized in that the control method comprises the following steps:
- detecting the displacement of the pointing element towards a target boundary when the pointing element is detected at a threshold distance from the target boundary, generating at least one sensory feedback, so that the sensory feedback is perceived by a user of said interface substantially at the time of crossing the target boundary, avoiding a time lag between the perception of sensory feedback and the crossing of the target boundary.
- This solution can be independent of the speed of displacement of the pointing element, and makes it possible to reduce costs by fictitiously delimiting a sensory feedback activation threshold distance upstream of the target boundary characteristic of the change of the sensing element. real area.
- We anticipate the crossing of the target boundary such as a boundary between two zones or delimiting a given zone, which makes it possible to generate the sensory feedback at the right moment, that is to say when the pointing element such as the the user's finger is actually on the target border.
- the user experience in other words the perception of a sensory feedback characteristic of a change of zones, is improved.
- control method may further comprise one or more of the following features taken alone or in combination:
- the threshold distance is predetermined and fixed with respect to the target boundary
- At least one displacement parameter of the pointing element is evaluated on the tactile surface in time, and as a function of said displacement parameter, the threshold distance is adjusted with respect to the target boundary;
- the parameter of displacement of the pointing element on the touch-sensitive surface in time is at least one variable chosen from the speed of displacement of the pointing element and a function of the speed such as the acceleration of the element of pointing;
- control method is for the generation of at least two sensory feedbacks of a different nature, and the generation of each sensory feedback is controlled simultaneously or shifted in time.
- the invention also relates to a control interface for a motor vehicle configured to implement at least partially a control method according to any one of the preceding claims, the control interface comprising:
- a touch surface configured to determine the position and movement of a pointing element on the touch surface
- a sensory feedback module configured to generate at least one sensory feedback
- Such an interface makes it possible to activate the sensory feedback when it is established that the pointing element will change zone and cross or pass on a target boundary to the extent that the pointing element has reached the threshold distance.
- the interface allows this sensory feedback to be activated so as to be perceived by the user when the pointing element is actually on the target boundary.
- such a control interface offers a less expensive solution compared to a solution comprising processing means with very fast response times.
- control interface may further include one or more of the following features taken alone or in combination:
- the processing unit is configured to evaluate at least one parameter of displacement of the pointing element on the touch surface over time, and according to the at least one evaluated displacement parameter, adapting the threshold distance with respect to the border target;
- the sensory feedback module is configured to vibrate the touch surface so as to generate a haptic feedback, and / or generate a sound return and / or generate a visual feedback.
- FIG. 1a schematically represents a control interface such as a human-machine interface for a motor vehicle
- FIG. 1b schematically represents a side view of the interface of FIG. 1a;
- FIGS. 2a to 4d show diagrammatically different examples of display on a tactile surface of the interface of FIGS. 1a and 1b for anticipation. sensory feedback generation,
- FIG. 5 is a flowchart of the steps of a control method according to the invention for generating at least one anticipatory sensory feedback
- Fig. 6 is a flowchart detailing the steps of the control method.
- Control interface called human-machine interface
- Figure la represents a control interface for a motor vehicle 1.
- Such an interface 1 is commonly referred to as a human-machine interface. It is advantageously a man-machine interface that is reconfigurable.
- the control interface 1 comprises: a tactile surface 2 configured to detect a touch of a user's finger or any other pointing element 3 (for example a stylus or a finger) and its displacement on the tactile surface 2 , and
- a sensory feedback module 4 configured to generate at least one sensory feedback.
- the control interface 1 may furthermore comprise a display screen 5.
- the touch surface 2 can thus form an input device allowing the users of the control interface 1 to interact with it through touch.
- the touch surface 2 is for example configured to determine the spatial coordinates of the point where the user presses with his finger or another pointing element 3 on the touch surface 2.
- the touch surface 2 thus makes it possible to locate the position of the element 3 when the pointing element 3 moves, the touching surface 2 is configured to determine successive spatial coordinates corresponding to at least two points successive on the touch surface 2.
- the tactile surface 2 comprises a capacitive touch screen.
- the capacitive touch panel comprises at least one capacitive sensor 21 for detecting at least one variation of the capacitance at the surface of the capacitive touch-sensitive panel.
- the capacitive sensor 21 comprises, for example, an array of electrodes, for example made of ITO (indium-tin oxide).
- the capacitive touch screen may further comprise a front plate 22 (or contact plate), for example polycarbonate or glass.
- the front plate 22 is arranged on the capacitive sensor 21 and is intended to face the user once mounted in the passenger compartment of the vehicle. This faceplate 22 may be rigid so as to provide the desired rigidity to the capacitive touchscreen.
- the advantageously flat touch surface 2 of the capacitive touch-sensitive panel is thus formed by the surface of the front plate 22.
- the touch surface 2 can use pressure sensitive resistors to detect a contact and a displacement of a pointing element such as a finger of the user on the touch surface 2.
- the touch surface 2 then comprises a pressure sensor, such as using the FSR technology for "Force Sensing Resistor" in English.
- the display screen 5 can be offset from the touch surface 2.
- the front plate 22 of the capacitive touch screen can be painted with an opaque color so as to hide the elements arranged behind.
- the capacitive touch screen can then form what is called a touchpad or "touchpad” in English or a push button or "push” in English.
- the display screen 5 is arranged facing and in contact with the touch surface 2 such as a capacitive touch screen, more precisely under this touch surface 2, so as to form a touchscreen.
- the display screen 5 is for example fixed by gluing the back of a capacitive sensor support 21 of the capacitive touch screen.
- back is meant the portion opposite to the portion carrying the capacitive sensor 21.
- the touch surface 2 is then transparent to display the images of the display screen 5 through the tactile surface 2.
- the sensory feedback module 4 can be connected to the touch surface 2 and / or the display screen 5 in the case of a touch screen.
- the sensory feedback module 4 can in particular be configured to generate a haptic feedback by vibrating the touch surface 2.
- the sensory feedback module 4 may comprise at least one actuator 41, 42 connected to the touch surface 2, and configured to drive the touching surface in motion, so as to generate the haptic feedback as a function of a signal of ordered.
- Two actuators 41, 42 are shown schematically in FIG.
- the haptic feedback is a vibratory signal such as a vibration produced by a sinusoidal control signal or by a control signal comprising one or a succession of pulses, sent to actuator 41 and / or 42.
- the vibration can be directed in the plane of the tactile surface 2 or orthogonally to the plane of the tactile surface 2 or in a combination of these two directions.
- actuators 41, 42 these can be arranged under the touch surface 2, in different positions (in the center or on one side) or in different orientations (in the direction of the support on the surface or in a other axis).
- Such actuators 41, 42 are known and will not be described in more detail in the present description.
- a parameter of the haptic feedback can be chosen from among the intensity of the acceleration, the frequency, the amplitude, the duration, the duration between two identical signals, the phase. For example, it is possible to simulate different textures of the tactile surface 2, such as different surface roughnesses.
- the sensory feedback generation module 4 can be configured to generate a sound feedback to the user.
- a sound return parameter can be chosen from the intensity of the volume, the phase, the frequency, the duration, the duration between two identical signals.
- an image displayed on the touch screen can represent marked patterns, such as microbosses, at the position of the pointing element 3. In this case we speak back visual.
- the visual feedback can also be generated by the sensory feedback module 4.
- the sensory feedback module 4 can be configured to generate all the sensory feedbacks, that is to say, haptic and / or sound and / or visual, at the same time.
- the return module Sensory 4 can be configured to generate one or the other of the returns in a time-shifted manner with respect to another return.
- a haptic feedback when a haptic feedback is generated, a sound feedback can be generated before, at the same time or after the generation of the haptic feedback. Indeed, so that the user perceives at the same time a haptic feedback and a sound return, it is better not to generate them at the same time but in an advanced or delayed manner with respect to each other. It is the same for the generation of a visual feedback that can be generated before, at the same time or after the generation of a haptic feedback and / or a sound return.
- the sensory feedback module 4 is configured to generate one or more sensory feedbacks, of the same nature or of a different nature, when the pointing element 3, here the finger 3, passes or crosses a target boundary 6. Since the touch surface is smooth, this target boundary 6 is not materialized haptically on the touch surface 2.
- It may be a border 6 separating two zones ZI, Z2 from the display screen 5, for example from the touch screen, as illustrated in FIG. 2a. It may also be a boundary 6 indicating that one enters an area or that one leaves an area, as schematized in Figure 2b schematically showing the finger 3 leaving the zone Z2 .
- a haptic feedback it is possible to simulate on the tactile surface 2, a texture at this target boundary 6 between two zones Z1, Z2 for example, or delimiting an entry or a zone exit.
- the control interface 1 further comprises a processing unit 7 shown schematically in FIGS. 1a and 1b.
- the processing unit 7 is configured to exchange information and / or data with the touch surface 2 and also with the sensory feedback module 4.
- This processing unit 7 may comprise one or more processing means such as one or more microcontrollers or computers, having memories and programs particularly adapted to receive position and displacement or sliding information of the pointing element 3, this information having been detected in particular by the tactile surface 2.
- the processing unit 7 is for example the on-board computer of the motor vehicle.
- the touch-sensitive surface 2 can detect successive spatial coordinates corresponding to support points, and can transmit this information with the coordinates to the processing unit 7.
- the processing unit 7 therefore comprises at least one reception means information transmitted by the tactile surface 2.
- the processing unit 7 further comprises one or more processing means such as a control circuit configured to drive the sensory feedback module 4 in order to generate at least one sensory feedback such as a haptic and / or sound return and / or visual.
- the control circuit may include means for transmitting a sensory feedback generation command to the sensory feedback module 4.
- the control interface 1 may further include a measuring device 9 shown very schematically in this FIG.
- the measuring device 9 comprises for example one or more pressure sensors configured to measure the pressure exerted on the tactile surface 2.
- This is for example one or more strain gauges.
- the strain gauge or gauges are arranged in direct connection with the touch surface 2 and are cleverly distributed as required.
- it is possible to provide a strain gauge substantially in the middle of the tactile surface 2, for example in the case of a tactile pad called "Touchpad”.
- one or more strain gauges are arranged on one or more edges of the tactile surface 2.
- four strain gages may be provided, each placed at an angle of the touch surface 2.
- the or each gauge stress can be arranged at dampers provided under the touch surface 2 so as to measure the displacement of the touch surface 2 during contact and sliding of the pointing element 3 on the touch surface 2.
- the processing unit 7 comprises at least one processing means for receiving measurement signals from the measuring device 9, in particular the or each pressure sensor.
- the pressure measurement (s) are advantageously taken into account.
- the pressure measurements make it possible to identify the way in which the user presses on the tactile surface 2 by means of the pointing element 3, so as to generate the appropriate signal for the simulation accordingly. texture.
- processing unit 7 is advantageously configured to anticipate that the pointing element 3 will subsequently cross a target boundary 6 of the touch surface 2, and to control the sensory feedback module 4 before the actual crossing of the target boundary 6, in order to generate at least one sensory feedback perceived by the user when the pointing element 3 actually crosses the target boundary 6.
- the processing unit 7 is therefore configured to anticipate the crossing of a target boundary 6 when a displacement is detected on the touch surface 2, and not when a single point of support is detected.
- the processing unit 7 anticipates the crossing of the target boundary 6 when the pointing element 3 is at a threshold distance d, of activation of at least one sensory feedback, with respect to the target border 6.
- processing unit 7 is configured to:
- the threshold distance d it is possible to determine, for example by experiments, an average speed of displacement Vmoy of the pointing element on the touch surface 2 and it is possible to determine the delay Tu between the location of the pointing element 3 to a given location and an effective vibration on the touch surface 2 due to activation of the actuators 41 and 42 for example.
- a first approach it is possible to set an overall average speed common to all the functions to be controlled.
- a second approach in particular according to the morphology of the control areas such as their dimensions, their number on the tactile surface, their proximity or their geometry, it is possible to set an average speed for each control function. These average speeds can for example be recorded in a memory of the processing unit 7.
- the pointing element 3 if the pointing element 3 is at a distance d from the target boundary 6 and moves at the average velocity ⁇ ⁇ , the pointing element 3 will cross the target boundary at moment when one can feel the haptic feedback on the tactile surface 2.
- Said at least one sensory feedback is then perceived by a user of said interface 1 when the pointing element 3 crosses the target boundary 6.
- a fictitious delimitation such as a fictitious line 13, upstream of the target boundary 6 and which is arranged at the threshold distance d of the target boundary 6.
- upstream is used here for reference in the direction of displacement of the pointing element 3 on the tactile surface 2.
- the fictitious delimitation may extend substantially parallel to the boundary 6 with which it is associated.
- the processing unit 7 is therefore configured to generate the sensory feedback substantially at the instant at which the pointing element 3 is detected at the threshold distance d, for example on the imaginary delimitation 13.
- the threshold distance d for example here the distance of the fictitious delineation 13 with respect to the target border 6, can be predefined and fixed.
- the threshold distance d for example here the distance of the fictitious delineation 13 with respect to the target boundary 6, can be parameterized and can therefore be variable.
- the processing unit 7 comprises one or more processing means for setting or defining the threshold distance d.
- the processing unit 7 can be configured to analyze the displacement of the pointing element 3 so as to anticipate the crossing of the target boundary 6.
- the processing unit 7 is therefore further configured to evaluate at least one displacement parameter of the pointing element 3 on the touch surface 2 in time. These include the evaluation of the speed of movement of the pointing element 3, or a function of the speed such as the derivative or acceleration.
- processing unit 7 may comprise one or more processing means configured for:
- the processing unit 7 can analyze the position location information of successive points, transmitted by the tactile surface 2. Thus, with reference to FIGS. 2a and 2b, the processing unit 7 can determine that the pointing element 3 moves in the direction shown schematically by the arrow Fl, which is here horizontal and to the right.
- the processing unit 7 can detect rectilinear movements whether they are horizontal or vertical (arrow F2 in FIG. 3b), but also circular (arrow F3 in FIG. 4a) or else at an angle to the touch surface 2 (arrow F4, FIG. 4c), in one direction or the other, for example to the right (arrow Fl in FIGS. 2b, 4b and arrow F4 in FIG. 4c) or to the left (arrow F 5 in Figure 4d).
- the vertical and horizontal terms, left and right, are here used with reference to the arrangement of the elements as shown in FIGS. 2a to 4d.
- the processing unit 7 may comprise at least one calculation means making it possible to deduce the speed of the pointing element 3 from the location information of successive points, transmitted by the tactile surface 2, and function of an internal clock of the man-machine interface for example.
- the processing unit 7 may comprise at least one means for calculating the derivative of the speed, and therefore of the acceleration, for example from location location information of points further away between them, transmitted by the tactile surface 2.
- the processing unit 7 may comprise at least one processing means, for example a software part, allowing, starting from at least one characteristic quantity of the evolution of the displacement of the pointing element 3 on the touch surface 2 in time, to adapt the threshold distance cl with respect to the target boundary 6.
- the processing unit 7 may be configured to take into account at least one parameter of displacement of the pointing element on the touch surface in the time evaluated for the determination of the threshold distance cl.
- the processing unit 7 can anticipate that the finger 3 will cross the target boundary 6 between the two zones ZI and Z2.
- the processing unit 7 can generate in advance, as soon as the finger 3 is detected at the threshold distance cl a control signal to the sensory feedback module 4 which generates the sensory feedback (s). When the finger 3 actually arrives on the border 6, the user perceives the sensory feedback or returns.
- the two zones ZI, Z2 are open zones, that is to say without any boundary between the two zones ZI, Z2, but in this case it is the crossing of one of the borders marked around one of the zones, here Z2, which is anticipated.
- processing unit 7 makes it possible to anticipate that a pointing element 3 will enter an area (FIG. 2a) but also that it will exit an area (FIG. 2b).
- the tactile surface 2 may comprise a succession of distinct zones B1, B2, B3, B4 simulating control buttons, each zone B1, B2, B3, B4 being delimited by a closed surface forming 6.
- the processing unit 7 is then configured to anticipate whether the pointing element 3 will enter or leave one or other of these areas B1, B2, B3, B4.
- the pointing element 3 is detected in the zone B2, and if the user moves his finger 3 vertically, downwards with reference to the arrangement of the elements in FIG.
- the processing unit 7 is configured to determine that the pointing element 3 will leave the zone B2 when it is located at a threshold distance cl of the border 6 at the bottom of the zone B2 in reference to the arrangement of the elements in FIG. 3b, and to activate in advance the generation of a significant sensory feedback from the output of B2. If the pointing element 3 continues its downward movement illustrated by the arrow F2, the processing unit 7 can anticipate if and when the pointing element 3 will enter the next zone B3 when the pointing element 3 is located at a threshold distance c1 of the boundary 6 at the top of the zone B3 with reference to the arrangement of the elements in FIG.
- a different sensory feedback may be provided depending on whether the pointing element 3 enters or leaves a closed zone.
- two distinct haptic feedbacks one can simulate the feeling of a user pressing and releasing a key.
- the simulated buttons or control buttons are represented in line or in a corridor.
- the virtual keys may be arranged in a circular manner as shown diagrammatically in FIG. 4a.
- the processing unit 7 can anticipate from the displacement in a circular motion schematized by the arrow F3, that the pointing element 3 will cross a boundary 6 defining an area, here B2, when the pointing element 3 is for example located on the fictitious delineation 13 along a radius of the circle defined by the virtual keys, so as to control the generation of the sensory feedback or returns so that it (s) is (are) presented (s) to the good moment to the user.
- the threshold distance cl is an angular distance.
- a list of items A, B, C can be displayed on the display screen 5, for example on the touch screen.
- the user moves the pointing element 3 on the touch surface 2, and when a contact pointing an item, here A, in the list, is detected on the touch-sensitive surface 2, a sub-list of items A1, A2, A3 may be displayed.
- a sub-list of items A1, A2, A3 may be displayed.
- A31, A32, A33 can be displayed.
- the processing unit 7 can anticipate that the finger 3 will leave the zone corresponding to the item A3 when the finger 3 is located at a threshold distance d from the right boundary of item A3 with reference to the arrangement of the elements in FIG. 4b.
- the processing unit 7 can anticipate that the finger 3 will enter the zone corresponding to the sub-item A31, when the finger 3 is located at a threshold distance d from the left boundary 6 of the sub-item A31. reference to the arrangement of the elements in FIG. 4b.
- the treatment unit 7 can thus anticipate the appropriate sensory feedback at the right time.
- FIG. 4c It is also possible to provide, as illustrated in FIG. 4c, a cascading arrangement of windows A, B, C, D.
- the processing unit 7 can anticipate each window crossing and activate in anticipation the generation of the appropriate sensory feedback at the right moment. .
- the processing unit 7 can still anticipate the crossing of successive lines 11 when the pointing element 3 is located at a threshold distance d of a given line 11, and activate in advance the generation of at least one return appropriate sensory associated with the crossing of each line 11.
- each line 11 forms a target boundary.
- a sensory feedback control method for a human-machine interface in particular for a motor vehicle, is described, the human-machine interface comprising a touch-sensitive surface 2. It is advantageously an interface of control 1 as described above with reference to Figures la to 4d.
- the control method comprises a step E1 in which the displacement of a pointing element 3 such as a finger of a user of the control interface 1, or a stylus or a finger, on the tactile surface is detected. 2 in the direction of a target boundary 6 or 11 of the touch surface 2.
- This detection step El is for example made by the touch surface 2.
- the control method further comprises a step E2, for example said anticipation step, at which at the threshold distance d, at least one sensory feedback, in particular hap tic, is generated.
- At least one parameter of displacement of the pointing element 3 on the touch surface 2 in time such as the acceleration of the pointing element 3 and / or the speed of displacement of the pointing element 3 can be used for the definition of the threshold distance d with respect to the target border 6 or 11, at this step E2.
- step E2 can comprise substeps E200 to E220.
- the position of the pointing element 3 is located on the touch surface 2, and the direction of movement of the pointing element 3 on the touch surface 2 is evaluated. be implemented by the touch surface 2 and the evaluation of the direction by the processing unit 7.
- At least one characteristic variable of the evolution of the displacement of the pointing element 3 on the tactile surface 2 in time such as speed and / or acceleration .
- Evaluation of one or more quantities such as speed and / or acceleration can be implemented by the processing unit 7.
- the threshold distance d is not predetermined in a fixed manner, during a step E210, it is possible to define this threshold distance d, for example it is possible to define the distance between a fictitious delineation 13 upstream of the target boundary 6 according to the direction of displacement of the pointing element 3.
- the threshold distance d for example here the location or the position of the imaginary delimitation 13, always upstream of the target border 6 or 11, can be adapted according to of the acceleration and / or the speed of displacement of the pointing element 3. For example, when the pointing element 3 moves very fast, the fictitious delineation 13 of the target boundary can be further removed. 6 or 11, in other words the threshold distance d can be increased with respect to the target boundary 6 or 11.
- This step E210 can be implemented by the processing unit 7.
- step E220 when the pointing element 3 arrives at this threshold distance c1, it is detected, for example here, when the pointing element 3 passes the imaginary delimitation 13 and the haptic return is generated.
- This step E220 can be implemented by the processing unit 7.
- a control method as described according to one or the other embodiment makes it possible to present to a user at least one sensory feedback at the right moment, that is to say that the user perceives this sensory feedback when it actually changes zone or crosses a target boundary 6 or 11.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1656139A FR3053489A1 (en) | 2016-06-29 | 2016-06-29 | CONTROL METHOD AND CONTROL INTERFACE FOR MOTOR VEHICLE |
PCT/EP2017/066074 WO2018002189A1 (en) | 2016-06-29 | 2017-06-28 | Control method and control interface for a motor vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3479201A1 true EP3479201A1 (en) | 2019-05-08 |
Family
ID=57348799
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17732477.9A Ceased EP3479201A1 (en) | 2016-06-29 | 2017-06-28 | Control method and control interface for a motor vehicle |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP3479201A1 (en) |
FR (1) | FR3053489A1 (en) |
WO (1) | WO2018002189A1 (en) |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2778856A1 (en) * | 2013-03-14 | 2014-09-17 | Immersion Corporation | Systems and methods for haptic and gesture-driven paper simulation |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6219034B1 (en) * | 1998-02-23 | 2001-04-17 | Kristofer E. Elbing | Tactile computer interface |
US8332755B2 (en) * | 2009-05-27 | 2012-12-11 | Microsoft Corporation | Force-feedback within telepresence |
US20120249461A1 (en) * | 2011-04-01 | 2012-10-04 | Analog Devices, Inc. | Dedicated user interface controller for feedback responses |
FR3015383B1 (en) * | 2013-12-19 | 2017-01-13 | Dav | CONTROL DEVICE FOR MOTOR VEHICLE AND CONTROL METHOD |
FR3026868B1 (en) * | 2014-10-02 | 2019-08-16 | Dav | DEVICE AND CONTROL METHOD FOR MOTOR VEHICLE |
-
2016
- 2016-06-29 FR FR1656139A patent/FR3053489A1/en active Pending
-
2017
- 2017-06-28 EP EP17732477.9A patent/EP3479201A1/en not_active Ceased
- 2017-06-28 WO PCT/EP2017/066074 patent/WO2018002189A1/en unknown
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2778856A1 (en) * | 2013-03-14 | 2014-09-17 | Immersion Corporation | Systems and methods for haptic and gesture-driven paper simulation |
Also Published As
Publication number | Publication date |
---|---|
WO2018002189A1 (en) | 2018-01-04 |
FR3053489A1 (en) | 2018-01-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1450247B1 (en) | Human-computer interface with force feedback for pressure pad | |
FR3026866B1 (en) | DEVICE AND CONTROL METHOD FOR MOTOR VEHICLE | |
BE1020021A3 (en) | MULTIMODE TOUCH SCREEN DEVICE. | |
FR3026868B1 (en) | DEVICE AND CONTROL METHOD FOR MOTOR VEHICLE | |
EP3221781B1 (en) | Graphical interface and method for managing said graphical interface during the touch-selection of a displayed element | |
WO2017108896A1 (en) | Control interface for a motor vehicle | |
WO2022090091A1 (en) | Three-dimensional touch interface providing haptic feedback | |
WO2018002186A1 (en) | Control method and control interface for a motor vehicle | |
WO2016051115A1 (en) | Control device and method for a motor vehicle | |
EP3918446A1 (en) | Method for generating a haptic feedback for an interface, and associated interface | |
WO2015092165A1 (en) | Control device for a motor vehicle and control method | |
FR3030071A1 (en) | DEVICE AND CONTROL METHOD FOR MOTOR VEHICLE | |
EP3479201A1 (en) | Control method and control interface for a motor vehicle | |
EP3394713A1 (en) | Control interface for automotive vehicle | |
FR3058938B1 (en) | INTERFACE FOR MOTOR VEHICLE AND INTERFACING METHOD | |
WO2018219832A1 (en) | Method for generating a sensory feedback for an interface and associated interface | |
EP4232883A1 (en) | Haptic-feedback touch device with spatialized textures | |
EP3084584A1 (en) | Man/machine interface for controlling at least two functions of a motor vehicle | |
WO2017211835A1 (en) | Control module and method for motor vehicle | |
FR2971864A1 (en) | Virtual reality equipment i.e. immersive virtual reality environment equipment, for virtual reality interaction with human-machine interface car, has contact device with touch pad positioned at point where interface is intended to appear | |
FR2996652A1 (en) | Tactile detector for designating target in field of vision of driver of car, has tactile surface forming restricted detection zones, where relief is arranged on periphery of restricted detection zones to form guide for finger | |
FR2979722A1 (en) | Portable electronic device i.e. mobile phone, has activation unit activating processing rule application unit upon detection of movement of phone by motion sensor, where activation unit is inhibited in absence of selection of graphic object |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20190205 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20200918 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20230219 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230528 |