EP3908911A1 - Interactive system - Google Patents
Interactive systemInfo
- Publication number
- EP3908911A1 EP3908911A1 EP20705404.0A EP20705404A EP3908911A1 EP 3908911 A1 EP3908911 A1 EP 3908911A1 EP 20705404 A EP20705404 A EP 20705404A EP 3908911 A1 EP3908911 A1 EP 3908911A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- occupant
- comfort index
- emotional
- interactive system
- score
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000002452 interceptive effect Effects 0.000 title claims description 21
- 230000002996 emotional effect Effects 0.000 claims abstract description 52
- 238000012545 processing Methods 0.000 claims abstract description 7
- 238000005259 measurement Methods 0.000 claims description 17
- 230000037007 arousal Effects 0.000 claims description 10
- 238000000034 method Methods 0.000 claims description 9
- 230000006870 function Effects 0.000 claims description 6
- 230000008451 emotion Effects 0.000 claims description 4
- 239000002304 perfume Substances 0.000 claims description 4
- 238000013210 evaluation model Methods 0.000 claims description 3
- 230000003993 interaction Effects 0.000 claims description 3
- 239000000341 volatile oil Substances 0.000 claims description 3
- 239000006199 nebulizer Substances 0.000 claims description 2
- 230000000875 corresponding effect Effects 0.000 description 11
- 238000013507 mapping Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 238000002663 nebulization Methods 0.000 description 2
- 239000003016 pheromone Substances 0.000 description 2
- 235000019640 taste Nutrition 0.000 description 2
- 241000251468 Actinopterygii Species 0.000 description 1
- 241000894006 Bacteria Species 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 239000013566 allergen Substances 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 235000009508 confectionery Nutrition 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000009792 diffusion process Methods 0.000 description 1
- 230000000763 evoking effect Effects 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 239000000796 flavoring agent Substances 0.000 description 1
- 235000019634 flavors Nutrition 0.000 description 1
- 150000002500 ions Chemical class 0.000 description 1
- 239000011859 microparticle Substances 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 150000002894 organic compounds Chemical class 0.000 description 1
- 238000005067 remediation Methods 0.000 description 1
- 230000000241 respiratory effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000033764 rhythmic process Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 235000019583 umami taste Nutrition 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Arrangement of adaptations of instruments
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60H—ARRANGEMENTS OF HEATING, COOLING, VENTILATING OR OTHER AIR-TREATING DEVICES SPECIALLY ADAPTED FOR PASSENGER OR GOODS SPACES OF VEHICLES
- B60H3/00—Other air-treating devices
- B60H3/0007—Adding substances other than water to the air, e.g. perfume, oxygen
-
- B60K35/25—
-
- B60K35/26—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q3/00—Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors
- B60Q3/80—Circuits; Control arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/22—Psychological state; Stress level or workload
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/011—Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
Definitions
- the invention relates to an interactive system with an occupant of a motor vehicle.
- the invention also relates to a method of interaction with an occupant of a motor vehicle.
- the transition to the autonomous car presents more and more challenges in terms of creating an empathetic vehicle that will make driving and / or travel more pleasant.
- the remaining progress can be grouped into three areas, including safety, comfort and entertainment.
- the system used includes, if possible, a camera, for example of the GoPro® type, or even a webcam. Such a device makes it possible to perform facial recognition of the occupant and thus adapt the interactive system.
- a known model consists in obtaining, from the various parameters, a map of the emotional state of the passenger.
- a map of the emotional state of the passenger can, for example, be represented by a point in a two-dimensional space formed by an abscissa axis corresponding to the valence (intrinsically pleasant or unpleasant quality of a stimulus or a situation) and a ordinate axis corresponding to the arousal (force of the emotional stimulus).
- the invention aims to improve known systems by proposing an interactive system making it possible to assess the emotional state of an occupant of a vehicle, by using an emotional index making it possible to easily determine the stimuli or stimuli. to be applied to said occupant and thus have an impact on the safety and / or comfort and / or entertainment of the occupants of the vehicle.
- the subject of the invention is an interactive system with an occupant of a motor vehicle comprising:
- Interactive system with an occupant of a motor vehicle comprising:
- a measuring device comprising at least one sensor arranged to acquire at least one parameter related to the occupant of said vehicle,
- This invention is remarkable in that an emotional comfort index is calculated from the representative data and that at least one actuator is configured to activate at least one multi-sensory stimulus to interact with the occupant, said stimulus making it possible to modify the emotional state of said occupant.
- the emotional comfort index can be calculated using the formula / C1 being the emotional comfort index and
- CISn being a score of the comfort index determined at a time n.
- the score of the comfort index is a function of the valence and the arousal.
- the score of the comfort index is calculated from the following formula: vn being valence and an arousal,
- CISn being a comfort index score determined at a time n.
- n is between 10 milliseconds and 300 milliseconds.
- the values of the comfort index score are collected for a period of between 0.1 seconds and 30 seconds, particularly 0.1 seconds and 5 seconds.
- the sensors are composed of at least one ultra-wideband radar and an infrared camera.
- the ultra wideband radar has a frequency between 10GHz and 1THz, in particular between 50GHz and 160GHz.
- the infrared camera detects
- wavelengths between 0.7mm and 100 miti in particular between 25mm and 100mm.
- the at least one multi-sensory stimulus is a perfume and / or essential oils and / or a nebulizer and / or lighting and / or sound and / or music and / or vibration and / or a massage and / or a flow of air and / or light.
- the interactive system has a learning loop making it possible to improve the processing of the emotion evaluation model.
- the learning loop makes it possible to create a database.
- the interactive system comprises at least one display means, particularly a user interface.
- the at least one display means can be a portable device, in particular a watch and / or glasses and / or a bracelet and / or a belt and / or a shirt.
- the at least one display means can be a text and / or a color and / or a sound and / or a vibration.
- the representative data is pretreated before activating the at least one multi-sensory stimulus.
- the representative data placed in 3D space will be correlated to one or more maps corresponding to different stimuli.
- the pretreatment consists in taking into account additional data, in particular the culture and / or education of the occupant and / or the life and personal experience of said occupant, and / or age and / or gender and / or type of clothing.
- the additional data can be determined by means of an autonomous measurement system allowing the measurement of air quality and / or pollution, and / or microparticles and / or allergens and / or the sound environment and / or the light environment and / or the temperature and / or the humidity.
- the evaluation model is created using artificial intelligence.
- the multi-sensory stimulus allows a
- the multi-sensory stimulus allows a remediation of the occupant's emotional state.
- the multi-sensory stimulus allows a
- the at least one sensor can be chosen from: a camera, in particular far infrared and / or near infrared and / or in the visible, a microphone, a portable device, a sensor installed in the vehicle in particular a conductive element placed in a seat and / or in a steering wheel and / or in an armrest.
- the interactive system can include at least one far infrared camera, several vital sign sensors and a microphone.
- the invention also relates to a method of interaction with an occupant of a motor vehicle comprising the steps consisting in:
- Figure 1 is a schematic diagram of at least part of the process according to the invention.
- Figure 2 is a schematic representation of a space
- Figure 3 is a schematic representation of the step of analyzing the process of Figure 1,
- Figure 4 is a schematic representation of the step of analyzing the process of Figure 1,
- Figure 5 is a schematic representation of the multi-sensory mapping used in the process of Figure 1,
- Figure 6 is a schematic representation of the construction of the cartography of Figure 5
- the interactive system proposed in the invention uses a method having five steps. These five steps shown diagrammatically in [FIG. 1] consist of: a) a measurement step M carried out by a measurement device and during which one or more parameters relating to the occupant (s) are collected, b) an analysis step A to map the emotional state of
- the occupant c) a calculation step C during which an emotional comfort index C1 is determined, d) an action step Ac during which at least one multi-sensory stimulus is applied so as to modify the emotional state of the occupant (s).
- the OR the stimuli are chosen as a function of the value of the emotional comfort index C1 calculated in the previous step, e) a possible information step Inf during which the occupant is informed of his emotional state and of the stimuli applied.
- the measurement device will acquire one or more parameters describing the status and therefore the emotional state of the occupant (what is described for an occupant can of course be applied to several occupants of the vehicle simultaneously). Parameters describing the environment in which the occupant is located can also be collected in order to deduce their possible effect on the emotional state of the occupant.
- the device described in the preceding paragraphs is composed of one or more sensors. These sensors are preferably sensors
- the vehicle on board the vehicle, such as one or more cameras, vital sign sensors, one or more microphones or even contact sensors.
- the measurements are made by cameras, in particular infrared cameras which take images in the infrared range. These cameras are directed towards the expected positions of the various occupants of the vehicle: driver's seat, passenger seat, rear seat, etc. In particular, one or more very wide angle cameras (for example of the “fisheye” or “fish eye” type). »In French) can cover several positions simultaneously. Infrared cameras preferentially detect wavelengths between 0.7mm and 100mm, preferably 25mm and 100mm.
- NIR near infrared cameras
- Fl R far infrared cameras
- the images from nearby infrared cameras can for example be used to delimit the position, dimensions and movements of different parts of the body of an occupant of the vehicle.
- the images from far infrared cameras can for example be used to identify the body parts of the occupant exchanging the most heat with the passenger compartment, for example the head and hands, which are not reopened with clothing and will therefore appear warmer.
- the FIR and RGB NIR cameras can be merged to better identify a fixed location on the more precise RGB camera and follow it on the FIR camera.
- the images from nearby infrared cameras can for example be used to delimit the position, dimensions and movements of different parts of the body of an occupant of the vehicle.
- the images from far infrared cameras can for example be used to identify the parts of the occupant's body exchanging the most heat with the passenger compartment, for example the head and hands, which are not covered with clothing and will thus appear more hot.
- the measurement device also makes it possible to determine the environment of the passenger or passengers, this being able to influence his emotional state.
- data such as temperature, light intensity, noise, vehicle speed, etc. will be able to be collected.
- the measuring device includes at least one microphone allowing
- the measuring device can also include biosensors which can detect parameters such as organic compounds, ions, bacteria, etc.
- the measuring device can also be composed of vital sign sensors. These can for example be in the form of contactless sensors such as for example a radar, a camera, or even portable elements (watch, shirt, bracelet, etc.).
- Vital sign sensors in contact with the passenger can also be used. They are, for example, in the form of elements
- a preferred embodiment uses an ultra wideband short range radar.
- the radar frequency can for example be between 10 GHz and 1 THz, preferably between 50 GHz and 160 GHz.
- the next analysis step A consists in evaluating and interpreting the parameters obtained during the measurement step M.
- these parameters will be translated into representative data easily represented in a space
- This 2D space is shown in [ Figure 3]. It is formed by a first axis corresponding to the valence v n and a second axis corresponding to the arousal a n .
- the valence characterizes the level of pleasure or annoyance associated with an emotion whereas the arousal can be defined as being the intensity of the affects and the response generated by this emotion.
- the individual's control status can also be assessed from measurements made on the vehicle. We then speak of dominance. The space used is then a three-dimensional space and no longer two-dimensional.
- the cartography is then modified taking into account the impact of the culture and / or education of the occupant.
- This step can for example be carried out by deep learning (or “deep leaming” in English), allowing obtain a user profile for each occupant
- the last step in the construction of the mapping concerns the impact of the personal life and / or the experience of the occupant This step is preferably done, by statistical learning (or "machine leaming" in English) thus allowing adapt the mapping in real time.
- the computation step C of the emotional comfort index C1 is carried out by means of several measurements of the score of the comfort index CISn which can be deduced from the representative data described above and therefore from the values of the valence vn and of the arousal year at a given time.
- the CISn score is calculated from a function whose variables are valence vn and arousal an, respectively representing the positive or negative experience of the occupant, as well as the importance of said experience.
- the CISn comfort index score can be calculated from the following formula in which v n is valence, and an is arousal.
- the score of the CISn comfort index can be calculated from a more complex formula depending on the valence v n and the arousal an.
- the Ac action step shown in [Figure 1] corresponds to the application of one or more stimuli.
- the choice of stimulus depends on the desired effect.
- different types of actions can be selected.
- the multisensory stimuli used can, for example, be: a) For sight: use of interior lighting with different colors
- the interactive system may have one or more display means. This may be, for example, in the form of a portable device, in particular a watch and / or glasses and / or a bracelet and / or a belt and / or a shirt and / or a mobile telephone.
- the display means can also be in the form of text and / or a color and / or a sound and / or a vibration.
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1900118A FR3091603B1 (en) | 2019-01-07 | 2019-01-07 | Interactive system |
PCT/FR2020/050012 WO2020144424A1 (en) | 2019-01-07 | 2020-01-06 | Interactive system |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3908911A1 true EP3908911A1 (en) | 2021-11-17 |
Family
ID=67383852
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP20705404.0A Pending EP3908911A1 (en) | 2019-01-07 | 2020-01-06 | Interactive system |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP3908911A1 (en) |
CN (1) | CN113260955A (en) |
FR (1) | FR3091603B1 (en) |
WO (1) | WO2020144424A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113212337A (en) * | 2021-06-01 | 2021-08-06 | 北京现代汽车有限公司 | Cabin control method and system and vehicle |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101006191B1 (en) * | 2002-08-06 | 2011-01-07 | 윤재민 | Emotion and Motion Extracting Method of Virtual Human |
DE102004006910A1 (en) * | 2004-02-12 | 2005-08-25 | Bayerische Motoren Werke Ag | Vehicle control procedure senses driver and passenger health using contactless biosensors and uses vehicle environment control equipment to improve situation |
JP2008068664A (en) * | 2006-09-12 | 2008-03-27 | Fujitsu Ten Ltd | Vehicle control apparatus and vehicle control method |
US9507413B2 (en) * | 2010-12-03 | 2016-11-29 | Continental Automotive Systems, Inc. | Tailoring vehicle human machine interface |
US9493130B2 (en) * | 2011-04-22 | 2016-11-15 | Angel A. Penilla | Methods and systems for communicating content to connected vehicle users based detected tone/mood in voice input |
WO2014144690A2 (en) * | 2013-03-15 | 2014-09-18 | Edwards David A | Systems, methods and articles to provide olfactory sensations |
KR101508059B1 (en) * | 2013-06-26 | 2015-04-07 | 숭실대학교산학협력단 | Apparatus and Method for pleasant-unpleasant quotient of word |
DE102015200775A1 (en) * | 2015-01-20 | 2016-07-21 | Bayerische Motoren Werke Aktiengesellschaft | Independent assessment of an emotional state and a cognitive burden |
CN106652378A (en) * | 2015-11-02 | 2017-05-10 | 比亚迪股份有限公司 | Driving reminding method and system for vehicle, server and vehicle |
WO2018033819A1 (en) * | 2016-08-16 | 2018-02-22 | Resolve Digital Health Inc. | Digital health ecosystem |
US10150351B2 (en) * | 2017-02-08 | 2018-12-11 | Lp-Research Inc. | Machine learning for olfactory mood alteration |
-
2019
- 2019-01-07 FR FR1900118A patent/FR3091603B1/en active Active
-
2020
- 2020-01-06 CN CN202080007703.0A patent/CN113260955A/en active Pending
- 2020-01-06 EP EP20705404.0A patent/EP3908911A1/en active Pending
- 2020-01-06 WO PCT/FR2020/050012 patent/WO2020144424A1/en unknown
Also Published As
Publication number | Publication date |
---|---|
FR3091603A1 (en) | 2020-07-10 |
US20220063408A1 (en) | 2022-03-03 |
WO2020144424A1 (en) | 2020-07-16 |
FR3091603B1 (en) | 2022-01-21 |
CN113260955A (en) | 2021-08-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3883472A1 (en) | System interacting with an occupant of a motor vehicle | |
US11479258B1 (en) | Smart ring system for monitoring UVB exposure levels and using machine learning technique to predict high risk driving behavior | |
CN110268456A (en) | Driver's monitoring arrangement, driver monitor method, learning device and learning method | |
KR102045569B1 (en) | Appratus for controlling integrated supervisory of pilots status and method for guiding task performance ability of pilots using the same | |
EP3424030A1 (en) | Personalized device and method for monitoring a motor vehicle driver | |
EP3882097A1 (en) | Techniques for separating driving emotion from media induced emotion in a driver monitoring system | |
EP1788536B1 (en) | Method for the evaluation of the state of vigilance of a vehicle conductor | |
FR3038770A1 (en) | SYSTEM FOR MONITORING THE STATUS OF VIGILANCE OF AN OPERATOR | |
WO2013092214A2 (en) | Method for rating the noise of the brakes of a motor vehicle | |
EP3908911A1 (en) | Interactive system | |
FR3028741A1 (en) | DEVICE FOR MEASURING THE HEART RATE OF THE DRIVER OF A VEHICLE | |
FR3099610A1 (en) | EMOTION EVALUATION SYSTEM | |
JP7183782B2 (en) | Emotion estimation device, environment providing system, vehicle, emotion estimation method, and information processing program | |
US20230129746A1 (en) | Cognitive load predictor and decision aid | |
US11970059B2 (en) | Interactive system | |
EP3492015A1 (en) | Device and method for detecting emotion | |
CN115428093A (en) | Techniques for providing user-adapted services to users | |
FR3104283A1 (en) | Interactive system and associated interaction method | |
Rundo et al. | Advanced temporal dilated convolutional neural network for a robust car driver identification | |
WO2022017737A1 (en) | Interactive system and associated interaction method | |
EP4069543A1 (en) | Interactive system and associated interaction method | |
FR3112222A1 (en) | Interactive system, in particular for a motor vehicle | |
FR3116472A1 (en) | Thermal management method and corresponding thermal management system | |
FR3115976A1 (en) | Information processing device and method for determining biological state |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20210616 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20230208 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230528 |