EP3590071A1 - Device for determining the attentiveness of a driver of a vehicle, on-board system comprising such a device, and associated method - Google Patents
Device for determining the attentiveness of a driver of a vehicle, on-board system comprising such a device, and associated methodInfo
- Publication number
- EP3590071A1 EP3590071A1 EP18708630.1A EP18708630A EP3590071A1 EP 3590071 A1 EP3590071 A1 EP 3590071A1 EP 18708630 A EP18708630 A EP 18708630A EP 3590071 A1 EP3590071 A1 EP 3590071A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- driver
- hand
- detection zone
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/082—Selecting or switching between different modes of propelling
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/005—Handover processes
- B60W60/0053—Handover processes from vehicle to occupant
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/005—Handover processes
- B60W60/0059—Estimation of the risk associated with autonomous or manual driving, e.g. situation too complex, sensor failure or driver incapacity
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2413—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
- G06F18/24133—Distances to prototypes
- G06F18/24143—Distances to neighbourhood prototypes, e.g. restricted Coulomb energy networks [RCEN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/243—Classification techniques relating to the number of classes
- G06F18/2433—Single-class perspective, e.g. one-against-all classification; Novelty detection; Outlier detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/107—Static hand or arm
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo or light sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/223—Posture, e.g. hand, foot, or seat position, turned or inclined
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/229—Attention level, e.g. attentive to driving, reading or sleeping
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/271—Image signal generators wherein the generated image signals comprise depth maps or disparity maps
Definitions
- Device for determining the attention state of a vehicle driver embedded system comprising such a device, and associated method
- the present invention relates to a device for determining the state of attention of a vehicle driver.
- a monitoring device adapted to determine a state of alertness of the driver and in particular to prevent drowsy driving. Depending on the state of alertness determined, the monitoring device alerts the driver to prevent it from getting into a dangerous situation.
- Such a monitoring device deduces the state of vigilance of the driver according to behavioral parameters associated with the driver and / or operating parameters of the vehicle.
- the behavioral parameters for example the closing rate of the eyelids or the gaze direction
- the operating parameters of the vehicle for example the parameters relating to an angle of rotation of the steering wheel, at a vehicle speed or the action of the driver on certain keys, are obtained from physical sensors of the vehicle.
- the present invention proposes an on-board device for determining a state of attention of a vehicle driver.
- a device for determining a state of attention of a vehicle driver comprising:
- an image capture unit embedded in said vehicle said image capture unit being adapted to capture at least one image of a detection zone situated in said vehicle, and
- an image processing unit adapted to receive said captured and programmed image for determining the state of attention of the driver, as a function of the detection of the presence of an object of distraction in one of the hands of the driver located in the detection area.
- an object of distraction is an object other than a driving member of the vehicle. It is an object capable of distracting the driver from his driving, and occupying his hand so that said object of distraction prevents the driver from interacting safely with the driving members of the vehicle.
- the device makes it possible to determine the state of attention of the driver as a function of a state of occupation of at least one of the hands of the driver among a "busy” state in which the hand holds an object distraction, and a "free” state in which it is free to interact with the driving organs of the vehicle because it holds no object of distraction.
- a busy state the driver is in a lower state of attention than when both hands are in a free state because the driver may be embarrassed to intervene quickly on the drivers of the vehicle if it holds an object of distraction in his hand.
- the image capture unit comprises at least one sensor adapted to capture a three-dimensional image of the detection zone, said three-dimensional image including information relating to the distance, with respect to said sensor, from said distracted object and / or said hand located in the detection zone;
- the image capture unit comprises at least one sensor adapted to capture at least one image of a first nature comprising a first type of information relating to said distraction object and / or to said hand located in the detection zone , and a sensor adapted to capture at least one image of a second nature distinct from the first nature, comprising a second type information relating to said distraction object and / or said hand located in the detection zone;
- the image of first nature is chosen from: a three-dimensional image comprising information relating to the distance, with respect to said sensor, from at least said distraction object and / or said hand located in the detection zone, a two-dimensional image comprising information relating to the luminance of at least said distraction object and / or said hand located in the detection zone, and a thermal image including information relating to the temperature of at least said amusement object and / or said hand located in the detection zone (the image of second nature can also be chosen from the aforementioned image types, while being of a nature distinct from the first nature as already indicated);
- the image capture unit further comprises at least one sensor adapted to capture an image of a third nature distinct from said first and second natures, comprising a third type of information relating to said object and / or to said hand located in the detection zone
- the image of the third nature is chosen from: a three-dimensional image comprising information relating to the distance, with respect to said sensor, from at least said distraction object and / or said hand located in the detection zone, a two-dimensional image comprising information relating to the luminance of at least said distraction object and / or said hand located in the detection zone, and a thermal image including information relating to the temperature of at least said amusement object and / or said hand located in the detection zone;
- the image processing unit is programmed to implement the following steps:
- step b1) the image processing unit is programmed to implement step b1) according to the following substeps:
- step b2) the image processing unit is programmed to implement step b2) according to the following substeps:
- the image processing unit implements step b1) from the image of first nature, and step b2) from the image of a nature distinct from the first nature;
- the image processing unit implements the steps b1) to b2) from each image received from the image capture unit, and implements an additional step according to which it determines a confidence index associated with the detection of the presence of the object of distraction in the hand of the driver according to which it deduces the state of attention of the driver;
- the image processing unit comprises processing means using a trained neural network to determine the state of attention of the driver from the image of the detection zone captured by the capture unit; images.
- the invention also relates to an on-board vehicle system comprising:
- a device adapted to determine the state of attention of the driver
- an autonomous driving unit of said vehicle programmed to control driving members of said vehicle independently of the driver
- a decision unit programmed to authorize the driver to control at least part of the driving members of the vehicle when determining a free state of the hand (or at least one of the two hands, or both hands) of the driver and / or to alert the driver when determining a busy state of the driver's hand, and / or to switch the autonomous driving in a secure mode.
- the vehicle is an autonomous motor vehicle, that is to say whose drivers are controlled independently of the driver, the driver will not be able to resume control, even partial, of the driving members of the vehicle, if occupation status of his hand is determined as "occupied”.
- the autonomous vehicle can be brought by the decision unit of the onboard system to park on the low side (or on the emergency stop band), according to the aforementioned secure mode.
- the detection of an object of distraction in at least one of the hands of the driver may for example be followed by a warning of the driver on the risk taken.
- the invention finally proposes a method for determining a state of attention of a vehicle driver, according to which
- an image capture unit embedded in said vehicle captures at least one image of a detection zone located in said vehicle
- an image processing unit receives said captured image and determines the state of attention of the driver, according to the detection of the presence of a distraction object in at least one of the driver's hands located in the area detection.
- the driving members of said vehicle are controlled independently of the driver, there is further provided a step according to which the driver is authorized to control at least part of the driving members of the vehicle in case of determination of a state. free of one of the hands of the driver and / or alerting the driver when determining a busy state of one of said hands of the driver and / or tilt the autonomous driving in a secure mode.
- the method that has just been proposed may also optionally include steps such as those proposed above with respect to the device for determining a state of attention of the driver (in particular steps b1) and b2) and the possible steps for these steps).
- FIG. 1 shows schematically, in front view, a motor vehicle comprising a device according to the invention
- FIG. 2 diagrammatically represents the device of FIG. 1 according to two possible embodiments, a first embodiment (in solid lines) in which the image capture unit 11 comprises a single sensor 12, and a second an embodiment in which the image capture unit 11 comprises two sensors 12, 13 (solid line and dotted line, the sensor 13 being shown in dashed lines); and
- FIG. 3 represents a flowchart of the main steps of the method according to the invention.
- FIG. 1 shows the front of a vehicle 1 embodying a device 10 for determining a state of attention of a driver of the vehicle 1.
- such a device 10 is adapted to determine the state of attention of the driver 4 according to a state of occupation of at least one of the hands of said driver 4 of the vehicle 1.
- such a device 10 is adapted to detect at least one of the hands of the driver in a detection zone D of the vehicle 1, and to determine the state of occupation of this (or these) hand (s) to deduce the the state of attention of the driver.
- the state of occupation is determined from a "busy” state in which the hand holds an object of distraction, and a “free” state in which it does not hold any object of distraction.
- the hand can for example be engaged in driving the vehicle, that is to say act on a driving member of the vehicle, or be empty, for example at rest on an armrest.
- the object of distraction is an object other than a driving member of the vehicle.
- This is for example a mobile phone, a book, a road map, a GPS etc.
- the drivers accessible to the driver's hands are, for example, the steering wheel 3, the gearshift lever, the controls (turn signal or windshield wiper levers), the switches (such as the hazard warning lights) or the parking brake. .
- the device comprises: an image capture unit 1 1 embedded in said vehicle 1, said image capture unit 11 being adapted to capture at least one image of the detection zone D situated in said vehicle 1, and
- an image processing unit 15 adapted to receive said captured and programmed image for determining the state of occupation of the hand of the driver 4 located in the detection zone D, as a function of the detection of the presence of a object of distraction in said hand.
- the image capture unit 1 1 is embedded in the motor vehicle, that is to say disposed inside the vehicle 1, more precisely inside the passenger compartment of the vehicle 1.
- the image capture unit 11 comprises at least one sensor 12, 13 adapted to capture an image of a first nature of the detection zone D.
- the detection zone D is located between the gear lever 2 and the front door of the driver.
- the detection zone D thus encompasses the steering wheel 3 of the vehicle 1 and contains both hands of the driver.
- the sensor 12, 13 of the image capture unit 1 1 is for example placed in a front ceiling of the vehicle 1 automobile, so that it takes in top view the zone detection D.
- the senor could be placed on the dashboard of the vehicle, in a central area of the latter, so that the detection area would be seen from the front.
- the detection zone may contain only one of the driver's hands.
- the senor could be placed behind the steering wheel 3 of the vehicle, at the dashboard.
- the detection zone could then easily contain the driver's right hand and left hand.
- the image capture unit comprises a single sensor 12 adapted to capture a three-dimensional image of the detection zone D, said three-dimensional image comprising information relating to the distance, with respect to said sensor, of at least a part of the elements of the space contained in the detection zone D.
- elements of the space include the hands of the driver 4 and the object of distraction possibly present in the detection zone D.
- the elements of the space can also include elements of the environment of the driver, for example elements the passenger compartment of the vehicle and the vehicle control devices such as the gear lever, steering wheel, armrest, etc.
- the three-dimensional image includes a scatterplot representing the envelope of the space elements present in the detection zone D, including the hand of the driver, the forearm of the driver, and the distraction object that may be present therein. to be present.
- the scatterplot thus gives information as to the position in space of the elements of the space present in the detection zone D, in particular information relative to their distance relative to said sensor.
- Such a sensor 12 adapted to capture three-dimensional images is known to those skilled in the art and will not be described in detail. It will be specified simply that it could be a flight time sensor, such as a time of flight camera ("Time of Flight" or TOF according to the English acronym) adapted to send the light to the conductor 4 and measure the time it takes for this light to return to said time-of-flight sensor to deduce the three-dimensional image of the detection zone D.
- a stereoscopic sensor comprising at least two cameras, each capturing an image of the detection zone according to its own point of view, the images of each camera being then combined to deduce the three-dimensional image of the detection zone. It can also be a structured light sensor adapted to project a pattern on the detection zone and to analyze the deformation of this pattern to deduce the three-dimensional image of the conductor 4.
- the image processing unit 15 is adapted to receive the three-dimensional image of the sensor, and programmed to determine by means of this image, the state of occupation of at least one of the hands of the conductor, according to the detection of the presence of the object of distraction in said hand.
- the image processing unit 15 is programmed to implement the following steps: b1) detecting at least one of the hands of the driver in said image received from the image capture unit 1 1, and
- the image processing unit 15 comprises processing means programmed to detect the image of the driver's hand in the image captured by the image capture unit 11, as well as the presence of the object of distraction in this hand.
- step b1) is programmed to implement step b1) according to the following substeps:
- the processing unit of the processing unit 15 is programmed to identify, among the points of the cloud of points, those associated with the image of the arm or the forearm of the driver.
- the recognition of the shape of the arm or the forearm of the driver (step b1)) is here based on a shape recognition algorithm.
- the processing means are also programmed to recognize characteristic shapes of space elements that may be present in the detection zone D, such as a part of the speed lever 2, a part of the steering wheel 3, part of an armrest, etc.
- the "shape" of the elements of space corresponds here to its external envelope. The detection of these space elements can facilitate the detection of the driver's arm.
- the treatment unit of the processing unit 15 identifies a portion of the cloud of points in which the hand of the driver is likely to be.
- the recognition of the shape of the hand is here based on an algorithm of form, realized with the determined portion of the cloud of points.
- the processing means are also programmed to identify at least two different regions in said three-dimensional image captured by the image capture unit 11, for example a first region formed by the images of the points closest to the sensor (foreground ) and a second region formed by the images of the farthest points of the sensor (second plane), and to recognize the shape of the arm or forearm of the driver 4 in these regions.
- the processing means are adapted to determine the location of the driver's arm, and then to deduce the location of the driver's hand in the real space of the detection zone D.
- step b2) is programmed to implement step b2) according to the following substeps:
- processing means are also programmed to identify the nature of the detected objects, according to the recognized shape at the estimated location of the hand in the image.
- the processing unit 15 is programmed to deduce the state of occupation of the driver's hand, according to the shape recognized at the level of the driver's hand in the three-dimensional image.
- This recognized form may be that associated with a distraction object a mobile phone, a glass or a cup, a book, a road map, etc., or that of elements still present in the detection zone D,
- the image processing unit 15 is programmed to determine that the driver's hand is in a busy state when it is very likely, given the shape recognized at the driver's hand in the image. three-dimensional, that the driver holds this object of distraction with his hand.
- the image processing unit could be programmed to determine that the hand is in a busy state when the distance between the hand and the distraction object, in the captured three-dimensional image, is less than a predetermined threshold value.
- the image processing unit could be programmed to determine that the hand is in a free state when the distance between the hand and the distracted object detected, in the captured three-dimensional image, is greater than a value. predetermined threshold.
- the driver can react faster on the manual driving organs of the vehicle, such as the steering wheel, turn signals, the horn, the hazard warning lights, or the gearshift, in case of need.
- the manual driving organs of the vehicle such as the steering wheel, turn signals, the horn, the hazard warning lights, or the gearshift, in case of need.
- the device according to the invention is thus adapted to determine that the overall attention state of the driver is lower when his hand is in a busy state than when it is in a free state.
- the image processing unit 15 comprises processing means including a data transmission network. trained neurons to directly recognize the state of occupancy of the hand, or even the state of attention of the driver, from an image of the detection zone D.
- the neural network is trained prior to its use in the device 10 according to the invention.
- the neural network is input fed by a plurality of images of the detection zone D in which the hand is in a free state (i.e. the hand is empty of any object of distraction, or holds a driving organ of the vehicle by hand) and it is indicated to the neural network that for its images, the state of the hand is free.
- the neural network is also fed by a plurality of images of the detection zone D in which the hand is in an occupied state (that is to say, it holds an object of distraction). , and we indicate to the neural network that for its pictures, the state of the hand is busy.
- the neural network then receiving as input the image of the detection zone D captured by the image-capturing unit 1 1, is programmed to output, the free or occupied state of the hand of the driver, or even directly the state of high attention (if the hand is in a free state) or low of the driver (if the hand is in a busy state).
- the image capture unit 1 1 comprises at least
- a sensor 12 (solid line in FIG. 2) adapted to capture at least one image of a first nature comprising a first type of information relating to the elements of the space contained in the detection zone D, and
- a sensor 13 (in dashed line in FIG. 2) adapted to capture at least one image of a second nature distinct from the first nature, comprising a second type of information relating to the elements of the space contained in the zone of detection D.
- the elements of the space contained in the detection zone D include in particular the hand of the driver and / or the object of distraction whose presence it is desired to detect. They may also include elements of the driver's environment naturally present in the detection zone D, such as the steering wheel, the gear lever or the armrest of one of the seats.
- the image capture unit 11 comprises either a single sensor (not shown) adapted to capture both the first and second nature images, or at least two distinct sensors 12, 13 adapted to capture respectively the first and second images. first and second natures.
- the image capture unit 1 1 comprises two separate sensors 12, 13, it is conceivable that they be arranged at the same place in the passenger compartment of the vehicle 1, for example in the ceiling lamp, behind the steering wheel 3, or still in a central region of the dashboard as previously described. It is also conceivable that each separate sensor is disposed at a different location of the passenger compartment of the vehicle, especially among the previously described locations.
- the first and second nature images are chosen from: a three-dimensional image comprising information relating to the distance, relative to said sensor, of at least a part of the elements of the space contained in the detection zone D, that is to say here at least of the object of distraction and / or the hand of the driver,
- a two-dimensional image comprising information relating to the luminance of at least part of the elements contained in the detection zone D, that is to say at least the distraction object and / or the driver's hand and,
- a thermal image comprising information relating to the temperature of at least part of the elements contained in the detection zone D, that is to say at least the distraction object and / or the driver's hand .
- the senor 12, 13 may be one of those described previously in the first embodiment of the device 10, namely a time-of-flight sensor, a stereoscopic sensor, or a structured light sensor.
- This type of sensor may in some cases also be suitable for capturing two-dimensional images.
- a conventional photographic type sensor, or a camera is also capable of capturing two-dimensional images.
- the two-dimensional images are images giving information on the luminance of the elements present in the detection zone D, including the hand and the forearm of the driver.
- the two-dimensional images comprise pixels representing regions of the detection zone D, that is to say a pixel corresponding to the image of a region of the detection zone D. Each pixel is more or less bright depending on the corresponding luminance of the region of the detection zone D that it represents.
- the two-dimensional images are in black and white, but one could also have two-dimensional color images, in which case each pixel would represent the chrominance of each corresponding region of the detection zone.
- the senor may be a thermal camera, for example a high wavelength infrared camera (or Long Wired InfraRed LWIR camera).
- a thermal camera for example a high wavelength infrared camera (or Long Wired InfraRed LWIR camera).
- the luminous intensity of the pixels of the thermal images depends on the temperature of the regions of the detection zone D corresponding to each pixel: the higher the temperature, the brighter the pixel, the lower the temperature, the darker the pixel.
- the forearm and the driver's hand will be represented by bright pixels, as well as the battery of the mobile phone, or a cup filled with hot liquid.
- the gear lever, a book or a road map will be represented by darker pixels.
- the processing unit it is also easier for the processing unit to recognize the nature of the elements of the space included in the detection zone D, and in particular to discern the forearm of the driver terminated by the hand of the driver of the rest of these elements of space.
- the image capture unit 11 may further comprise at least one sensor adapted to capture an image of a third nature, distinct from said first and second natures, comprising a third type of information relating to the elements of the space contained in the detection zone D.
- the image capture unit 1 1 will in this case comprise at least two separate sensors, possibly arranged at different locations of the passenger compartment of the vehicle, to capture the images of first, second and third natures.
- the third nature image is chosen from the images described above, namely a three-dimensional image, a two-dimensional image or a thermal image.
- the sensors described above are for example used.
- the image processing unit 15 takes into account at least three images of different natures, namely a three-dimensional image, a two-dimensional image and a thermal image, for determining the state of occupation of the driver's hand.
- the image processing unit 15 is programmed to implement steps b1) and b2) explained above, namely:
- the image processing unit 15 comprises processing means programmed to implement the steps b1) and b2) in a so-called sequential implementation or in a so-called concurrent implementation.
- steps b1) and b2) are made from one or two images of the same nature.
- the image processing unit 15 implements the step b1) from at least a first captured image of first nature, and the step b2) from at least a second captured image of a different nature from the first nature.
- the first image of first nature is for example one of the three-dimensional images captured by the image capture unit 1 1.
- the processing means are programmed to segment this three-dimensional image, that is to say to identify at least two different regions in said three-dimensional image, and to recognize the shape of the forearm or the arm of the driver 4 in each of these regions, as described in the first embodiment of the device 10, and to deduce the position of the hand of the driver, logically at the end of the arm recognized.
- the two aforementioned regions correspond for example, as mentioned above, to a near region and to a region remote from the sensor.
- the processing means of the image processing unit 15 are programmed to detect the possible presence of an object in the driver's hand by means of the second second nature image captured by the image capture unit 1 1.
- the processing means of the image processing unit 15 are programmed to recognize the characteristic shape of objects possibly present in the detection zone D, at the position of the hand as evaluated as explained above, from two-dimensional and / or thermal images.
- the processing means are then programmed to determine, by combining the information of the first and second images (first and second natures) if an object is present in the hand of the driver or not and, if so, to determine the nature of the this object to estimate whether it is a distraction object (or a driving device).
- the image processing unit 15 is programmed to determine that the hand is in a free state.
- the processing unit 15 is programmed to identify the nature of the detected object.
- the image processing unit 15 is then programmed to determine that the hand is in a busy state if the detected object is an object of distraction (and to determine that the hand is in a free state, in the sense available for the driving, if the detected object is a driving device).
- the shape recognized at the level of the hand in the first and second nature images confirms the probability that it holds an object of distraction, or greatly reduce this probability.
- the shape of the contour of the hand can thus be taken into account by the image processing unit 15 for determining the state of occupation of the hand.
- the sequential implementation makes it possible to use the most relevant information of each of the first and second nature images, depending on the step to be performed.
- thermal or two-dimensional images as a first-nature image
- three-dimensional, two-dimensional or thermal images as an image of second nature, provided that they are chosen from distinct nature of the first nature.
- the image processing unit 15 implements the steps b1) and b2) from each image received from the image capture unit 11, and implements a step that it determines a confidence index associated with the detection of the presence of the object in the driver's hand according to which it deduces the state of occupation of the driver's hand.
- the image processing unit 15 implements steps b1) and b2) independently for each of the first, second and third images.
- steps b1) and b2) are implemented with the three-dimensional images as described in the case of the first embodiment of the device 10 according to the invention.
- the processing means of the image processing unit 15 are programmed to identify, from among the pixels of each image, those associated with the image of the hand and / or the image of the image. the forearm of the driver, and those associated with the image of a distraction object possibly present in the detection zone D, from shape recognition algorithms specific to said thermal and two-dimensional images.
- the image processing unit 15 is programmed to deduce the state of occupation of the hand of the driver in these images.
- the image processing unit 15 determines the confidence index of the treatment, that is to say determines whether the treatment has led to a result, and if this result is safe, or if it is rather random.
- the confidence index is at most if one recognizes the palm and five fingers in the image; the index is lower when fewer elements are recognized, for example when one recognizes the palm and only two fingers.
- the image processing unit 15 determines the actual occupation status (occupied or free) of the driver's hand from the image whose processing has led to the highest confidence index, c ' that is to say, whose result is the safest.
- the image processing unit 15 comprises processing means including a trained neuron network for directly recognizing the state of occupancy of the hand from an image of the detection zone D.
- the neural network is trained prior to its use in the device according to the invention according to the same principle as that described in the case of the first embodiment, except that it is trained with images of first, second and possibly third natures.
- the neuron network thus driven is thus adapted to receive first, second, and possibly third-nature images as input to determine the state of occupation of the driver's hand.
- the fact of using images of different types at the input of the neural network makes it possible to increase the reliability of the result given at the output of said neural network.
- the device according to the invention finally finds a particularly useful application in an embedded system 100 for a vehicle comprising
- the device according to the invention adapted to determine the state of occupation of at least one of the hands of the conductor 4 present in the detection zone D,
- an autonomous driving unit 50 of said vehicle 1 programmed to control driving members of said vehicle independently of the driver
- a decision unit (not shown) programmed to allow the driver 4 to control at least part of the driving members of the vehicle 1 in case of determination of a free state the hand of the driver and / or to alert the driver in case of determination a busy state of the driver's hand, and / or to tilt the autonomous driving in a secure mode.
- the driving members include the steering wheel 3, the acceleration and brake pedals, the gear lever, the turn signals, the headlights, the wipers.
- the driving members of the vehicle include all the vehicle elements used for driving.
- the autonomous driving unit 50 is adapted to control the various driving members so that said vehicle is driven without intervention of the driver.
- the driver is allowed to be in a state of insufficient vigilance, that is to say, being distracted, and it is possible that he reads a book or looks at his mobile phone without danger for driving.
- the decision unit is programmed to allow the driver to at least partially control the driving members only when his state of global attention is high, especially when at least one of his hands is in a state free.
- the decision unit does not allow a restart by the driver, but may for example command the display of a warning message to the driver to encourage him to be attentive and / or switch the autonomous driving in a secure mode in which the autonomous driving unit 50 controls for example the parking of the vehicle on the low side (or on the emergency stop band) ).
- FIG. 3 shows in flowchart form the main steps of the method implemented by the device according to the invention.
- the method according to the invention for determining the state of occupation of at least one of the hands of the driver of the vehicle comprises steps according to which:
- the image capture unit 1 1 embedded in said vehicle 1 captures at least one image of the detection zone D located in said vehicle 1,
- the image processing unit 15 receives said captured image and determines the state of occupation of a hand of the conductor 4 located in the detection zone D, as a function of the detection of the presence of an object of distraction in said hand.
- step a) the image capture unit 11 of the device 10 according to the invention captures at least one image of a first nature of the detection zone. D.
- the detection zone D Preferably, it captures two or even three images, of the same nature or of different natures, from the detection zone D, preferably of natures different.
- each image of different nature is captured at a given moment, that is to say that all the images are captured simultaneously by the image capture unit 11 or in a relatively short time interval, in particular well under a minute. This ensures that the analyzed situation has not changed between the captures of images of different natures.
- step b) the image processing unit 15 receives the captured image or images in step a).
- the image processing unit 15 implements the steps described above, namely
- step b) When two images of different natures are captured in step a), by the image capture unit 11 according to the second embodiment of the device 10 according to the invention, the implementation of step b) can be sequential or concurrent, as previously described.
- the image processing unit 15 comprises a trained neuron network represented by the channel (2).
- said image processing unit 15 directly recognizes the state of occupation of the hand from the images received from the image capture unit 11 (block G of FIG. 3).
- step b) the image processing unit 15 sends the state of occupation of the hand of the driver to the devices of the vehicle the applicant (block E2 of the FIG. 3), in particular to the autonomous vehicle decision unit, or to a device monitoring the existing driver of the vehicle.
- the device according to the invention informs the driver of the busy state of at least one of his hands, so as to encourage him to refocus on his conduct.
- an additional step of determining a state of overall attention of the driver taking into account the state of occupation of the hand of the driver and / or a state of alertness of the driver determined by d other known means.
- the driver in a situation where the driving members of said vehicle are controlled independently of the driver, there is further provided a step according to which the driver is authorized to control at least part of the driving members of the vehicle in the event of determination of a free state of the driver's hand and / or in the event of determination of a busy state of the driver's hand, the driver is alerted and / or the autonomous driving is tilted in a secure mode.
- the device, the system and the method according to the invention are particularly advantageous in partially or completely autonomous driving situations, during which the driver is allowed to relax his attention, that is to say to present a state of insufficient vigilance .
- the position of the driver in his situations can be changed to the point that he is no longer in front of a possible surveillance device adapted to capture an image of his head to assess his level of vigilance. It is then very useful to determine the state of occupation of the hands of the driver to apprehend his state of overall attention.
- the invention makes it possible to provide information complementary to that already provided by a possible device for monitoring the driver.
- the invention applies to any type of vehicle, including transport vehicles such as boats, trucks, trains.
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1700222A FR3063557B1 (en) | 2017-03-03 | 2017-03-03 | DEVICE FOR DETERMINING THE STATE OF ATTENTION OF A VEHICLE DRIVER, ON-BOARD SYSTEM COMPRISING SUCH A DEVICE, AND ASSOCIATED METHOD |
PCT/EP2018/054586 WO2018158163A1 (en) | 2017-03-03 | 2018-02-23 | Device for determining the attentiveness of a driver of a vehicle, on-board system comprising such a device, and associated method |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3590071A1 true EP3590071A1 (en) | 2020-01-08 |
Family
ID=58645179
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP18708630.1A Withdrawn EP3590071A1 (en) | 2017-03-03 | 2018-02-23 | Device for determining the attentiveness of a driver of a vehicle, on-board system comprising such a device, and associated method |
Country Status (5)
Country | Link |
---|---|
US (1) | US11170241B2 (en) |
EP (1) | EP3590071A1 (en) |
CN (1) | CN110383290B (en) |
FR (1) | FR3063557B1 (en) |
WO (1) | WO2018158163A1 (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019028798A1 (en) * | 2017-08-10 | 2019-02-14 | 北京市商汤科技开发有限公司 | Method and device for monitoring driving condition, and electronic device |
FR3085927B1 (en) * | 2018-09-18 | 2023-04-14 | Valeo Comfort & Driving Assistance | DRIVER DISTRACTION DETECTION DEVICE, SYSTEM AND METHOD |
KR102612925B1 (en) * | 2018-10-18 | 2023-12-13 | 주식회사 에이치엘클레무브 | Emergency controlling device for vehicle |
CN111845757A (en) * | 2019-04-30 | 2020-10-30 | 通用汽车环球科技运作有限责任公司 | Distraction-eliminating system |
US11447140B2 (en) * | 2020-10-20 | 2022-09-20 | Toyota Motor Engineering & Manufacturing North America, Inc. | Cognitive tunneling mitigation device for driving |
CN116457843A (en) * | 2020-11-23 | 2023-07-18 | 索尼半导体解决方案公司 | Time-of-flight object detection circuit and time-of-flight object detection method |
CA3228053A1 (en) * | 2021-08-05 | 2023-02-09 | David RUTTENBERG | Multi-sensory, assistive wearable technology, and method of providing sensory relief using same |
CN114005105B (en) * | 2021-12-30 | 2022-04-12 | 青岛以萨数据技术有限公司 | Driving behavior detection method and device and electronic equipment |
WO2024052889A1 (en) * | 2022-09-09 | 2024-03-14 | Immervision, Inc. | Optical system with localized magnification |
Family Cites Families (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7983817B2 (en) * | 1995-06-07 | 2011-07-19 | Automotive Technologies Internatinoal, Inc. | Method and arrangement for obtaining information about vehicle occupants |
US20030117275A1 (en) * | 2001-10-25 | 2003-06-26 | Watkins D. Scott | Apparatus for detecting potential tire failure |
US7646312B2 (en) * | 2006-08-11 | 2010-01-12 | Michael Rosen | Method and system for automated detection of mobile telephone usage by drivers of vehicles |
JP4922715B2 (en) * | 2006-09-28 | 2012-04-25 | タカタ株式会社 | Occupant detection system, alarm system, braking system, vehicle |
JP4420081B2 (en) * | 2007-08-03 | 2010-02-24 | 株式会社デンソー | Behavior estimation device |
EP2304527A4 (en) * | 2008-06-18 | 2013-03-27 | Oblong Ind Inc | Gesture-based control system for vehicle interfaces |
US9055905B2 (en) * | 2011-03-18 | 2015-06-16 | Battelle Memorial Institute | Apparatuses and methods of determining if a person operating equipment is experiencing an elevated cognitive load |
JP5505434B2 (en) * | 2012-02-09 | 2014-05-28 | 株式会社デンソー | Armpit judging device |
JP2013225205A (en) * | 2012-04-20 | 2013-10-31 | Denso Corp | Smoking detection device and program |
US20150379362A1 (en) * | 2013-02-21 | 2015-12-31 | Iee International Electronics & Engineering S.A. | Imaging device based occupant monitoring system supporting multiple functions |
US9751534B2 (en) * | 2013-03-15 | 2017-09-05 | Honda Motor Co., Ltd. | System and method for responding to driver state |
JP5983575B2 (en) * | 2013-09-27 | 2016-08-31 | 株式会社Jvcケンウッド | Operation input device, operation input method and program |
DE102014202490A1 (en) * | 2014-02-12 | 2015-08-13 | Volkswagen Aktiengesellschaft | Apparatus and method for signaling a successful gesture input |
KR101803222B1 (en) * | 2014-03-07 | 2017-11-29 | 폭스바겐 악티엔 게젤샤프트 | User interface and method for signalling a 3d position of input means during gesture detection |
US9842266B2 (en) * | 2014-04-04 | 2017-12-12 | Conduent Business Services, Llc | Method for detecting driver cell phone usage from side-view images |
US9721173B2 (en) * | 2014-04-04 | 2017-08-01 | Conduent Business Services, Llc | Machine learning approach for detecting mobile phone usage by a driver |
EP2930081B1 (en) * | 2014-04-08 | 2019-03-27 | Volvo Car Corporation | Method for transition between driving modes |
WO2016029939A1 (en) * | 2014-08-27 | 2016-03-03 | Metaio Gmbh | Method and system for determining at least one image feature in at least one image |
DE102015201369A1 (en) * | 2015-01-27 | 2016-07-28 | Robert Bosch Gmbh | Method and device for operating an at least partially automatically moving or mobile motor vehicle |
US9996756B2 (en) * | 2015-08-31 | 2018-06-12 | Lytx, Inc. | Detecting risky driving with machine vision |
US10710608B2 (en) * | 2015-10-26 | 2020-07-14 | Active Knowledge Ltd. | Provide specific warnings to vehicle occupants before intense movements |
US9988008B2 (en) * | 2015-10-26 | 2018-06-05 | Active Knowledge Ltd. | Moveable internal shock-absorbing energy dissipation padding in an autonomous vehicle |
US10059347B2 (en) * | 2015-10-26 | 2018-08-28 | Active Knowledge Ltd. | Warning a vehicle occupant before an intense movement |
DE102016206771A1 (en) * | 2015-12-16 | 2017-06-22 | Robert Bosch Gmbh | Method and device for controlling at least one driver interaction system |
US10043084B2 (en) * | 2016-05-27 | 2018-08-07 | Toyota Jidosha Kabushiki Kaisha | Hierarchical context-aware extremity detection |
EP3469437A4 (en) * | 2016-06-13 | 2020-03-25 | Xevo Inc. | Method and system for providing auto space management using virtuous cycle |
US9928433B1 (en) * | 2016-06-14 | 2018-03-27 | State Farm Mutual Automobile Insurance Company | Apparatuses, systems, and methods for determining when a vehicle operator is texting while driving |
CN106203385B (en) * | 2016-07-20 | 2019-07-09 | 东方网力科技股份有限公司 | A kind of hand-held phone behavioral value method and device of driver |
US10540557B2 (en) * | 2016-08-10 | 2020-01-21 | Xevo Inc. | Method and apparatus for providing driver information via audio and video metadata extraction |
US20180096668A1 (en) * | 2016-09-30 | 2018-04-05 | Ford Global Technologies, Llc | Hue adjustment of a vehicle display based on ambient light |
US10467488B2 (en) * | 2016-11-21 | 2019-11-05 | TeleLingo | Method to analyze attention margin and to prevent inattentive and unsafe driving |
US11493918B2 (en) * | 2017-02-10 | 2022-11-08 | Magna Electronics Inc. | Vehicle driving assist system with driver attentiveness assessment |
-
2017
- 2017-03-03 FR FR1700222A patent/FR3063557B1/en active Active
-
2018
- 2018-02-23 EP EP18708630.1A patent/EP3590071A1/en not_active Withdrawn
- 2018-02-23 US US16/490,767 patent/US11170241B2/en active Active
- 2018-02-23 CN CN201880015472.0A patent/CN110383290B/en active Active
- 2018-02-23 WO PCT/EP2018/054586 patent/WO2018158163A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
CN110383290B (en) | 2023-09-01 |
US11170241B2 (en) | 2021-11-09 |
US20200012872A1 (en) | 2020-01-09 |
WO2018158163A1 (en) | 2018-09-07 |
FR3063557B1 (en) | 2022-01-14 |
CN110383290A (en) | 2019-10-25 |
FR3063557A1 (en) | 2018-09-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3590071A1 (en) | Device for determining the attentiveness of a driver of a vehicle, on-board system comprising such a device, and associated method | |
EP3222467B1 (en) | Device and method for controlling the interior lighting of a vehicle with gesture recognition and optionally voice recognition | |
WO2017097942A1 (en) | Method for controlling a function of a motor vehicle by means of a mobile terminal | |
FR3030396A1 (en) | METHOD FOR MANAGING AN AUTONOMOUS OR PARTIALLY AUTONOMOUS VEHICLE AND SIGNALING DEVICE FOR IMPLEMENTING SAID VEHICLE | |
FR2865307A1 (en) | DEVICE FOR DETERMINING THE RISK OF COLLISION | |
EP1724153A1 (en) | Vorrichtung zur Detektion von Hindernissen mit einem Abbildungssystem für Kraftfahrzeug | |
EP3439918A1 (en) | Method for controlling the automatic display of a pictogram indicating the imminent opening of a vehicle door | |
FR2942064A1 (en) | Method for alerting driver of motor vehicle e.g. bus, during event occurred on back side of vehicle, involves displaying pictogram in form of contour of visible part of reflection of vehicle on glass of rear-view mirrors | |
EP3609742A1 (en) | Luminous lane-change signalling device for a motor vehicle | |
FR2910408A1 (en) | Infrared illuminator controlling method for e.g. car, involves detecting presence of object or person in detecting zone in carrying zone of illuminator using presence sensor, where detection zone has ophthalmic risk zone | |
FR3066158A1 (en) | METHOD AND SYSTEM FOR ALERTING A PRESENCE IN A DEAD ANGLE OF A VEHICLE | |
FR3047961A1 (en) | DEVICE AND METHOD FOR CHANGING THE CIRCULATION PATHWAY FOR A MOTOR VEHICLE | |
FR3099104A1 (en) | Vehicle exterior mirror observation device | |
FR3056501A1 (en) | MOTOR VEHICLE IDENTIFICATION ASSISTANCE SYSTEM AND METHOD FOR IMPLEMENTING THE SAME | |
FR3103052A1 (en) | Method of assisting in driving a vehicle and associated driving assistance system | |
EP3210813B1 (en) | Method of presenting information on starting a motor vehicle | |
FR3061793A1 (en) | METHOD AND DEVICE FOR MONITORING A STATIONED MOTOR VEHICLE | |
EP3867126B1 (en) | Method and device for assisting with driving a motor vehicle during reversing | |
FR3122391A1 (en) | Method and device for determining a state of attention of a driver of an autonomous vehicle | |
FR3138894A1 (en) | Method and device for warning of danger by vibration of the floor mat | |
FR3068667A1 (en) | AUTOMATIC ACTIVATION METHOD OF A FUNCTION OF A MOTOR VEHICLE | |
FR3097829A1 (en) | Method for activating at least one function relating to the safety of a vehicle | |
WO2023002098A1 (en) | Method and device for controlling the opening of a vehicle door | |
FR3061780A1 (en) | METHOD FOR CONTROLLING A MOTOR VEHICLE DISPLAY AND CORRESPONDING DISPLAY | |
FR3126362A1 (en) | Methods and devices for controlling a digital rear-view system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20190725 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
PUAG | Search results despatched under rule 164(2) epc together with communication from examining division |
Free format text: ORIGINAL CODE: 0009017 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20220303 |
|
B565 | Issuance of search results under rule 164(2) epc |
Effective date: 20220303 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06V 10/82 20220101ALI20220228BHEP Ipc: G06V 20/59 20220101ALI20220228BHEP Ipc: G06K 9/00 20060101ALI20220228BHEP Ipc: G06K 9/62 20060101AFI20220228BHEP |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20220714 |