EP3799752B1 - Ego motorcycle on-board awareness raising system, method for detecting and displaying presence of autonomous vehicles - Google Patents

Ego motorcycle on-board awareness raising system, method for detecting and displaying presence of autonomous vehicles Download PDF

Info

Publication number
EP3799752B1
EP3799752B1 EP19465569.2A EP19465569A EP3799752B1 EP 3799752 B1 EP3799752 B1 EP 3799752B1 EP 19465569 A EP19465569 A EP 19465569A EP 3799752 B1 EP3799752 B1 EP 3799752B1
Authority
EP
European Patent Office
Prior art keywords
motorcycle
ego
board
vehicles
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP19465569.2A
Other languages
German (de)
French (fr)
Other versions
EP3799752A1 (en
Inventor
Constantin-Florin Caruntu
Alexandru-Daniel Puscasu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Continental Autonomous Mobility Germany GmbH
Original Assignee
Continental Automotive GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Automotive GmbH filed Critical Continental Automotive GmbH
Priority to EP19465569.2A priority Critical patent/EP3799752B1/en
Publication of EP3799752A1 publication Critical patent/EP3799752A1/en
Application granted granted Critical
Publication of EP3799752B1 publication Critical patent/EP3799752B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A42HEADWEAR
    • A42BHATS; HEAD COVERINGS
    • A42B3/00Helmets; Helmet covers ; Other protective head coverings
    • A42B3/04Parts, details or accessories of helmets
    • A42B3/0406Accessories for helmets
    • A42B3/0433Detecting, signalling or lighting devices
    • A42B3/046Means for detecting hazards or accidents
    • AHUMAN NECESSITIES
    • A42HEADWEAR
    • A42BHATS; HEAD COVERINGS
    • A42B3/00Helmets; Helmet covers ; Other protective head coverings
    • A42B3/04Parts, details or accessories of helmets
    • A42B3/0406Accessories for helmets
    • A42B3/042Optical devices
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/163Decentralised systems, e.g. inter-vehicle communication involving continuous checking
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes

Definitions

  • the invention is related to increasing road safety.
  • the invention is related to a system and a method for increasing the awareness of motorcycle riders in respect to the presence of autonomous vehicles driving in their field of view as well as a computer program for carrying out steps of the method.
  • the invention US 5251333 A presents a simplified display system that is mounted on the helmet of the motorcycle rider.
  • the invention US 20130305437 A1 proposes a helmet with a look-down micro-display that projects a virtual image in-line with the helmet's chin bar.
  • the invention US 8638237 B2 discloses a system that alerts a vehicle driver about a motorcycle approaching from the rear.
  • the system consists in a unit located in the car and a unit located on the motorcycle.
  • the second unit transmits signals toward the traveling lane of the motorcycle and the car unit is responsible of receiving the transmitted signals and alerting the driver of the approaching motorcycle from the rear.
  • US 20140273863 A1 provides a system which establishes a communication between a smart helmet and a mobile phone/communicator. It consists in a computer processor, a microphone and one speaker, all connected and integrated in the helmet to be used by the motorcycle rider for mobile calls.
  • the invention US 20160075338 A1 presents a safety device for motorcycle riders that includes a safety helmet and a camera device, which is situated on the motorcycle rider's helmet; the camera device is connected to a warning device that outputs a warning as a function of data collected by the camera device related to the state of the motorcycle rider, e.g., fatigue monitoring.
  • the invention US 20170176746 A1 presents a system with one or more cameras that are physically coupled to a helmet, where each camera is configured to generate a video feed, which is presented to a user by projecting it onto a surface, such as the visor of the helmet, thereby enabling enhanced situational awareness for the user of the helmet to the surroundings, but no processing is done on the received images, they being presented directly as captured by the cameras.
  • the invention US 10,219,571 discloses a motorcycle helmet comprising a plurality of electronic components, including internally mounted sensors for detecting objects present in a blind spot of a wearer.
  • the invention JP 2017004426 A describes a traffic safety system which includes a traffic light system with a plurality of wireless tags that measures the distance and the position of the communication devices detected by their radio tags owned by pedestrians, bicycles, motorcycles, wheelchairs, automobiles in its proximity and then sends this position to the other participants that have radio tags.
  • the system operates based on a combination of sensors using radio signals and light laser/LED emission to detect the position of the vehicles around and to communicate it via V2X (Vehicle-to-X).
  • the traffic information is displayed by means of a head unit display HUD that associates the received position information with the actual position on the road surface.
  • One special category of traffic participants refers to the autonomous vehicles.
  • Fig. 1 depicts a plurality of vehicles driving in the ego motorcycle rider's field of view without any hint of which vehicle among them is autonomous.
  • the problem solved by the invention is to provide a system and a method of detecting and displaying the presence of autonomous vehicles driving in the field of view of the ego motorcycle rider for the purpose of alerting said ego motorcycle rider about such presence enabling him to decide on the precautions to take in respect to the detected autonomous vehicles.
  • an ego motorcycle on-board awareness raising system placed in an ensemble consisting in an ego motorcycle and an ego motorcycle rider smart helmet, said smart helmet comprising an advanced driver assistance systems camera acquiring video images having the field of view facing forward and said smart helmet comprising a smart helmet visor for displaying said video images to the motorcycle rider as well as other awareness information, said ego motorcycle on-board awareness raising system further comprising:
  • a method for detecting and displaying presence of autonomous vehicles driving in the field of view of an ego motorcycle rider wearing an ego motorcycle rider smart helmet provided with a smart helmet visor using the ego motorcycle on-board awareness raising system having the following repetitive sequence of eight steps of the method carried out at regular intervals of time t n :
  • a computer program comprising instructions which, when the program is executed on an ego motorcycle on-board detection processing unit cause said ego motorcycle on-board detection processing unit to execute the steps 2 to 7 of the method.
  • the ego motorcycle on-board awareness raising system is placed in an ensemble consisting in an ego motorcycle and an ego motorcycle rider smart helmet SH. Some components of the system are being placed in the ego motorcycle, whereas other components are being placed in the smart helmet SH of the motorcycle rider, as it will be hereafter detailed.
  • Said ego motorcycle rider's smart helmet SH comprises an advanced driver assistance systems camera ADASC, alternatively called camera, acquiring video images and a smart helmet SH visor for displaying said video images as well as other awareness information such as images and/or warning messages addressed to the motorcycle rider.
  • ADASC advanced driver assistance systems camera
  • a smart helmet SH visor for displaying said video images as well as other awareness information such as images and/or warning messages addressed to the motorcycle rider.
  • the advanced driver assistance systems camera ADASC used in this invention has at least the following characteristics: 2 MP (megapixels) and a rate of 30 fps (frames per second) .
  • Said camera is mounted in/on the motorcycle rider' s smart helmet SH in a known way.
  • the advanced driver assistance systems camera ADASC used in this invention has the field of view facing forward, that is in the direction of movement of the motorcycle rider.
  • the inclination of the head of the motorcycle rider and/ or its rotation shall have as effect the change of the field of view.
  • Said ego motorcycle on-board awareness raising system further comprises the following components:
  • the definition of the field of view of the motorcycle rider facing forward includes the angle of the field of view and the distances in respect to the vehicles driving ahead of the motorcycle rider. Both the angle and the distances are pre-determined depending on the characteristics of the camera and, respectively of the radar module RM.
  • the angle of the field of view may be up to 150° inclusively measured in horizontal plane.
  • the radar module RM is placed in front of the ego motorcycle facing forward. It is configured to measure the distance between the ego motorcycle and the vehicles driving in the ego motorcycle field of view, said distance typically ranging between 5-200m inclusively. Any radar used in the automotive industry may be used in the invention provided that it is configured to send the measurements to the ego motorcycle on-board detection processing unit DPU.
  • the gyroscope GYRO is placed within the smart helmet SH. It is configured to determine the direction and inclination of the field of view of the smart helmet depending on the direction in which the rider looks. Any gyroscope used in the automotive industry may be used in the invention, provided that it fits within the smart helmet SH and provided that it is configured to send the result of determinations to the ego motorcycle on-board detection processing unit.
  • the GPS sensor GPSS is configured to determine the geographical position of the ego motorcycle. Any GPS sensor used in the automotive industry may be used in the invention, provided that it is configured to send the result of determinations to the ego motorcycle on-board detection processing unit DPU.
  • the GPS sensor GPSS may be placed in the ego motorcycle as a component of said motorcycle as provided by the manufacturer of motorcycles.
  • ego motorcycle is not provided with GPS sensor GPSS, but the ego motorcycle on-board unit MOBU is provided by its manufacturer with a GPS sensor GPSS.
  • said GPS sensor GPSS of the ego motorcycle on-board unit MOBU is the one configured to determine the geographical position of the ego motorcycle and to send the result of determinations to the ego motorcycle on-board detection processing unit DPU.
  • neither ego motorcycle nor ego motorcycle on-board unit MOBU is provided with GPS sensor GPSS.
  • the GPS sensor GPSS may be provided by a rider's smartphone configured to determine the geographical position of the ego motorcycle and to send the result of determinations to the ego motorcycle on-board detection processing unit DPU.
  • GPS sensor GPSS as shown in the alternative embodiments above have the advantage of providing flexibility to the system and allowing it to be used in a wider range of situations, irrespective of whether the ego motorcycle is provided with its built-in GPS sensor GPSS.
  • the acceleration sensor ACC is placed in the ego motorcycle in the usual place(s). It is configured to measure the acceleration signal of the ego motorcycle on all three axes X, Y Z. Any acceleration sensor used in the automotive industry measuring the acceleration signal of the ego motorcycle on all three axes X, Y Z. may be used in the invention provided that it is configured to send the measurements to the ego motorcycle on-board detection processing unit DPU.
  • the speedometer SPEEDO is placed in the ego motorcycle in the usual place (s) . It is configured to measure the speed of the ego motorcycle. Any speedometer used in the automotive industry may be used in the invention provided that is configured to send the measurements to the ego motorcycle on-board detection processing unit DPU.
  • motorcycle dynamics unit MDU can be combined with any of the possibilities of use of the GPS sensor GPSS within the system, having the advantage of flexibility.
  • the ego motorcycle on-board unit MOBU is configured to communicate via the Vehicle-to-Everything V2X communication channel with other vehicles, including other motorcycle riders.
  • communication via the Vehicle-to-Everything V2X communication channel requires that all vehicles communicating be provided with a corresponding vehicle on-board units VOBU, said vehicle on-board units VOBU allowing sending and receiving messages through said Vehicle-to-Everything V2X communication channel.
  • the initial configuration of the ego motorcycle on-board unit MOBU is the one commonly known in the state of art for the vehicle on-board unit VOBU.
  • the ego motorcycle on-board unit MOBU used in the invention is further configured to send via the vehicle bus VB the data received from other vehicles' vehicle on-board unit VOBU to the ego motorcycle on-board detection processing unit DPU.
  • autonomous vehicles are provided with vehicle on-board units VOBU. Additionally, it is known that each autonomous vehicle must be provided with an autonomous vehicle identifier AVI.
  • the ego motorcycle on-board unit MOBU is configured to read the autonomous vehicle identifier AVI of each autonomous vehicle.
  • the ego motorcycle on-board unit MOBU is configured to receive periodically from other vehicles' vehicle on-board units VOBU, including autonomous vehicles the following information, for each vehicle: geographical position; speed; acceleration signal on the three axes.
  • the ego motorcycle on-board unit MOBU is configured to receive periodically, apart from the above-captioned information, the following information referring to autonomous vehicles, for each autonomous vehicle for which the autonomous vehicle identifier was received and read: targeted path; estimation of position, speed and acceleration in a subsequent pre-determined prediction interval.
  • Fig. 2 places schematically the ego motorcycle on-board unit MOBU in the same category as the sensors, as the data received by the ego motorcycle on-board unit MOBU about other vehicles provided with vehicle on-board unit VOBU, including autonomous vehicles, is used by the method according to the invention in the same way as the information from the other sensors.
  • the ego motorcycle on-board detection processing unit DPU is specially configured to carry out the detection of the autonomous vehicles. Whilst all the other components of the system according to the invention already exist either on the ego motorcycle or on the ego motorcycle rider smart helmet SH, being specially adapted for the invention, the ego motorcycle on-board detection processing unit DPU does not exist in the absence of the invention.
  • the ego motorcycle on-board detection processing unit DPU is configured to receive input from all the categories of sensors, configured to detect autonomous vehicles driving in the ego motorcycle's field of view and further configured to send said result of detection of the autonomous vehicles to the smart helmet SH visor in order to be displayed, as it will be further detailed in the description of the steps of the method.
  • the ego motorcycle on-board detection processing unit DPU comprises a dedicated processor having a processing power above 1000MHz and a capacity to store information of at least 1-2Gb and also comprises at least one non-volatile memory.
  • the ego motorcycle on-board detection processing unit DPU is placed in the motorcycle, for example, in a location suitable for electronic control units.
  • the at least one ego motorcycle bus system BS is configured to interconnect all the components of the system: the advanced driver assistance camera ADASC, the radar module RM, the gyroscope GYRO, the GPS sensor GPSS, the acceleration sensor ACC, the speedometer SPEEDO, the ego motorcycle on-board unit MOBU, the ego motorcycle on-board detection processing unit DPU and the smart helmet SH visor.
  • the at least one ego motorcycle bus system BS may be any kind of vehicle bus used in the automotive industry, using communication protocols such as but not limited to: CAN bus, FlexRay, Ethernet, Bluetooth, etc. Depending on the particular configuration of said components of the system, more than one motorcycle bus system BS may be used to interconnect various components of the system, including the case when various communication protocols are used respectively for each bus system BS.
  • a method for detecting and displaying presence of autonomous vehicles driving in the field of view of an ego motorcycle rider using the ego motorcycle on-board awareness raising system consists of sequences of 8 steps carried out at regular intervals of time t n when said motorcycle rider is in traffic.
  • a non-limiting example of regular intervals of time t n is between 20ms and 100ms.
  • the ego motorcycle on-board unit MOBU broadcasts messages through Vehicle-to-Everything V2X communication channel, said messages having the purpose to send to other vehicles provided with corresponding vehicle on-board units VOBU data about the position of the ego motorcycle and having the purpose to gather data about the presence of said other vehicles provided with corresponding vehicle on-board units VOBU and to gather data about which ones of said other vehicles are provided with a corresponding autonomous vehicle identifier AVI.
  • the broadcasting is carried out regularly covering a pre-determined broadcasting range.
  • broadcast is carried out every 100ms and the broadcasting range is of around 200m around the ego motorcycle in all directions, thus including the field of view.
  • the ego motorcycle on-board detection processing unit DPU receives via the at least one ego motorcycle bus system BS signals as input data from the sensors:
  • Sensors send data to the ego motorcycle on-board detection processing unit DPU at different rates, depending on the specific configurations of the components. It is not mandatory for the invention that the rates be synchronized.
  • the ego motorcycle on-board detection processing unit DPU performs the processing of video stream acquired in step 2 from the advanced driver assistance systems camera ADASC, including steps such as:
  • Image segmentation and labelling aim to reveal the relevant objects in the image corresponding to each video stream.
  • the relevant objects are all vehicles from the field of view irrespectively of whether they are provided or not with corresponding vehicle on-board units VOBU.
  • the processing of video stream is carried out with the same rate as the acquisition rate of said video stream.
  • the resulting processed video stream with the relevant objects labelled does not contain yet the detection of the autonomous vehicles.
  • the ego motorcycle on-board detection processing unit DPU creates a fused environment road model based on the processed video stream carried out in the third step and on the data received from the radar module RM in the second step.
  • the fused environment road model has thus more information than the processed video stream, as the distance to the relevant objects revealed in the processed video stream is now added, still the detection of the autonomous vehicles is not yet carried out.
  • step 2 based on the fused environment road model created in step 4 and based on the following input data received in step 2:
  • Simultaneous localization and mapping SLAM algorithm includes correlation by the ego motorcycle on-board detection processing unit DPU with data received from the sensors: inclination of the helmet received from the gyroscope GYRO, geographical position from the GPS sensor GPSS, the lateral, longitudinal acceleration signal and the yaw rate from the accelerometer ACC and the speed from the speedometer SPEEDO.
  • One non-limiting example includes the correlation of orientation of the inclination of the helmet received from the gyroscope GYRO with the results provided by the radar module RM that are already processed in the fused environment road model because when the motorcycle rider moves his head, the inclination of the smart helmet SH changes and the video acquired by the camera changes.
  • the result of this step is the simultaneous localization and mapping of the motorcycle rider and its smart helmet SH in the fused environment road model, that includes information about the distances to all vehicles driving in the field of view of the motorcycle rider as measured by the radar module RM, still the detection of the autonomous vehicles is not yet carried out.
  • the ego motorcycle on-board detection processing unit DPU detects in the video stream of step 3 the autonomous vehicles provided with the corresponding autonomous vehicle identifier AVI.
  • the ego motorcycle on-board detection processing unit DPU compares and correlates the data regarding simultaneous localization and mapping of the motorcycle rider and its smart helmet SH in the fused environment road model as resulted from step 5 with the data received in step 2 from the ego motorcycle on-board unit MOBU regarding the other vehicles provided with corresponding vehicle on-board unit VOBU including autonomous vehicles of step 2. Specifically,
  • the autonomous vehicles detected are then marked by known marking techniques on the processed video stream.
  • the result of this step is a processed video stream with detected marked autonomous vehicles driving in the field of view of the motorcycle rider.
  • corrections are applied to the detection carried out in previous step.
  • the correction of the position marked for each autonomous vehicle detected is carried out by using a general kinematic estimation algorithm taking into account the values of the subsequent pre-determined prediction interval and has the purpose of ensuring a greater accuracy of marking on the processed video stream of the detected autonomous vehicles.
  • a general kinematic estimation algorithm taking into account the values of the subsequent pre-determined prediction interval and has the purpose of ensuring a greater accuracy of marking on the processed video stream of the detected autonomous vehicles.
  • the measurement of the radar module RM is considered as the more accurate and the determinations of the GPS sensor GPSS must be adjusted to match the measurements of the radar module RM.
  • the result of this step is a corrected video stream with detected and marked autonomous vehicles driving in the field of view of the motorcycle rider.
  • the ego motorcycle on-board detection processing unit DPU sends via the at least one ego motorcycle bus system BS to the smart helmet SH visor the corrected video stream with detected marked autonomous vehicles driving in the field of view of the motorcycle rider.
  • the smart helmet SH visor displays, in an understandable manner by the motorcycle rider, the autonomous vehicles provided with corresponding autonomous vehicle identifier AVI.
  • a computer program which, when the program is executed on an ego motorcycle on-board detection processing unit DPU of any of the preferred embodiments, cause said ego motorcycle on-board detection processing unit DPU to execute the steps 2 to 7 of the method.

Description

    Field of the invention
  • The invention is related to increasing road safety. In particular the invention is related to a system and a method for increasing the awareness of motorcycle riders in respect to the presence of autonomous vehicles driving in their field of view as well as a computer program for carrying out steps of the method. By increasing the awareness of motorcycle riders, safety of the road participants is increased.
  • Background of the invention
  • It is known that motorcycle riders are more vulnerable to the consequences of road accidents than persons travelling in vehicles, as the motorcycle, by its mere construction is less stable and protects less the human in case of an accident than a vehicle with at least four wheels.
  • One way to reduce vulnerability of motorcycle riders is by increasing their awareness in respect to other traffic participants.
  • For example, the invention US 5251333 A presents a simplified display system that is mounted on the helmet of the motorcycle rider.
  • The invention US 20130305437 A1 proposes a helmet with a look-down micro-display that projects a virtual image in-line with the helmet's chin bar.
  • The invention US 8638237 B2 discloses a system that alerts a vehicle driver about a motorcycle approaching from the rear. The system consists in a unit located in the car and a unit located on the motorcycle. The second unit transmits signals toward the traveling lane of the motorcycle and the car unit is responsible of receiving the transmitted signals and alerting the driver of the approaching motorcycle from the rear.
  • The invention disclosed in US 20140273863 A1 provides a system which establishes a communication between a smart helmet and a mobile phone/communicator. It consists in a computer processor, a microphone and one speaker, all connected and integrated in the helmet to be used by the motorcycle rider for mobile calls.
  • The invention US 20160075338 A1 presents a safety device for motorcycle riders that includes a safety helmet and a camera device, which is situated on the motorcycle rider's helmet; the camera device is connected to a warning device that outputs a warning as a function of data collected by the camera device related to the state of the motorcycle rider, e.g., fatigue monitoring.
  • The invention US 20170176746 A1 presents a system with one or more cameras that are physically coupled to a helmet, where each camera is configured to generate a video feed, which is presented to a user by projecting it onto a surface, such as the visor of the helmet, thereby enabling enhanced situational awareness for the user of the helmet to the surroundings, but no processing is done on the received images, they being presented directly as captured by the cameras.
  • The invention US 10,219,571 discloses a motorcycle helmet comprising a plurality of electronic components, including internally mounted sensors for detecting objects present in a blind spot of a wearer.
  • The invention JP 2017004426 A describes a traffic safety system which includes a traffic light system with a plurality of wireless tags that measures the distance and the position of the communication devices detected by their radio tags owned by pedestrians, bicycles, motorcycles, wheelchairs, automobiles in its proximity and then sends this position to the other participants that have radio tags. The system operates based on a combination of sensors using radio signals and light laser/LED emission to detect the position of the vehicles around and to communicate it via V2X (Vehicle-to-X). The traffic information is displayed by means of a head unit display HUD that associates the received position information with the actual position on the road surface.
  • One special category of traffic participants refers to the autonomous vehicles.
  • There is an increasing interest in letting autonomous vehicles ride in many world jurisdictions. The advanced driver assistance systems used by autonomous vehicles are still far from providing complete and accurate information to enable autonomous vehicles to be at least as safe as human-conducted vehicles.
  • Disadvantages of prior art
  • Prior art does not disclose a solution capable of distinguishing among the vehicles driving in the ego motorcycle rider's field of view which vehicle is autonomous and which one is not. Fig. 1 depicts a plurality of vehicles driving in the ego motorcycle rider's field of view without any hint of which vehicle among them is autonomous.
  • It is a disadvantage the fact that prior art does not disclose any solution to detect the autonomous vehicles driving in motorcycle rider's field of view because said motorcycle rider cannot take his precautions in respect to said vehicles if he so desires.
  • Problem solved by the invention
  • The problem solved by the invention is to provide a system and a method of detecting and displaying the presence of autonomous vehicles driving in the field of view of the ego motorcycle rider for the purpose of alerting said ego motorcycle rider about such presence enabling him to decide on the precautions to take in respect to the detected autonomous vehicles.
  • Summary of the invention
  • In order to solve the problem, the inventors conceived in a first aspect of the invention an ego motorcycle on-board awareness raising system placed in an ensemble consisting in an ego motorcycle and an ego motorcycle rider smart helmet, said smart helmet comprising an advanced driver assistance systems camera acquiring video images having the field of view facing forward and said smart helmet comprising a smart helmet visor for displaying said video images to the motorcycle rider as well as other awareness information, said ego motorcycle on-board awareness raising system further comprising:
    • a radar module placed in front of the ego motorcycle facing forward, configured to measure the distance between the ego motorcycle and the vehicles driving in the ego motorcycle's field of view and configured to send the measurements to an ego motorcycle on-board detection processing unit;
    • a gyroscope placed within the smart helmet configured to determine the direction and inclination of the field of view of the smart helmet depending on the direction in which the rider looks and configured to send the measurements to the ego motorcycle on-board detection processing unit;
    • a GPS sensor placed in the ego motorcycle configured to determine the geographical position of the ego motorcycle and configured to send the measurements to the ego motorcycle on-board detection processing unit;
    • an acceleration sensor placed in the ego motorcycle, configured to measure the acceleration signal of the ego motorcycle on three axes X, Y Z and configured to send the measurements to the ego motorcycle on-board detection processing unit;
    • a speedometer placed in the ego motorcycle, configured to measure the speed of the ego motorcycle and configured to send the measurements to the ego motorcycle on-board detection processing unit;
    • an ego motorcycle on-board unit configured:
      • to communicate via the Vehicle-to-Everything V2X communication channel with other vehicles provided with a corresponding vehicle on-board unit for the purpose of receiving the geographical position, speed and acceleration signal on three axes X, Y Z of each said another vehicle provided with a corresponding vehicle on-board unit VOBU;
      • to receive and read an autonomous vehicle identifier sent via the Vehicle-to-Everything V2X communication channel by the corresponding vehicle on-board unit of each respective autonomous vehicle together with the targeted path and estimation of position, speed and acceleration on three axes X, Y Z in a subsequent pre-determined prediction interval for each autonomous vehicle for which the autonomous vehicle identifier was received and read;
      • to send the measurements to the ego motorcycle on-board detection processing unit;
    • the ego motorcycle on-board detection processing unit configured to carry out detection of the autonomous vehicles driving in the ego motorcycle's field of view and further configured to send said result of detection to the smart helmet;
    • at least one ego motorcycle bus system, configured to interconnect the advanced driver assistance camera, the radar module, the gyroscope, the GPS sensor, the acceleration sensor, the speedometer, the ego motorcycle on-board unit, the ego motorcycle on-board detection processing unit and the smart helmet visor.
  • In a second aspect of the invention, it is provided a method for detecting and displaying presence of autonomous vehicles driving in the field of view of an ego motorcycle rider wearing an ego motorcycle rider smart helmet provided with a smart helmet visor, using the ego motorcycle on-board awareness raising system having the following repetitive sequence of eight steps of the method carried out at regular intervals of time tn:
    • Step 1 Broadcasting by an ego motorcycle on-board unit of messages through Vehicle-to-Everything V2X communication channel, said messages having the purpose to send to other vehicles provided with corresponding vehicle on-board units data about the position of the ego motorcycle and having the purpose to gather data about the presence of said other vehicles provided with corresponding vehicle on-board units and to gather data about which ones of said other vehicles are provided with a corresponding autonomous vehicle identifier.
    • Step 2 Receiving by an ego motorcycle on-board detection processing unit of input data from the sensors via the at least one ego motorcycle bus system:
      • video streams acquired from an advanced driver assistance systems camera;
      • distances to all vehicles situated in front of the ego vehicle from a radar module provided that said all vehicles are driving within the range of the radar module;
      • speed of the ego motorcycle from a speedometer;
      • acceleration signals from X, Y and Z axes of the ego motorcycle from an accelerometer;
      • orientation of the smart helmet of the ego motorcycle from a gyroscope;
      • ego motorcycle geographical position from a GPS sensor;
      • data from the ego motorcycle on-board unit regarding the geographical position, speed and acceleration signals from X, Y and Z axes of other vehicles provided with corresponding vehicle on-board units including autonomous vehicles;
      • data from the ego motorcycle on-board unit for each autonomous vehicle: the autonomous vehicle identifier, the targeted path and the estimation of position, speed and acceleration signals from X, Y and Z axes in a subsequent pre-determined prediction interval.
    • Step 3 Performing by the ego motorcycle on-board detection processing unit (DPU) the processing of video stream acquired in step 2 from the advanced driver assistance systems camera:
      • Applying image segmentation to the video stream for the purpose of identifying in it all the relevant objects;
      • Labelling of the relevant objects in the segmented image, resulting a processed video stream with the relevant objects labelled.
    • Step 4 Creating by the ego motorcycle on-board detection processing unit of a fused environment road model based on the previous processed video stream and on the data received from the radar module.
    • Step 5 Based on the fused road environment model of step 4 and based on part of the input data received in step 2:
      • inclination of the helmet received from the gyroscope;
      • geographical position from the GPS sensor;
      • the lateral, longitudinal acceleration signal and the yaw rate from the accelerometer and
      • speed from the speedometer,
      applying by the ego motorcycle on-board detection processing unit of simultaneous localization and mapping algorithm with the purpose of localizing the ego motorcycle in the fused road environment model and localizing the corresponding orientation of the smart helmet,
      resulting the simultaneous localization and mapping of the motorcycle rider and its smart helmet (SH) in the fused environment road model.
    • Step 6 Based on:
      • the processed video stream with the relevant objects labelled of step 3;
      • the simultaneous localization and mapping of the motorcycle rider and its smart helmet SH in the fused environment road model of step 5 and
      • on the data from the ego motorcycle on-board unit received in step 2 regarding the other vehicles provided with corresponding vehicle on-board unit) including autonomous vehicles;
      • comparing and correlating the data regarding simultaneous localization and mapping of the motorcycle rider and its smart helmet in the fused environment road model as resulted from step 5 with the data received in step 2 from the ego motorcycle on-board unit regarding the other vehicles provided with corresponding vehicle on-board unit including autonomous vehicles,
      • detecting in said processed video stream of each autonomous vehicle provided with the corresponding autonomous vehicle identifier, and
      • marking each detected autonomous vehicle in the processed video stream,
      resulting a processed video stream with detected marked autonomous vehicles driving in the field of view of the motorcycle rider.
    • Step 7 Applying correction to the marking of each detected autonomous vehicle in the processed video stream for ensuring a greater accuracy of marking on the processed video stream of the detected autonomous vehicles and sending via the ego motorcycle bus system data regarding said marked autonomous vehicles the smart helmet visor,
      resulting a corrected video stream with detected marked autonomous vehicles driving in the field of view of the motorcycle rider.
    • Step 8 Displaying on the smart helmet visor of the marked autonomous vehicles in an understandable manner by the motorcycle rider
  • In a third aspect of the invention it is provided a computer program comprising instructions which, when the program is executed on an ego motorcycle on-board detection processing unit cause said ego motorcycle on-board detection processing unit to execute the steps 2 to 7 of the method.
  • Advantages of the invention
  • The main advantages of this invention are the following:
    • Providing the motorcycle rider with the possibility to become aware of the autonomous vehicles that drive in his field of view so that he can take special precautions.
    • Carrying out of all computations onboard in the ego motorcycle detection processing unit DPU without need of any infrastructure components such as radio frequency tags, smart signs traffic, light emitting instruments, which makes the system and the method a self-contained solution.
    Brief description of the drawings
    • Fig. 1 refers to the representation of the smart helmet visor of the vehicles driving in motorcycle rider's field of view in the prior art;
    • Fig. 2 refers to a diagram of the system and method according to the invention;
    • Fig. 3 refers to the representation of the smart helmet visor of the vehicles driving in motorcycle rider' s field of view with the displaying of the autonomous vehicles as a consequence of applying the method of the invention in the system according to the invention.
  • List of references in the drawings:
  • SH
    Smart helmet of the motorcycle rider
    ADASC
    Advanced driver assistance systems camera
    RM
    Radar module
    GYRO
    Gyroscope
    GPSS
    Global Positioning System GPS sensor
    ACC
    Acceleration sensor
    SPEEDO
    Speedometer
    MDU
    Motorcycle Dynamics Unit = Acceleration sensor + Speedometer
    DPU
    On-board detection processing unit of the ego motorcycle
    V2X
    Vehicle to Everything communication channel
    MOBU
    Ego motorcycle on-board-unit for communicating through Vehicle to Everything communication channel
    VOBU
    Vehicle on-board unit for communicating through Vehicle to Everything communication channel
    AVI
    Autonomous vehicle identifier
    tn
    regular interval of times for the sequences of the method.
    Detailed description and example of realization
  • With reference to Fig. 2, the ego motorcycle on-board awareness raising system is placed in an ensemble consisting in an ego motorcycle and an ego motorcycle rider smart helmet SH. Some components of the system are being placed in the ego motorcycle, whereas other components are being placed in the smart helmet SH of the motorcycle rider, as it will be hereafter detailed.
  • Said ego motorcycle rider's smart helmet SH comprises an advanced driver assistance systems camera ADASC, alternatively called camera, acquiring video images and a smart helmet SH visor for displaying said video images as well as other awareness information such as images and/or warning messages addressed to the motorcycle rider.
  • The advanced driver assistance systems camera ADASC used in this invention has at least the following characteristics: 2 MP (megapixels) and a rate of 30 fps (frames per second) . Said camera is mounted in/on the motorcycle rider' s smart helmet SH in a known way.
  • The advanced driver assistance systems camera ADASC used in this invention has the field of view facing forward, that is in the direction of movement of the motorcycle rider. The inclination of the head of the motorcycle rider and/ or its rotation shall have as effect the change of the field of view.
  • Said ego motorcycle on-board awareness raising system according to the invention further comprises the following components:
    • a radar/radar module RM;
    • a gyroscope GYRO;
    • a GPS sensor GPSS;
    • an acceleration sensor ACC;
    • a speedometer SPEEDO;
    • an ego motorcycle on-board unit MOBU;
    • an ego motorcycle on-board detection processing unit DPU;
    • at least one ego motorcycle bus system BS.
  • The definition of the field of view of the motorcycle rider facing forward includes the angle of the field of view and the distances in respect to the vehicles driving ahead of the motorcycle rider. Both the angle and the distances are pre-determined depending on the characteristics of the camera and, respectively of the radar module RM. The angle of the field of view may be up to 150° inclusively measured in horizontal plane.
  • The radar module RM is placed in front of the ego motorcycle facing forward. It is configured to measure the distance between the ego motorcycle and the vehicles driving in the ego motorcycle field of view, said distance typically ranging between 5-200m inclusively. Any radar used in the automotive industry may be used in the invention provided that it is configured to send the measurements to the ego motorcycle on-board detection processing unit DPU.
  • The gyroscope GYRO is placed within the smart helmet SH. It is configured to determine the direction and inclination of the field of view of the smart helmet depending on the direction in which the rider looks. Any gyroscope used in the automotive industry may be used in the invention, provided that it fits within the smart helmet SH and provided that it is configured to send the result of determinations to the ego motorcycle on-board detection processing unit.
  • The GPS sensor GPSS is configured to determine the geographical position of the ego motorcycle. Any GPS sensor used in the automotive industry may be used in the invention, provided that it is configured to send the result of determinations to the ego motorcycle on-board detection processing unit DPU.
  • In a preferred embodiment, the GPS sensor GPSS may be placed in the ego motorcycle as a component of said motorcycle as provided by the manufacturer of motorcycles.
  • In an alternative preferred embodiment, ego motorcycle is not provided with GPS sensor GPSS, but the ego motorcycle on-board unit MOBU is provided by its manufacturer with a GPS sensor GPSS. In this case, said GPS sensor GPSS of the ego motorcycle on-board unit MOBU is the one configured to determine the geographical position of the ego motorcycle and to send the result of determinations to the ego motorcycle on-board detection processing unit DPU.
  • In an alternative preferred embodiment, neither ego motorcycle nor ego motorcycle on-board unit MOBU is provided with GPS sensor GPSS. In this case, the GPS sensor GPSS may be provided by a rider's smartphone configured to determine the geographical position of the ego motorcycle and to send the result of determinations to the ego motorcycle on-board detection processing unit DPU.
  • The multiple examples of GPS sensor GPSS as shown in the alternative embodiments above have the advantage of providing flexibility to the system and allowing it to be used in a wider range of situations, irrespective of whether the ego motorcycle is provided with its built-in GPS sensor GPSS.
  • The acceleration sensor ACC is placed in the ego motorcycle in the usual place(s). It is configured to measure the acceleration signal of the ego motorcycle on all three axes X, Y Z. Any acceleration sensor used in the automotive industry measuring the acceleration signal of the ego motorcycle on all three axes X, Y Z. may be used in the invention provided that it is configured to send the measurements to the ego motorcycle on-board detection processing unit DPU.
  • The speedometer SPEEDO is placed in the ego motorcycle in the usual place (s) . It is configured to measure the speed of the ego motorcycle. Any speedometer used in the automotive industry may be used in the invention provided that is configured to send the measurements to the ego motorcycle on-board detection processing unit DPU.
  • It is possible to combine the acceleration sensor ACC and the speedometer SPEEDO in a single sensor, in this way making economy of space, which is an advantage given the general shortage of space in a motorcycle. This is shown in Fig 2 as motorcycle dynamics unit MDU. This has the advantage of using less space in the motorcycle and the advantage of synchronizing the rate of providing data by the acceleration sensor ACC and the speedometer SPEEDO.
  • The use of the motorcycle dynamics unit MDU can be combined with any of the possibilities of use of the GPS sensor GPSS within the system, having the advantage of flexibility.
  • The ego motorcycle on-board unit MOBU is configured to communicate via the Vehicle-to-Everything V2X communication channel with other vehicles, including other motorcycle riders.
  • In general, communication via the Vehicle-to-Everything V2X communication channel requires that all vehicles communicating be provided with a corresponding vehicle on-board units VOBU, said vehicle on-board units VOBU allowing sending and receiving messages through said Vehicle-to-Everything V2X communication channel.
  • The initial configuration of the ego motorcycle on-board unit MOBU is the one commonly known in the state of art for the vehicle on-board unit VOBU. The ego motorcycle on-board unit MOBU used in the invention is further configured to send via the vehicle bus VB the data received from other vehicles' vehicle on-board unit VOBU to the ego motorcycle on-board detection processing unit DPU.
  • It is known that autonomous vehicles are provided with vehicle on-board units VOBU. Additionally, it is known that each autonomous vehicle must be provided with an autonomous vehicle identifier AVI.
  • In the invention, the ego motorcycle on-board unit MOBU is configured to read the autonomous vehicle identifier AVI of each autonomous vehicle.
  • The ego motorcycle on-board unit MOBU is configured to receive periodically from other vehicles' vehicle on-board units VOBU, including autonomous vehicles the following information, for each vehicle: geographical position; speed; acceleration signal on the three axes.
  • In addition, the ego motorcycle on-board unit MOBU is configured to receive periodically, apart from the above-captioned information, the following information referring to autonomous vehicles, for each autonomous vehicle for which the autonomous vehicle identifier was received and read: targeted path; estimation of position, speed and acceleration in a subsequent pre-determined prediction interval.
  • Fig. 2 places schematically the ego motorcycle on-board unit MOBU in the same category as the sensors, as the data received by the ego motorcycle on-board unit MOBU about other vehicles provided with vehicle on-board unit VOBU, including autonomous vehicles, is used by the method according to the invention in the same way as the information from the other sensors.
  • The ego motorcycle on-board detection processing unit DPU is specially configured to carry out the detection of the autonomous vehicles. Whilst all the other components of the system according to the invention already exist either on the ego motorcycle or on the ego motorcycle rider smart helmet SH, being specially adapted for the invention, the ego motorcycle on-board detection processing unit DPU does not exist in the absence of the invention.
  • The ego motorcycle on-board detection processing unit DPU is configured to receive input from all the categories of sensors, configured to detect autonomous vehicles driving in the ego motorcycle's field of view and further configured to send said result of detection of the autonomous vehicles to the smart helmet SH visor in order to be displayed, as it will be further detailed in the description of the steps of the method.
  • In order to carry out said majority of the steps of the method, the ego motorcycle on-board detection processing unit DPU comprises a dedicated processor having a processing power above 1000MHz and a capacity to store information of at least 1-2Gb and also comprises at least one non-volatile memory.
  • The ego motorcycle on-board detection processing unit DPU is placed in the motorcycle, for example, in a location suitable for electronic control units.
  • The at least one ego motorcycle bus system BS is configured to interconnect all the components of the system: the advanced driver assistance camera ADASC, the radar module RM, the gyroscope GYRO, the GPS sensor GPSS, the acceleration sensor ACC, the speedometer SPEEDO, the ego motorcycle on-board unit MOBU, the ego motorcycle on-board detection processing unit DPU and the smart helmet SH visor.
  • The at least one ego motorcycle bus system BS may be any kind of vehicle bus used in the automotive industry, using communication protocols such as but not limited to: CAN bus, FlexRay, Ethernet, Bluetooth, etc. Depending on the particular configuration of said components of the system, more than one motorcycle bus system BS may be used to interconnect various components of the system, including the case when various communication protocols are used respectively for each bus system BS.
  • In a second aspect of the invention it is provided a method for detecting and displaying presence of autonomous vehicles driving in the field of view of an ego motorcycle rider using the ego motorcycle on-board awareness raising system. The method consists of sequences of 8 steps carried out at regular intervals of time tn when said motorcycle rider is in traffic.
  • The method is described below in connection with examples of values of the parameters. Said values shall be considered for exemplification only and shall not be considered limiting the invention, because the range of the parameters' values depend on the configuration of each of the components of the system.
  • A non-limiting example of regular intervals of time tn is between 20ms and 100ms.
  • In the first step of the method, the ego motorcycle on-board unit MOBU broadcasts messages through Vehicle-to-Everything V2X communication channel, said messages having the purpose to send to other vehicles provided with corresponding vehicle on-board units VOBU data about the position of the ego motorcycle and having the purpose to gather data about the presence of said other vehicles provided with corresponding vehicle on-board units VOBU and to gather data about which ones of said other vehicles are provided with a corresponding autonomous vehicle identifier AVI.
  • The broadcasting is carried out regularly covering a pre-determined broadcasting range. In a non-limiting example, broadcast is carried out every 100ms and the broadcasting range is of around 200m around the ego motorcycle in all directions, thus including the field of view.
  • In the second step of the method, the ego motorcycle on-board detection processing unit DPU receives via the at least one ego motorcycle bus system BS signals as input data from the sensors:
    • video streams acquired from an advanced driver assistance systems camera ADASC. In a non-limiting example, said video streams are received at an acquisition rate of up to 33ms inclusively;
    • the distances to all vehicles situated in the range of a radar module RM from said radar module. In a non-limiting example, data from the radar module RM is received at equally-spaced time intervals of 50-60ms inclusively and refers to the distances to all the vehicles, irrespectively of whether they are provided or not with corresponding vehicle on-board units VOBU as the radar module RM is not configured to distinguish which vehicle is provided with vehicle on-board units VOBU.
    • speed of the ego motorcycle from a speedometer SPEEDO. In a non-limiting example, data about the speed is received at equally-spaced time intervals of 10ms;
    • acceleration signals from X, Y and Z axes of the ego motorcycle from an accelerometer ACC. In a non-limiting example, data about acceleration signals is received at equally-spaced time intervals of 10ms. It is customary in the automotive industry to synchronize the receiving of data from the speedometer SPEEDO and the accelerometer ACC, irrespective of whether the speedometer SPEEDO and the accelerometer ACC are combined in one sensor.
    • orientation of the smart helmet of the ego motorcycle from a gyroscope GYRO. In a non-limiting example, data about the orientation is received at equally-spaced time intervals of 10ms-15ms;
    • ego motorcycle geographical position from a GPS sensor GPSS. In a non-limiting example, data about geographical position is received at equally-spaced time intervals of 50-60ms inclusively;
    • data from the ego motorcycle on-board unit MOBU regarding the geographical position, speed and acceleration signals from X, Y and Z axes of other vehicles provided with corresponding vehicle on-board units VOBU including autonomous vehicles;
    • data from the ego motorcycle on-board unit MOBU for each autonomous vehicle: the autonomous vehicle identifier AVI, the targeted path and the estimation of position, speed and acceleration signals from X, Y and Z axes in the subsequent pre-determined prediction interval. Typically, the pre-determined future interval ranges between 10s and 20s from the moment data is sent by the ego motorcycle on-board unit MOBU and refers to a best prediction of the position, speed and acceleration signals from X, Y and Z axes of said autonomous vehicle at the end of said subsequent pre-determined prediction interval.
  • Sensors send data to the ego motorcycle on-board detection processing unit DPU at different rates, depending on the specific configurations of the components. It is not mandatory for the invention that the rates be synchronized.
  • In the third step of the method, the ego motorcycle on-board detection processing unit DPU performs the processing of video stream acquired in step 2 from the advanced driver assistance systems camera ADASC, including steps such as:
    • Applying image segmentation to the video stream for the purpose of identifying in it all the relevant objects;
    • Labelling of the relevant objects in the segmented image.
  • Image segmentation and labelling, in general, aim to reveal the relevant objects in the image corresponding to each video stream. In the invention, the relevant objects are all vehicles from the field of view irrespectively of whether they are provided or not with corresponding vehicle on-board units VOBU.
  • The processing of video stream is carried out with the same rate as the acquisition rate of said video stream. The resulting processed video stream with the relevant objects labelled does not contain yet the detection of the autonomous vehicles.
  • In the fourth step of the method, the ego motorcycle on-board detection processing unit DPU creates a fused environment road model based on the processed video stream carried out in the third step and on the data received from the radar module RM in the second step.
  • The fused environment road model has thus more information than the processed video stream, as the distance to the relevant objects revealed in the processed video stream is now added, still the detection of the autonomous vehicles is not yet carried out.
  • In the fifth step of the method, based on the fused environment road model created in step 4 and based on the following input data received in step 2:
    • inclination of the smart helmet SH received from the gyroscope GYRO,
    • geographical position from the GPS sensor GPSS,
    • the lateral, longitudinal acceleration signal and the yaw rate from the accelerometer ACC and
    • speed from the speedometer SPEEDO,
    the ego motorcycle on-board detection processing unit DPU applies simultaneous localization and mapping SLAM algorithm with the purpose of localizing the ego motorcycle in the fused road environment model and localizing the corresponding orientation of the smart helmet SH.
  • Simultaneous localization and mapping SLAM algorithm includes correlation by the ego motorcycle on-board detection processing unit DPU with data received from the sensors: inclination of the helmet received from the gyroscope GYRO, geographical position from the GPS sensor GPSS, the lateral, longitudinal acceleration signal and the yaw rate from the accelerometer ACC and the speed from the speedometer SPEEDO. One non-limiting example includes the correlation of orientation of the inclination of the helmet received from the gyroscope GYRO with the results provided by the radar module RM that are already processed in the fused environment road model because when the motorcycle rider moves his head, the inclination of the smart helmet SH changes and the video acquired by the camera changes.
  • The result of this step is the simultaneous localization and mapping of the motorcycle rider and its smart helmet SH in the fused environment road model, that includes information about the distances to all vehicles driving in the field of view of the motorcycle rider as measured by the radar module RM, still the detection of the autonomous vehicles is not yet carried out. Bv
  • In the sixth step of the method, the ego motorcycle on-board detection processing unit DPU detects in the video stream of step 3 the autonomous vehicles provided with the corresponding autonomous vehicle identifier AVI.
  • In order to do so, the ego motorcycle on-board detection processing unit DPU compares and correlates the data regarding simultaneous localization and mapping of the motorcycle rider and its smart helmet SH in the fused environment road model as resulted from step 5 with the data received in step 2 from the ego motorcycle on-board unit MOBU regarding the other vehicles provided with corresponding vehicle on-board unit VOBU including autonomous vehicles of step 2. Specifically,
    • simultaneous localization and mapping of the motorcycle rider and its smart helmet SH refer to the localization of the motorcycle rider and its smart helmet SH in respect to all vehicles, autonomous or not, provided with vehicle on-board unit VOBU driving in the field of view of the motorcycle rider within the range of the radar module RM. In this case the distances from the motorcycle rider are measured by the radar module RM,
    • the data provided by the ego motorcycle on-board unit MOBU refer to the presence of only the vehicles provided with corresponding vehicle on-board unit VOBU including autonomous vehicles running within said pre-determined broadcasting range. In this case the distances from the motorcycle rider are determined as a difference between the GPS location of each vehicle provided with corresponding vehicle on-board unit VOBU and the GPS location of the ego motorcycle.
  • The comparison and the correlation have as result the detection in said processed video stream of the autonomous vehicles driving in the field of view of the motorcycle rider that satisfy both conditions:
    • the autonomous vehicles are placed within said pre-determined broadcasting range, because in the contrary their autonomous vehicle identifier AVI cannot be read by the ego motorcycle on-board unit MOBU and
    • the autonomous vehicles are placed within the range of the radar module RM, reason for which said autonomous vehicles are found among the relevant objects labelled in step 3 based on which the fused environment road model was created in step 4 serving for the application of the of simultaneous localization and mapping SLAM algorithm in step 5.
  • The autonomous vehicles detected are then marked by known marking techniques on the processed video stream. The result of this step is a processed video stream with detected marked autonomous vehicles driving in the field of view of the motorcycle rider.
  • In the seventh step of the method, corrections are applied to the detection carried out in previous step.
  • Such corrections are needed for various reasons, such as but not limited to:
    • the sensors send information at different rates to the ego motorcycle on-board detection processing unit DPU;
    • during the time elapsed between carrying out step 1 of the method and step 6 of the method all vehicles, ego motorcycle rider moved their position;
    • the distance is measured by the radar module RM and respectively determined as difference of GPS location as measured by GPS sensor GPSS.
  • The correction of the position marked for each autonomous vehicle detected is carried out by using a general kinematic estimation algorithm taking into account the values of the subsequent pre-determined prediction interval and has the purpose of ensuring a greater accuracy of marking on the processed video stream of the detected autonomous vehicles. In case of discrepancies between the determination of the distances measured the radar module RM and respectively determined as difference of GPS location as measured by GPS sensor GPSS exceeding a pre-determined ration, the measurement of the radar module RM is considered as the more accurate and the determinations of the GPS sensor GPSS must be adjusted to match the measurements of the radar module RM.
  • The result of this step is a corrected video stream with detected and marked autonomous vehicles driving in the field of view of the motorcycle rider.
  • Once the correction carried out, the ego motorcycle on-board detection processing unit DPU sends via the at least one ego motorcycle bus system BS to the smart helmet SH visor the corrected video stream with detected marked autonomous vehicles driving in the field of view of the motorcycle rider.
  • In the eighth step of the method, the smart helmet SH visor displays, in an understandable manner by the motorcycle rider, the autonomous vehicles provided with corresponding autonomous vehicle identifier AVI.
  • Comparing Fig 1 illustrating the image as displayed on the smart helmet SH visor in the absence of the invention, with Fig. 3 illustrating the image as displayed on the smart helmet SH visor of the motorcycle rider by applying the invention, one can see that in Fig. 3, autonomous vehicles appear represented with dotted line, said dotted line being one of the non-limiting examples of displaying the detected and marked autonomous vehicles in an understandable manner by the motorcycle rider.
  • In a third aspect of the invention it is provided a computer program, which, when the program is executed on an ego motorcycle on-board detection processing unit DPU of any of the preferred embodiments, cause said ego motorcycle on-board detection processing unit DPU to execute the steps 2 to 7 of the method.
  • While the description of the invention was disclosed in detail in connection to preferred embodiments, those skilled in the art will appreciate that changes may be made to adapt a particular situation without departing from the scope of the invention as defined by the claims.

Claims (7)

  1. Ego motorcycle on-board awareness raising system placed in an ensemble consisting in an ego motorcycle and an ego motorcycle rider smart helmet (SH), said smart helmet (SH) comprising an advanced driver assistance systems camera (ADASC) acquiring video images having the field of view facing forward and said smart helmet (SH) comprising a smart helmet (SH) visor for displaying said video images to the motorcycle rider as well as other awareness information, characterized in that said ego motorcycle on-board awareness raising system further comprises:
    - a radar module (RM) placed in front of the ego motorcycle facing forward, configured to measure the distance between the ego motorcycle and the vehicles driving in the ego motorcycle's field of view and configured to send the measurements to an ego motorcycle on-board detection processing unit (DPU);
    - a gyroscope (GYRO) placed within the smart helmet (SH) configured to determine the direction and inclination of the field of view of the smart helmet (SH) depending on the direction in which the rider looks and configured to send the measurements to the ego motorcycle on-board detection processing unit (DPU);
    - a GPS sensor (GPSS) placed in the ego motorcycle configured to determine the geographical position of the ego motorcycle and configured to send the measurements to the ego motorcycle on-board detection processing unit (DPU);
    - an acceleration sensor (ACC) placed in the ego motorcycle, configured to measure the acceleration signal of the ego motorcycle on three axes (X, Y Z) and configured to send the measurements to the ego motorcycle on-board detection processing unit (DPU);
    - a speedometer (SPEEDO) placed in the ego motorcycle, configured to measure the speed of the ego motorcycle and configured to send the measurements to the ego motorcycle on-board detection processing unit (DPU);
    - an ego motorcycle on-board unit (MOBU) configured:
    - to communicate via the Vehicle-to-Everything V2X communication channel with other vehicles provided with a corresponding vehicle on-board unit (VOBU) for the purpose of receiving the geographical position, speed and acceleration signal on three axes (X, Y Z) of each said other vehicle provided with a corresponding vehicle on-board unit (VOBU);
    - to receive and read an autonomous vehicle identifier (AVI) sent via the Vehicle-to-Everything V2X communication channel by the corresponding vehicle on-board unit (VOBU) of each respective autonomous vehicle together with the targeted path and estimation of position, speed and acceleration on three axes (X, Y Z) in a subsequent pre-determined prediction interval for each autonomous vehicle for which the autonomous vehicle identifier was received and read;
    - to send the measurements to the ego motorcycle on-board detection processing unit (DPU);
    - the ego motorcycle on-board detection processing unit (DPU) configured to carry out detection of the autonomous vehicles driving in the ego motorcycle's field of view and further configured to send said result of detection to the smart helmet (SH);
    - at least one ego motorcycle bus system (BS) configured to interconnect the advanced driver assistance camera (ADASC), the radar module (RM), the gyroscope (GYRO), the GPS sensor (GPSS), the acceleration sensor (ACC), the speedometer (SPEEDO), the ego motorcycle on-board unit (MOBU), the ego motorcycle on-board detection processing unit (DPU) and the smart helmet (SH) visor.
  2. Ego motorcycle on-board awareness raising system of claim 1, wherein the GPS sensor (GPSS) may be placed in the ego motorcycle as a component of said motorcycle.
  3. Ego motorcycle on-board awareness raising system of claim 1, wherein the GPS sensor (GPSS) is a component of the ego motorcycle on-board unit (MOBU).
  4. Ego motorcycle on-board awareness raising system of claim 1, wherein the GPS sensor (GPSS) is a component of a motorcycle rider's smartphone.
  5. The ego motorcycle on-board awareness raising system of any of the claims 1 to 4, wherein the acceleration sensor (ACC) and the speedometer (SPEEDO) are combined in a single sensor forming a motorcycle dynamics unit (MDU).
  6. Method for detecting and displaying presence of autonomous vehicles driving in the field of view of an ego motorcycle rider wearing an ego motorcycle rider smart helmet (SH) provided with a smart helmet (SH) visor, using the ego motorcycle on-board awareness raising system, characterized in that the following repetitive sequence of steps of the method is carried out at regular intervals of time tn:
    Step 1 Broadcasting by an ego motorcycle on-board unit (MOBU) of messages through Vehicle-to-Everything V2X communication channel, said messages having the purpose to send to other vehicles provided with corresponding vehicle on-board units (VOBU) data about the position of the ego motorcycle and having the purpose to gather data about the presence of said other vehicles provided with corresponding vehicle on-board units (VOBU) and to gather data about which ones of said other vehicles are provided with a corresponding autonomous vehicle identifier (AVI).
    Step 2 Receiving by an ego motorcycle on-board detection processing unit (DPU) of input data from the sensors via the at least one ego motorcycle bus system (BS):
    - video streams acquired from an advanced driver assistance systems camera (ADASC);
    - distances to all vehicles situated in front of the ego vehicle from a radar module (RM) provided that said all vehicles are driving within the range of the radar module (RM);
    - speed of the ego motorcycle from a speedometer (SPEEDO);
    - acceleration signals from X, Y and Z axes of the ego motorcycle from an accelerometer (ACC);
    - orientation of the smart helmet (SH) of the ego motorcycle from a gyroscope (GYRO);
    - ego motorcycle geographical position from a GPS sensor (GPSS);
    - data from the ego motorcycle on-board unit (MOBU) regarding the geographical position, speed and acceleration signals from X,Y and Z axes of other vehicles provided with corresponding vehicle on-board units (VOBU) including autonomous vehicles;
    - data from the ego motorcycle on-board unit (MOBU) for each autonomous vehicle: the autonomous vehicle identifier (AVI), the targeted path and the estimation of position, speed and acceleration signals from X,Y and Z axes in a subsequent pre-determined prediction interval.
    Step 3 Performing by the ego motorcycle on-board detection processing unit (DPU) the processing of video stream acquired in step 2 from the advanced driver assistance systems camera (ADASC):
    - Applying image segmentation to the video stream for the purpose of identifying in it all the relevant objects;
    - Labelling of the relevant objects in the segmented image. resulting a processed video stream with the relevant objects labelled.
    Step 4 Creating by the ego motorcycle on-board detection processing unit (DPU) of a fused environment road model based on the previous processed video stream and on the data received from the radar module (RM).
    Step 5 Based on the fused road environment model of step 4 and based on part of the input data received in step 2:
    - inclination of the helmet received from the gyroscope (GYRO);
    - geographical position from the GPS sensor (GPSS);
    - the lateral, longitudinal acceleration signal and the yaw rate from the accelerometer (ACC) and
    - speed from the speedometer (SPEEDO),
    applying by the ego motorcycle on-board detection processing unit (DPU) of simultaneous localization and mapping (SLAM) algorithm with the purpose of localizing the ego motorcycle in the fused road environment model and localizing the corresponding orientation of the smart helmet (SH), resulting the simultaneous localization and mapping of the motorcycle rider and its smart helmet (SH) in the fused environment road model.
    Step 6 Based on:
    - the processed video stream with the relevant objects labelled of step 3;
    - the simultaneous localization and mapping of the motorcycle rider and its smart helmet SH in the fused environment road model of step 5 and
    - on the data from the ego motorcycle on-board unit (MOBU) received in step 2 regarding the other vehicles provided with corresponding vehicle on-board unit (VOBU) including autonomous vehicles;
    - comparing and correlating the data regarding simultaneous localization and mapping of the motorcycle rider and its smart helmet (SH) in the fused environment road model as resulted from step 5 with the data received in step 2 from the ego motorcycle on-board unit (MOBU) regarding the other vehicles provided with corresponding vehicle on-board unit (VOBU) including autonomous vehicles,
    - detecting in said processed video stream of each autonomous vehicle provided with the corresponding autonomous vehicle identifier (AVI), and
    - marking each detected autonomous vehicle in the processed video stream,
    resulting a processed video stream with detected marked autonomous vehicles driving in the field of view of the motorcycle rider.
    Step 7 Applying correction to the marking of each detected autonomous vehicle in the processed video stream for ensuring a greater accuracy of marking on the processed video stream of the detected autonomous vehicles and sending via the ego motorcycle bus system (BS) data regarding said marked autonomous vehicles the smart helmet visor (SH),
    resulting a corrected video stream with detected marked autonomous vehicles driving in the field of view of the motorcycle rider.
    Step 8 Displaying on the smart helmet (SH) visor of the marked autonomous vehicles in an understandable manner by the motorcycle rider
  7. A computer program comprising instructions which, when the program is executed on an ego motorcycle on-board detection processing unit (DPU) of any of the claims 1 to 5, cause said ego motorcycle on-board detection processing unit (DPU) to execute the steps 2 to 7 of the method according to claim 6.
EP19465569.2A 2019-10-02 2019-10-02 Ego motorcycle on-board awareness raising system, method for detecting and displaying presence of autonomous vehicles Active EP3799752B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP19465569.2A EP3799752B1 (en) 2019-10-02 2019-10-02 Ego motorcycle on-board awareness raising system, method for detecting and displaying presence of autonomous vehicles

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP19465569.2A EP3799752B1 (en) 2019-10-02 2019-10-02 Ego motorcycle on-board awareness raising system, method for detecting and displaying presence of autonomous vehicles

Publications (2)

Publication Number Publication Date
EP3799752A1 EP3799752A1 (en) 2021-04-07
EP3799752B1 true EP3799752B1 (en) 2022-07-06

Family

ID=68461733

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19465569.2A Active EP3799752B1 (en) 2019-10-02 2019-10-02 Ego motorcycle on-board awareness raising system, method for detecting and displaying presence of autonomous vehicles

Country Status (1)

Country Link
EP (1) EP3799752B1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114495545A (en) * 2022-01-28 2022-05-13 常州海蓝利科物联网技术有限公司 Vehicle control system and method
WO2023174753A1 (en) * 2022-03-16 2023-09-21 Aegis Rider Ag Method and system of determining position and orientation of a helmet

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IL99697A0 (en) 1991-10-09 1993-05-13 Nir Tsook Helmet mounted display device
IL190674A (en) 2008-04-07 2013-08-29 Janus Interface Ltd Vehicle awareness system
US20130311075A1 (en) * 2012-05-18 2013-11-21 Continental Automotive Systems, Inc. Motorcycle and helmet providing advance driver assistance
US20130305437A1 (en) 2012-05-19 2013-11-21 Skully Helmets Inc. Augmented reality motorcycle helmet
US10219571B1 (en) * 2012-11-08 2019-03-05 Peter Aloumanis In helmet sensors providing blind spot awareness
US20140273863A1 (en) 2013-03-15 2014-09-18 Luizzi Bros. Sealcoating & Striping Llc Smart helmet with mobile communicator integration
DE102014218577A1 (en) 2014-09-16 2016-03-17 Robert Bosch Gmbh Safety device for motorcyclists, Method of operating a safety device for motorcyclists
KR20170117098A (en) * 2015-02-10 2017-10-20 라이더 시스템즈 엘엘씨 Proximity recognition system for automobiles
JP2017004426A (en) 2015-06-15 2017-01-05 真人 田村 Traffic safety system including head-up display
US10324290B2 (en) 2015-12-17 2019-06-18 New Skully, Inc. Situational awareness systems and methods

Also Published As

Publication number Publication date
EP3799752A1 (en) 2021-04-07

Similar Documents

Publication Publication Date Title
US10210406B2 (en) System and method of simultaneously generating a multiple lane map and localizing a vehicle in the generated map
US10336257B2 (en) Rear vision system for a vehicle and method of using the same
JP6638701B2 (en) Driving awareness estimation device
JP6496018B2 (en) Traffic sign validity confirmation system and method
JP2006215911A (en) Apparatus, system and method for displaying approaching mobile body
CN112771592B (en) Method for warning a driver of a motor vehicle, control device and motor vehicle
JP6415583B2 (en) Information display control system and information display control method
US20180037162A1 (en) Driver assistance system
EP3799752B1 (en) Ego motorcycle on-board awareness raising system, method for detecting and displaying presence of autonomous vehicles
KR20140071174A (en) Lane guide device in vehicle and method thereof
WO2017115371A1 (en) Apparatus and method for avoiding vehicular accidents
US20210088352A1 (en) Control device
US11361687B2 (en) Advertisement display device, vehicle, and advertisement display method
US20190147269A1 (en) Information processing apparatus, driver monitoring system, information processing method and computer-readable storage medium
JP2008217813A (en) Collision information providing device and method
KR20150070832A (en) Driver assistance apparatus and Vehicle including the same
CN113706883B (en) Tunnel section safe driving system and method
KR20130003521A (en) Driving guidance system using sensors
JP6589991B2 (en) Human interface
JP2008046761A (en) System, device, and method for processing image of movable object
KR20150072942A (en) Driver assistance apparatus and Vehicle including the same
JP2008250453A (en) Drive support device and method
JP6526191B2 (en) Traffic sign support system and method
JP6823003B2 (en) Output device
JP2017004339A (en) Driver support device for vehicle

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20211007

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20220209

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1502213

Country of ref document: AT

Kind code of ref document: T

Effective date: 20220715

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602019016667

Country of ref document: DE

RAP2 Party data changed (patent owner data changed or rights of a patent transferred)

Owner name: CONTINENTAL AUTONOMOUS MOBILITY GERMANY GMBH

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20220706

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220706

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220706

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221107

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221006

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220706

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220706

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220706

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220706

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220706

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220706

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221106

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220706

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20221007

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602019016667

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220706

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220706

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220706

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220706

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220706

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220706

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220706

26N No opposition filed

Effective date: 20230411

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20221031

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20221002

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220706

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20221031

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220706

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20221031

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20221002

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20231026

Year of fee payment: 5

Ref country code: DE

Payment date: 20231031

Year of fee payment: 5

Ref country code: CH

Payment date: 20231102

Year of fee payment: 5

REG Reference to a national code

Ref country code: AT

Ref legal event code: UEP

Ref document number: 1502213

Country of ref document: AT

Kind code of ref document: T

Effective date: 20220706