WO2024013760A1 - A system and method for assisting a rider of a vehicle - Google Patents

A system and method for assisting a rider of a vehicle Download PDF

Info

Publication number
WO2024013760A1
WO2024013760A1 PCT/IN2023/050219 IN2023050219W WO2024013760A1 WO 2024013760 A1 WO2024013760 A1 WO 2024013760A1 IN 2023050219 W IN2023050219 W IN 2023050219W WO 2024013760 A1 WO2024013760 A1 WO 2024013760A1
Authority
WO
WIPO (PCT)
Prior art keywords
rider
vehicle
posture
control unit
speed
Prior art date
Application number
PCT/IN2023/050219
Other languages
French (fr)
Inventor
Sumeet Shekhar
Naresh Adepu
Narra ANITHA
S N Prashanth
Original Assignee
Tvs Motor Company Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tvs Motor Company Limited filed Critical Tvs Motor Company Limited
Publication of WO2024013760A1 publication Critical patent/WO2024013760A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • B60W30/143Speed control
    • B60W30/146Speed limiting
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0818Inactivity or incapacity of driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2300/00Indexing codes relating to the type of vehicle
    • B60W2300/36Cycles; Motorcycles; Scooters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/18Roll
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/223Posture, e.g. hand, foot, or seat position, turned or inclined
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2200/00Type of vehicle
    • B60Y2200/10Road Vehicles
    • B60Y2200/12Motorcycles, Trikes; Quads; Scooters

Definitions

  • the present invention relates to a system and a method for assisting a rider of a vehicle. More particularly, the present invention relates to the system and method for assisting the rider based on the posture of the rider.
  • vehicles do not have simple and economical means to monitor a posture of a rider while riding a vehicle and assist the rider in case his or her posture is abnormal/incorrect. Also, vehicles do not have simple and economical means to determine number of riders on the vehicle for adjustment of suspension systems of the vehicle. Also, vehicles do not have means to control the vehicle i.e. to deaccelerate the vehicle or perform an emergency braking operation in case the posture of the rider is incorrect or indicates non alertness of the rider.
  • one or more sensors are arranged on the vehicle and/or on the rider of the vehicle.
  • one or more load sensors are provided on a handlebar of the vehicle to determine whether the rider is holding the handlebar.
  • one or more load sensors are provided under the seat of the vehicle to indicate number of riders on the vehicle/load on the vehicle for adjustment of the suspension system of the vehicle.
  • one or more sensors such as inertial measurement unit, accelerometers, gyroscopes are used for determining inclination of the rider owing to which the inclination of the vehicle will be determined.
  • the one or more sensors are provided on joints of the rider and/or wearables of the rider including jackets, helmets etc.
  • Use of multiple adhoc sensors to detect or determine the posture of the rider or the number of rider on the vehicle increases overall costs of the vehicle, causes discomfort to the rider and requires additional hardware components such as wirings and controllers to communicate the posture of the rider or number of riders to the vehicle which is undesirable.
  • alerts are generated on position and/or parameters of the vehicle resulting from incorrect or abnormal posture of the rider rather than generating alerts on root cause of mishaps or accidents i.e., incorrect posture or non-alertness of the rider.
  • an alert will be provided when the vehicle deviates from a lane by more than a pre-defined distance and not on the cause of such deviation, which may be rider looking in left or right direction, rider driving with only one end on a handlebar, etc.
  • a system for assisting a rider of a vehicle comprises one or more speed sensors, one or more image capturing units and a control unit.
  • the one or more speed sensors are mounted on the vehicle.
  • the one or more speed sensors are configured to detect a speed of the vehicle.
  • the one or more image capturing units are mounted on the vehicle.
  • the one or more image capturing units are configured to capture one or more images and/or one or more videos of the rider of the vehicle in real time.
  • the one or more image capturing units are mounted on the vehicle such that the image(s) and/or video(s) of both the rider and a pillion rider can be captured by the one or more image capturing units.
  • the present invention can therefore determine a load on the vehicle without using one or more sensors and also determine a posture of the pillion rider.
  • the one or more image capturing units may be mounted on a front portion of the vehicle. In another embodiment, the one or more image capturing units may be mounted on both front portion of the vehicle and rear portion of the vehicle.
  • the control unit is also mounted on the vehicle and is in communication with the one or more speed sensors and the one or more image capturing units. On detection of the speed of the vehicle being greater than a pre-defined speed, the control unit determines a posture of the rider as a normal posture or an abnormal posture based on the one or more captured image(s) and/or video(s). In a scenario, where the posture of the rider is determined to be the normal posture, the control unit does not take any further actions. In a scenario, when the posture of the rider is determined to be the abnormal posture for a time period greater than a pre-defined time period, one or more pre-defined operations are performed by the control unit to assist the rider of the vehicle.
  • the one or more image capturing units are mounted on a dashboard of the vehicle or an instrument cluster of the vehicle.
  • the one or more pre-defined operation comprises instructing an audio alert device to generate an audio alert, instructing a visual alert device to generate a visual alert, instructing a haptic alert device to generate a haptic alert, and/or instructing a speed control device mounted on the vehicle to control the speed of the vehicle.
  • the control unit comprises a key feature identification unit and a pose determination unit.
  • the key feature identification unit is in communication with the one or more image capturing units.
  • the key feature identification unit is configured to identify an initial set of key features of the rider in at least one of the one or more captured images and one or more captured videos.
  • the key feature identification unit is further configured to assign a confidence score to each of the identified key features in the initial set of key features.
  • the key feature identification unit is further configured to determine a final set of key features.
  • the final set of key features depend on the confidence score assigned to each of the key features in the initial set of key features. The higher the confidence score, the greater is the probability of finding the key feature in the captured image(s) and/or video(s).
  • the key features having a confidence score of 0.6 or more are included in the final list of the key features. However, this value should not be construed as limiting and may be set to a different value by the manufacturer.
  • the pose determination unit is in communication with the key feature identification unit.
  • the pose determination unit is configured to determine angles and distance between the key features in the final set of key features to estimate a posture of the rider of the vehicle.
  • the pose determination unit is further configured to compare the estimated pose of the rider with at least one of one or more pre-defined normal postures and abnormal postures. Based on the comparison, the pose determination unit determines the posture of the rider as the normal posture or the abnormal posture.
  • the one or more key features of the rider comprises a nose of the rider, an upper lip of the rider, a lower lip of the rider, a chin of the rider, a jaw of the rider, a torso of the rider, a neck of the rider, left finger joints of the rider, right finger joints of the rider, a left eye of the rider, a right eye of the rider, a right ear of the rider, a left ear of the rider, a right shoulder of the rider, a left shoulder of the rider, a left elbow of the rider, a right elbow of the rider, a left wrist of the rider, a right wrist of the rider, a waist of the rider and/or a back of the rider.
  • the one or more pre-defined abnormal postures of the rider comprises one hand on a handlebar of the vehicle and another hand away from the handle bar of the vehicle, both the hands away from the handlebar of the vehicle, head turned partially or fully in a right direction or left direction of the vehicle, head turned partially or fully in an upward direction or downward direction of the vehicle, the rider standing on the vehicle, the rider leaning in the left or right direction of the vehicle and the rider leaning in a front direction or rear direction of the vehicle.
  • the control unit is configured to receive a lean angle of the vehicle with respect to a ground on which the vehicle is travelling.
  • One or more lean angle sensors are configured to detect a lean angle of the vehicle with respect to a ground on which the vehicle is moving.
  • the one or more lean angle sensors are in communication with the control unit.
  • the lean angle of the rider with respect to the vehicle is determined by the control unit based on the one or more captured images and/or videos.
  • the lean angle of the vehicle with respect to the ground is determined based on the lean angle of the rider with respect to the vehicle and the lean angle of the vehicle with respect to the ground.
  • the posture of the rider is determined as the normal posture or the abnormal posture. It is to be understood that too much inclination by the rider while the vehicle is already leaning will cause slippage of the vehicle. Accordingly, too much inclination by the rider when the vehicle is already leaning is an abnormal posture.
  • a method for assisting a rider of a vehicle comprises detecting a speed of the vehicle.
  • the step of detecting the speed of the vehicle is performed by one or more speed sensors mounted on the vehicle.
  • the method further comprises capturing at least one of one or more images and one or more videos of the rider in real time.
  • the step of capturing the one or more images and/or videos is performed by one or more image capturing units mounted on the vehicle.
  • the method further comprises determining a posture of the rider as a normal posture or an abnormal posture based on at least one of the one or more captured images and videos when the speed of the vehicle is greater than a pre-defined speed.
  • the step of determining the posture is performed by the control unit which is in communication with the one or more speed sensors and the one or more image capturing units.
  • the method further comprises performing one or more pre-defined operations when the posture of the rider is determined as abnormal posture for a time period greater than a pre-defined time period.
  • the step of performing the one or more pre-defined operations is performed by the control unit.
  • the step of performing one or more pre-defined operations comprises instructing an audio alert device to generate an audio alert, instructing a visual alert device to generate a visual alert, instructing a haptic alert device to generate a haptic alert and/or instructing a speed control device to control the speed of the vehicle.
  • control here refers to operation performed by the control unit to limit the speed of the vehicle such as adaptive cruise control or arrest the sped of the vehicle such as emergency braking operations.
  • the step of determining the posture of the rider further comprises the steps of: (i) identifying an initial set of key features of the rider in at least one of the one or more captured images and videos, (ii) assigning a confidence score to each of the identified key features in the initial set of key features, (iii) determining a final set of key features based on the confidence score, (iv) determining angles and distances between the key features in the final set of key features to estimate a posture of the rider, (v) comparing the estimated posture of the rider with one or more pre-defined normal postures and abnormal postures, (vi) determining, based on the comparison, the estimated posture of the rider as normal posture or abnormal posture.
  • the steps (i), (ii) and (iii) are performed by a key feature identification unit of the control unit.
  • the key feature identification unit is in communication with one or more image capturing units.
  • the steps (iv), (v) and (vi) are performed by a pose determination unit of the control unit.
  • the pose determination unit is in communication with the key feature identification unit.
  • the step of determining the posture of the rider further comprises the steps of: (i) determining a lean angle of the rider with respect to the vehicle based on at least one of the one or more captured images and videos, (ii) receiving a lean angle of the vehicle with respect to a ground from one or more lean angle sensors mounted on the vehicle, (iii) determining the lean angle of the rider with respect to the ground based on the lean angle of the rider with respect to the vehicle and lean angle of the vehicle with respect to the ground, and (iv) determining, based on the lean angle of the rider with respect to the ground, the posture of the rider as the normal posture or the abnormal posture.
  • the steps (i) to (iv) are performed by the control unit of the vehicle.
  • the control unit is in communication with the one or more lean angle sensors mounted on the vehicle.
  • Figure 1 is a block diagram of a system for assisting a rider of a vehicle, accordance with an embodiment of the present invention.
  • Figure 2 is a block diagram of a system for assisting a rider of the vehicle, in accordance with another embodiment of the present invention.
  • Figure 3 is a block diagram of a system for assisting a rider of the vehicle, in accordance with another embodiment of the present invention.
  • Figure 4 is a block diagram of a system for assisting a rider of the vehicle, in accordance with another embodiment of the present invention.
  • Figure 5 is a flow chart illustrating a method for assisting a rider of the vehicle, in accordance with an embodiment of the present invention.
  • Figure 6a, Figure 6b and Figure 6c is a flow chart illustrating a method for assisting a rider of the vehicle, in accordance with another embodiment of the present invention.
  • Figure 7a and Figure 7b is a flow chart illustrating a method for assisting a rider of the vehicle, in accordance with another embodiment of the present invention.
  • FIG. 1 is a block diagram of a system 102 for assisting a rider of the vehicle 100, in accordance with an embodiment of the present invention.
  • vehicle includes saddle type vehicle as well as passenger type vehicle.
  • the saddle type vehicle comprises two-wheeler, three wheeler and four wheelers such as bicycles, scooters, motorcycles and the likes.
  • the passenger vehicle comprises three-wheelers and four wheelers such as auto rickshaws, cars, lorries, truck and the likes.
  • vehicle also includes conventional internal combustion engine vehicles, electric vehicles and hybrid vehicles.
  • the system 102 comprises one or more speed sensors 104, one or more image capturing units 106 and a control unit 108.
  • the one or more speed sensors 104 are mounted on the vehicle 100 and configured to detect a speed of the vehicle 100.
  • the one or more image capturing units 106 are mounted on the vehicle 100 and configured to capture one or more images and/or videos of a rider in real time.
  • the control unit 108 is also mounted on the vehicle 100 and is in communication with the one or more speed sensors 104 and the one or more image capturing units 106. The speed of the vehicle 100 is received by the control unit 108 from the one or more speed sensors 104.
  • the control unit 108 On detection of speed of the vehicle 100 being greater than a pre-defined speed, the control unit 108 is configured to determine a posture of the rider as a normal posture or an abnormal posture. In case the abnormal posture is maintained by the rider for a time period greater than the pre-defined duration, the control unit 108 is configured to perform one or more pre-defined operations.
  • the one or more pre-defined operations include any actions performed by the control unit 108 to alert the rider and/or control the speed of the vehicle.
  • control refers to operation performed by the control unit 108 to limit the speed of the vehicle 100 such as adaptive cruise control or arrest the speed of the vehicle 100 such as emergency braking operations.
  • the image capturing unit 106 is a camera.
  • same camera may be used for capturing one or more images and/or videos of the rider of the vehicle 100.
  • the image capturing unit 106 is capable of working both in a day mode and a night mode.
  • the one or more image capturing units 106 are mounted on a front portion of the vehicle 100 such as dashboard and/or instrument cluster of the vehicle 100 and/or a rear portion of the vehicle 100 such as grab rail of the vehicle 100.
  • the pre-defined speed is 5 km per hour. This value, however, should not be construed as limiting and may include any value pre-configured in the control unit 108 by the manufacturer of the vehicle 100.
  • the pre-defined duration is 30 seconds. This value, however, should not be construed as limiting and may include any value pre-configured in the control unit 108 by the manufacturer of the vehicle 100.
  • the term “rider” may include both the rider of the vehicle and a pillion rider seated behind the rider of the vehicle 100.
  • the control unit 108 may be configured to determine posture of both the rider and pillion rider. It is to be understood that abnormal posture of the pillion rider in a saddle type vehicle may also result in accident of the vehicle.
  • the one or more image capturing units 106 are mounted such that the one or more image(s) and/or video(s) of the rider and the pillion rider can be captured. In one non-limiting example, the one or more image capturing units 106 may be mounted on a front portion of the vehicle 100 and a rear portion of the vehicle 100.
  • Figure 2 is a block diagram of a system 202 for assisting a rider of the vehicle 200, in accordance with another embodiment of the present invention.
  • the system 202 comprises one or more speed sensors 204, one or more image capturing units 206 and a control unit 208.
  • the construction, interrelation and function of the one or more speed sensors 204, the one or more image capturing units 206 and the control unit 208 is same as defined for the one or more speed sensors 104, the one or more image capturing units 106 and the control unit 108 in Figure 1 of the present invention.
  • Figure 2 additionally illustrates one or more devices to perform the one or more predefined operations by the control unit 208.
  • the control unit 208 is in communication with an audio alert device 210, a visual alert device 212, a haptic alert device 214 and a speed control device 216.
  • the audio alert device 210 such as a buzzer or a horn is mounted on the vehicle 200 and is configured to generate an audio alert.
  • the visual alert device 212 such as light emitting diodes is mounted on the vehicle 200 and is configured to generate a visual alert.
  • the haptic alert device 214 such as a vibrator capable of producing vibration is mounted on the vehicle 200 and is configured to generate a haptic alert.
  • the audio alert device 210, the visual alert device 212 and the haptic alert device 214 alerts the rider to correct the abnormal posture to avoid accident/collisions.
  • the control unit 208 controls the speed of the vehicle 200 to prevent accident/collision.
  • the term “control” here refers to operation performed by the control unit 208 to limit the speed of the vehicle 200 such as adaptive cruise control or arrest the speed of the vehicle 200 such as emergency braking operations.
  • the audio alert device 210, the visual alert device 212, the haptic alert device 214 and the speed control device 216 may be operated independently or in combination with each other.
  • the audio alert device 210, the visual alert device 212, the haptic alert device 214 and the speed control device 216 includes already known or later developed devices.
  • FIG. 3 is a block diagram of a system 302 for assisting a rider of the vehicle 300, in accordance with another embodiment of the present invention.
  • the system 302 comprises one or more speed sensors 304, one or more image capturing units 306 and a control unit 308.
  • the construction, interrelation and function of the one or more speed sensors 304, the one or more image capturing units 306 and the control unit 308 is same as defined for the one or more speed sensors 104, the one or more image capturing units 106 and the control unit 108 in Figure 1 of the present invention.
  • the construction, interrelation and function of the audio alert device 310, visual alert device 312, haptic alert device 314 and speed control device 316 is same as defined for the audio alert device 210, visual alert device 212, haptic alert device 214 and speed control device 216 in Figure 2 of the present invention.
  • Figure 3 additional illustrates one or more lean angle sensors 318 mounted on the vehicle 300 and in communication with the control unit 308 to determine the posture of the rider.
  • the one or more lean angle sensors 318 are configured to detect a lean angle of the vehicle 300 with respect to a ground on which the vehicle 300 is travelling.
  • the lean angle of the vehicle 300 with respect to a ground, detected by the one or more lean angle sensors 318, is transmitted to the control unit 308.
  • the control unit 308 is configured to determine a lean angle of the rider with respect to the vehicle 300.
  • the control unit 308 determines the lean angle of the rider with respect to the vehicle 300 by processing one or more images and/or videos captured by the one or more image capturing units 306.
  • the lean angle of the rider with respect to the vehicle 300 can be determined using the system 402 illustrated in Figure 4 and method 600 illustrated in Figure 6a, 6b and 6c of the present invention.
  • the system 402 and the method 600 to determine the lean angle of the rider with respect to the vehicle should not be construed as limiting and other now known or later developed systems and methods may also be used by the present invention to determine the lean angle of the rider with respect to the vehicle 300.
  • the control unit 308 determines the lean angle of the rider with respect to the ground using now known or later developed techniques. Based on the lean angle of the rider with respect to the ground, the control unit 308 determines the posture of the rider as normal or abnormal. In an embodiment, the lean angle of the rider is compared with one or more predefined lean angles to determine the posture of the rider as normal or abnormal. The one or more pre-defined angles are pre-configured in the control unit 308 by the manufacturer of the vehicle 300.
  • the lean angle sensors may be Inertial measurement units (IMU), accelerometer, gyroscope and other now known or later developed sensors.
  • IMU Inertial measurement units
  • the lean angle sensor sends roll and yaw rates of the vehicle to the control unit 308.
  • FIG. 4 is a block diagram of a system 402 for assisting a rider of the vehicle 400, in accordance with another embodiment of the present invention.
  • the system 402 comprises one or more speed sensors 404, one or more image capturing units 406 and a control unit 408.
  • the control unit 408 comprises a key feature identification unit 408a and a pose determination unit 408b.
  • the one or more speed sensors 404 are mounted on the vehicle 400 and configured to detect a speed of the vehicle 400 and transmit the same to the control unit 408.
  • the one or more image capturing units 406 are configured to capture one or more images and/or videos of the rider of the vehicle 400 and is in communication with the key feature identification unit 408a.
  • the key feature identification unit 408a is configured to identify an initial set of key features of the rider in at least one of the one or more captured images and videos.
  • the key feature identification unit 408a is further configured to assign a confidence score to each of the identified key features in the initial set of key features and determine, based on the confidence score, a final set of key features.
  • the confidence score indicates the probability that a key feature exists in the one or more captured images and videos. The higher the confidence score, more is the probability of finding the key feature in the one or more captured images and videos.
  • the key features having a confidence score of 0.6 or more are included in the final list of the key features. However, this value should not be construed as limiting and may be set to a different value by the manufacturer.
  • the one or more key feature of the rider comprises a nose of the rider, an upper lip of the rider, a lower lip of the rider, a chin of the rider, a jaw of the rider, a torso of the rider, a neck of the rider, left finger joints of the rider, right finger joints of the rider, a left eye of the rider, a right eye of the rider, a right ear of the rider, a left ear of the rider, a right shoulder of the rider, a left shoulder of the rider, a left elbow of the rider, a right elbow of the rider, a left wrist of the rider, a right wrist of the rider, a waist of the rider and/or a back of the rider.
  • at least 22 such key features may be identified by key feature identification unit 408a.
  • the pose determination unit 408b is in communication with the key feature identification unit 408a.
  • the pose determination unit 408b is configured to determine angles and distances between the key features in the final set of key features to estimate a posture of the rider. For example, distance between eyes, distance between left shoulder and chin, angle of chin from left shoulder etc., are determined by the pose determination unit 408b.
  • the angle and distance of chin from left shoulder will change/increase which gives an indication that the rider is looking in a right direction while riding the vehicle 400.
  • the estimated posture of the rider indicates that he/she is looking in the right direction while riding the vehicle 400.
  • the pose determination unit 408b is further configured to compare the estimated posture of the rider with at least one of one or more pre-defined normal postures and abnormal postures.
  • the one or more pre-defined abnormal posture of the rider comprises one hand on a handlebar of the vehicle 400 and another hand away from the handle bar of the vehicle 400, both the hands away from the handlebar of the vehicle 400, head turned partially or fully in a right direction or left direction of the vehicle 400, head turned partially or fully in an upward direction or downward direction of the vehicle 400, the rider standing on the vehicle 400, the rider leaning in the left or right direction of the vehicle 400 and/or the rider leaning in a front direction or rear direction of the vehicle 400.
  • the pose determination unit 408b determines the estimated posture of the rider as normal posture or abnormal posture.
  • the posture of the rider looking in the right direction may be determined as the abnormal posture by the control unit 408.
  • the pose determination unit 408b and the key feature identification unit 408a may be data driven models such as trained neural networks, genetic algorithm and now known or later developed algorithms. During training phase of the data driven models, range of angles and polygons formed between different key features defining different postures (normal and abnormal) is done. Once trained and tested, in the real time, the angles and joint parameters between the identified key points are obtained and classified by the key feature identification unit 408a and pose determination unit 408b into different poses and normal and abnormal poses are identified.
  • the control unit 408 determines the time period for which the abnormal posture is maintained by the rider of the vehicle 400. In case such duration of time is greater than a pre-defined time period, the control unit performs one or more pre-defined operation.
  • the one or more pre-defined operation comprises instructing an audio alert device 410 to generate an audio alert, instructing a visual alert device 412 to generate a visual alert, instructing a haptic alert device 414 to generate a haptic alert and/or instructing a speed control device 416 to control the speed of the vehicle 400.
  • the audio alert device 410, visual alert device 412, haptic alert device 414 and speed control device 416 are mounted on the vehicle 400 and include now known or later developed devices.
  • Figure 5 is a flow chart illustrating a method 500 assisting a rider of the vehicle, in accordance with an embodiment of the present invention.
  • the method comprises detecting a speed of the vehicle 100, 200.
  • the step of detecting the speed of the vehicle 100, 200 is performed by one or more speed sensors 104, 204 mounted on the vehicle 100, 200.
  • the method comprises capturing at least one of one or more images and one or more videos of the rider in real time.
  • the step of capturing one or more image(s) and/or video(s) is performed by one or more image capturing units 106, 206 mounted on the vehicle 100.
  • the method comprises determining, on detection of the speed being greater than a pre-defined speed, a posture of the rider as a normal posture or an abnormal posture based on at least one of the one or more captured images and videos.
  • the step of determining the posture of the rider as normal posture or abnormal posture is performed by a control unit 108, 208 which is in communication with the one or more speed sensors 104, 204 and the one or more image capturing units 106, 206.
  • the method further comprises performing, on determination of the posture of the rider being the abnormal posture for a time-period greater than a pre-defined time period, one or more pre-defined operations.
  • the step of performing the one or more pre-defined operation is performed by the control unit 108, 208.
  • the step 504 of performing comprises instructing an audio alert device 210 to generate an audio alert, instructing a visual alert device 212 to generate a visual alert, instructing a haptic alert device 214 to generate a haptic alert and/or instructing a speed control device 216 to control the speed of the vehicle.
  • control here refers to operation performed by the control unit to limit the speed of the vehicle such as adaptive cruise control or arrest the sped of the vehicle such as emergency braking operations.
  • Figure 6a, 6b and 6c is a flow chart illustrating a method 600 for assisting a rider of the vehicle 400, in accordance with another embodiment of the present invention.
  • the method comprises detecting a speed of the vehicle 400.
  • the step of detecting the speed of the vehicle 400 is performed by one or more speed sensors 404 mounted on the vehicle 400.
  • the method comprises capturing at least one of one or more images and one or more videos of the rider in real time.
  • the step of capturing one or more image(s) and/or video(s) is performed by one or more image capturing units 406 mounted on the vehicle 400.
  • the method comprises steps 603a, 603b, 603c, 603d, 603e and 603f for determining a posture of the rider as normal posture or abnormal posture.
  • the step 603 is performed by a control unit 408 mounted on the vehicle 400 and in communication with the one or more speed sensors 404 and the one or more image capturing units 406.
  • the method comprises identifying an initial set of key features of the rider in at least one of the one or more captured images and videos.
  • the step of identifying the initial set of key features is performed by a key feature identification unit 408a of the control unit 408.
  • the one or more key features of the rider may comprises a nose of the rider, an upper lip of the rider, a lower lip of the rider, a chin of the rider, a jaw of the rider, a torso of the rider, a neck of the rider, left finger joints of the rider, right finger joints of the rider, a left eye of the rider, a right eye of the rider, a right ear of the rider, a left ear of the rider, a right shoulder of the rider, a left shoulder of the rider, a left elbow of the rider, a right elbow of the rider, a left wrist of the rider, a right wrist of the rider, a waist of the rider and/or a back of the rider.
  • the method comprises assigning a confidence score to each of the identified key features in the initial set of key features.
  • the step of assigning is performed by the key feature identification unit 408a of the control unit 408.
  • the higher the confidence score the greater is the probability of finding the key feature in the captured image(s) and/or video(s).
  • the key features having a confidence score of 0.6 or more are included in the final list of the key features. However, this value should not be construed as limiting and may be set to a different value by the manufacturer.
  • the method comprises determining, based on the confidence score, a final set of key features.
  • the step of determining final set of key features is performed by the key feature identification unit 408a of the control unit 408.
  • the method comprises determining angles and distances between the key features in the final set of key features to estimate a posture of the rider.
  • the step of determining angles and distance is performed by a pose determination unit 408b of the control unit 408.
  • the method comprises comparing the estimated posture of the rider with one or more pre-defined normal postures and abnormal postures.
  • the step of comparing the estimated posture with the one or more predefined normal posture and abnormal posture is performed by the pose determination unit 408b of the control unit 408.
  • the one or more pre-defined abnormal posture of the rider comprises one hand on a handlebar of the vehicle and another hand away from the handle bar of the vehicle 400, both the hands away from the handlebar of the vehicle 400, head turned partially or fully in a right direction or left direction of the vehicle 400, head turned partially or fully in an upward direction or downward direction of the vehicle, the rider standing on the vehicle 400, the rider leaning in the left or right direction of the vehicle 400 and/or the rider leaning in a front direction or rear direction of the vehicle 400.
  • the method comprises determining, based on comparison in step 603e,the estimated posture of the rider as normal posture or abnormal posture.
  • the step of determining the estimated posture as normal posture or abnormal posture is performed by the pose determination unit 408b of the control unit 408.
  • the method comprises performing, on determination of the posture of the rider being the abnormal posture for a time-period greater than a pre-defined time period, one or more pre-defined operations.
  • the step of performing the one or more predefined operation is performed by the control unit 408.
  • the step 604 of performing comprises instructing an audio alert device 410 to generate an audio alert, instructing a visual alert device 412 to generate a visual alert, instructing a haptic alert device 416 to generate a haptic alert and/or instructing a speed control device 418 to limit the speed of the vehicle 400.
  • Figure 7a, 7b is a flow chart illustrating a method 700 for assisting a rider of the vehicle 300, in accordance with another embodiment of the present invention.
  • the method comprises detecting a speed of the vehicle 300.
  • the step of detecting the speed of the vehicle 300 is performed by one or more speed sensors 304 mounted on the vehicle 300.
  • the method comprises capturing at least one of one or more images and one or more videos of the rider in real time.
  • the step of capturing the one or more image(s) and/or video(s) is performed by one or more image capturing units 306 mounted on the vehicle 300.
  • the method comprises steps 703a, 703b, 703c and 703d for determining a posture of the rider as normal posture or abnormal posture.
  • the step 703 is performed by a control unit 308 mounted on the vehicle 300 and in communication with the one or more speed sensors 304 and the one or more image capturing units 306.
  • the method comprises determining a lean angle of the rider with respect to the vehicle 300 based on at least one of the one or more captured images and videos.
  • the method comprises receiving a lean angle of the vehicle 300 with respect to a ground on which the vehicle is travelling.
  • the lean angle of the vehicle 300 with respect to the ground is detected by one or more lean angle sensors 318 mounted on the vehicle 300 and transmitted to the control unit 308.
  • the method comprises determining the lean angle of the rider with respect to the ground based on the lean angle of the rider with respect to the vehicle 300 and the lean angle of the vehicle 300 with respect to the ground.
  • the method comprises determining, based on the lean angle of the rider with respect to the ground, the posture of the rider as the normal posture or the abnormal posture.
  • the lean angle of the rider with respect to the ground is compared with the one or more pre-defined lean angle values indicating a normal posture or an abnormal posture. It is to be understood that too much inclination by the rider while the vehicle is already leaning will cause slippage of the vehicle. Accordingly, too much inclination by the rider when the vehicle is already leaning is an abnormal posture.
  • the method comprises performing, on determination of the posture of the rider being the abnormal posture for a time-period greater than a pre-defined time period, one or more pre-defined operations.
  • the step of performing the one or more predefined operation is performed by the control unit 308.
  • the step of performing comprises instructing an audio alert device 310 to generate an audio alert, instructing a visual alert device 312 to generate a visual alert, instructing a haptic alert device 314 to generate a haptic alert and/or instructing a speed control device 316 to control of the vehicle.
  • control here refers to operation performed by the control unit to limit the speed of the vehicle such as adaptive cruise control or arrest the sped of the vehicle such as emergency braking operations.
  • control unit 108, 208, 308, 408 can include a set of instructions that can be executed to cause the control unit 108, 208, 308, 408 to perform the above-disclosed method.
  • the control unit 108, 208, 308, 408 may include a processor which may be a central processing unit (CPU), a graphics processing unit (GPU), or both.
  • the processor may be one or more general processors, digital signal processors, application specific integrated circuits, field programmable gate arrays, servers, networks, digital circuits, analog circuits, combinations thereof, or other now known or later developed devices for analysing and processing data.
  • the processor may implement a software program, such as code generated manually i.e. programmed.
  • the control unit 108, 208, 308, 408 may include a memory.
  • the memory may be a main memory, a static memory, or a dynamic memory.
  • the memory may include, but is not limited to computer readable storage media such as various types of volatile and nonvolatile storage media, including but not limited to random access memory, read-only memory, programmable read-only memory, electrically programmable read-only memory, electrically erasable read-only memory, flash memory, magnetic tape or disk, optical media and the like.
  • the memory is operable to store instructions executable by the processor. The functions, acts or tasks illustrated in the figures or described may be performed by the programmed processor executing the instructions stored in the memory.
  • the control unit 108, 208, 308, 408 may further include a display unit such as a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid state display, a cathode ray tube or other now known or later developed display device for outputting determined information.
  • a display unit such as a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid state display, a cathode ray tube or other now known or later developed display device for outputting determined information.
  • the display may act as an interface for the user to see the functioning of the processor, or specifically as an interface with the software stored in the memory.
  • control unit 108, 208, 308, 408 may include an input device configured to allow a user to interact with any of the components of the control unit 108, 208, 308, 408.
  • the input device may be a number pad, a keyboard, or a cursor control device, such as a mouse, or a joystick, touch screen display, remote control or any other device operative to interact with the control unit 108, 208, 308, 408.
  • the control unit 108, 208, 308, 408 may also include a disk or optical drive unit.
  • the disk drive unit may include a computer-readable medium in which one or more sets of instructions, e.g. software, can be embedded. Further, the instructions may embody one or more of the methods or logic as described. In a particular example, the instructions may reside completely, or at least partially, within the memory or within the processor during execution by the control unit 108, 208, 308, 408.
  • the memory and the processor also may include computer-readable media as discussed above.
  • the present invention contemplates a computer-readable medium that includes instructions or receives and executes instructions responsive to a propagated signal so that a device connected to a network can communicate data over the network.
  • the instructions may be transmitted or received over the network.
  • the network may include wired networks, wireless networks, Ethernet AVB networks, or combinations thereof.
  • the wireless network may be a cellular telephone network.
  • the network may be a public network, such as the Internet, a private network, such as an intranet, or combinations thereof, and may utilize a variety of networking protocols now available or later developed.
  • the present invention increases safety of the rider of the vehicle 100, 200, 300, 400 .
  • the rider of the vehicle 100, 200, 300, 400 is alerted when it is determined by the present invention that the posture of the rider is abnormal for a time period greater than a predefined time period.
  • the present invention can also perform one or more controlling operation on the vehicle 100, 200, 300, 400 such as reduction in speed of the vehicle 100, 200, 300, 400 or emergency braking operations to prevent accident of the vehicle 100, 200, 300, 400 when the posture of the rider is determined to be abnormal.
  • the present invention increases awareness of riding pattern of the vehicle 100, 200, 300, 400.
  • the rider of the vehicle may correct his/her posture while riding the vehicle 100, 200, 300, 400.
  • the present invention may monitor the postures of the rider and recommend best riding postures against the bad posture of the rider for less fatigue and increased safety.
  • the present invention enable the rider to understand the posture in which he/she drives most of the time and could be used to provide the rider a suggestion on how to improve the riding posture and thus feel less fatigued.
  • the present invention provides safety critical functions such as controlling or arresting the speed of the vehicle on determination of abnormal posture being maintained by the rider for a time interval greater than a pre-defined time interval and also when the rider does not correct his posture after one or more alerts.
  • the present invention does not require multiple sensors on the body of the rider to determine the posture of the rider which, in turn, decreases the overall cost of the vehicle.
  • the present invention may also determine the presence and posture of the pillion rider in case of saddle type vehicles.
  • the present invention can also be used to identify load on the vehicle i.e. number of the riders on the vehicle without use of sensors.
  • the number of riders on the vehicle can be determined by the one or more image capturing units 106, 206, 306, 406 of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

A system (102) and method (500) for assisting a rider of a vehicle (100). The system (102) comprises one or more speed sensors (104) configured to detect a speed of the vehicle (100), one or more image capturing units (106) configured to capture at least one of one or more images and one or more videos of the rider in real time and a control unit (108) configured to determine, on detection of the speed being greater than a pre-defined speed, a posture of the rider as a normal posture or an abnormal posture based on at least one of the one or more captured images and videos and perform, on determination of the posture of the rider being the abnormal posture for a time-period greater than a pre-defined time period, one or more pre-defined operations.

Description

TITLE OF INVENTION
A SYSTEM AND METHOD FOR ASSISTING A RIDER OF A VEHICLE
FIELD OF THE INVENTION
[001] The present invention relates to a system and a method for assisting a rider of a vehicle. More particularly, the present invention relates to the system and method for assisting the rider based on the posture of the rider.
BACKGROUND OF THE INVENTION
[002] In current scenario, vehicles do not have simple and economical means to monitor a posture of a rider while riding a vehicle and assist the rider in case his or her posture is abnormal/incorrect. Also, vehicles do not have simple and economical means to determine number of riders on the vehicle for adjustment of suspension systems of the vehicle. Also, vehicles do not have means to control the vehicle i.e. to deaccelerate the vehicle or perform an emergency braking operation in case the posture of the rider is incorrect or indicates non alertness of the rider.
[003] In some vehicles, to detect the number of riders on the vehicle and /or posture of the rider, one or more sensors are arranged on the vehicle and/or on the rider of the vehicle. In one example, one or more load sensors are provided on a handlebar of the vehicle to determine whether the rider is holding the handlebar. In another example, one or more load sensors are provided under the seat of the vehicle to indicate number of riders on the vehicle/load on the vehicle for adjustment of the suspension system of the vehicle. In another example, one or more sensors such as inertial measurement unit, accelerometers, gyroscopes are used for determining inclination of the rider owing to which the inclination of the vehicle will be determined. In another example, the one or more sensors are provided on joints of the rider and/or wearables of the rider including jackets, helmets etc. Use of multiple adhoc sensors to detect or determine the posture of the rider or the number of rider on the vehicle increases overall costs of the vehicle, causes discomfort to the rider and requires additional hardware components such as wirings and controllers to communicate the posture of the rider or number of riders to the vehicle which is undesirable.
[004] Also, in current scenario, alerts are generated on position and/or parameters of the vehicle resulting from incorrect or abnormal posture of the rider rather than generating alerts on root cause of mishaps or accidents i.e., incorrect posture or non-alertness of the rider. For example, in current scenario, an alert will be provided when the vehicle deviates from a lane by more than a pre-defined distance and not on the cause of such deviation, which may be rider looking in left or right direction, rider driving with only one end on a handlebar, etc.
[005] In view thereof, there is a need-felt to overcome at least the above-mentioned disadvantages of the prior art.
SUMMARY OF THE INVENTION
[006] In one aspect of the present invention, a system for assisting a rider of a vehicle is disclosed. The system comprises one or more speed sensors, one or more image capturing units and a control unit. The one or more speed sensors are mounted on the vehicle. The one or more speed sensors are configured to detect a speed of the vehicle. The one or more image capturing units are mounted on the vehicle. The one or more image capturing units are configured to capture one or more images and/or one or more videos of the rider of the vehicle in real time. In a saddle type vehicle, the one or more image capturing units are mounted on the vehicle such that the image(s) and/or video(s) of both the rider and a pillion rider can be captured by the one or more image capturing units. The present invention can therefore determine a load on the vehicle without using one or more sensors and also determine a posture of the pillion rider. In an embodiment, the one or more image capturing units may be mounted on a front portion of the vehicle. In another embodiment, the one or more image capturing units may be mounted on both front portion of the vehicle and rear portion of the vehicle.
[007] The control unit is also mounted on the vehicle and is in communication with the one or more speed sensors and the one or more image capturing units. On detection of the speed of the vehicle being greater than a pre-defined speed, the control unit determines a posture of the rider as a normal posture or an abnormal posture based on the one or more captured image(s) and/or video(s). In a scenario, where the posture of the rider is determined to be the normal posture, the control unit does not take any further actions. In a scenario, when the posture of the rider is determined to be the abnormal posture for a time period greater than a pre-defined time period, one or more pre-defined operations are performed by the control unit to assist the rider of the vehicle.
[008] In an embodiment, the one or more image capturing units are mounted on a dashboard of the vehicle or an instrument cluster of the vehicle.
[009] In an embodiment, the one or more pre-defined operation comprises instructing an audio alert device to generate an audio alert, instructing a visual alert device to generate a visual alert, instructing a haptic alert device to generate a haptic alert, and/or instructing a speed control device mounted on the vehicle to control the speed of the vehicle. [010] In an embodiment, the control unit comprises a key feature identification unit and a pose determination unit. The key feature identification unit is in communication with the one or more image capturing units. The key feature identification unit is configured to identify an initial set of key features of the rider in at least one of the one or more captured images and one or more captured videos. The key feature identification unit is further configured to assign a confidence score to each of the identified key features in the initial set of key features. The key feature identification unit is further configured to determine a final set of key features. The final set of key features depend on the confidence score assigned to each of the key features in the initial set of key features. The higher the confidence score, the greater is the probability of finding the key feature in the captured image(s) and/or video(s). In an embodiment, the key features having a confidence score of 0.6 or more are included in the final list of the key features. However, this value should not be construed as limiting and may be set to a different value by the manufacturer. The pose determination unit is in communication with the key feature identification unit. The pose determination unit is configured to determine angles and distance between the key features in the final set of key features to estimate a posture of the rider of the vehicle. The pose determination unit is further configured to compare the estimated pose of the rider with at least one of one or more pre-defined normal postures and abnormal postures. Based on the comparison, the pose determination unit determines the posture of the rider as the normal posture or the abnormal posture.
[011] The one or more key features of the rider comprises a nose of the rider, an upper lip of the rider, a lower lip of the rider, a chin of the rider, a jaw of the rider, a torso of the rider, a neck of the rider, left finger joints of the rider, right finger joints of the rider, a left eye of the rider, a right eye of the rider, a right ear of the rider, a left ear of the rider, a right shoulder of the rider, a left shoulder of the rider, a left elbow of the rider, a right elbow of the rider, a left wrist of the rider, a right wrist of the rider, a waist of the rider and/or a back of the rider.
[012] The one or more pre-defined abnormal postures of the rider comprises one hand on a handlebar of the vehicle and another hand away from the handle bar of the vehicle, both the hands away from the handlebar of the vehicle, head turned partially or fully in a right direction or left direction of the vehicle, head turned partially or fully in an upward direction or downward direction of the vehicle, the rider standing on the vehicle, the rider leaning in the left or right direction of the vehicle and the rider leaning in a front direction or rear direction of the vehicle.
[013] In an embodiment, the control unit is configured to receive a lean angle of the vehicle with respect to a ground on which the vehicle is travelling. One or more lean angle sensors are configured to detect a lean angle of the vehicle with respect to a ground on which the vehicle is moving. The one or more lean angle sensors are in communication with the control unit. The lean angle of the rider with respect to the vehicle is determined by the control unit based on the one or more captured images and/or videos. The lean angle of the vehicle with respect to the ground is determined based on the lean angle of the rider with respect to the vehicle and the lean angle of the vehicle with respect to the ground. Based on the lean angle of the rider with respect to the ground, the posture of the rider is determined as the normal posture or the abnormal posture. It is to be understood that too much inclination by the rider while the vehicle is already leaning will cause slippage of the vehicle. Accordingly, too much inclination by the rider when the vehicle is already leaning is an abnormal posture.
[014] In another aspect of the invention, a method for assisting a rider of a vehicle is disclosed. The method comprises detecting a speed of the vehicle. The step of detecting the speed of the vehicle is performed by one or more speed sensors mounted on the vehicle. The method further comprises capturing at least one of one or more images and one or more videos of the rider in real time. The step of capturing the one or more images and/or videos is performed by one or more image capturing units mounted on the vehicle. The method further comprises determining a posture of the rider as a normal posture or an abnormal posture based on at least one of the one or more captured images and videos when the speed of the vehicle is greater than a pre-defined speed. The step of determining the posture is performed by the control unit which is in communication with the one or more speed sensors and the one or more image capturing units. The method further comprises performing one or more pre-defined operations when the posture of the rider is determined as abnormal posture for a time period greater than a pre-defined time period. The step of performing the one or more pre-defined operations is performed by the control unit.
[015] In an embodiment, the step of performing one or more pre-defined operations comprises instructing an audio alert device to generate an audio alert, instructing a visual alert device to generate a visual alert, instructing a haptic alert device to generate a haptic alert and/or instructing a speed control device to control the speed of the vehicle. The term “control” here refers to operation performed by the control unit to limit the speed of the vehicle such as adaptive cruise control or arrest the sped of the vehicle such as emergency braking operations. [016] In an embodiment, the step of determining the posture of the rider further comprises the steps of: (i) identifying an initial set of key features of the rider in at least one of the one or more captured images and videos, (ii) assigning a confidence score to each of the identified key features in the initial set of key features, (iii) determining a final set of key features based on the confidence score, (iv) determining angles and distances between the key features in the final set of key features to estimate a posture of the rider, (v) comparing the estimated posture of the rider with one or more pre-defined normal postures and abnormal postures, (vi) determining, based on the comparison, the estimated posture of the rider as normal posture or abnormal posture. The steps (i), (ii) and (iii) are performed by a key feature identification unit of the control unit. The key feature identification unit is in communication with one or more image capturing units. The steps (iv), (v) and (vi) are performed by a pose determination unit of the control unit. The pose determination unit is in communication with the key feature identification unit.
[017] In an embodiment, the step of determining the posture of the rider further comprises the steps of: (i) determining a lean angle of the rider with respect to the vehicle based on at least one of the one or more captured images and videos, (ii) receiving a lean angle of the vehicle with respect to a ground from one or more lean angle sensors mounted on the vehicle, (iii) determining the lean angle of the rider with respect to the ground based on the lean angle of the rider with respect to the vehicle and lean angle of the vehicle with respect to the ground, and (iv) determining, based on the lean angle of the rider with respect to the ground, the posture of the rider as the normal posture or the abnormal posture. The steps (i) to (iv) are performed by the control unit of the vehicle. The control unit is in communication with the one or more lean angle sensors mounted on the vehicle. BRIEF DESCRIPTION OF THE DRAWINGS
[018] Reference will be made to embodiments of the invention, examples of which may be illustrated in accompanying figures. These figures are intended to be illustrative, not limiting. Although the invention is generally described in context of these embodiments, it should be understood that it is not intended to limit the scope of the invention to these particular embodiments.
Figure 1 is a block diagram of a system for assisting a rider of a vehicle, accordance with an embodiment of the present invention.
Figure 2 is a block diagram of a system for assisting a rider of the vehicle, in accordance with another embodiment of the present invention.
Figure 3 is a block diagram of a system for assisting a rider of the vehicle, in accordance with another embodiment of the present invention.
Figure 4 is a block diagram of a system for assisting a rider of the vehicle, in accordance with another embodiment of the present invention.
Figure 5 is a flow chart illustrating a method for assisting a rider of the vehicle, in accordance with an embodiment of the present invention.
Figure 6a, Figure 6b and Figure 6c is a flow chart illustrating a method for assisting a rider of the vehicle, in accordance with another embodiment of the present invention.
Figure 7a and Figure 7b is a flow chart illustrating a method for assisting a rider of the vehicle, in accordance with another embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
[019] Various features and embodiments of the present invention here will be discernible from the following further description thereof, set out hereunder. [020] Figure 1 is a block diagram of a system 102 for assisting a rider of the vehicle 100, in accordance with an embodiment of the present invention. The term “vehicle” includes saddle type vehicle as well as passenger type vehicle. The saddle type vehicle comprises two-wheeler, three wheeler and four wheelers such as bicycles, scooters, motorcycles and the likes. The passenger vehicle comprises three-wheelers and four wheelers such as auto rickshaws, cars, lorries, truck and the likes. The term “vehicle” also includes conventional internal combustion engine vehicles, electric vehicles and hybrid vehicles.
[021] As shown, the system 102 comprises one or more speed sensors 104, one or more image capturing units 106 and a control unit 108. The one or more speed sensors 104 are mounted on the vehicle 100 and configured to detect a speed of the vehicle 100. The one or more image capturing units 106 are mounted on the vehicle 100 and configured to capture one or more images and/or videos of a rider in real time. The control unit 108 is also mounted on the vehicle 100 and is in communication with the one or more speed sensors 104 and the one or more image capturing units 106. The speed of the vehicle 100 is received by the control unit 108 from the one or more speed sensors 104. On detection of speed of the vehicle 100 being greater than a pre-defined speed, the control unit 108 is configured to determine a posture of the rider as a normal posture or an abnormal posture. In case the abnormal posture is maintained by the rider for a time period greater than the pre-defined duration, the control unit 108 is configured to perform one or more pre-defined operations. The one or more pre-defined operations include any actions performed by the control unit 108 to alert the rider and/or control the speed of the vehicle. The term “control” refers to operation performed by the control unit 108 to limit the speed of the vehicle 100 such as adaptive cruise control or arrest the speed of the vehicle 100 such as emergency braking operations.
[022] In an embodiment, the image capturing unit 106 is a camera. In vehicles 100 having a camera for authentication of the rider of the vehicle, same camera may be used for capturing one or more images and/or videos of the rider of the vehicle 100. The image capturing unit 106 is capable of working both in a day mode and a night mode.
[023] In an embodiment, the one or more image capturing units 106 are mounted on a front portion of the vehicle 100 such as dashboard and/or instrument cluster of the vehicle 100 and/or a rear portion of the vehicle 100 such as grab rail of the vehicle 100.
[024] In an embodiment, the pre-defined speed is 5 km per hour. This value, however, should not be construed as limiting and may include any value pre-configured in the control unit 108 by the manufacturer of the vehicle 100.
[025] In an embodiment, the pre-defined duration is 30 seconds. This value, however, should not be construed as limiting and may include any value pre-configured in the control unit 108 by the manufacturer of the vehicle 100.
[026] In a saddle type vehicle, the term “rider” may include both the rider of the vehicle and a pillion rider seated behind the rider of the vehicle 100. The control unit 108 may be configured to determine posture of both the rider and pillion rider. It is to be understood that abnormal posture of the pillion rider in a saddle type vehicle may also result in accident of the vehicle. In such a scenario, the one or more image capturing units 106 are mounted such that the one or more image(s) and/or video(s) of the rider and the pillion rider can be captured. In one non-limiting example, the one or more image capturing units 106 may be mounted on a front portion of the vehicle 100 and a rear portion of the vehicle 100. [027] Figure 2 is a block diagram of a system 202 for assisting a rider of the vehicle 200, in accordance with another embodiment of the present invention.
[028] As shown, the system 202 comprises one or more speed sensors 204, one or more image capturing units 206 and a control unit 208. The construction, interrelation and function of the one or more speed sensors 204, the one or more image capturing units 206 and the control unit 208 is same as defined for the one or more speed sensors 104, the one or more image capturing units 106 and the control unit 108 in Figure 1 of the present invention.
[029] Figure 2 additionally illustrates one or more devices to perform the one or more predefined operations by the control unit 208. As shown, the control unit 208 is in communication with an audio alert device 210, a visual alert device 212, a haptic alert device 214 and a speed control device 216. The audio alert device 210 such as a buzzer or a horn is mounted on the vehicle 200 and is configured to generate an audio alert. The visual alert device 212 such as light emitting diodes is mounted on the vehicle 200 and is configured to generate a visual alert. The haptic alert device 214 such as a vibrator capable of producing vibration is mounted on the vehicle 200 and is configured to generate a haptic alert. The audio alert device 210, the visual alert device 212 and the haptic alert device 214 alerts the rider to correct the abnormal posture to avoid accident/collisions. The control unit 208 controls the speed of the vehicle 200 to prevent accident/collision. The term “control” here refers to operation performed by the control unit 208 to limit the speed of the vehicle 200 such as adaptive cruise control or arrest the speed of the vehicle 200 such as emergency braking operations. It is to be understood that the audio alert device 210, the visual alert device 212, the haptic alert device 214 and the speed control device 216 may be operated independently or in combination with each other. Also, the audio alert device 210, the visual alert device 212, the haptic alert device 214 and the speed control device 216 includes already known or later developed devices.
[030] Figure 3 is a block diagram of a system 302 for assisting a rider of the vehicle 300, in accordance with another embodiment of the present invention.
[031] As shown, the system 302 comprises one or more speed sensors 304, one or more image capturing units 306 and a control unit 308. The construction, interrelation and function of the one or more speed sensors 304, the one or more image capturing units 306 and the control unit 308 is same as defined for the one or more speed sensors 104, the one or more image capturing units 106 and the control unit 108 in Figure 1 of the present invention. Also, the construction, interrelation and function of the audio alert device 310, visual alert device 312, haptic alert device 314 and speed control device 316 is same as defined for the audio alert device 210, visual alert device 212, haptic alert device 214 and speed control device 216 in Figure 2 of the present invention.
[032] Figure 3 additional illustrates one or more lean angle sensors 318 mounted on the vehicle 300 and in communication with the control unit 308 to determine the posture of the rider. The one or more lean angle sensors 318 are configured to detect a lean angle of the vehicle 300 with respect to a ground on which the vehicle 300 is travelling. The lean angle of the vehicle 300 with respect to a ground, detected by the one or more lean angle sensors 318, is transmitted to the control unit 308. The control unit 308 is configured to determine a lean angle of the rider with respect to the vehicle 300. The control unit 308 determines the lean angle of the rider with respect to the vehicle 300 by processing one or more images and/or videos captured by the one or more image capturing units 306. For the purpose of the present invention, the lean angle of the rider with respect to the vehicle 300 can be determined using the system 402 illustrated in Figure 4 and method 600 illustrated in Figure 6a, 6b and 6c of the present invention. However, the system 402 and the method 600 to determine the lean angle of the rider with respect to the vehicle should not be construed as limiting and other now known or later developed systems and methods may also be used by the present invention to determine the lean angle of the rider with respect to the vehicle 300.
[033] On receiving the lean angle of the vehicle 300 from the one or more lean angle sensors 318 and on determining the lean angle of the rider with respect to the vehicle 300, the control unit 308 determines the lean angle of the rider with respect to the ground using now known or later developed techniques. Based on the lean angle of the rider with respect to the ground, the control unit 308 determines the posture of the rider as normal or abnormal. In an embodiment, the lean angle of the rider is compared with one or more predefined lean angles to determine the posture of the rider as normal or abnormal. The one or more pre-defined angles are pre-configured in the control unit 308 by the manufacturer of the vehicle 300. It is to be understood that too much inclination by the rider while the vehicle 300 is already leaning will cause slippage of the vehicle 300. Accordingly, too much inclination by the rider when the vehicle 300 is already leaning is an abnormal posture. The lean angle sensors may be Inertial measurement units (IMU), accelerometer, gyroscope and other now known or later developed sensors. The lean angle sensor sends roll and yaw rates of the vehicle to the control unit 308.
[034] Figure 4 is a block diagram of a system 402 for assisting a rider of the vehicle 400, in accordance with another embodiment of the present invention. [035] As shown, the system 402 comprises one or more speed sensors 404, one or more image capturing units 406 and a control unit 408. The control unit 408 comprises a key feature identification unit 408a and a pose determination unit 408b. The one or more speed sensors 404 are mounted on the vehicle 400 and configured to detect a speed of the vehicle 400 and transmit the same to the control unit 408. The one or more image capturing units 406 are configured to capture one or more images and/or videos of the rider of the vehicle 400 and is in communication with the key feature identification unit 408a. The key feature identification unit 408a is configured to identify an initial set of key features of the rider in at least one of the one or more captured images and videos. The key feature identification unit 408a is further configured to assign a confidence score to each of the identified key features in the initial set of key features and determine, based on the confidence score, a final set of key features. The confidence score indicates the probability that a key feature exists in the one or more captured images and videos. The higher the confidence score, more is the probability of finding the key feature in the one or more captured images and videos. In an embodiment, the key features having a confidence score of 0.6 or more are included in the final list of the key features. However, this value should not be construed as limiting and may be set to a different value by the manufacturer. The one or more key feature of the rider comprises a nose of the rider, an upper lip of the rider, a lower lip of the rider, a chin of the rider, a jaw of the rider, a torso of the rider, a neck of the rider, left finger joints of the rider, right finger joints of the rider, a left eye of the rider, a right eye of the rider, a right ear of the rider, a left ear of the rider, a right shoulder of the rider, a left shoulder of the rider, a left elbow of the rider, a right elbow of the rider, a left wrist of the rider, a right wrist of the rider, a waist of the rider and/or a back of the rider. In one non-limiting example, at least 22 such key features may be identified by key feature identification unit 408a.
[036] The pose determination unit 408b is in communication with the key feature identification unit 408a. The pose determination unit 408b is configured to determine angles and distances between the key features in the final set of key features to estimate a posture of the rider. For example, distance between eyes, distance between left shoulder and chin, angle of chin from left shoulder etc., are determined by the pose determination unit 408b. When head is rotated to the right, the angle and distance of chin from left shoulder will change/increase which gives an indication that the rider is looking in a right direction while riding the vehicle 400. In other words, the estimated posture of the rider indicates that he/she is looking in the right direction while riding the vehicle 400. The pose determination unit 408b is further configured to compare the estimated posture of the rider with at least one of one or more pre-defined normal postures and abnormal postures. The one or more pre-defined abnormal posture of the rider comprises one hand on a handlebar of the vehicle 400 and another hand away from the handle bar of the vehicle 400, both the hands away from the handlebar of the vehicle 400, head turned partially or fully in a right direction or left direction of the vehicle 400, head turned partially or fully in an upward direction or downward direction of the vehicle 400, the rider standing on the vehicle 400, the rider leaning in the left or right direction of the vehicle 400 and/or the rider leaning in a front direction or rear direction of the vehicle 400. Based on said comparison, the pose determination unit 408b determines the estimated posture of the rider as normal posture or abnormal posture. In the above example, the posture of the rider looking in the right direction may be determined as the abnormal posture by the control unit 408. [037] The pose determination unit 408b and the key feature identification unit 408a may be data driven models such as trained neural networks, genetic algorithm and now known or later developed algorithms. During training phase of the data driven models, range of angles and polygons formed between different key features defining different postures (normal and abnormal) is done. Once trained and tested, in the real time, the angles and joint parameters between the identified key points are obtained and classified by the key feature identification unit 408a and pose determination unit 408b into different poses and normal and abnormal poses are identified.
[038] On determination of the posture as the abnormal posture, the control unit 408 determines the time period for which the abnormal posture is maintained by the rider of the vehicle 400. In case such duration of time is greater than a pre-defined time period, the control unit performs one or more pre-defined operation. The one or more pre-defined operation comprises instructing an audio alert device 410 to generate an audio alert, instructing a visual alert device 412 to generate a visual alert, instructing a haptic alert device 414 to generate a haptic alert and/or instructing a speed control device 416 to control the speed of the vehicle 400. The audio alert device 410, visual alert device 412, haptic alert device 414 and speed control device 416 are mounted on the vehicle 400 and include now known or later developed devices.
[039] Figure 5 is a flow chart illustrating a method 500 assisting a rider of the vehicle, in accordance with an embodiment of the present invention.
[040] At step 501 , the method comprises detecting a speed of the vehicle 100, 200. The step of detecting the speed of the vehicle 100, 200 is performed by one or more speed sensors 104, 204 mounted on the vehicle 100, 200. Y1
[041] At step 502, the method comprises capturing at least one of one or more images and one or more videos of the rider in real time. The step of capturing one or more image(s) and/or video(s) is performed by one or more image capturing units 106, 206 mounted on the vehicle 100.
[042] At step 503, the method comprises determining, on detection of the speed being greater than a pre-defined speed, a posture of the rider as a normal posture or an abnormal posture based on at least one of the one or more captured images and videos. The step of determining the posture of the rider as normal posture or abnormal posture is performed by a control unit 108, 208 which is in communication with the one or more speed sensors 104, 204 and the one or more image capturing units 106, 206.
[043] At step 504, the method further comprises performing, on determination of the posture of the rider being the abnormal posture for a time-period greater than a pre-defined time period, one or more pre-defined operations. The step of performing the one or more pre-defined operation is performed by the control unit 108, 208.
[044] In an embodiment, the step 504 of performing comprises instructing an audio alert device 210 to generate an audio alert, instructing a visual alert device 212 to generate a visual alert, instructing a haptic alert device 214 to generate a haptic alert and/or instructing a speed control device 216 to control the speed of the vehicle. The term “control” here refers to operation performed by the control unit to limit the speed of the vehicle such as adaptive cruise control or arrest the sped of the vehicle such as emergency braking operations.
[045] Figure 6a, 6b and 6c is a flow chart illustrating a method 600 for assisting a rider of the vehicle 400, in accordance with another embodiment of the present invention. [046] At step 601 , the method comprises detecting a speed of the vehicle 400. The step of detecting the speed of the vehicle 400 is performed by one or more speed sensors 404 mounted on the vehicle 400.
[047] At step 602, the method comprises capturing at least one of one or more images and one or more videos of the rider in real time. The step of capturing one or more image(s) and/or video(s) is performed by one or more image capturing units 406 mounted on the vehicle 400.
[048] At step 603, the method comprises steps 603a, 603b, 603c, 603d, 603e and 603f for determining a posture of the rider as normal posture or abnormal posture. The step 603 is performed by a control unit 408 mounted on the vehicle 400 and in communication with the one or more speed sensors 404 and the one or more image capturing units 406.
[049] At step 603a, the method comprises identifying an initial set of key features of the rider in at least one of the one or more captured images and videos. The step of identifying the initial set of key features is performed by a key feature identification unit 408a of the control unit 408. Each frame of the image or video is fed to the key feature identification unit 408a The one or more key features of the rider may comprises a nose of the rider, an upper lip of the rider, a lower lip of the rider, a chin of the rider, a jaw of the rider, a torso of the rider, a neck of the rider, left finger joints of the rider, right finger joints of the rider, a left eye of the rider, a right eye of the rider, a right ear of the rider, a left ear of the rider, a right shoulder of the rider, a left shoulder of the rider, a left elbow of the rider, a right elbow of the rider, a left wrist of the rider, a right wrist of the rider, a waist of the rider and/or a back of the rider. [050] At step 603b, the method comprises assigning a confidence score to each of the identified key features in the initial set of key features. The step of assigning is performed by the key feature identification unit 408a of the control unit 408. The higher the confidence score, the greater is the probability of finding the key feature in the captured image(s) and/or video(s). In an embodiment, the key features having a confidence score of 0.6 or more are included in the final list of the key features. However, this value should not be construed as limiting and may be set to a different value by the manufacturer.
[051] At step 603c, the method comprises determining, based on the confidence score, a final set of key features, The step of determining final set of key features is performed by the key feature identification unit 408a of the control unit 408.
[052] At step 603d, the method comprises determining angles and distances between the key features in the final set of key features to estimate a posture of the rider. The step of determining angles and distance is performed by a pose determination unit 408b of the control unit 408.
[053] At step 603e, the method comprises comparing the estimated posture of the rider with one or more pre-defined normal postures and abnormal postures. The step of comparing the estimated posture with the one or more predefined normal posture and abnormal posture is performed by the pose determination unit 408b of the control unit 408. The one or more pre-defined abnormal posture of the rider comprises one hand on a handlebar of the vehicle and another hand away from the handle bar of the vehicle 400, both the hands away from the handlebar of the vehicle 400, head turned partially or fully in a right direction or left direction of the vehicle 400, head turned partially or fully in an upward direction or downward direction of the vehicle, the rider standing on the vehicle 400, the rider leaning in the left or right direction of the vehicle 400 and/or the rider leaning in a front direction or rear direction of the vehicle 400.
[054] At step 603f, the method comprises determining, based on comparison in step 603e,the estimated posture of the rider as normal posture or abnormal posture. The step of determining the estimated posture as normal posture or abnormal posture is performed by the pose determination unit 408b of the control unit 408.
[055] At step 604, the method comprises performing, on determination of the posture of the rider being the abnormal posture for a time-period greater than a pre-defined time period, one or more pre-defined operations. The step of performing the one or more predefined operation is performed by the control unit 408.
[056] In an embodiment, the step 604 of performing comprises instructing an audio alert device 410 to generate an audio alert, instructing a visual alert device 412 to generate a visual alert, instructing a haptic alert device 416 to generate a haptic alert and/or instructing a speed control device 418 to limit the speed of the vehicle 400.
[057] Figure 7a, 7b is a flow chart illustrating a method 700 for assisting a rider of the vehicle 300, in accordance with another embodiment of the present invention.
[058] At step 701 , the method comprises detecting a speed of the vehicle 300. The step of detecting the speed of the vehicle 300 is performed by one or more speed sensors 304 mounted on the vehicle 300.
[059] At step 702, the method comprises capturing at least one of one or more images and one or more videos of the rider in real time. The step of capturing the one or more image(s) and/or video(s) is performed by one or more image capturing units 306 mounted on the vehicle 300. [060] At step 703, the method comprises steps 703a, 703b, 703c and 703d for determining a posture of the rider as normal posture or abnormal posture. The step 703 is performed by a control unit 308 mounted on the vehicle 300 and in communication with the one or more speed sensors 304 and the one or more image capturing units 306.
[061] At step 703a, the method comprises determining a lean angle of the rider with respect to the vehicle 300 based on at least one of the one or more captured images and videos.
[062] At step 703b, the method comprises receiving a lean angle of the vehicle 300 with respect to a ground on which the vehicle is travelling. The lean angle of the vehicle 300 with respect to the ground is detected by one or more lean angle sensors 318 mounted on the vehicle 300 and transmitted to the control unit 308.
[063] At step 703c, the method comprises determining the lean angle of the rider with respect to the ground based on the lean angle of the rider with respect to the vehicle 300 and the lean angle of the vehicle 300 with respect to the ground.
[064] At step 703d, the method comprises determining, based on the lean angle of the rider with respect to the ground, the posture of the rider as the normal posture or the abnormal posture. In an embodiment, the lean angle of the rider with respect to the ground is compared with the one or more pre-defined lean angle values indicating a normal posture or an abnormal posture. It is to be understood that too much inclination by the rider while the vehicle is already leaning will cause slippage of the vehicle. Accordingly, too much inclination by the rider when the vehicle is already leaning is an abnormal posture.
[065] At step 704, the method comprises performing, on determination of the posture of the rider being the abnormal posture for a time-period greater than a pre-defined time period, one or more pre-defined operations. The step of performing the one or more predefined operation is performed by the control unit 308.
[066] In an embodiment, the step of performing comprises instructing an audio alert device 310 to generate an audio alert, instructing a visual alert device 312 to generate a visual alert, instructing a haptic alert device 314 to generate a haptic alert and/or instructing a speed control device 316 to control of the vehicle. The term “control” here refers to operation performed by the control unit to limit the speed of the vehicle such as adaptive cruise control or arrest the sped of the vehicle such as emergency braking operations.
[067] It is to be understood that typical hardware configuration of the control unit 108, 208, 308, 408 can include a set of instructions that can be executed to cause the control unit 108, 208, 308, 408 to perform the above-disclosed method.
[068] The control unit 108, 208, 308, 408 may include a processor which may be a central processing unit (CPU), a graphics processing unit (GPU), or both. The processor may be one or more general processors, digital signal processors, application specific integrated circuits, field programmable gate arrays, servers, networks, digital circuits, analog circuits, combinations thereof, or other now known or later developed devices for analysing and processing data. The processor may implement a software program, such as code generated manually i.e. programmed.
[069] The control unit 108, 208, 308, 408 may include a memory. The memory may be a main memory, a static memory, or a dynamic memory. The memory may include, but is not limited to computer readable storage media such as various types of volatile and nonvolatile storage media, including but not limited to random access memory, read-only memory, programmable read-only memory, electrically programmable read-only memory, electrically erasable read-only memory, flash memory, magnetic tape or disk, optical media and the like. The memory is operable to store instructions executable by the processor. The functions, acts or tasks illustrated in the figures or described may be performed by the programmed processor executing the instructions stored in the memory.
[070] The control unit 108, 208, 308, 408 may further include a display unit such as a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid state display, a cathode ray tube or other now known or later developed display device for outputting determined information. The display may act as an interface for the user to see the functioning of the processor, or specifically as an interface with the software stored in the memory.
[071] Additionally, the control unit 108, 208, 308, 408 may include an input device configured to allow a user to interact with any of the components of the control unit 108, 208, 308, 408. The input device may be a number pad, a keyboard, or a cursor control device, such as a mouse, or a joystick, touch screen display, remote control or any other device operative to interact with the control unit 108, 208, 308, 408.
[072] The control unit 108, 208, 308, 408 may also include a disk or optical drive unit. The disk drive unit may include a computer-readable medium in which one or more sets of instructions, e.g. software, can be embedded. Further, the instructions may embody one or more of the methods or logic as described. In a particular example, the instructions may reside completely, or at least partially, within the memory or within the processor during execution by the control unit 108, 208, 308, 408. The memory and the processor also may include computer-readable media as discussed above. The present invention contemplates a computer-readable medium that includes instructions or receives and executes instructions responsive to a propagated signal so that a device connected to a network can communicate data over the network. Further, the instructions may be transmitted or received over the network. The network may include wired networks, wireless networks, Ethernet AVB networks, or combinations thereof. The wireless network may be a cellular telephone network. Further, the network may be a public network, such as the Internet, a private network, such as an intranet, or combinations thereof, and may utilize a variety of networking protocols now available or later developed.
[073] The claimed features/method steps of the present invention as discussed above are not routine, conventional, or well understood in the art, as the claimed steps enable the following solutions to the existing problems in conventional technologies. Specifically, the technical problem of rider having an abnormal posture while riding the vehicle 100, 200, 300, 400 is solved by present invention.
[074] The present invention increases safety of the rider of the vehicle 100, 200, 300, 400 . The rider of the vehicle 100, 200, 300, 400 is alerted when it is determined by the present invention that the posture of the rider is abnormal for a time period greater than a predefined time period. The present invention can also perform one or more controlling operation on the vehicle 100, 200, 300, 400 such as reduction in speed of the vehicle 100, 200, 300, 400 or emergency braking operations to prevent accident of the vehicle 100, 200, 300, 400 when the posture of the rider is determined to be abnormal.
[075] The present invention increases awareness of riding pattern of the vehicle 100, 200, 300, 400. When an alert is provided on determination of the abnormal posture, the rider of the vehicle may correct his/her posture while riding the vehicle 100, 200, 300, 400. Also, the present invention may monitor the postures of the rider and recommend best riding postures against the bad posture of the rider for less fatigue and increased safety. The present invention enable the rider to understand the posture in which he/she drives most of the time and could be used to provide the rider a suggestion on how to improve the riding posture and thus feel less fatigued.
[076] The present invention provides safety critical functions such as controlling or arresting the speed of the vehicle on determination of abnormal posture being maintained by the rider for a time interval greater than a pre-defined time interval and also when the rider does not correct his posture after one or more alerts.
[077] The present invention does not require multiple sensors on the body of the rider to determine the posture of the rider which, in turn, decreases the overall cost of the vehicle.
[078] The present invention may also determine the presence and posture of the pillion rider in case of saddle type vehicles.
[079] The present invention can also be used to identify load on the vehicle i.e. number of the riders on the vehicle without use of sensors. The number of riders on the vehicle can be determined by the one or more image capturing units 106, 206, 306, 406 of the present invention.
[080] While the present invention has been described with respect to certain embodiments, it will be apparent to those skilled in the art that various changes and modification may be made without departing from the scope of the invention as defined in the following claims.

Claims

CLAIMS:
1 . A system (102, 202, 302, 402) for assisting a rider of a vehicle (100, 200, 300, 400), the system comprising: one or more speed sensors (104, 204, 304, 404) mounted on the vehicle (100, 200, 300, 400), the one or more speed sensors (104, 204, 304, 404) configured to detect a speed of the vehicle (100, 200, 300, 400); one or more image capturing units (106, 206, 306, 406) mounted on the vehicle (100, 200, 300, 400), the one or more image capturing units (106, 206, 306, 406) configured to capture at least one of one or more images and one or more videos of the rider in real time; a control unit (108, 208, 308, 408) mounted on the vehicle (100, 200, 300, 400) and in communication with the one or more speed sensors (104, 204, 304, 404) and the one or more image capturing units (106, 206, 306, 406), the control unit (108, 208, 308, 408) configured to: determine, on detection of the speed being greater than a pre-defined speed, a posture of the rider as a normal posture or an abnormal posture based on at least one of the one or more captured images and videos; and perform, on determination of the posture of the rider being the abnormal posture for a time-period greater than a pre-defined time period, one or more pre-defined operations.
2. The system (402) as claimed in claim 1 , wherein the control unit (408) comprises: a key feature identification unit (408a), in communication with the one or more image capturing units (406), configured to:
• identify an initial set of key features of the rider in at least one of the one or more captured images and videos;
• assign a confidence score to each of the identified key features in the initial set of key features; and
• determine, based on the confidence score, a final set of key features;
- a pose determination unit (408b), in communication with the key feature identification unit (408a), configured to
• determine angles and distances between the key features in the final set of key features to estimate a posture of the rider;
• compare the estimated posture of the rider with at least one of one or more pre-defined normal postures and abnormal postures; and
• determine, based on said comparison, the estimated posture of the rider as normal posture or abnormal posture The system (202, 302, 402) as claimed in claim 1 , wherein the one or more pre-defined operations comprises at least one of: instructing an audio alert device (210, 310, 410) to generate an audio alert; instructing a visual alert device (212, 312, 412) to generate a visual alert; instructing a haptic alert device (214, 314, 414) to generate a haptic alert; and instructing a speed control device (216, 316, 416) mounted on the vehicle to control the speed of the vehicle (100, 200, 300, 400).
4. The system (102, 202, 302, 402) as claimed in claim 1 , wherein the one or more image capturing units (106, 206, 306, 406) are mounted on at least one of: a dashboard of the vehicle (100, 200, 300, 400) and an instrument cluster of the vehicle (100, 200, 300, 400).
5. The system (402) as claimed in claim 2, wherein the one or more key features of the rider comprises at least one of: a nose of the rider; an upper lip of the rider; a lower lip of the rider; a chin of the rider; a jaw of the rider; a torso of the rider; a neck of the rider; left finger joints of the rider; right finger joints of the rider; a left eye of the rider; a right eye of the rider; a right ear of the rider; a left ear of the rider; a right shoulder of the rider; a left shoulder of the rider; a left elbow of the rider; a right elbow of the rider; a left wrist of the rider; a right wrist of the rider; a waist of the rider; and a back of the rider.
6. The system (402) as claimed in claim 2, wherein the pre-defined abnormal postures of the rider comprises at least one of: one hand on a handlebar of the vehicle (400) and another hand away from the handle bar of the vehicle (400); both hands away from the handlebar of the vehicle (400); head turned partially or fully in a right direction or left direction of the vehicle (400); head turned partially or fully in an upward direction or downward direction of the vehicle (400); the rider standing on the vehicle (400); the rider leaning in the left or right direction of the vehicle (400); the rider leaning in a front direction or rear direction of the vehicle (400). The system (302) as claimed in claim 1 , wherein the control unit (308) is configured to: receive a lean angle of the vehicle (300) with respect to the ground from one or more lean angle sensors (318) mounted on the vehicle (300); determine a lean angle of the rider with respect to the vehicle (300) based on at least one of the one or more captured images and videos; determine a lean angle of the rider with respect to the ground based on the lean angle of the vehicle (300) with respect to ground and the lean angle of the rider with respect to the vehicle (300); determine the posture of the rider as the normal posture or the abnormal posture based on the lean angle of the rider with respect to the ground. A method (500, 600, 700) for assisting a rider of a vehicle (100, 200, 300, 400), the method comprising: detecting (501 , 601 , 701 ), by one or more speed sensors (104, 204, 304, 404) mounted on the vehicle (100, 200, 300, 400), a speed of the vehicle (100, 200, 300, 400); capturing (502, 602, 702), by one or more image capturing units (106, 206, 306, 406) mounted on the vehicle (100, 200, 300, 400), at least one of one or more images and one or more videos of the rider in real time; determining (503, 603, 703), by a control unit (108, 208, 308, 408), on detection of the speed being greater than a pre-defined speed, a posture of the rider as a normal posture or an abnormal posture based on at least one of the one or more captured images and videos; and performing (503, 603, 703), by the control unit (108, 208, 308, 408), on determination of the posture of the rider being the abnormal posture for a time-period greater than a pre-defined time period, one or more pre-defined operations.
9. The method (500, 600, 700) as claimed in claim 8, wherein the step of performing comprises at least one of: instructing, by the control unit (108, 208, 308, 408), an audio alert device (210, 310, 410) to generate an audio alert; instructing, by the control unit (108, 208, 308, 408), a visual alert device (212, 312, 412) to generate a visual alert; instructing, by the control device (108, 208, 308, 408), a haptic alert device (214, 314, 414) to generate a haptic alert; and instructing, by the control device (108, 208, 308, 408), a speed control device (216, 316, 416) to control the speed of the vehicle (100, 200, 300, 400).
10. The method (600) as claimed in claim 8, wherein the step of determining the posture of the rider comprises: identifying (603a), by a key feature identification unit (408a), an initial set of key features of the rider in at least one of the one or more captured images and videos; assigning (603b), by the key feature identification unit (408a), a confidence score to each of the identified key features in the initial set of key features; determining (603c), by the key feature identification unit (408a), based on the confidence score, a final set of key features; determining (603d), by a pose determination unit (408b), angles and distances between the key features in the final set of key features to estimate a posture of the rider; comparing (603e), by the pose determination unit (408b), the estimated posture of the rider with one or more pre-defined normal postures and abnormal postures; determining (603f), by the pose determination unit (408b), based on the comparison, the estimated posture of the rider as normal posture or abnormal posture. 1 . The method (700) as claimed in claim 8, wherein the step of determining the posture of the rider comprises:
- determining (703a) , by the control unit (308), a lean angle of the rider with respect to the vehicle (300) based on at least one of the one or more captured images and videos;
- receiving (703b) , by the control unit (308) in communication with one or more lean angle sensors (318) mounted on the vehicle (300), a lean angle of the vehicle (300) with respect to a ground;
- determining (703c), by the control unit (308), the lean angle of the rider with respect to the ground based on the lean angle of the rider with respect to the vehicle (300) and the lean angle of the vehicle (300) with respect to the ground ; - determining (703d), by the control unit (308), based on the lean angle of the rider with respect to the ground, the posture of the rider as the normal posture or the abnormal posture.
PCT/IN2023/050219 2022-07-12 2023-03-08 A system and method for assisting a rider of a vehicle WO2024013760A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN202241040055 2022-07-12
IN202241040055 2022-07-12

Publications (1)

Publication Number Publication Date
WO2024013760A1 true WO2024013760A1 (en) 2024-01-18

Family

ID=85985030

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2023/050219 WO2024013760A1 (en) 2022-07-12 2023-03-08 A system and method for assisting a rider of a vehicle

Country Status (1)

Country Link
WO (1) WO2024013760A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011002920A1 (en) * 2011-01-20 2012-07-26 Robert Bosch Gmbh Method for monitoring the posture of a motorcyclist
US20200104617A1 (en) * 2017-06-11 2020-04-02 Jungo Connectivity Ltd. System and method for remote monitoring of a human
EP3604097B1 (en) * 2017-04-07 2022-05-11 Yamaha Hatsudoki Kabushiki Kaisha Steering input information acquisition device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011002920A1 (en) * 2011-01-20 2012-07-26 Robert Bosch Gmbh Method for monitoring the posture of a motorcyclist
EP3604097B1 (en) * 2017-04-07 2022-05-11 Yamaha Hatsudoki Kabushiki Kaisha Steering input information acquisition device
US20200104617A1 (en) * 2017-06-11 2020-04-02 Jungo Connectivity Ltd. System and method for remote monitoring of a human

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
CHELI F. ET AL: "Vision-based measuring system for rider's pose estimation during motorcycle riding", MECHANICAL SYSTEMS AND SIGNAL PROCESSING, vol. 38, no. 2, 1 July 2013 (2013-07-01), AMSTERDAM, NL, pages 399 - 410, XP093055895, ISSN: 0888-3270, Retrieved from the Internet <URL:https://pdf.sciencedirectassets.com/272413/1-s2.0-S0888327013X00068/1-s2.0-S0888327013000289/main.pdf?X-Amz-Security-Token=IQoJb3JpZ2luX2VjEEwaCXVzLWVhc3QtMSJHMEUCIQCMlpdL4lG23X1OaXNzfUyOEh6YJ/nqVvZfmmPvWevHFQIgBkwrf64dozE0gqFqR6oCRvqzXEaeX+bT17xC7Q08ls0quwUIpf//////////ARAFGgwwNTkwMDM1NDY4NjUiDJBsI> DOI: 10.1016/j.ymssp.2013.01.009 *

Similar Documents

Publication Publication Date Title
JP7005933B2 (en) Driver monitoring device and driver monitoring method
US10936888B2 (en) Apparatus detecting driving incapability state of driver
US10909399B2 (en) Apparatus detecting driving incapability state of driver
US10572746B2 (en) Apparatus detecting driving incapability state of driver
US20210394751A1 (en) Information processing apparatus, information processing method, and program
US10501051B2 (en) Control device, control method, program, and control system
JP6848912B2 (en) Status determination device, status determination program and computer readable continuous tangible recording medium
US20210195981A1 (en) System and method for monitoring a cognitive state of a rider of a vehicle
JP2021155032A (en) Automatically estimating skill levels and confidence levels of drivers
WO2019198179A1 (en) Passenger state determination device, alarm output control device, and passenger state determination method
JP2020091672A (en) Processing apparatus and processing method for system for supporting rider of saddle-riding type vehicle, system for supporting rider of saddle-riding type vehicle, and saddle-riding type vehicle
US20230008012A1 (en) Rider-assistance system and control method for rider-assistance system
US11500470B2 (en) System and method of vehicle aware gesture recognition in vehicles with smart helmets
US11685372B2 (en) Processing unit and processing method for collision warning system, collision warning system, and motorcycle
US11904907B2 (en) Autonomous driving vehicle, method for controlling autonomous driving vehicle, and program
JP6631545B2 (en) Dependency estimator
WO2024013760A1 (en) A system and method for assisting a rider of a vehicle
JP2008105511A (en) Driving support apparatus
JP7348734B2 (en) Vehicle driving information providing device, vehicle driving information providing method and program
JP6617602B2 (en) Maneuvering detection system and maneuvering detection method
US11142215B2 (en) Processing unit and processing method for inter-vehicular distance warning system, inter-vehicular distance warning system, and motorcycle
US20200104590A1 (en) Eyeball information detection device, eyeball information detection method, and occupant monitoring device
US20240199161A1 (en) Controller and control method for assistance system
TWI843514B (en) Personalized driving early warning method and system
WO2018163536A1 (en) Driver body condition recovery support device, method and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23716693

Country of ref document: EP

Kind code of ref document: A1