WO2024038076A1 - Method and system to alert a rider on a vehicle based on rider posture for safe riding - Google Patents

Method and system to alert a rider on a vehicle based on rider posture for safe riding Download PDF

Info

Publication number
WO2024038076A1
WO2024038076A1 PCT/EP2023/072529 EP2023072529W WO2024038076A1 WO 2024038076 A1 WO2024038076 A1 WO 2024038076A1 EP 2023072529 W EP2023072529 W EP 2023072529W WO 2024038076 A1 WO2024038076 A1 WO 2024038076A1
Authority
WO
WIPO (PCT)
Prior art keywords
rider
vehicle
sensor data
data
computing unit
Prior art date
Application number
PCT/EP2023/072529
Other languages
French (fr)
Inventor
Aditya RAO
Original Assignee
Continental Automotive Technologies GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Automotive Technologies GmbH filed Critical Continental Automotive Technologies GmbH
Publication of WO2024038076A1 publication Critical patent/WO2024038076A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62JCYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
    • B62J6/00Arrangement of optical signalling or lighting devices on cycles; Mounting or supporting thereof; Circuits therefor
    • B62J6/22Warning or information lights
    • B62J6/24Warning or information lights warning or informing the rider, e.g. low fuel warning lights
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62JCYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
    • B62J45/00Electrical equipment arrangements specially adapted for use as accessories on cycles, not otherwise provided for
    • B62J45/40Sensor arrangements; Mounting thereof
    • B62J45/41Sensor arrangements; Mounting thereof characterised by the type of sensor
    • B62J45/415Inclination sensors
    • B62J45/4151Inclination sensors for sensing lateral inclination of the cycle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62JCYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
    • B62J50/00Arrangements specially adapted for use on cycles not provided for in main groups B62J1/00 - B62J45/00
    • B62J50/20Information-providing devices
    • B62J50/21Information-providing devices intended to provide information to rider or passenger
    • B62J50/22Information-providing devices intended to provide information to rider or passenger electronic, e.g. displays

Definitions

  • the present subject matter is generally related to the field of rider safety system, more particularly, but not exclusively, to a method, a computing unit, and a system for alerting a rider of a vehicle based on rider posture for safe riding.
  • rider safety systems have also improved.
  • rider safety systems to maintain traction control during cornering or sudden braking, and to prevent skidding using antilock braking mechanisms during hard braking.
  • the existing rider safety systems do not consider rider posture during normal driving or cornering. As a consequence, a rider vehicle with a bad cornering speed and/or a bad rider posture may suffer imbalance and/or be thrown out of the vehicle.
  • the present disclosure relates to a method for alerting a rider of a vehicle for safe riding.
  • the method comprising receiving at least one of sensor data and visual data related to the rider from an imaging and sensing unit. Thereafter, the method comprising determining absolute lean angle of a posture of the rider based on at least one of the sensor data and the visual data. Subsequently, the method comprising determining weight imbalance of the rider using the absolute lean angle and the sensor data when the absolute lean angle exceeds a predetermined threshold angle. Lastly, the method comprising alerting the rider for safe riding upon determination of the weight imbalance.
  • the present disclosure relates to a computing unit for alerting a rider of a vehicle for safe riding.
  • the computing unit comprising a processor and a memory communicatively coupled to the processor, wherein the memory stores processor executable instructions, which on execution, cause the processor to receive at least one of sensor data and visual data related to the rider from an imaging and sensing unit.
  • the processor is configured to determine absolute lean angle of a posture of the rider based on at least one of the sensor data and the visual data.
  • the processor is configured to determine weight imbalance of the rider using the absolute lean angle and the sensor data when the absolute lean angle exceeds a predetermined threshold angle.
  • the processor is configured to alert the rider for safe riding upon determination of the weight imbalance.
  • the present disclosure relates to a system for alerting a rider of a vehicle for safe riding.
  • the system comprising an imaging and sensing unit, and a computing unit communicatively coupled to the imaging and sensing unit.
  • the computing unit is configured to receive at least one of sensor data and visual data related to the rider from the imaging and sensing unit. Thereafter, the computing unit is configured to determine absolute lean angle of a posture of the rider based on at least one of the sensor data and the visual data. In the subsequent step, the computing unit is configured to determine weight imbalance of the rider using the absolute lean angle and the sensor data when the absolute lean angle exceeds a predetermined threshold angle. Lastly, the computing unit is configured to alert the rider for safe riding upon determination of the weight imbalance.
  • Embodiments of the disclosure according to the above mentioned method, computing unit and system bring about several technical advantages.
  • the rider’s posture on a vehicle is monitored continuously and the rider is alerted in case of weight imbalance. This approach ensures safety of the rider and prevents any potential accidents that might occur due to weight imbalance.
  • sensor data and visual data provides comprehensive data to determine absolute lean angle and thereafter, to determine weight imbalance. This approach of using sensor data and visual data minimizes number of non contact accidents due to amateur riding styles of riders or/and improper centre of gravity of a vehicle.
  • the alert mechanism (i.e., the method) of the present disclosure makes the rider aware of the situation such as weight imbalance or incorrect rider posture to take corrective action, thereby, preventing the rider from being thrown out of the vehicle due to bad cornering speeds and/or bad rider postures.
  • the use of the infrared camera in the present disclosure allows alerting the rider of the vehicle in case rider’s drowsiness/fatigue signs are detected. This approach further prevents any accidents due to lack of awareness of the rider while driving. Also, the use of the infrared camera helps to detect movements of the rider at night.
  • Figure 1a illustrates an exemplary environment for alerting a rider of a vehicle for safe riding in accordance with some embodiments of the present disclosure.
  • Figures 1b and 1c illustrate at least one camera arranged on a dashboard of a vehicle in accordance with some embodiments of the present disclosure.
  • FIG. 2 shows a detailed block diagram of a computing unit in accordance with some embodiments of the present disclosure.
  • Figure 3 illustrates a flowchart showing a method for alerting a rider of a vehicle for safe riding in accordance with some embodiments of present disclosure.
  • Figure 4 illustrates a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.
  • Embodiment of the present disclosure provides a solution for alerting a rider of a vehicle based on rider posture for safe riding.
  • the rider is referred to a driver of the vehicle.
  • the vehicle is a two wheeler or a three wheeler.
  • the present disclosure can also be extended to any vehicle other than the two wheeler or the three wheeler where rider alert solution of the present disclosure can be implemented.
  • the present disclosure discloses a method, a computing unit, and a system for alerting a rider of a vehicle for safe riding. The method receives at least one of sensor data and visual data related to the rider from an imaging and sensing unit. Based on at least one of the sensor data and the visual data, the method determines absolute lean angle of a posture of the rider.
  • the method determines weight imbalance of the rider using the absolute lean angle and the sensor data. Upon determination of the weight imbalance, the method alerts the rider for safe riding. The alerting the rider is performed using at least one of an audio signal, a haptic signal and a visual signal.
  • the present disclosure improves rider safety.
  • Figure 1a illustrates an exemplary environment for alerting a rider of a vehicle for safe riding in accordance with some embodiments of the present disclosure.
  • the environment 100 includes a computing unit 101 , a communication network 109 and an imaging and sensing unit 111.
  • the imaging and sensing unit 111 captures at least one of sensor data and visual data related to the rider of a vehicle.
  • the vehicle is a two wheeler or a three wheeler such as, but not limited to, a bicycle, an Electronically Power Assisted Cycles (EPAC), a scooter, an all terrain vehicle and any type of motorcycle.
  • the present disclosure can also be extended to any vehicle other than the two-wheeler or the three wheeler where rider alert solution of the present disclosure can be implemented.
  • the imaging and sensing unit 111 comprises pressure sensor arrays coupled to a seating unit of the vehicle and handlebar grips (i.e.
  • the at least one camera 121 arranged on a dashboard of the vehicle is shown in Figures 1b and 1c.
  • the at least one camera 121 arranged on a dashboard of the vehicle is positioned to face the rider to cover a field of view greater than or equal to 120° to monitor all actions or activities of the rider.
  • the at least one camera comprises at least one of a colour camera and an infrared camera.
  • the colour camera may be an any type of colour camera to capture the visual data.
  • the infrared camera is used for measuring or tracking eye movement of the rider when the rider is wearing a helmet with no visor or a helmet with a clear visor.
  • the visual data from the infrared camera helps the computing unit 101 to alert the rider in case rider drowsiness or fatigue signs are detected. Furthermore, the visual data from the infrared camera helps the computing unit 101 to detect movements of the rider at night.
  • the visual data from the at least one camera 121 is a continuous stream of video. In one embodiment, the at least one camera 121 processes the continuous stream of video to provide the visual data in a form of continuous stream of images.
  • the pressure sensor arrays coupled to the seating unit of the vehicle actively monitor if the rider is seated appropriately on the seating unit or leaning away from a normal seating pose.
  • the pressure sensor arrays coupled to the handlebar grips of the vehicle include the pressure sensor arrays coupled to the left handlebar grip and to the right handlebar grip.
  • the pressure sensor arrays coupled to the handlebar grips actively monitor engagement of the rider with the vehicle, which includes appropriate handlebar grip and turn detection.
  • the wheel speed sensors provide data related to a speed at which the rider of the vehicle is driving the vehicle or entering a corner.
  • the gyroscope provides data on how a rider tilt is affecting a vehicle tilt.
  • the accelerometer provides data related to acceleration of the vehicle.
  • the sensor data comprises data from at least one of the pressure sensor arrays, the wheel speed sensors, the accelerometer and the gyroscope.
  • the imaging and sensing unit 111 transmits at least one of the sensor data and the visual data related to the rider to the computing unit 101 using the communication network 109.
  • the communication network 109 may include, but is not limited to, a direct interconnection, an e commerce network, a Peer to Peer (P2P) network, Local Area Network (LAN), Wide Area Network (WAN), wireless network (for example, using Wireless Application Protocol), Internet, Wi Fi, Bluetooth, and the like.
  • P2P Peer to Peer
  • LAN Local Area Network
  • WAN Wide Area Network
  • wireless network for example, using Wireless Application Protocol
  • Internet Wi Fi
  • Wi Fi Wi Fi
  • Bluetooth wireless network
  • the computing unit 101 receives at least one of the sensor data and the visual data related to the rider from the imaging and sensing unit 111. Based on the at least one of the sensor data and the visual data, the computing unit 111 alerts the rider for safe riding.
  • the computing unit 101 may be present on a server or in a navigation device of a vehicle or as an independent unit (or a device) coupled to the dashboard of the vehicle.
  • the server may be a local server or on a cloud server.
  • the computing unit 101 includes an I/O interface 103, a memory 105 and a processor 107.
  • the I/O interface 103 is configured to communicate with the imaging and sensing unit 111.
  • the I/O interface 103 may employ communication protocols/methods such as, without limitation, audio, analog, digital, monaural, Radio Corporation of America (RCA) connector, stereo, IEEE® 1394 high speed serial bus, serial bus, Universal Serial Bus (USB), infrared, Personal System/2 (PS/2) port, Bayonet Neill Concelman (BNC) connector, coaxial, component, composite, Digital Visual Interface (DVI), High Definition Multimedia Interface (HDMI®), Radio Frequency (RF) antennas, S Video, Video Graphics Array (VGA), IEEE® 802.11 b/g/n/x, Bluetooth, cellular e.g., Code Division Multiple Access (CDMA), High Speed Packet Access (HSPA+), Global System for Mobile communications (GSM®), Long Term Evolution (LTE®), Worldwide interoperability for Microwave access (WiMax®), or the like.
  • CDMA Code Division Multiple Access
  • HSPA+ High Speed Packet Access
  • GSM® Global System for Mobile communications
  • LTE® Long Term Evolution
  • the memory 105 is communicatively coupled to the processor 107 of the computing unit 101.
  • the memory 105 also, stores processor instructions which cause the processor 107 to execute the instructions for alerting the rider of the vehicle for safe riding.
  • the processor 107 may include at least one data processor for alerting the rider of the vehicle for safe riding.
  • the imaging and sensing unit 111 together with the computing unit 101 form a system for alerting a rider of a vehicle for safe riding.
  • Situation 1 i.e. , a rider is taking a turn at a corner: when the rider is taking a turn, the computing unit 101 receives at least one of the sensor data and the visual data related to the rider from the imaging and sensing unit 111.
  • the computing unit 101 determines an absolute lean angle of a posture of the rider based on at least one of the sensor data and the visual data.
  • the computing unit 101 determines weight imbalance of the rider using the absolute lean angle and the sensor data. Thereafter, based on the determination of the weight imbalance, the computing unit 101 alerts the rider using at least one of an audio signal, a haptic signal and a visual signal for safe riding.
  • Situation 2 i.e.
  • the computing unit 101 utilizes the visual data from the infrared camera received in the at least one of sensor data and visual data related to the rider to determine whether rider’s eye focus is on a road or rider’s eyes showing drowsy or fatigue signs.
  • the computing unit 101 continues with processing upcoming at least one of the sensor data and the visual data to monitor rider posture.
  • the computing unit 101 alerts the rider using at least one of an audio signal, a haptic signal and a visual signal for safe riding.
  • FIG. 2 shows a detailed block diagram of a computing unit in accordance with some embodiments of the present disclosure.
  • the computing unit 101 in addition to the I/O interface 103 and the processor 107 described above, includes data 201 and one or more modules 211 , which are described herein in detail.
  • the data 201 is stored within the memory 105.
  • the data 201 includes, for example, sensor data and visual data 203 and other data 205.
  • the sensor data and visual data 203 includes data from at least one of pressure sensor arrays, wheel speed sensors, an accelerometer and a gyroscope.
  • the sensor data and visual data 203 receives data from the imaging and sensing unit 111.
  • the other data 205 may store data, including temporary data and temporary files, generated by one or more modules 211 for performing the various functions of the computing unit 101 .
  • the data 201 in the memory 105 is processed by the one or more modules 211 present within the memory 105 of computing unit 101.
  • the one or more modules 211 may be implemented as dedicated hardware units.
  • the term module refers to an Application Specific Integrated Circuit (ASIC), an electronic circuit, a Field Programmable Gate Arrays (FPGA), Programmable System on Chip (PSoC), a combinational logic circuit, and/or other suitable components that provide the described functionality.
  • the one or more modules 211 are communicatively coupled to the processor 107 for performing one or more functions of the computing unit 101 .
  • the one or more modules 211 when configured with the functionality defined in the present disclosure will result in a novel hardware.
  • the one or more modules 211 include, but are not limited to, a receiving module 213, a determining module 215 and an alerting module 217.
  • the one or more modules 211 also, include other modules 219 to perform various miscellaneous functionalities of the computing unit 101.
  • the receiving module 213 receives at least one of the sensor data and the visual data related to the rider from the imaging and sensing unit 111 through the I/O interface 103.
  • the receiving module 213 further sends the at least one of the sensor data and the visual data to the determining module 215.
  • the determining module 215 determines absolute lean angle of a posture of the rider based on at least one of the sensor data and the visual data. In detail, the determining module 215 detects rider tilt movement using the visual data received in at least one of the sensor data and visual data. In one embodiment, the determining module 215, in addition to the visual data, utilizes the sensor data from wheel speed sensors, an accelerometer, and a gyroscope to make decision on rider tilt movement. The rider tilt movement is correlated and compared by the determining module 215 with at least one of the sensor data from pressure sensor arrays coupled to a seating unit of the vehicle and handlebar grips received in at least one of the sensor data and visual data.
  • the determining module 215 determines absolute lean angle. Subsequently, the determining module 215 determines weight imbalance of the rider using the absolute lean angle and the sensor data when the absolute lean angle exceeds a predetermined threshold angle. In detail, the determining module 215 compares the absolute lean angle with the predetermined threshold angle.
  • the predetermined threshold angle may be an angle defined as per industry standard for two wheeler or three wheeler vehicles. In one non-limiting embodiment, the absolute lean angle is 35°. When the absolute lean angle does not exceed the predetermined threshold angle, the determining module 215 continues with processing upcoming at least one of the sensor data and the visual data to monitor rider posture.
  • the determining module 215 determines weight imbalance of the rider. For determining weight imbalance, the determining module 215 determines using the visual data whether the rider is leaning on a left side, on a right side or not leaning on either side. If there is no weight imbalance i.e. , the rider is not leaning on either side, the determining module 215 continues with processing upcoming at least one of the sensor data and the visual data to monitor rider posture. When the determining module 215 determines that the rider is leaning on the left side, the determining module 215 correlates the sensor data from pressure sensor arrays coupled to the seating unit of the vehicle and a left handlebar grip to determine weight imbalance of the rider.
  • the determining module 215 determines that the rider is leaning on the right side
  • the determining module 215 correlates the sensor data from pressure sensor arrays coupled to the seating unit of the vehicle and a right handlebar grip to determine weight imbalance of the rider.
  • the determining module 215 sends the determined weight imbalance of the rider to the alerting module 217.
  • the determining module 215 utilizes visual data from the infrared camera received in the at least one of sensor data and visual data related to the rider to determine whether rider’s eye focus is on a road or rider’s eyes showing drowsy or fatigue signs.
  • the determining module 215 continues with processing upcoming at least one of the sensor data and the visual data to monitor rider posture.
  • the determining module 215 sends the determined rider’s eye focus imbalance of the rider to the alerting module 217.
  • the alerting module 217 alerts the rider for safe riding upon determination of the weight imbalance.
  • the alerting module 217 alerts the rider using at least one of an audio signal, a haptic signal and a visual signal. For instance, the alerting module 217 alerts the rider if the rider is entering a corner with too much speed and/or bad posture that can affect centre of gravity of the vehicle, which can in turn lead to skidding.
  • the alerting module 217 alerts the rider for safe riding upon determination of the rider’s eye focus imbalance.
  • the alerting module 217 alerts the rider using at least one of an audio signal, a haptic signal and a visual signal.
  • Figure 3 illustrates a flowchart showing a method for alerting a rider of a vehicle for safe riding in accordance with some embodiments of present disclosure.
  • the method 300 includes one or more blocks for alerting a rider of a vehicle for safe riding.
  • the method 300 may be described in the general context of computer executable instructions.
  • computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform particular functions or implement particular abstract data types.
  • the order in which the method 300 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method. Additionally, individual blocks may be deleted from the methods without departing from the scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof.
  • the receiving module 213 of the computing unit 101 receives at least one of sensor data and visual data related to a rider from the imaging and sensing unit (111 ).
  • the imaging and sensing unit (111 ) comprises pressure sensor arrays coupled to a seating unit of the vehicle and handlebar grips (i.e. , the left handlebar grip and the right handlebar grip), at least one camera (121 ) arranged on a dashboard of the vehicle, wheel speed sensors coupled to wheels of the vehicle and an accelerometer and a gyroscope coupled to an inertial measurement unit of the vehicle.
  • the at least one camera (121 ) comprises at least one of a colour camera and an infrared camera.
  • the sensor data comprises data from at least one of pressure sensor arrays, wheel speed sensors, an accelerometer and a gyroscope.
  • the vehicle is a two wheeler or a three wheeler.
  • the determining module 215 of the computing unit 101 determines absolute lean angle of a posture of the rider based on at least one of the sensor data and the visual data.
  • the determining module 215 of the computing unit 101 determines weight imbalance of the rider using the absolute lean angle and the sensor data when the absolute lean angle exceeds a predetermined threshold angle.
  • the alerting module 217 of the computing unit 101 alerts the rider for safe riding upon determination of the weight imbalance.
  • the alerting the rider is performed using at least one of an audio signal, a haptic signal and a visual signal
  • the rider’s posture on a vehicle is monitored continuously and the rider is alerted in case of weight imbalance. This approach ensures safety of the rider and prevents any potential accidents that might occur due to weight imbalance.
  • sensor data and visual data provides comprehensive data to determine absolute lean angle and thereafter, to determine weight imbalance. This approach of using sensor data and visual data minimizes number of non contact accidents due to amateur riding styles of riders or/and improper centre of gravity of a vehicle.
  • the alert mechanism (i.e., the method) of the present disclosure makes the rider aware of the situation such as weight imbalance or incorrect rider posture to take corrective action, thereby, preventing the rider from being thrown out of the vehicle due to bad cornering speeds and/or bad rider postures.
  • the use of the infrared camera in the present disclosure allows alerting the rider of the vehicle in case rider’s drowsiness/fatigue signs are detected. This approach further prevents any accidents due to lack of awareness of the rider while driving. Also, the use of the infrared camera helps to detect movements of the rider at night.
  • Figure 4 illustrates a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.
  • the computer system 400 is used to implement the computing unit 101.
  • the computer system 400 includes a central processing unit (“CPU” or “processor”) 402.
  • the processor 402 includes at least one data processor for alerting a rider of a vehicle for safe riding.
  • the processor 402 includes specialized processing units such as, integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, and the like.
  • the processor 402 is disposed in communication with one or more input/output (I/O) devices (not shown in Figure 4) via I/O interface 401 .
  • the I/O interface 401 employs communication protocols/methods such as, without limitation, audio, analog, digital, monaural, Radio Corporation of America (RCA) connector, stereo, IEEE® 1394 high speed serial bus, serial bus, Universal Serial Bus (USB), infrared, Personal System/2 (PS/2) port, Bayonet Neill Concelman (BNC) connector, coaxial, component, composite, Digital Visual Interface (DVI), High Definition Multimedia Interface (HDMI®), Radio Frequency (RF) antennas, S Video, Video Graphics Array (VGA), IEEE® 802.11 b/g/n/x, Bluetooth, cellular e.g., Code Division Multiple Access (CDMA), High Speed Packet Access (HSPA+), Global System for Mobile communications (GSM®), Long Term Evolution (LTE®), Worldwide interoperability for Microwave access (WiMax®), or the
  • the computer system 400 uses the I/O interface 401 to communicate with one or more I/O devices such as input devices 412 and output devices 413.
  • the input devices 412 may be an antenna, keyboard, mouse, joystick, (infrared) remote control, camera, card reader, fax machine, dongle, biometric reader, microphone, touch screen, touchpad, trackball, stylus, scanner, storage device, transceiver, video device/source, etc.
  • the output devices 413 may be a printer, fax machine, video display (e.g., Cathode Ray Tube (CRT), Liquid Crystal Display (LCD), Light Emitting Diode (LED), plasma, Plasma Display Panel (PDP), Organic Light Emitting Diode display (OLED) or the like), audio speaker and the like.
  • video display e.g., Cathode Ray Tube (CRT), Liquid Crystal Display (LCD), Light Emitting Diode (LED), plasma, Plasma Display Panel (PDP), Organic Light Emitting Diode display (OLED) or the like
  • the computer system 400 consists of the computing unit 101 .
  • the processor 402 is disposed in communication with the communication network 109 via a network interface 403.
  • the network interface 403 communicates with the communication network 109.
  • the network interface 403 employs connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), Transmission Control Protocol/lnternet Protocol (TCP/IP), token ring, IEEE® 802.11 a/b/g/n/x and the like.
  • the communication network 109 includes, without limitation, a direct interconnection, Local Area Network (LAN), Wide Area Network (WAN), wireless network (e.g., using Wireless Application Protocol), the Internet and the like.
  • the network interface 403 uses the network interface 403 and the communication network 109 to communicate with the imaging and sensing unit 111.
  • the network interface 403 employs connection protocols that include, but not limited to, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), Transmission Control Protocol/lnternet Protocol (TCP/IP), token ring, IEEE® 802.11 a/b/g/n/x and the like.
  • the communication network 109 includes, but is not limited to, a direct interconnection, a Peer to Peer (P2P) network, Local Area Network (LAN), Wide Area Network (WAN), wireless network (e.g., using Wireless Application Protocol), the Internet, Wi Fi and the like.
  • the processor 402 is disposed in communication with a memory 405 (e.g., RAM, ROM, etc. not shown in Figure 4) via a storage interface 404.
  • the storage interface 404 connects to memory 405 including, without limitation, memory drives, removable disc drives and the like, employing connection protocols such as, Serial Advanced Technology Attachment (SATA), Integrated Drive Electronics (IDE), IEEE® 1394, Universal Serial Bus (USB), fiber channel, Small Computer Systems Interface (SCSI) and the like.
  • the memory drives further include a drum, magnetic disc drive, magnetooptical drive, optical drive, Redundant Array of Independent Discs (RAID), solid state memory devices, solid state drives, and the like.
  • the memory 405 stores a collection of program or database components, including, without limitation, user interface 406, an operating system 407 and the like.
  • computer system 400 stores user/application data, such as, the data, variables, records, etc., as described in this disclosure.
  • databases may be implemented as fault tolerant, relational, scalable, secure databases such as Oracle or Sybase.
  • the operating system 407 facilitates resource management and operation of the computer system 400.
  • Examples of operating systems include, without limitation, APPLE® MACINTOSH® OS X®, UNIX®, UNIX like system distributions (E.G., BERKELEY SOFTWARE DISTRIBUTION® (BSD), FREEBSD®, NETBSD®, OPENBSD and the like), LINUX® DISTRIBUTIONS (E.G., RED HAT®, UBUNTU®, KUBUNTU® and the like), IBM®OS/2®, MICROSOFT® WINDOWS® (XP®, VISTA®/7/8, 10 and the like), APPLE® IOS®, GOOGLETM ANDROIDTM, BLACKBERRY® OS, or the like.
  • the computer system 400 implements web browser 408 stored program components.
  • Web browser 408 is a hypertext viewing application, such as MICROSOFT® INTERNET EXPLORER®, GOOGLETM CHROMETM, MOZILLA® FIREFOX®, APPLE® SAFARI® and the like. Secure web browsing is provided using Secure Hypertext Transport Protocol (HTTPS), Secure Sockets Layer (SSL), Transport Layer Security (TLS) and the like. Web browsers 408 utilizes facilities such as AJAX, DHTML, ADOBE® FLASH®, JAVASCRIPT®, JAVA®, Application Programming Interfaces (APIs) and the like.
  • HTTPS Secure Hypertext Transport Protocol
  • SSL Secure Sockets Layer
  • TLS Transport Layer Security
  • Web browsers 408 utilizes facilities such as AJAX, DHTML, ADOBE® FLASH®, JAVASCRIPT®, JAVA®, Application Programming Interfaces (APIs) and the like.
  • the computer system 400 implements a mail server (not shown in Figure 4) stored program component.
  • the mail server is an Internet mail server such as Microsoft Exchange, or the like.
  • the mail server utilizes facilities such as ASP, ACTIVEX®, ANSI® C++/C#, MICROSOFT®, .NET, CGI SCRIPTS, JAVA®, JAVASCRIPT®, PERL®, PHP, PYTHON®, WEBOBJECTS® and the like.
  • the mail server utilizes communication protocols such as Internet Message Access Protocol (IMAP), Messaging Application Programming Interface (MAPI), MICROSOFT® exchange, Post Office Protocol (POP), Simple Mail Transfer Protocol (SMTP), or the like.
  • IMAP Internet Message Access Protocol
  • MAPI Messaging Application Programming Interface
  • MICROSOFT® exchange Post Office Protocol
  • POP Post Office Protocol
  • SMTP Simple Mail Transfer Protocol
  • the computer system 400 implements a mail client (not shown in Figure 4) stored program component.
  • the mail client is a mail viewing application, such as APPLE® MAIL, MICROSOFT® ENTOURAGE®, MICROSOFT® OUTLOOK®, MOZILLA® THUNDERBIRD® and the like.
  • a computer readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored.
  • a computer readable storage medium stores instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein.
  • the term “computer readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e. , be non transitory. Examples include Random Access Memory (RAM), Read Only Memory (ROM), volatile memory, non volatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
  • the described operations may be implemented as a method, an individual unit, system or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof.
  • the described operations may be implemented as code maintained in a “non transitory computer readable medium”, where a processor may read and execute the code from the computer readable medium.
  • the processor is at least one of a microprocessor and a processor capable of processing and executing the queries.
  • a non transitory computer readable medium may include media such as magnetic storage medium (e.g., hard disk drives, floppy disks, tape and the like), optical storage (CD ROMs, DVDs, optical disks and the like), volatile and non volatile memory devices (e.g., EEPROMs, ROMs, PROMs, RAMs, DRAMs, SRAMs, Flash Memory, firmware, programmable logic and the like) and the like.
  • non transitory computer readable media include all computer readable media except for a transitory.
  • the code implementing the described operations may further be implemented in hardware logic (e.g., an integrated circuit chip, Programmable Gate Array (PGA), Application Specific Integrated Circuit (ASIC) and the like).
  • an embodiment means “one or more (but not all) embodiments of the invention(s)” unless expressly specified otherwise.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

Present invention discloses a method, a computing unit (101), and a system for alerting a rider of a vehicle based on a posture of the rider for safe riding. The method comprises receiving (301) at least one of sensor data and visual data related to the rider from an imaging and sensing unit (111). Thereafter, the method comprises determining (303) absolute lean angle of the posture of the rider based on at least one of the sensor data and the visual data. Subsequently, the method comprises determining (305) weight imbalance of the rider using the absolute lean angle and the sensor data when the absolute lean angle exceeds a predetermined threshold angle. Lastly, the method comprises alerting (307) the rider for safe riding upon determination of the weight imbalance. The vehicle is a two wheeler or a three wheeler.

Description

METHOD AND SYSTEM TO ALERT A RIDER ON A VEHICLE BASED ON RIDER
POSTURE FOR SAFE RIDING
TECHNICAL FIELD
The present subject matter is generally related to the field of rider safety system, more particularly, but not exclusively, to a method, a computing unit, and a system for alerting a rider of a vehicle based on rider posture for safe riding.
BACKGROUND
With the advancement in vehicle technology in two or three wheelers, rider safety systems have also improved. There are rider safety systems to maintain traction control during cornering or sudden braking, and to prevent skidding using antilock braking mechanisms during hard braking. However, the existing rider safety systems do not consider rider posture during normal driving or cornering. As a consequence, a rider vehicle with a bad cornering speed and/or a bad rider posture may suffer imbalance and/or be thrown out of the vehicle.
The information disclosed in this background of the disclosure section is for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.
SUMMARY
There is a need to overcome the above mentioned problems related to the existing rider safety systems.
In an embodiment, the present disclosure relates to a method for alerting a rider of a vehicle for safe riding. The method comprising receiving at least one of sensor data and visual data related to the rider from an imaging and sensing unit. Thereafter, the method comprising determining absolute lean angle of a posture of the rider based on at least one of the sensor data and the visual data. Subsequently, the method comprising determining weight imbalance of the rider using the absolute lean angle and the sensor data when the absolute lean angle exceeds a predetermined threshold angle. Lastly, the method comprising alerting the rider for safe riding upon determination of the weight imbalance.
In another embodiment, the present disclosure relates to a computing unit for alerting a rider of a vehicle for safe riding. The computing unit comprising a processor and a memory communicatively coupled to the processor, wherein the memory stores processor executable instructions, which on execution, cause the processor to receive at least one of sensor data and visual data related to the rider from an imaging and sensing unit. Thereafter, the processor is configured to determine absolute lean angle of a posture of the rider based on at least one of the sensor data and the visual data. In the subsequent step, the processor is configured to determine weight imbalance of the rider using the absolute lean angle and the sensor data when the absolute lean angle exceeds a predetermined threshold angle. Lastly, the processor is configured to alert the rider for safe riding upon determination of the weight imbalance.
In yet another embodiment, the present disclosure relates to a system for alerting a rider of a vehicle for safe riding. The system comprising an imaging and sensing unit, and a computing unit communicatively coupled to the imaging and sensing unit. The computing unit is configured to receive at least one of sensor data and visual data related to the rider from the imaging and sensing unit. Thereafter, the computing unit is configured to determine absolute lean angle of a posture of the rider based on at least one of the sensor data and the visual data. In the subsequent step, the computing unit is configured to determine weight imbalance of the rider using the absolute lean angle and the sensor data when the absolute lean angle exceeds a predetermined threshold angle. Lastly, the computing unit is configured to alert the rider for safe riding upon determination of the weight imbalance.
Embodiments of the disclosure according to the above mentioned method, computing unit and system bring about several technical advantages. In present disclosure, the rider’s posture on a vehicle is monitored continuously and the rider is alerted in case of weight imbalance. This approach ensures safety of the rider and prevents any potential accidents that might occur due to weight imbalance.
The use of sensor data and visual data provides comprehensive data to determine absolute lean angle and thereafter, to determine weight imbalance. This approach of using sensor data and visual data minimizes number of non contact accidents due to amateur riding styles of riders or/and improper centre of gravity of a vehicle.
The alert mechanism (i.e., the method) of the present disclosure makes the rider aware of the situation such as weight imbalance or incorrect rider posture to take corrective action, thereby, preventing the rider from being thrown out of the vehicle due to bad cornering speeds and/or bad rider postures.
Furthermore, the use of the infrared camera in the present disclosure allows alerting the rider of the vehicle in case rider’s drowsiness/fatigue signs are detected. This approach further prevents any accidents due to lack of awareness of the rider while driving. Also, the use of the infrared camera helps to detect movements of the rider at night.
The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and together with the description, serve to explain the disclosed principles. In the figures, the left most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the figures to reference like features and components. Some embodiments of system and/or methods in accordance with embodiments of the present subject matter are now described below, by way of example only, and with reference to the accompanying figures.
Figure 1a illustrates an exemplary environment for alerting a rider of a vehicle for safe riding in accordance with some embodiments of the present disclosure.
Figures 1b and 1c illustrate at least one camera arranged on a dashboard of a vehicle in accordance with some embodiments of the present disclosure.
Figure 2 shows a detailed block diagram of a computing unit in accordance with some embodiments of the present disclosure.
Figure 3 illustrates a flowchart showing a method for alerting a rider of a vehicle for safe riding in accordance with some embodiments of present disclosure.
Figure 4 illustrates a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.
It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flowcharts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION
In the present document, the word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments. While the disclosure is susceptible to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the drawings and will be described in detail below. It should be understood, however that it is not intended to limit the disclosure to the particular forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternatives falling within the scope of the disclosure.
The terms “comprises”, “comprising”, or any other variations thereof, are intended to cover a non exclusive inclusion, such that a setup, device or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus proceeded by “comprises... a” does not, without more constraints, preclude the existence of other elements or additional elements in the system or method.
In the following detailed description of the embodiments of the disclosure, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the disclosure may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the disclosure, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present disclosure. The following description is, therefore, not to be taken in a limiting sense.
Embodiment of the present disclosure provides a solution for alerting a rider of a vehicle based on rider posture for safe riding. In present disclosure, the rider is referred to a driver of the vehicle. The vehicle is a two wheeler or a three wheeler. The present disclosure can also be extended to any vehicle other than the two wheeler or the three wheeler where rider alert solution of the present disclosure can be implemented. The present disclosure discloses a method, a computing unit, and a system for alerting a rider of a vehicle for safe riding. The method receives at least one of sensor data and visual data related to the rider from an imaging and sensing unit. Based on at least one of the sensor data and the visual data, the method determines absolute lean angle of a posture of the rider. When the absolute lean angle exceeds a predetermined threshold angle, the method determines weight imbalance of the rider using the absolute lean angle and the sensor data. Upon determination of the weight imbalance, the method alerts the rider for safe riding. The alerting the rider is performed using at least one of an audio signal, a haptic signal and a visual signal. The present disclosure improves rider safety.
Figure 1a illustrates an exemplary environment for alerting a rider of a vehicle for safe riding in accordance with some embodiments of the present disclosure.
As shown in the Figure 1a, the environment 100 includes a computing unit 101 , a communication network 109 and an imaging and sensing unit 111. The imaging and sensing unit 111 captures at least one of sensor data and visual data related to the rider of a vehicle. The vehicle is a two wheeler or a three wheeler such as, but not limited to, a bicycle, an Electronically Power Assisted Cycles (EPAC), a scooter, an all terrain vehicle and any type of motorcycle. The present disclosure can also be extended to any vehicle other than the two-wheeler or the three wheeler where rider alert solution of the present disclosure can be implemented. The imaging and sensing unit 111 comprises pressure sensor arrays coupled to a seating unit of the vehicle and handlebar grips (i.e. , a left handlebar grip and a right handlebar grip), at least one camera 121 arranged on a dashboard of the vehicle, wheel speed sensors coupled to wheels of the vehicle and an accelerometer and a gyroscope coupled to an inertial measurement unit of the vehicle. The at least one camera 121 arranged on the dashboard of the vehicle is shown in Figures 1b and 1c. The at least one camera 121 arranged on a dashboard of the vehicle is positioned to face the rider to cover a field of view greater than or equal to 120° to monitor all actions or activities of the rider. The at least one camera comprises at least one of a colour camera and an infrared camera. The colour camera may be an any type of colour camera to capture the visual data. The infrared camera is used for measuring or tracking eye movement of the rider when the rider is wearing a helmet with no visor or a helmet with a clear visor. The visual data from the infrared camera helps the computing unit 101 to alert the rider in case rider drowsiness or fatigue signs are detected. Furthermore, the visual data from the infrared camera helps the computing unit 101 to detect movements of the rider at night. The visual data from the at least one camera 121 is a continuous stream of video. In one embodiment, the at least one camera 121 processes the continuous stream of video to provide the visual data in a form of continuous stream of images. The pressure sensor arrays coupled to the seating unit of the vehicle actively monitor if the rider is seated appropriately on the seating unit or leaning away from a normal seating pose. The pressure sensor arrays coupled to the handlebar grips of the vehicle include the pressure sensor arrays coupled to the left handlebar grip and to the right handlebar grip. The pressure sensor arrays coupled to the handlebar grips actively monitor engagement of the rider with the vehicle, which includes appropriate handlebar grip and turn detection. The wheel speed sensors provide data related to a speed at which the rider of the vehicle is driving the vehicle or entering a corner. The gyroscope provides data on how a rider tilt is affecting a vehicle tilt. The accelerometer provides data related to acceleration of the vehicle. The sensor data comprises data from at least one of the pressure sensor arrays, the wheel speed sensors, the accelerometer and the gyroscope. The imaging and sensing unit 111 transmits at least one of the sensor data and the visual data related to the rider to the computing unit 101 using the communication network 109.
The communication network 109 may include, but is not limited to, a direct interconnection, an e commerce network, a Peer to Peer (P2P) network, Local Area Network (LAN), Wide Area Network (WAN), wireless network (for example, using Wireless Application Protocol), Internet, Wi Fi, Bluetooth, and the like.
The computing unit 101 receives at least one of the sensor data and the visual data related to the rider from the imaging and sensing unit 111. Based on the at least one of the sensor data and the visual data, the computing unit 111 alerts the rider for safe riding. The computing unit 101 may be present on a server or in a navigation device of a vehicle or as an independent unit (or a device) coupled to the dashboard of the vehicle. The server may be a local server or on a cloud server. The computing unit 101 includes an I/O interface 103, a memory 105 and a processor 107. The I/O interface 103 is configured to communicate with the imaging and sensing unit 111. The I/O interface 103 may employ communication protocols/methods such as, without limitation, audio, analog, digital, monaural, Radio Corporation of America (RCA) connector, stereo, IEEE® 1394 high speed serial bus, serial bus, Universal Serial Bus (USB), infrared, Personal System/2 (PS/2) port, Bayonet Neill Concelman (BNC) connector, coaxial, component, composite, Digital Visual Interface (DVI), High Definition Multimedia Interface (HDMI®), Radio Frequency (RF) antennas, S Video, Video Graphics Array (VGA), IEEE® 802.11 b/g/n/x, Bluetooth, cellular e.g., Code Division Multiple Access (CDMA), High Speed Packet Access (HSPA+), Global System for Mobile communications (GSM®), Long Term Evolution (LTE®), Worldwide interoperability for Microwave access (WiMax®), or the like.
The memory 105 is communicatively coupled to the processor 107 of the computing unit 101. The memory 105, also, stores processor instructions which cause the processor 107 to execute the instructions for alerting the rider of the vehicle for safe riding.
The processor 107 may include at least one data processor for alerting the rider of the vehicle for safe riding.
In one embodiment, the imaging and sensing unit 111 together with the computing unit 101 form a system for alerting a rider of a vehicle for safe riding.
Hereafter, the operation of the computing unit 101 for alerting the rider of the vehicle for safe riding is described.
Situation 1 i.e. , a rider is taking a turn at a corner: when the rider is taking a turn, the computing unit 101 receives at least one of the sensor data and the visual data related to the rider from the imaging and sensing unit 111. The computing unit 101 determines an absolute lean angle of a posture of the rider based on at least one of the sensor data and the visual data. When the absolute lean angle exceeds the predetermined threshold angle, the computing unit 101 determines weight imbalance of the rider using the absolute lean angle and the sensor data. Thereafter, based on the determination of the weight imbalance, the computing unit 101 alerts the rider using at least one of an audio signal, a haptic signal and a visual signal for safe riding. Situation 2 i.e. , rider’s eyes showing drowsy or fatigue signs: when the rider is driving the vehicle, the computing unit 101 receives at least one of the sensor data and the visual data related to the rider from the imaging and sensing unit 111. The computing unit 101 determines an absolute lean angle of a posture of the rider based on at least one of the sensor data and the visual data. When the absolute lean angle does not exceed the predetermined threshold angle, the computing unit 101 continues with processing upcoming at least one of the sensor data and the visual data to monitor rider posture. Additionally, the computing unit 101 utilizes the visual data from the infrared camera received in the at least one of sensor data and visual data related to the rider to determine whether rider’s eye focus is on a road or rider’s eyes showing drowsy or fatigue signs. When rider’s eye focus is on the road or rider’s eyes showing no drowsy or fatigue signs, the computing unit 101 continues with processing upcoming at least one of the sensor data and the visual data to monitor rider posture. When rider’s eye focus is not on the road or rider’s eyes showing drowsy or fatigue signs, the computing unit 101 alerts the rider using at least one of an audio signal, a haptic signal and a visual signal for safe riding.
Figure 2 shows a detailed block diagram of a computing unit in accordance with some embodiments of the present disclosure.
The computing unit 101 , in addition to the I/O interface 103 and the processor 107 described above, includes data 201 and one or more modules 211 , which are described herein in detail. In the embodiment, the data 201 is stored within the memory 105. The data 201 includes, for example, sensor data and visual data 203 and other data 205.
The sensor data and visual data 203 includes data from at least one of pressure sensor arrays, wheel speed sensors, an accelerometer and a gyroscope. The sensor data and visual data 203 receives data from the imaging and sensing unit 111.
The other data 205 may store data, including temporary data and temporary files, generated by one or more modules 211 for performing the various functions of the computing unit 101 . In the embodiment, the data 201 in the memory 105 is processed by the one or more modules 211 present within the memory 105 of computing unit 101. In the embodiment, the one or more modules 211 may be implemented as dedicated hardware units. As used herein, the term module refers to an Application Specific Integrated Circuit (ASIC), an electronic circuit, a Field Programmable Gate Arrays (FPGA), Programmable System on Chip (PSoC), a combinational logic circuit, and/or other suitable components that provide the described functionality. In some implementations, the one or more modules 211 are communicatively coupled to the processor 107 for performing one or more functions of the computing unit 101 . The one or more modules 211 when configured with the functionality defined in the present disclosure will result in a novel hardware.
In one implementation, the one or more modules 211 include, but are not limited to, a receiving module 213, a determining module 215 and an alerting module 217. The one or more modules 211 , also, include other modules 219 to perform various miscellaneous functionalities of the computing unit 101.
The receiving module 213 receives at least one of the sensor data and the visual data related to the rider from the imaging and sensing unit 111 through the I/O interface 103. The receiving module 213 further sends the at least one of the sensor data and the visual data to the determining module 215.
The determining module 215 determines absolute lean angle of a posture of the rider based on at least one of the sensor data and the visual data. In detail, the determining module 215 detects rider tilt movement using the visual data received in at least one of the sensor data and visual data. In one embodiment, the determining module 215, in addition to the visual data, utilizes the sensor data from wheel speed sensors, an accelerometer, and a gyroscope to make decision on rider tilt movement. The rider tilt movement is correlated and compared by the determining module 215 with at least one of the sensor data from pressure sensor arrays coupled to a seating unit of the vehicle and handlebar grips received in at least one of the sensor data and visual data. Based on the correlation and comparison, the determining module 215 determines absolute lean angle. Subsequently, the determining module 215 determines weight imbalance of the rider using the absolute lean angle and the sensor data when the absolute lean angle exceeds a predetermined threshold angle. In detail, the determining module 215 compares the absolute lean angle with the predetermined threshold angle. The predetermined threshold angle may be an angle defined as per industry standard for two wheeler or three wheeler vehicles. In one non-limiting embodiment, the absolute lean angle is 35°. When the absolute lean angle does not exceed the predetermined threshold angle, the determining module 215 continues with processing upcoming at least one of the sensor data and the visual data to monitor rider posture. When the absolute lean angle exceeds the predetermined threshold angle, the determining module 215 determines weight imbalance of the rider. For determining weight imbalance, the determining module 215 determines using the visual data whether the rider is leaning on a left side, on a right side or not leaning on either side. If there is no weight imbalance i.e. , the rider is not leaning on either side, the determining module 215 continues with processing upcoming at least one of the sensor data and the visual data to monitor rider posture. When the determining module 215 determines that the rider is leaning on the left side, the determining module 215 correlates the sensor data from pressure sensor arrays coupled to the seating unit of the vehicle and a left handlebar grip to determine weight imbalance of the rider. Analogously, when the determining module 215 determines that the rider is leaning on the right side, the determining module 215 correlates the sensor data from pressure sensor arrays coupled to the seating unit of the vehicle and a right handlebar grip to determine weight imbalance of the rider. The determining module 215 sends the determined weight imbalance of the rider to the alerting module 217.
In another embodiment, the determining module 215 utilizes visual data from the infrared camera received in the at least one of sensor data and visual data related to the rider to determine whether rider’s eye focus is on a road or rider’s eyes showing drowsy or fatigue signs. When rider’s eye focus is on the road or rider’s eyes showing no drowsy or fatigue signs, the determining module 215 continues with processing upcoming at least one of the sensor data and the visual data to monitor rider posture. When rider’s eye focus is not on the road or rider’s eyes showing drowsy or fatigue signs, the determining module 215 sends the determined rider’s eye focus imbalance of the rider to the alerting module 217.
The alerting module 217 alerts the rider for safe riding upon determination of the weight imbalance. The alerting module 217 alerts the rider using at least one of an audio signal, a haptic signal and a visual signal. For instance, the alerting module 217 alerts the rider if the rider is entering a corner with too much speed and/or bad posture that can affect centre of gravity of the vehicle, which can in turn lead to skidding.
In one embodiment, the alerting module 217 alerts the rider for safe riding upon determination of the rider’s eye focus imbalance. The alerting module 217 alerts the rider using at least one of an audio signal, a haptic signal and a visual signal.
Figure 3 illustrates a flowchart showing a method for alerting a rider of a vehicle for safe riding in accordance with some embodiments of present disclosure.
As illustrated in Figure 3, the method 300 includes one or more blocks for alerting a rider of a vehicle for safe riding. The method 300 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform particular functions or implement particular abstract data types.
The order in which the method 300 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method. Additionally, individual blocks may be deleted from the methods without departing from the scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof.
At block 301 , the receiving module 213 of the computing unit 101 receives at least one of sensor data and visual data related to a rider from the imaging and sensing unit (111 ). The imaging and sensing unit (111 ) comprises pressure sensor arrays coupled to a seating unit of the vehicle and handlebar grips (i.e. , the left handlebar grip and the right handlebar grip), at least one camera (121 ) arranged on a dashboard of the vehicle, wheel speed sensors coupled to wheels of the vehicle and an accelerometer and a gyroscope coupled to an inertial measurement unit of the vehicle. The at least one camera (121 ) comprises at least one of a colour camera and an infrared camera. The sensor data comprises data from at least one of pressure sensor arrays, wheel speed sensors, an accelerometer and a gyroscope. The vehicle is a two wheeler or a three wheeler.
At block 303, the determining module 215 of the computing unit 101 determines absolute lean angle of a posture of the rider based on at least one of the sensor data and the visual data.
At block 305, the determining module 215 of the computing unit 101 determines weight imbalance of the rider using the absolute lean angle and the sensor data when the absolute lean angle exceeds a predetermined threshold angle.
At block 307, the alerting module 217 of the computing unit 101 alerts the rider for safe riding upon determination of the weight imbalance. The alerting the rider is performed using at least one of an audio signal, a haptic signal and a visual signal
Some of the technical advantages of the present disclosure are listed below.
In present disclosure, the rider’s posture on a vehicle is monitored continuously and the rider is alerted in case of weight imbalance. This approach ensures safety of the rider and prevents any potential accidents that might occur due to weight imbalance.
The use of sensor data and visual data provides comprehensive data to determine absolute lean angle and thereafter, to determine weight imbalance. This approach of using sensor data and visual data minimizes number of non contact accidents due to amateur riding styles of riders or/and improper centre of gravity of a vehicle. The alert mechanism (i.e., the method) of the present disclosure makes the rider aware of the situation such as weight imbalance or incorrect rider posture to take corrective action, thereby, preventing the rider from being thrown out of the vehicle due to bad cornering speeds and/or bad rider postures.
Furthermore, the use of the infrared camera in the present disclosure allows alerting the rider of the vehicle in case rider’s drowsiness/fatigue signs are detected. This approach further prevents any accidents due to lack of awareness of the rider while driving. Also, the use of the infrared camera helps to detect movements of the rider at night.
Figure 4 illustrates a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.
In an embodiment, the computer system 400 is used to implement the computing unit 101. The computer system 400 includes a central processing unit (“CPU” or “processor”) 402. The processor 402 includes at least one data processor for alerting a rider of a vehicle for safe riding. The processor 402 includes specialized processing units such as, integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, and the like.
The processor 402 is disposed in communication with one or more input/output (I/O) devices (not shown in Figure 4) via I/O interface 401 . The I/O interface 401 employs communication protocols/methods such as, without limitation, audio, analog, digital, monaural, Radio Corporation of America (RCA) connector, stereo, IEEE® 1394 high speed serial bus, serial bus, Universal Serial Bus (USB), infrared, Personal System/2 (PS/2) port, Bayonet Neill Concelman (BNC) connector, coaxial, component, composite, Digital Visual Interface (DVI), High Definition Multimedia Interface (HDMI®), Radio Frequency (RF) antennas, S Video, Video Graphics Array (VGA), IEEE® 802.11 b/g/n/x, Bluetooth, cellular e.g., Code Division Multiple Access (CDMA), High Speed Packet Access (HSPA+), Global System for Mobile communications (GSM®), Long Term Evolution (LTE®), Worldwide interoperability for Microwave access (WiMax®), or the like.
Using the I/O interface 401 , the computer system 400 communicates with one or more I/O devices such as input devices 412 and output devices 413. For example, the input devices 412 may be an antenna, keyboard, mouse, joystick, (infrared) remote control, camera, card reader, fax machine, dongle, biometric reader, microphone, touch screen, touchpad, trackball, stylus, scanner, storage device, transceiver, video device/source, etc. The output devices 413 may be a printer, fax machine, video display (e.g., Cathode Ray Tube (CRT), Liquid Crystal Display (LCD), Light Emitting Diode (LED), plasma, Plasma Display Panel (PDP), Organic Light Emitting Diode display (OLED) or the like), audio speaker and the like.
In some embodiments, the computer system 400 consists of the computing unit 101 . The processor 402 is disposed in communication with the communication network 109 via a network interface 403. The network interface 403 communicates with the communication network 109. The network interface 403 employs connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), Transmission Control Protocol/lnternet Protocol (TCP/IP), token ring, IEEE® 802.11 a/b/g/n/x and the like. The communication network 109 includes, without limitation, a direct interconnection, Local Area Network (LAN), Wide Area Network (WAN), wireless network (e.g., using Wireless Application Protocol), the Internet and the like. Using the network interface 403 and the communication network 109, the computer system 400 communicates with the imaging and sensing unit 111. The network interface 403 employs connection protocols that include, but not limited to, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), Transmission Control Protocol/lnternet Protocol (TCP/IP), token ring, IEEE® 802.11 a/b/g/n/x and the like.
The communication network 109 includes, but is not limited to, a direct interconnection, a Peer to Peer (P2P) network, Local Area Network (LAN), Wide Area Network (WAN), wireless network (e.g., using Wireless Application Protocol), the Internet, Wi Fi and the like. In some embodiments, the processor 402 is disposed in communication with a memory 405 (e.g., RAM, ROM, etc. not shown in Figure 4) via a storage interface 404. The storage interface 404 connects to memory 405 including, without limitation, memory drives, removable disc drives and the like, employing connection protocols such as, Serial Advanced Technology Attachment (SATA), Integrated Drive Electronics (IDE), IEEE® 1394, Universal Serial Bus (USB), fiber channel, Small Computer Systems Interface (SCSI) and the like. The memory drives further include a drum, magnetic disc drive, magnetooptical drive, optical drive, Redundant Array of Independent Discs (RAID), solid state memory devices, solid state drives, and the like.
The memory 405 stores a collection of program or database components, including, without limitation, user interface 406, an operating system 407 and the like. In some embodiments, computer system 400 stores user/application data, such as, the data, variables, records, etc., as described in this disclosure. Such databases may be implemented as fault tolerant, relational, scalable, secure databases such as Oracle or Sybase.
The operating system 407 facilitates resource management and operation of the computer system 400. Examples of operating systems include, without limitation, APPLE® MACINTOSH® OS X®, UNIX®, UNIX like system distributions (E.G., BERKELEY SOFTWARE DISTRIBUTION® (BSD), FREEBSD®, NETBSD®, OPENBSD and the like), LINUX® DISTRIBUTIONS (E.G., RED HAT®, UBUNTU®, KUBUNTU® and the like), IBM®OS/2®, MICROSOFT® WINDOWS® (XP®, VISTA®/7/8, 10 and the like), APPLE® IOS®, GOOGLE™ ANDROID™, BLACKBERRY® OS, or the like.
In some embodiments, the computer system 400 implements web browser 408 stored program components. Web browser 408 is a hypertext viewing application, such as MICROSOFT® INTERNET EXPLORER®, GOOGLE™ CHROME™, MOZILLA® FIREFOX®, APPLE® SAFARI® and the like. Secure web browsing is provided using Secure Hypertext Transport Protocol (HTTPS), Secure Sockets Layer (SSL), Transport Layer Security (TLS) and the like. Web browsers 408 utilizes facilities such as AJAX, DHTML, ADOBE® FLASH®, JAVASCRIPT®, JAVA®, Application Programming Interfaces (APIs) and the like. The computer system 400 implements a mail server (not shown in Figure 4) stored program component. The mail server is an Internet mail server such as Microsoft Exchange, or the like. The mail server utilizes facilities such as ASP, ACTIVEX®, ANSI® C++/C#, MICROSOFT®, .NET, CGI SCRIPTS, JAVA®, JAVASCRIPT®, PERL®, PHP, PYTHON®, WEBOBJECTS® and the like. The mail server utilizes communication protocols such as Internet Message Access Protocol (IMAP), Messaging Application Programming Interface (MAPI), MICROSOFT® exchange, Post Office Protocol (POP), Simple Mail Transfer Protocol (SMTP), or the like. The computer system 400 implements a mail client (not shown in Figure 4) stored program component. The mail client is a mail viewing application, such as APPLE® MAIL, MICROSOFT® ENTOURAGE®, MICROSOFT® OUTLOOK®, MOZILLA® THUNDERBIRD® and the like.
Furthermore, one or more computer readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer readable storage medium stores instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e. , be non transitory. Examples include Random Access Memory (RAM), Read Only Memory (ROM), volatile memory, non volatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
The described operations may be implemented as a method, an individual unit, system or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof. The described operations may be implemented as code maintained in a “non transitory computer readable medium”, where a processor may read and execute the code from the computer readable medium. The processor is at least one of a microprocessor and a processor capable of processing and executing the queries. A non transitory computer readable medium may include media such as magnetic storage medium (e.g., hard disk drives, floppy disks, tape and the like), optical storage (CD ROMs, DVDs, optical disks and the like), volatile and non volatile memory devices (e.g., EEPROMs, ROMs, PROMs, RAMs, DRAMs, SRAMs, Flash Memory, firmware, programmable logic and the like) and the like. Further, non transitory computer readable media include all computer readable media except for a transitory. The code implementing the described operations may further be implemented in hardware logic (e.g., an integrated circuit chip, Programmable Gate Array (PGA), Application Specific Integrated Circuit (ASIC) and the like).
The terms “an embodiment”, “embodiment”, “embodiments”, “the embodiment”, “the embodiments”, “one or more embodiments”, “some embodiments”, and “one embodiment” mean “one or more (but not all) embodiments of the invention(s)” unless expressly specified otherwise.
The terms “including”, “comprising”, “having” and variations thereof mean “including but not limited to”, unless expressly specified otherwise.
The enumerated listing of items does not imply that any or all of the items are mutually exclusive, unless expressly specified otherwise.
The terms “a”, “an” and “the” mean “one or more”, unless expressly specified otherwise.
A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary, a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention.
When a single device or article is described herein, it will be readily apparent that more than one device/article (whether or not they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether or not they cooperate), it will be readily apparent that a single device/article may be used in place of the more than one device or article, or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the invention need not include the device itself.
The illustrated operations of Figure 3 show certain events occurring in a certain order. In alternative embodiments, certain operations may be performed in a different order, modified or removed. Moreover, steps may be added to the above described logic and still conform to the described embodiments. Further, operations described herein may occur sequentially or certain operations may be processed in parallel. Yet further, operations may be performed by a single processing unit or by distributed processing units.
Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based here on. Accordingly, the disclosure of the embodiments of the invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the scope being indicated by the following claims. REFERRAL NUMERALS
Figure imgf000022_0001

Claims

1 . A method for alerting a rider of a vehicle for safe riding, the method comprising: receiving (301 ) at least one of sensor data and visual data related to the rider from an imaging and sensing unit (111 ); determining (303) absolute lean angle of a posture of the rider based on at least one of the sensor data and the visual data; determining (305) weight imbalance of the rider using the absolute lean angle and the sensor data when the absolute lean angle exceeds a predetermined threshold angle; and alerting (307) the rider for safe riding upon determination of the weight imbalance.
2. The method of claim 1 , wherein the imaging and sensing unit (111 ) comprises pressure sensor arrays coupled to a seating unit of the vehicle and handlebar grips, at least one camera (121 ) arranged on a dashboard of the vehicle, wheel speed sensors coupled to wheels of the vehicle and an accelerometer and a gyroscope coupled to an inertial measurement unit of the vehicle.
3. The method of claim 2, wherein the at least one camera (121 ) comprises at least one of a colour camera and an infrared camera.
4. The method of claim 1 , wherein the sensor data comprises data from at least one of pressure sensor arrays, wheel speed sensors, an accelerometer and a gyroscope.
5. The method of claim 1 , wherein alerting the rider is performed using at least one of an audio signal, a haptic signal and a visual signal.
6. The method of claim 1 , wherein the vehicle is a two wheeler or a three wheeler.
7. A computing unit (101 ) for alerting a rider of a vehicle for safe riding, the computing unit (101 ) comprising: a processor (107); and a memory (105) communicatively coupled to the processor (107), wherein the memory (105) stores processor executable instructions, which on execution, cause the processor (107) to: receive at least one of sensor data and visual data related to the rider from an imaging and sensing unit (111 ); determine absolute lean angle of a posture of the rider based on at least one of the sensor data and the visual data; determine weight imbalance of the rider using the absolute lean angle and the sensor data when the absolute lean angle exceeds a predetermined threshold angle; and alert the rider for safe riding upon determination of the weight imbalance.
8. The computing unit (101 ) of claim 7, wherein the imaging and sensing unit (111 ) comprises pressure sensor arrays coupled to a seating unit of the vehicle and handlebar grips, at least one camera (121 ) arranged on a dashboard of the vehicle, wheel speed sensors coupled to wheels of the vehicle and an accelerometer and a gyroscope coupled to an inertial measurement unit of the vehicle.
9. The computing unit (101 ) of claim 8, wherein the at least one camera (121 ) comprises at least one of a colour camera and an infrared camera.
10. The computing unit (101 ) of claim 7, wherein the sensor data comprises data from at least one of pressure sensor arrays, wheel speed sensors, an accelerometer and a gyroscope.
11 . The computing unit (101 ) of claim 7, wherein alerting the rider is performed using at least one of an audio signal, a haptic signal and a visual signal.
12. The computing unit (101 ) of claim 7, wherein the vehicle is a two wheeler or a three wheeler.
13. A system for alerting a rider of a vehicle for safe riding, the system comprising: an imaging and sensing unit (111 ); and a computing unit (101 ) communicatively coupled to the imaging and sensing unit (111 ), the computing unit is configured to: receive at least one of sensor data and visual data related to the rider from an imaging and sensing unit (111 ); determine absolute lean angle of a posture of the rider based on at least one of the sensor data and the visual data; determine weight imbalance of the rider using the absolute lean angle and the sensor data when the absolute lean angle exceeds a predetermined threshold angle; and alert the rider for safe riding upon determination of the weight imbalance.
14. The system of claim 13, wherein the imaging and sensing unit (111 ) comprises pressure sensor arrays coupled to a seating unit of the vehicle and handlebar grips, at least one camera (121 ) arranged on a dashboard of the vehicle, wheel speed sensors coupled to wheels of the vehicle and an accelerometer and a gyroscope coupled to an inertial measurement unit of the vehicle.
15. The system of claim 13, wherein the sensor data comprises data from at least one of pressure sensor arrays, wheel speed sensors, an accelerometer and a gyroscope.
PCT/EP2023/072529 2022-08-17 2023-08-16 Method and system to alert a rider on a vehicle based on rider posture for safe riding WO2024038076A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB2211986.1 2022-08-17
GBGB2211986.1A GB202211986D0 (en) 2022-08-17 2022-08-17 Method and system to alert a rider on a vehicle based on rider posture for safe riding

Publications (1)

Publication Number Publication Date
WO2024038076A1 true WO2024038076A1 (en) 2024-02-22

Family

ID=84546366

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/072529 WO2024038076A1 (en) 2022-08-17 2023-08-16 Method and system to alert a rider on a vehicle based on rider posture for safe riding

Country Status (2)

Country Link
GB (1) GB202211986D0 (en)
WO (1) WO2024038076A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2026287A2 (en) * 2007-06-18 2009-02-18 Kawasaki Jukogyo Kabushiki Kaisha Event data recorder, motorcycle, and information recording method
EP3640917A1 (en) * 2017-06-12 2020-04-22 Robert Bosch GmbH Processing unit and processing method for intervehicular distance warning system, intervehicular distance warning system, and motorcycle

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2026287A2 (en) * 2007-06-18 2009-02-18 Kawasaki Jukogyo Kabushiki Kaisha Event data recorder, motorcycle, and information recording method
EP3640917A1 (en) * 2017-06-12 2020-04-22 Robert Bosch GmbH Processing unit and processing method for intervehicular distance warning system, intervehicular distance warning system, and motorcycle

Also Published As

Publication number Publication date
GB202211986D0 (en) 2022-09-28

Similar Documents

Publication Publication Date Title
US10445559B2 (en) Methods and systems for warning driver of vehicle using mobile device
US10684131B2 (en) Method and system for generating and updating vehicle navigation maps with features of navigation paths
EP3376327B1 (en) Method of controlling an autonomous vehicle and a collision avoidance device thereof
EP3527947A1 (en) Method for generating a safe navigation path for a vehicle and a system thereof
US20180173974A1 (en) Method for detecting driving behavior and system using the same
EP3547061B1 (en) Method and system for generating a safe navigation path in real-time for navigating a vehicle
US20200264620A1 (en) Method and system for determining drivable road regions for safe navigation of an autonomous vehicle
US11181919B2 (en) Method and system for determining an optimal trajectory for navigation of an autonomous vehicle
US20190152278A1 (en) Tire pressure positioning method and apparatus
CN111932046A (en) Method for processing risk in service scene, computer equipment and storage medium
WO2024038076A1 (en) Method and system to alert a rider on a vehicle based on rider posture for safe riding
US10275918B2 (en) Method and system for detecting virtual reality sickness causing content in virtual reality motion simulators
CN111862386A (en) Accident recording method, device, medium and server for vehicle
US11294042B2 (en) Method and system for detecting presence of partial visual fault in Lidar sensor of vehicle
US11218547B2 (en) Method and system for delivering dynamically created content to HMI of a vehicle
US20210304424A1 (en) Method of stitching images captured by a vehicle, and a system thereof
US11498586B2 (en) Method and system for dynamically generating a secure navigation path for navigation of autonomous vehicle
US10859389B2 (en) Method for generation of a safe navigation path for a vehicle and system thereof
JP6630741B2 (en) Vehicle roll angle estimation system, vehicle, vehicle roll angle estimation method and program
US9482538B1 (en) Method and system for optimally localizing vehicles in a parking environment
Schnee et al. A probabilistic approach to online classification of bicycle crashes
JP2019040244A (en) Driving support device
EP3486023B1 (en) Method and system for performing laser marking
US20210366269A1 (en) Method and apparatus for alerting threats to users
KR20210082723A (en) Driver monitoring system for using interest area

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23765442

Country of ref document: EP

Kind code of ref document: A1