US20220406072A1 - Obstacle detection and notification for motorcycles - Google Patents

Obstacle detection and notification for motorcycles Download PDF

Info

Publication number
US20220406072A1
US20220406072A1 US17/822,571 US202217822571A US2022406072A1 US 20220406072 A1 US20220406072 A1 US 20220406072A1 US 202217822571 A US202217822571 A US 202217822571A US 2022406072 A1 US2022406072 A1 US 2022406072A1
Authority
US
United States
Prior art keywords
motorcycle
threat
looking camera
rider
tactile
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/822,571
Inventor
Christopher L. Oesterling
David H. Clifford
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GM Global Technology Operations LLC
Original Assignee
GM Global Technology Operations LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by GM Global Technology Operations LLC filed Critical GM Global Technology Operations LLC
Priority to US17/822,571 priority Critical patent/US20220406072A1/en
Assigned to GM Global Technology Operations LLC reassignment GM Global Technology Operations LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CLIFFORD, DAVID H, OESTERLING, CHRISTOPHER L
Publication of US20220406072A1 publication Critical patent/US20220406072A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62JCYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
    • B62J50/00Arrangements specially adapted for use on cycles not provided for in main groups B62J1/00 - B62J45/00
    • B62J50/20Information-providing devices
    • B62J50/21Information-providing devices intended to provide information to rider or passenger
    • B62J50/22Information-providing devices intended to provide information to rider or passenger electronic, e.g. displays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/24Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62JCYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
    • B62J45/00Electrical equipment arrangements specially adapted for use as accessories on cycles, not otherwise provided for
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/75Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/78Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/783Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • G06F16/7837Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using objects detected or recognised in the video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • H04W4/027Services making use of location information using location based information parameters using movement velocity, acceleration information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/20Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
    • B60R2300/202Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used displaying a blind spot scene on the vehicle part responsible for the blind spot
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8086Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for vehicle path indication
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8093Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning

Definitions

  • the technical field generally relates to obstacle detection and notification for motorcycles, and more particularly relates to use of computer vision and machine learning to provide feedback of potential obstacles to a rider.
  • ADAS Automotive Advanced Driver Assistance Systems
  • ADAS Advanced Driver Assistance Systems
  • motorcycles have very limited space to place ADAS.
  • Providing alerts to motorcycle riders is also a challenge, as the riders wear a helmet, and operate in a noisy environment that is affected by wind, engine noise, etc.
  • the viewing angle of a motorcycle rider wearing a helmet is limited, and placing visual indicators (such as a display for providing visual indications) on the motorcycle itself is challenging in terms of its positioning on the motorcycle at a location that is visible to the rider when riding the motorcycle.
  • visual indicators such as a display for providing visual indications
  • motorcycles behave differently than cars, their angles (e.g. lean angle) relative to the road shift much quicker and more dramatically than car angles with respect to the road, especially when the motorcycle leans, accelerates and brakes.
  • an obstacle detection and notification system for a motorcycle comprises: at least one of a forward looking camera and a backward looking camera mountable to the motorcycle; at least one processor in operable communication with the at least one of the forward looking camera and the backward looking camera, the at least one processor configured to execute program instructions, wherein the program instructions are configured to cause the at least one processor to execute processes including: receiving video from the at least one of the forward looking camera and the backward looking camera; performing a computer vision and machine learning based object detection and tracking process to detect, classify and track obstacles in the video and to output detected object data; and outputting audible, tactile or visual feedback, via an output system of the motorcycle, to a rider of the motorcycle based on the detected object data, wherein the audible, tactile or visual feedback is directional in that a direction of a threat is indicated.
  • visual feedback is output to the rider.
  • the system includes a rider lighting device in the form of a plurality of lights, wherein individual lights are lit to indicate directionality of the threat relative to the motorcycle.
  • the plurality of lights is arranged in a ring.
  • the plurality of lights emit different colors depending on immediacy of a threat.
  • the audible, tactile or visual feedback differentiates direction and severity of the threat.
  • the audible, tactile or visual feedback indicates type of threat including the motorcycle being in a blind spot of a vehicle, a vehicle being in a blind spot of the motorcycle, road objects and fast approaching vehicles.
  • a method of obstacle detection and notification for a motorcycle includes: receiving video from at least one of a forward looking camera and a backward looking camera; performing a computer vision and machine learning based object detection and tracking process to detect, classify and track obstacles in the video and to output detected object data; and outputting audible, tactile or visual feedback, via an output system of the motorcycle, to a rider of the motorcycle based on the detected object data, wherein the audible, tactile or visual feedback is directional in that a direction of a threat is indicated.
  • visual feedback is output to the rider.
  • the output system comprises a rider lighting device in the form of a plurality of lights, wherein individual lights are lit to indicate directionality of the threat relative to the motorcycle.
  • the plurality of lights is arranged in a ring.
  • the plurality of lights emit different colors depending on immediacy of a threat.
  • the audible, tactile or visual feedback differentiates direction and severity of the threat.
  • the audible, tactile or visual feedback indicates type of threat including the motorcycle being in a blind spot of a vehicle, a vehicle being in a blind spot of the motorcycle, road objects and fast approaching vehicles.
  • a motorcycle in a further aspect, includes: at least one of a forward looking camera and a backward looking camera; at least one processor in operable communication with the at least one of the forward looking camera and the backward looking camera, the at least one processor configured to execute program instructions, wherein the program instructions are configured to cause the at least one processor to execute processes including: receiving video from the at least one of the forward looking camera and the backward looking camera; performing a computer vision and machine learning based object detection and tracking process to detect, classify and track obstacles in the video and to output detected object data; and outputting audible, tactile or visual feedback, via an output system of the motorcycle, to a rider of the motorcycle based on the detected object data, wherein the audible, tactile or visual feedback is directional in that a direction of a threat is indicated.
  • visual feedback is output to the rider.
  • the motorcycle comprises a rider lighting device in the form of a plurality of lights, wherein individual lights are lit to indicate directionality of the threat relative to the motorcycle.
  • the plurality of lights is arranged in a ring.
  • the plurality of lights emit different colors depending on immediacy of a threat.
  • the audible, tactile or visual feedback differentiates direction and severity of the threat.
  • FIG. 1 is a functional block diagram of a motorcycle that includes an obstacle detection and notification system, in accordance with an exemplary embodiment
  • FIG. 2 is a functional block diagram of the obstacle detection and notification system of FIG. 1 , in accordance with an exemplary embodiment
  • FIG. 3 is a rider lighting display device included in the obstacle detection and notification system of FIGS. 1 and 2 , in accordance with an exemplary embodiment
  • FIG. 4 is a flowchart of a method for implementing obstacle detection and notification including reverse blind spot detection, which can be used in connection with the motorcycle of FIG. 1 and the obstacle detection and notification system of FIG. 2 , in accordance with an exemplary embodiment;
  • FIG. 5 is a flowchart of a method for implementing obstacle detection and notification including use of a collision risk heat map, which can be used in connection with the motorcycle of FIG. 1 and the obstacle detection and notification system of FIG. 2 , in accordance with an exemplary embodiment;
  • FIG. 6 is a flowchart of a method for implementing obstacle detection and notification including performing animal detection using thermal imaging, which can be used in connection with the motorcycle of FIG. 1 and the obstacle detection and notification system of FIG. 2 , in accordance with an exemplary embodiment.
  • module refers to an application specific integrated circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and memory that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
  • ASIC application specific integrated circuit
  • processor shared, dedicated, or group
  • memory that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
  • FIG. 1 illustrates a motorcycle 100 according to an exemplary embodiment.
  • the motorcycle 100 includes an obstacle detection and notification system 200 (ODNS) including dual camera machine visioning (MV) with artificial intelligent technology (AI) to detect and predict obstacles and provide feedback to a rider of the motorcycle 100 .
  • ODNS obstacle detection and notification system 200
  • MV dual camera machine visioning
  • AI artificial intelligent technology
  • the motorcycle 100 includes, in addition to the above-referenced ODNS 200 , a body 114 and two wheels 116 .
  • the rider wears a helmet 102 , which may be communicatively coupled to the ODNS 200 , as described further below.
  • the body 114 includes an engine (not shown), a braking system (not shown) and handles (not shown) for steering a front wheel.
  • the engine comprises a combustion engine.
  • the engine is an electric motor/generator, instead of, or in addition to, the combustion engine.
  • the engine is coupled to at least one of the wheels 116 through one or more transmission systems.
  • the braking system provides braking for the motorcycle 100 .
  • the braking system receives inputs from the driver via a brake pedal (not depicted) or a brake lever and provides appropriate braking via brake units (also not depicted).
  • the driver also provides inputs via an accelerator handle (not depicted) as to a desired speed or acceleration of the motorcycle 100 .
  • the motorcycle 100 includes one or more cameras 210 , 212 as part of a computer vision system.
  • the one or more cameras 210 , 212 can include a forward-looking camera 210 to capture an external scene ahead of the motorcycle 100 and a backward-looking camera 212 to capture an external scene behind the motorcycle 100 .
  • the forward-looking camera(s) 210 can be positioned above a motorcycle headlight, beneath the motorcycle headlight, within the motorcycle headlight (e.g. if it is integrated thereto during the manufacturing thereof), or in any other manner that provides the forward-looking camera(s) with a clear view to the area in front of the motorcycle 100 .
  • the backward-looking camera(s) 212 can be positioned above a motorcycle rear light, beneath the motorcycle rear light, within the motorcycle rear light (e.g. if it is integrated thereto during the manufacturing thereof), or in any other manner that provides the backward-looking camera(s) 212 with a clear view to the area in the back of the motorcycle 100 .
  • the cameras 210 , 212 may be wide angled cameras capable of viewing any angle above 60°, 90°, or even in the range of 130° to 175° or more of a forward scene or backward scene.
  • the cameras 210 , 212 may be monocular cameras and may provide at least RGB (Red, Green, Blue) video (made up of frames of image data). In other embodiments, the cameras 210 , 212 are stereoscopic cameras.
  • the cameras 210 , 212 include thermal imaging (or infrared) capabilities.
  • the forward-looking camera(s) 210 and the backward-looking camera(s) 212 can have a resolution of at least two Mega-Pixel (MP), and in some embodiments at least five MP.
  • the forward-looking camera(s) 210 and the backward-looking camera(s) 212 can have a frame rate of at least twenty Frames-Per-Second (FPS), and in some embodiments at least thirty FPS. Additional cameras may be included such as forward-looking and backward looking narrow angle cameras, which may have greater accuracy at larger ranges.
  • FIG. 1 shows the forward-looking camera(s) 210 and the backward-looking camera(s) 212
  • the motorcycle can include additional sensors including forward-looking and/or backward-looking radar device(s) 214 (as shown in FIG. 2 ), a plurality of laser range finders, or any other sensor that can support obstacle detection and prediction.
  • the ODNS 200 includes a controller 204 , an output system 206 , the forward and backward looking cameras 210 , 212 , the radar device 214 , a cellular connectivity device 216 , a GPS device 218 , a local communications device 252 and an enhanced map database 254 , in an exemplary embodiment.
  • the ODNS 200 monitors a surrounding area of the motorcycle 100 for proximity events including at least one of: vehicles in a rider's blind spots, rider is possibly in other vehicles blind spots (reverse blind spot), common road objects (e.g. approaching an unexpected stopped vehicle, pedestrians, etc.), unusual road objects (e.g. desert animals laying on warm road at nighttime), and fast approaching vehicles from behind.
  • the ODNS 200 may predict potential collisions (e.g. a vehicle unexpectedly pulling out in front of the motorcycle 100 ) using georeferenced high-risk motorcycle collision locations obtained from previous collision data via a telematics feed. Various outputs may be provided to both the rider and external vehicles such as activating a rear brake light on the motorcycle when a vehicle is fast approaching the motorcycle 100 , for example.
  • the ODNS 200 in one example, provides these functions in accordance with the methods 400 , 500 and 600 described further below in connection with FIGS. 4 to 6 .
  • the ODNS 200 includes hardware to be installed onto the motorcycle and associated software embodied in the controller 204 that controls the functions described herein.
  • the ODNS 200 may be installed on the motorcycle 100 by technicians as a retrofit or during manufacturing of the motorcycle 100 . Some elements of the ODNS 200 may be included in a rider's mobile telecommunications device such as the controller 204 (or part thereof), the rider lighting display device 220 and the video display device 228 .
  • the rider light display device 220 would be graphically presented on a display of the mobile telecommunications device rather than through LEDs as in the integrated hardware system described further herein.
  • the controller 204 is coupled to the cameras 210 , 212 , the radar device 214 , the cellular connectivity device 216 , the GPS device 218 , the enhanced map database 254 , the local communications device 252 and the output system 206 .
  • the controller 204 receives video data from the cameras 210 , 212 and, based thereon, performs computer vision based and machine learning based object detection and tracking, reverse blind spot detection, collision risk prediction, obstacle proximity detection and other operations described further herein.
  • the controller 204 provides rider feedback concerning detected obstacles or potential obstacles,
  • the controller 204 may additionally provide feedback to other vehicles and potentially also to electronically controlled components of the motorcycle 100 so as to implement, for example, automatic braking, automatic throttle control, automatic gear shifting, etc.
  • the controller 204 can be located under a seat of the motorcycle 100 , but can alternatively be located in other places in a motorcycle such as behind a display panel between the handles of the motorcycle 100 .
  • the controller 204 can be connected to a battery of the motorcycle 100 , or it can have its own power supply.
  • the controller 204 comprises a computer system.
  • the computer system of the controller 204 includes a processor 230 , a memory 232 , a storage device 236 , and a bus 238 .
  • the processor 230 performs the computation and control functions of the controller 204 , and may comprise any type of processor or multiple processors, single integrated circuits such as a microprocessor, or any suitable number of integrated circuit devices and/or circuit boards working in cooperation to accomplish the functions of a processing unit.
  • the processor 230 executes one or more programs 240 contained within the memory 232 and, as such, controls the general operation of the controller 204 and the computer system of the controller 204 , generally in executing the processes described herein, such as the methods 400 , 500 , 600 described further below in connection with FIGS. 4 to 6 .
  • the one or more computer programs 240 include at least an object detection and tracking module 242 , an obstacle proximity detection module 246 , a reverse blind spot detection module 244 and a collision risk prediction module 248 for performing steps of the methods 400 , 500 , 600 described in detail below.
  • the processor 230 is capable of executing one or more programs (i.e., running software) to perform various tasks encoded in the program(s), particularly the object detection and tracking module 242 , the obstacle proximity detection module 246 , the reverse blind spot detection module 244 and the collision risk prediction module 248 .
  • the processor 230 may be a microprocessor, microcontroller, application specific integrated circuit (ASIC) or other suitable device as realized by those skilled in the art.
  • the memory 232 can be any type of suitable memory. This would include the various types of dynamic random access memory (DRAM) such as SDRAM, the various types of static RAM (SRAM), and the various types of non-volatile memory (PROM, EPROM, and flash). In certain examples, the memory 232 is located on and/or co-located on the same computer chip as the processor 230 .
  • DRAM dynamic random access memory
  • SRAM static RAM
  • PROM EPROM
  • flash non-volatile memory
  • the memory 232 is located on and/or co-located on the same computer chip as the processor 230 .
  • the bus(es) 238 serves to transmit programs, data, status and other information or signals between the various components of the computer system of the controller 204 and between the various hardware components including the output system 206 , forward and backward looking cameras 210 , 212 , the cellular connectivity device 216 , the GPS device 218 , the local communications device 252 and the enhanced map database 254 .
  • the bus(es) 238 can be any suitable physical or logical means of connecting computer systems and components. This includes, but is not limited to, direct hard-wired connections, fiber optics, infrared and wireless bus technologies.
  • the storage device 236 can be any suitable type of storage apparatus, including direct access storage devices such as hard disk drives, flash systems, floppy disk drives and optical disk drives.
  • the storage device 236 comprises a program product from which memory 232 can receive a program 240 (including computer modules 242 , 244 , 246 , 248 ) that executes one or more embodiments of one or more processes of the present disclosure, such as the steps of the methods 400 , 500 and 600 (and any sub-processes thereof).
  • the program product may be directly stored in and/or otherwise accessed by the memory 232 and/or a disk (e.g., disk), such as that referenced below.
  • the enhanced map database 254 may be stored on the memory 232 .
  • signal bearing media examples include: recordable media such as floppy disks, hard drives, memory cards and optical disks, and transmission media such as digital and analog communication links. It will similarly be appreciated that the computer system of the controller 204 may also otherwise differ from the embodiment depicted in FIG. 2 , for example in that the computer system of the controller 204 may be coupled to or may otherwise utilize one or more remote computer systems and/or other control systems.
  • the output system 206 includes at least one of: a rider lighting display device 220 , an external lighting system 222 , a rider speaker 224 , a tactile feedback device 226 , a video display device 228 and an external speaker 256 .
  • the output system 206 is responsive to output data from the controller 204 to provide visual, audible or tactile feedback to a rider of the motorcycle 100 or to a driver of an external vehicle.
  • the controller 204 commands an alert to a rider of the motorcycle 100 in order to enable the rider to perform measures to eliminate or reduce any risk.
  • the alerts can be provided in any manner that can be sensed by a rider of the motorcycle 100 .
  • the alert can be visual provided via the rider lighting display device 220 , tactile via the tactile feedback device 226 and/or audible via the rider speaker 224 .
  • Some parts of the output system 206 may be included in the helmet 102 such as the rider speaker 224 , the tactile feedback device 226 and/or the rider lighting display device 220 .
  • the local communications device 252 allows the controller 204 to send data to the helmet 102 through any suitable local communications protocol such as Bluetooth or WiFi. In one embodiment, the local communications device 252 is facilitated through a local communications capability of a rider's mobile telecommunications device.
  • the rider lighting display device 220 of one example embodiment is shown in FIG. 3 .
  • the rider lighting display device 220 includes a plurality of light emitters 302 that can be activated to indicate presence and directionality of an obstacle or a potential obstacle.
  • the rider lighting display device 220 includes a plurality of light emitters 302 arranged in a ring around a motorcycle orientation reference graphic 304 .
  • more or less light emitters 302 could be provided.
  • four light emitters 304 could be included to provide four degrees of alert directionality including forward, backward, left and right.
  • the light emitters 302 may be LEDs in one embodiment.
  • the controller 204 may output data indicating directionality and severity of a collision threat and command the rider lighting display device 220 accordingly.
  • the light emitters 302 can be controlled to emit different colors depending on the threat level such as red for server alert, orange for high alert, yellow elevated alert and green for low alert or some subset of two or three of these alert levels. Other arrangements of light emitters 302 that allow directionality and threat level severity are possible such as light strips arranged on mirrors of the motorcycle 100 .
  • the rider lighting display device 220 may also be controlled to differentiate a type of threat such as having different flashing frequencies for different threat types.
  • the rider lighting display device 222 could be located in a display panel between the handles of the motorcycle 100 or could be projected onto or displayed by a shield of the helmet 102 .
  • the external lighting system 222 includes existing or additional lights of the motorcycle 100 to indicate to drivers of external vehicles that they are a collision threat to the motorcycle when such a determination is made by the controller 204 .
  • the external lighting system 222 provides notifications to drivers of other vehicles through a rear brake light (such as an LED) on the motorcycle 100 when, for example, the controller 204 determines that a vehicle is fast approaching the motorcycle from behind.
  • the motorcycle rider may get a notification via the rider lighting display device 220 and the driver of the vehicle receives a notification via the rear brake light or other rear light of the external lighting system 222 .
  • Front and/or side external lights could also be included as additional lights or as part of the existing lights of the motorcycle 100 to alert external drivers in front of the motorcycle 100 and to the sides of the motorcycle 100 .
  • the external speaker 256 which may be an integrated horn of the motorcycle 100 or an additional device, may additionally or alternatively provide a warning to external vehicles or humans of a collision threat with the motorcycle when such a collision threat has been determined by the controller 204 . More than one external speaker 256 could be arranged around the motorcycle 100 to allow for directionality in the audible warning such as front, rear, left side and right side external speakers 256 .
  • the warning notification can be a vibration provided to the rider of the motorcycle 100 via one or more vibrating or other tactile elements included in the tactile feedback device 226 causing vibration felt by the rider of the motorcycle 100 .
  • the vibration can be adjusted in accordance with the threat severity, so that the higher the risk, the stronger the vibration.
  • the vibration elements may additionally or alternatively be integrated into a jacket worn by the rider of the motorcycle 100 , into the seat of the motorcycle 100 , or into a helmet 102 worn by the rider of the motorcycle 100 .
  • the alert is provided through the rider speaker 224 .
  • the alert is provided as a sound notification to the rider of the motorcycle 100 via the one or more rider speakers 224 .
  • the rider speakers 224 can be integrated into the helmet 102 of the rider, or any other speakers that generate sounds that can be heard by the rider of the motorcycle 100 .
  • the sound notification can be a natural language voice notification, providing information of the specific threat type and/or severity identified and/or the direction of the threat.
  • the volume can be adjusted in accordance with the risk severity, so that the higher the risk, the higher the volume.
  • the video display device 228 provides a live video feed from the forward looking camera 210 and/or the backward looking camera 212 .
  • the live video feed may provide a focused area of the total video data based on a direction of the threat.
  • the live video feed may be supplemented with graphical indications of any collision threats distinguishing different types of threats, different threat levels and the direction of the threat as determined by the controller 204 .
  • video from the forward and backward looking cameras 210 , 212 is recorded in the storage device 236 .
  • the various possible output options of the output system 206 described above may be provided alone or in any combination. Having described the output system 206 and some example audible, visual or tactile feedback mechanisms to threat severity and directions determined by the controller 204 , a more detailed description of the software operations of the controller will be provided.
  • the object detection and tracking module 242 can be implemented through a number of object detection and tracking algorithms.
  • the object detection and tracking module 242 receives video data from the forward and backward looking cameras 210 , 212 and runs the video data, or a derivative thereof, through a machine learning algorithm to classify and localize obstacles that the machine learning algorithm is trained to detect.
  • the machine learning algorithm may include a Convolutional Neural Network (CNN) or other neural network.
  • CNN Convolutional Neural Network
  • YOLO You Only Look Once
  • the object detection part of the object detection and tracking module 242 provides detected object data including bounding box size, location and classification information. Various obstacle classifications are possible including vehicle, pedestrian, cyclist, non-human animal, etc.
  • the object tracking part of the object detection and tracking module 242 tracks a detected object over time (plural frames of video data) in order to derive velocity and acceleration information for tracked objects and to predict the obstacle's path.
  • an extended Kalman filter using a motion model for the tracked object can be included in the object tracking part.
  • the object detection and tracking module 242 uses intrinsic and extrinsic camera parameters and possibly also motion parameters from an Inertial Measurement Unit (not shown) or other motion sensors of the motorcycle 100 to provide detected object data in real world coordinates in a coordinate frame relative to the motorcycle 100 .
  • the object detection and tracking module 242 is thus able to output location, velocity, acceleration, path projection and classification data for each detected object in forward and backward looking scenes.
  • This data is included in detected object data provided to the obstacle proximity detection module 246 and the reverse blind spot detection module 244 .
  • the object detection and tracking module 242 has been described at a relatively high level for the purposes of conciseness. It should be appreciated that a number of object detection and tracking applications are available in the literature that receive video data and use computer vision and machine learning processing to classify and track detected objects.
  • the obstacle proximity detection module 246 receives the detected object data and data on the motion of the motorcycle 100 from the IMU or from other motion sensors such as a wheel speed sensor. In this way, the obstacle proximity detection module 246 is able to project the path of the motorcycle 100 and the projected paths of moving obstacles to determine whether there is any collision threat or any potential spatial overlap with detected stationary obstacles.
  • a collision threat may be determined by a projected collision occurring in less than a first predetermined time threshold. In some embodiments, a plurality of different time thresholds may be used so as to define different threat levels.
  • the mutual motion projections between the motorcycle 100 and the various moving obstacles can be compared to determine a directionality of the threat by determining a direction relative to the motorcycle 100 that an obstacle is travelling.
  • the directionality can be determined based on a relative location between the motorcycle 100 and the detected location of the stationary object.
  • the obstacle proximity and detection module 246 can output collision threat data that is indicative of collision threat severity level and directionality, which can be included in output data for output system 206 to activate various output devices as described above.
  • the obstacle proximity and detection module 246 may additionally distinguish detected threat types in the output data.
  • the obstacle proximity detection module 246 detects a plurality of kinds of proximity events including vehicles in a rider's blind spots, a rider is possibly in other vehicles blind spots (reverse blind spot) as described further below, common road objects (e.g. approaching an unexpected stopped vehicle, pedestrians, etc.), unusual road objects (e.g. desert animals laying on warm road at nighttime) as described further herein, fast approaching vehicles from behind, potential collisions (e.g. a vehicle may unexpectedly pull out in front of the rider but has not yet) as described further herein, etc.
  • common road objects e.g. approaching an unexpected stopped vehicle, pedestrians, etc.
  • unusual road objects e.g. desert animals laying on warm road at nighttime
  • fast approaching vehicles from behind e.g. a vehicle may unexpectedly pull out in front of the rider but has not yet
  • the object detection and tracking module 242 is trained to detect non-human animals based on thermal imaging received from the forward-looking camera 210 , the backward looking camera 212 or other forward or backward looking camera particularly suited to thermal imaging. Such an embodiment is designed to detect non-human animals in low visibility conditions such as fog and nighttime.
  • the obstacle proximity detection module 246 receives the detected non-human animal data and responsively outputs a collision threat based on time to potential collision with the non-human animal, which will determine a threat level, and a direction of the threat.
  • the output system 206 responsively outputs an indication of threat level and directionality and optionally also type of threat (e.g.
  • the non-human animal is a snake. Snakes and other desert animals can present a particular danger to motorcycles because they often lay on warm roads at nighttime.
  • the reverse blind spot detection module 244 receives the detected object data from the object detection and tracking module 244 .
  • the reverse blind spot detection module 244 determines one or more blind spot regions for one or more detected vehicles.
  • a blind spot in a vehicle is an area around the vehicle that cannot be directly observed by the driver while at the controls.
  • a blind spot may occur behind the side window at a location that is also not visible in the side view mirrors.
  • motorcycles are narrower than cars and are more liable to being wholly located within a vehicle's blind spot.
  • the reverse blind spot detection module 244 may retrieve a predetermined blind spot region and connect it to a detected vehicle at a location where a blind spot would be.
  • the predetermined blind spot region may be in image coordinates and scaled and rotated in image space according to a distance and orientation between the vehicle and the motorcycle 100 , which can be performed based on camera intrinsic and extrinsic parameters.
  • the predetermined blind spot region is provided in real world coordinates and the detected vehicles are transformed into real world space so that the predetermined blind spot regions can be connected thereto.
  • the object detection and tracking module 244 is trained to detect vehicles and additionally side view mirrors on the vehicles, which can additionally support accurate location of the blind spot region attached based on a bounding box around the side view mirrors. Alternatively, an average position relative to a bounding box around the vehicle could be used to connect the blind spot region.
  • the predetermined blind spot region is enlarged based on a relative speed between the other vehicle and the motorcycle 100 .
  • the relative speed is known from the detected object data output from the object detection and tracking module 242 as described above. In this way, the faster the closing speed, the more likely that a reverse blind spot detection is made to reflect that the driver of the other vehicle would have less time to check the side mirrors and spot the motorcycle 100 .
  • the reverse blind spot detection module 244 may compare a location of the motorcycle 100 with the blind spot region(s) to determine whether the motorcycle is located within the blind spot region of any other vehicle.
  • the location of the motorcycle 100 can be obtained by the GPS device 218 or from localization based on computer vision processing of the video data from the forward and backward looking cameras 210 , 212 .
  • the reverse blind spot detection module 244 can compare a path trajectory of the motorcycle 100 , which can be determined based on a motion model for the motorcycle 100 , and location, acceleration and velocity data for the motorcycle 100 .
  • the location, acceleration and velocity data for the motorcycle is obtained from GPS device 218 and/or motion sensors of the motorcycle 100 .
  • the reverse blind spot detection module 244 Based on whether the location of the motorcycle 100 is within a blind spot region or is projected to be within the blind sport region within a predetermined amount of time, the reverse blind spot detection module 244 provides output data indicating a reverse blind spot threat and a directionality thereof.
  • the reverse blind spot detection module 244 may additionally output a threat level based on the proximity of the vehicle and the motorcycle in its blind sport region or based on their relative speeds.
  • the output system 206 outputs visual, tactile or audible feedback, which identifies direction, severity and optionally also type of threat as described above.
  • the collision risk prediction module 248 predicts locations and times when there is an added risk of collision with an obstacle using crowd data regarding motorcycle, and possibly other, vehicle accidents or near accidents included in a collision risk map layer 270 of the enhanced map database 254 .
  • the collision risk map layer 270 is a map layer that is regularly updated with accident or near accident information through the cellular connectivity device 216 .
  • the cellular connectivity device 216 can be a 4G or 5G data communications device, for example. In some embodiments, the cellular connectivity device 216 is facilitated through a rider's mobile telecommunications device.
  • the collision risk map layer 270 provides georeferenced and time referenced crowd data on where collisions or near collisions have occurred.
  • the collision risk map layer 270 distinguishes between motorcycle accidents or near accidents and those of other vehicles since the threats to a motorcycle can be different to those to other types of vehicle and the collision risk prediction module 248 operates on the motorcycle specific data.
  • the collision risk map layer 270 may reflect a greater probability of collisions at a bar driveway location and time of day (e.g. after happy hour).
  • the collision risk prediction module 248 filters the collision risk map layer 270 using the current time and current location of the motorcycle 100 , which is known from the location data provided by the GPS device 218 so as to determine upcoming relatively high risk locations (e.g. locations where the collision risk is predicted to be greater than a threshold at the current time).
  • the collision risk prediction module 248 may provide output data to the output system 206 indicating the directionality, a threat level and optionally a type of threat.
  • the collision risk prediction module 248 may provide the output data to the obstacle proximity detection module 246 and/or the object detection and tracking module 242 .
  • the obstacle proximity detection module 246 and/or the object detection and tracking module 242 is responsive to the collision risk indication and location in the output data to increase a frequency of, or activate, the object detection and obstacle proximity detection processes and/or to spatially focus the object detection and obstacle proximity processes based on the location of the collision risk.
  • the obstacle proximity detection module 246 can increase in sensitivity in response to the collision risk data from the collision risk prediction module so as to indicate a higher threat level or to lower proximity thresholds so as to more readily provide output data to the output system 206 describing a collision threat.
  • the ODNS is placed on high alert and is specifically focused when the collision risk map layer 270 predicts an upcoming (e.g. within set a range of the forward looking and backward looking cameras 210 , 212 ) collision risk.
  • the controller 204 determines accident conditions based on high acceleration information being obtained from the IMU (not shown) or the GPS device 218 .
  • the controller 204 can ascertain from the acceleration information whether the motorcycle has been subjected to an impact or an emergency stop.
  • the controller 204 reports such accident conditions along with location, time and optionally date information, which can be obtained from the GPS device 218 , to a remote map server (not shown) through the cellular connectivity device 216 .
  • a crowd of motorcycles will similarly report accident conditions, allowing the remote map server to create a continually updating collision risk map layer 270 that is periodically pushed to the motorcycle 100 via the cellular connectivity device 216 .
  • the location and time of non-human animals are reported to the remote map server as an accident condition for motorcycles.
  • the collision risk map layer 270 can integrate crowd sourced times and locations for non-human animals being on the road. This information could alternatively be included in a different map layer.
  • the collision risk prediction module 248 can factor in high risk of non-human animals being on the road at certain locations and at certain times of the day and provide output data representing the collision risk to the output system 206 and the object detection and tracking module 242 and the obstacle proximity detection module 246 . In this way, object detection and tracking processing can be activated or increased in frequency when there is a collision risk above a predetermined threshold according to the non-human animal data in the collision risk map layer.
  • the output system 206 can provide an audible, tactile or visual alert concerning the collision risk, which can identify directionality and optionally also type of non-human animal (e.g. a snake graphic).
  • the obstacle proximity detection module 246 may increase in sensitivity (as described above) when there is a high risk of collision with a non-human animal on the road.
  • the ODNS 200 is in operable communication with an application on a rider's mobile telecommunications device.
  • the rider's mobile telecommunications device with the designed application may be used for configuration of the various motorcycle module settings (e.g. warning thresholds, preferred alert methods, etc.).
  • the application could also be used for pulling stored video from the cameras (action video, etc.) 210 , 212 or recorded video from the storage device 236 .
  • FIG. 4 is a flowchart of a method 400 for reverse blind spot detection, in accordance with one embodiment.
  • the method 400 can be implemented in connection with the motorcycle 100 of FIG. 1 and the ODNS 200 of FIG. 2 , in accordance with an exemplary embodiment.
  • the method 400 may be implemented continuously during motorcycle operation or be invoked or increased in regularity based on collision risk data from the collision risk prediction module 248 .
  • the method 400 includes the step 410 of receiving video data from the forward looking camera 210 and/or the backward looking camera. Object detection and tracking is performed on the video data to classify and localize other vehicles in step 430 .
  • a blind spot region is defined around each detected vehicle or each detected vehicle within a certain range of the motorcycle 100 .
  • the blind spot regions may be changed in size based on a relative speed between the motorcycle 100 and the other vehicle so as to be enlarged, the greater the relative speed.
  • a determination is made whether the motorcycle 100 is located in a blind spot region or is projected to be located in a blind spot region within a predetermined amount of time. When a determination has been made of a reverse blind spot threat, audible, visual or tactile feedback is provided to the rider of the motorcycle 100 warning of the reverse blind spot threat, the direction of the threat and optionally also identifying the type of threat.
  • FIG. 5 is a flowchart of a method 500 for collision risk determination, in accordance with one embodiment.
  • the method 500 can be implemented in connection with the motorcycle 100 of FIG. 1 and the ODNS 200 of FIG. 2 , in accordance with an exemplary embodiment.
  • the method 500 may be implemented continuously during motorcycle operation.
  • a collision risk is determined from collision risk data in a collision risk map layer 270 of the enhanced map database 254 .
  • the collision risk data is time referenced and georeferenced accident or near accident data reported from a crowd of motorcycles. Based on current time and current location of the motorcycle 100 , any collision risk information is extracted from the collusion risk map layer 270 and assessed for relevance based on the collision risk being above a predetermined threshold. An indication of the upcoming collision risk and the direction of the collision risk is output for further processing in step 560 .
  • step 520 video data is received from the forward looking camera 210 and/or the backward looking camera 212 .
  • object detection and tracking is performed based on the video data to classify and localize objects in the captured scene.
  • step 540 obstacle proximity detection is performed using the detected object data from step 530 including detecting vehicles in rider's blind spots, detecting common road objects (e.g. approaching an unexpected stopped vehicle, pedestrians, etc.), and detecting unusual road objects (e.g. desert animals laying on warm road at nighttime).
  • the obstacle proximity detection step 540 further determines directionality and immediacy of any proximity threat and outputs corresponding collision threat data.
  • step 550 audible, visual and/or tactile feedback is provided to the rider through the output system 206 based on the collision threat data to indicate to the rider the direction of the collision threat, the existence of the collision threat and optionally also the type of collision threat.
  • step 560 the object detection and tracking step 530 is adapted and/or the obstacle proximity detection step 560 is adapted and/or the rider feedback step 550 is adapted when collision risk data from step 510 is determined based on the collision risk map layer 270 . That is, steps 530 and 540 may be adapted so as to become active from a dormant state or steps 530 and 540 may be increased in frequency. Additionally, or alternatively, steps 530 and 540 may be adapted to spatially focus on a region of the video data in which a high collision risk has been determined. Additionally, or alternatively, the sensitivity of step 540 may be heightened by changing proximity thresholds used in detecting collision threats so as to more readily output an indication of a proximal collision threat.
  • the output system 206 may issue an audible, tactile and/or visual alert regarding a potential collision risk identified in step 510 including the direction and optionally also the type of collision risk (e.g. increased traffic from a bar at this time causing accident conditions, snakes on the road at this time, etc.).
  • a potential collision risk identified in step 510 including the direction and optionally also the type of collision risk (e.g. increased traffic from a bar at this time causing accident conditions, snakes on the road at this time, etc.).
  • FIG. 6 is a flowchart of a method 600 for detecting non-human animals on a road, in accordance with one embodiment.
  • the method 600 can be implemented in connection with the motorcycle 100 of FIG. 1 and the ODNS 200 of FIG. 2 , in accordance with an exemplary embodiment.
  • the method 600 may be implemented continuously during motorcycle operation or may become active or increased in frequency based on the collision risk data from the collision risk prediction module indicating potential for non-human animals on the road in the vicinity of the motorcycle 100 .
  • step 610 video data is received from a forward looking thermal camera and/or a backward looking thermal camera.
  • step 620 object detection and tracking are performed to localize and classify objects including non-human animals that will show up vividly in the thermal imaging.
  • the object detection and tracking processes may be trained to label non-human animals as a category or to specify one or more types of non-human animal such as snakes.
  • step 630 obstacle proximity detection is performed to detect proximal collision threats including collision threats with detected non-human animals.
  • step 640 the detected non-human animals are reported to a remote map server through the cellular connectivity device 216 .
  • step 650 a map update including crowd detected non-human animals is received.
  • the map update includes time and location for each detection.
  • the map update is included in the enhanced map database 254 possibly as part of the collision risk map layer.
  • the method 500 described above with respect to FIG. 5 takes into account georeferenced and time referenced non-human animals on the road when determining the collision risk data.
  • audible, tactile and/or visual rider feedback is provided to the non-human rider when step 630 determines a proximal collision threat because of a detected non-human animal on the road.
  • the rider feedback can indicate direction, severity (e.g. immediacy) and optionally the type of threat (e.g. distinguishing a non-human animal from other types of threat or specifying the detected type of non-human animal such as a snake).
  • the disclosed methods, systems, and motorcycles may vary from those depicted in the Figures and described herein.
  • the motorcycle 100 and the ODNS 200 and/or various components thereof may vary from that depicted in FIGS. 1 and 2 and described in connection therewith.
  • certain steps of the method 400 may vary from those depicted in FIGS. 4 , 5 and 6 and/or described above in connection therewith. It will similarly be appreciated that certain steps of the method described above may occur simultaneously or in a different order than that depicted in FIGS. 4 , 5 and 6 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Library & Information Science (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)

Abstract

An obstacle detection and notification system for a motorcycle. The system includes a forward looking camera and a backward looking camera mountable to the motorcycle and a processor in operable communication with the forward looking camera and the backward looking camera. The processor executes program instructions to execute processes including: receiving video from the of the forward looking camera and the backward looking camera, performing a computer vision and machine learning based object detection and tracking process to detect, classify and track obstacles in the video and to output detected object data and outputting directional audible, tactile or visual feedback, via an output system, to a rider of the motorcycle based on the detected object data.

Description

    INTRODUCTION
  • The technical field generally relates to obstacle detection and notification for motorcycles, and more particularly relates to use of computer vision and machine learning to provide feedback of potential obstacles to a rider.
  • Automotive Advanced Driver Assistance Systems (also known as “ADAS”) have become, in recent years, a standard in the car industry, inter alia due to the fact that safety is a main concern for car manufacturers. A primary concern of motorcycle riders is collisions with obstacles of any kind. There are some obstacles and situations that are of particular concern for motorcycle riders that would desirably be addressed by a suitable obstacle detection and notification system.
  • The motorcycle industry has not, generally, implemented ADAS features, which may be because of the relative cost of ADAS and a motorcycle and also because there are various difficulties that are specific to the motorcycle's environment. For example, motorcycles have very limited space to place ADAS. Providing alerts to motorcycle riders is also a challenge, as the riders wear a helmet, and operate in a noisy environment that is affected by wind, engine noise, etc. Furthermore, the viewing angle of a motorcycle rider wearing a helmet is limited, and placing visual indicators (such as a display for providing visual indications) on the motorcycle itself is challenging in terms of its positioning on the motorcycle at a location that is visible to the rider when riding the motorcycle. Still further, motorcycles behave differently than cars, their angles (e.g. lean angle) relative to the road shift much quicker and more dramatically than car angles with respect to the road, especially when the motorcycle leans, accelerates and brakes.
  • Accordingly, it is desirable to provide systems and methods for obstacle detection and notification for a motorcycle that are low in complexity and cost to implement on a motorcycle and that are able to provide enhanced situational awareness for a motorcycle rider to support avoiding collisions and accidents. Furthermore, other desirable features and characteristics of the present invention will become apparent from the subsequent detailed description and the appended claims, taken in conjunction with the accompanying drawings and the foregoing technical field and background.
  • SUMMARY
  • In an aspect, an obstacle detection and notification system for a motorcycle is provided. The system comprises: at least one of a forward looking camera and a backward looking camera mountable to the motorcycle; at least one processor in operable communication with the at least one of the forward looking camera and the backward looking camera, the at least one processor configured to execute program instructions, wherein the program instructions are configured to cause the at least one processor to execute processes including: receiving video from the at least one of the forward looking camera and the backward looking camera; performing a computer vision and machine learning based object detection and tracking process to detect, classify and track obstacles in the video and to output detected object data; and outputting audible, tactile or visual feedback, via an output system of the motorcycle, to a rider of the motorcycle based on the detected object data, wherein the audible, tactile or visual feedback is directional in that a direction of a threat is indicated.
  • In embodiments, visual feedback is output to the rider.
  • In embodiments, the system includes a rider lighting device in the form of a plurality of lights, wherein individual lights are lit to indicate directionality of the threat relative to the motorcycle.
  • In embodiments, the plurality of lights is arranged in a ring.
  • In embodiments, the plurality of lights emit different colors depending on immediacy of a threat.
  • In embodiments, the audible, tactile or visual feedback differentiates direction and severity of the threat.
  • In embodiments, the audible, tactile or visual feedback indicates type of threat including the motorcycle being in a blind spot of a vehicle, a vehicle being in a blind spot of the motorcycle, road objects and fast approaching vehicles.
  • In another aspect, a method of obstacle detection and notification for a motorcycle is provided. The method includes: receiving video from at least one of a forward looking camera and a backward looking camera; performing a computer vision and machine learning based object detection and tracking process to detect, classify and track obstacles in the video and to output detected object data; and outputting audible, tactile or visual feedback, via an output system of the motorcycle, to a rider of the motorcycle based on the detected object data, wherein the audible, tactile or visual feedback is directional in that a direction of a threat is indicated.
  • In embodiments, visual feedback is output to the rider.
  • In embodiments, the output system comprises a rider lighting device in the form of a plurality of lights, wherein individual lights are lit to indicate directionality of the threat relative to the motorcycle.
  • In embodiments, the plurality of lights is arranged in a ring.
  • In embodiments, the plurality of lights emit different colors depending on immediacy of a threat.
  • In embodiments, the audible, tactile or visual feedback differentiates direction and severity of the threat.
  • In embodiments, the audible, tactile or visual feedback indicates type of threat including the motorcycle being in a blind spot of a vehicle, a vehicle being in a blind spot of the motorcycle, road objects and fast approaching vehicles.
  • In a further aspect, a motorcycle is provided. The motorcycle includes: at least one of a forward looking camera and a backward looking camera; at least one processor in operable communication with the at least one of the forward looking camera and the backward looking camera, the at least one processor configured to execute program instructions, wherein the program instructions are configured to cause the at least one processor to execute processes including: receiving video from the at least one of the forward looking camera and the backward looking camera; performing a computer vision and machine learning based object detection and tracking process to detect, classify and track obstacles in the video and to output detected object data; and outputting audible, tactile or visual feedback, via an output system of the motorcycle, to a rider of the motorcycle based on the detected object data, wherein the audible, tactile or visual feedback is directional in that a direction of a threat is indicated.
  • In embodiments, visual feedback is output to the rider.
  • In embodiments, the motorcycle comprises a rider lighting device in the form of a plurality of lights, wherein individual lights are lit to indicate directionality of the threat relative to the motorcycle.
  • In embodiments, the plurality of lights is arranged in a ring.
  • In embodiments, the plurality of lights emit different colors depending on immediacy of a threat.
  • In embodiments, the audible, tactile or visual feedback differentiates direction and severity of the threat.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The exemplary embodiments will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and wherein:
  • FIG. 1 is a functional block diagram of a motorcycle that includes an obstacle detection and notification system, in accordance with an exemplary embodiment;
  • FIG. 2 is a functional block diagram of the obstacle detection and notification system of FIG. 1 , in accordance with an exemplary embodiment;
  • FIG. 3 is a rider lighting display device included in the obstacle detection and notification system of FIGS. 1 and 2 , in accordance with an exemplary embodiment;
  • FIG. 4 is a flowchart of a method for implementing obstacle detection and notification including reverse blind spot detection, which can be used in connection with the motorcycle of FIG. 1 and the obstacle detection and notification system of FIG. 2 , in accordance with an exemplary embodiment;
  • FIG. 5 is a flowchart of a method for implementing obstacle detection and notification including use of a collision risk heat map, which can be used in connection with the motorcycle of FIG. 1 and the obstacle detection and notification system of FIG. 2 , in accordance with an exemplary embodiment; and
  • FIG. 6 is a flowchart of a method for implementing obstacle detection and notification including performing animal detection using thermal imaging, which can be used in connection with the motorcycle of FIG. 1 and the obstacle detection and notification system of FIG. 2 , in accordance with an exemplary embodiment.
  • DETAILED DESCRIPTION
  • The following detailed description is merely exemplary in nature and is not intended to limit the application and uses. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, brief summary or the following detailed description. As used herein, the term module refers to an application specific integrated circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and memory that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
  • FIG. 1 illustrates a motorcycle 100 according to an exemplary embodiment. As described in greater detail further below, the motorcycle 100 includes an obstacle detection and notification system 200 (ODNS) including dual camera machine visioning (MV) with artificial intelligent technology (AI) to detect and predict obstacles and provide feedback to a rider of the motorcycle 100.
  • As depicted in FIG. 1 , the motorcycle 100 includes, in addition to the above-referenced ODNS 200, a body 114 and two wheels 116. The rider wears a helmet 102, which may be communicatively coupled to the ODNS 200, as described further below. The body 114 includes an engine (not shown), a braking system (not shown) and handles (not shown) for steering a front wheel. In one embodiment, the engine comprises a combustion engine. In other embodiments, the engine is an electric motor/generator, instead of, or in addition to, the combustion engine. Still referring to FIG. 1 , the engine is coupled to at least one of the wheels 116 through one or more transmission systems. The braking system provides braking for the motorcycle 100. The braking system receives inputs from the driver via a brake pedal (not depicted) or a brake lever and provides appropriate braking via brake units (also not depicted). The driver also provides inputs via an accelerator handle (not depicted) as to a desired speed or acceleration of the motorcycle 100.
  • Referring back to the exemplary embodiment of FIG. 1 , the motorcycle 100 includes one or more cameras 210, 212 as part of a computer vision system. The one or more cameras 210, 212 can include a forward-looking camera 210 to capture an external scene ahead of the motorcycle 100 and a backward-looking camera 212 to capture an external scene behind the motorcycle 100. The forward-looking camera(s) 210 can be positioned above a motorcycle headlight, beneath the motorcycle headlight, within the motorcycle headlight (e.g. if it is integrated thereto during the manufacturing thereof), or in any other manner that provides the forward-looking camera(s) with a clear view to the area in front of the motorcycle 100. The backward-looking camera(s) 212 can be positioned above a motorcycle rear light, beneath the motorcycle rear light, within the motorcycle rear light (e.g. if it is integrated thereto during the manufacturing thereof), or in any other manner that provides the backward-looking camera(s) 212 with a clear view to the area in the back of the motorcycle 100. The cameras 210, 212 may be wide angled cameras capable of viewing any angle above 60°, 90°, or even in the range of 130° to 175° or more of a forward scene or backward scene. The cameras 210, 212 may be monocular cameras and may provide at least RGB (Red, Green, Blue) video (made up of frames of image data). In other embodiments, the cameras 210, 212 are stereoscopic cameras. In some embodiments herein, the cameras 210, 212 include thermal imaging (or infrared) capabilities. The forward-looking camera(s) 210 and the backward-looking camera(s) 212 can have a resolution of at least two Mega-Pixel (MP), and in some embodiments at least five MP. The forward-looking camera(s) 210 and the backward-looking camera(s) 212 can have a frame rate of at least twenty Frames-Per-Second (FPS), and in some embodiments at least thirty FPS. Additional cameras may be included such as forward-looking and backward looking narrow angle cameras, which may have greater accuracy at larger ranges.
  • Although FIG. 1 shows the forward-looking camera(s) 210 and the backward-looking camera(s) 212, the motorcycle can include additional sensors including forward-looking and/or backward-looking radar device(s) 214 (as shown in FIG. 2 ), a plurality of laser range finders, or any other sensor that can support obstacle detection and prediction.
  • With additional reference to FIG. 2 , the ODNS 200 includes a controller 204, an output system 206, the forward and backward looking cameras 210, 212, the radar device 214, a cellular connectivity device 216, a GPS device 218, a local communications device 252 and an enhanced map database 254, in an exemplary embodiment. The ODNS 200 monitors a surrounding area of the motorcycle 100 for proximity events including at least one of: vehicles in a rider's blind spots, rider is possibly in other vehicles blind spots (reverse blind spot), common road objects (e.g. approaching an unexpected stopped vehicle, pedestrians, etc.), unusual road objects (e.g. desert animals laying on warm road at nighttime), and fast approaching vehicles from behind. The ODNS 200 may predict potential collisions (e.g. a vehicle unexpectedly pulling out in front of the motorcycle 100) using georeferenced high-risk motorcycle collision locations obtained from previous collision data via a telematics feed. Various outputs may be provided to both the rider and external vehicles such as activating a rear brake light on the motorcycle when a vehicle is fast approaching the motorcycle 100, for example. The ODNS 200, in one example, provides these functions in accordance with the methods 400, 500 and 600 described further below in connection with FIGS. 4 to 6 . The ODNS 200 includes hardware to be installed onto the motorcycle and associated software embodied in the controller 204 that controls the functions described herein. The ODNS 200 may be installed on the motorcycle 100 by technicians as a retrofit or during manufacturing of the motorcycle 100. Some elements of the ODNS 200 may be included in a rider's mobile telecommunications device such as the controller 204 (or part thereof), the rider lighting display device 220 and the video display device 228. The rider light display device 220 would be graphically presented on a display of the mobile telecommunications device rather than through LEDs as in the integrated hardware system described further herein.
  • Continuing to refer to FIG. 2 , a functional block diagram is provided for the ODNS 200 of FIG. 1 , in accordance with an exemplary embodiment. The controller 204 is coupled to the cameras 210, 212, the radar device 214, the cellular connectivity device 216, the GPS device 218, the enhanced map database 254, the local communications device 252 and the output system 206. The controller 204 receives video data from the cameras 210, 212 and, based thereon, performs computer vision based and machine learning based object detection and tracking, reverse blind spot detection, collision risk prediction, obstacle proximity detection and other operations described further herein. The controller 204 provides rider feedback concerning detected obstacles or potential obstacles, The controller 204 may additionally provide feedback to other vehicles and potentially also to electronically controlled components of the motorcycle 100 so as to implement, for example, automatic braking, automatic throttle control, automatic gear shifting, etc. The controller 204 can be located under a seat of the motorcycle 100, but can alternatively be located in other places in a motorcycle such as behind a display panel between the handles of the motorcycle 100. The controller 204 can be connected to a battery of the motorcycle 100, or it can have its own power supply.
  • As depicted in FIG. 2 , the controller 204 comprises a computer system. In the depicted embodiment, the computer system of the controller 204 includes a processor 230, a memory 232, a storage device 236, and a bus 238. The processor 230 performs the computation and control functions of the controller 204, and may comprise any type of processor or multiple processors, single integrated circuits such as a microprocessor, or any suitable number of integrated circuit devices and/or circuit boards working in cooperation to accomplish the functions of a processing unit. During operation, the processor 230 executes one or more programs 240 contained within the memory 232 and, as such, controls the general operation of the controller 204 and the computer system of the controller 204, generally in executing the processes described herein, such as the methods 400, 500, 600 described further below in connection with FIGS. 4 to 6 . The one or more computer programs 240 include at least an object detection and tracking module 242, an obstacle proximity detection module 246, a reverse blind spot detection module 244 and a collision risk prediction module 248 for performing steps of the methods 400, 500, 600 described in detail below.
  • The processor 230 is capable of executing one or more programs (i.e., running software) to perform various tasks encoded in the program(s), particularly the object detection and tracking module 242, the obstacle proximity detection module 246, the reverse blind spot detection module 244 and the collision risk prediction module 248. The processor 230 may be a microprocessor, microcontroller, application specific integrated circuit (ASIC) or other suitable device as realized by those skilled in the art.
  • The memory 232 can be any type of suitable memory. This would include the various types of dynamic random access memory (DRAM) such as SDRAM, the various types of static RAM (SRAM), and the various types of non-volatile memory (PROM, EPROM, and flash). In certain examples, the memory 232 is located on and/or co-located on the same computer chip as the processor 230.
  • The bus(es) 238 serves to transmit programs, data, status and other information or signals between the various components of the computer system of the controller 204 and between the various hardware components including the output system 206, forward and backward looking cameras 210, 212, the cellular connectivity device 216, the GPS device 218, the local communications device 252 and the enhanced map database 254. The bus(es) 238 can be any suitable physical or logical means of connecting computer systems and components. This includes, but is not limited to, direct hard-wired connections, fiber optics, infrared and wireless bus technologies.
  • The storage device 236 can be any suitable type of storage apparatus, including direct access storage devices such as hard disk drives, flash systems, floppy disk drives and optical disk drives. In one exemplary embodiment, the storage device 236 comprises a program product from which memory 232 can receive a program 240 (including computer modules 242, 244, 246, 248) that executes one or more embodiments of one or more processes of the present disclosure, such as the steps of the methods 400, 500 and 600 (and any sub-processes thereof). In another exemplary embodiment, the program product may be directly stored in and/or otherwise accessed by the memory 232 and/or a disk (e.g., disk), such as that referenced below. The enhanced map database 254 may be stored on the memory 232.
  • It will be appreciated that while this exemplary embodiment is described in the context of a fully functioning computer system, those skilled in the art will recognize that the mechanisms of the present disclosure are capable of being distributed as a program product with one or more types of non-transitory computer-readable signal bearing media used to store the program and the instructions thereof and carry out the distribution thereof, such as a non-transitory computer readable medium bearing the program and containing computer instructions stored therein for causing a computer processor (such as the processor 230) to perform and execute the program. Such a program product may take a variety of forms, and the present disclosure applies equally regardless of the particular type of computer-readable signal bearing media used to carry out the distribution. Examples of signal bearing media include: recordable media such as floppy disks, hard drives, memory cards and optical disks, and transmission media such as digital and analog communication links. It will similarly be appreciated that the computer system of the controller 204 may also otherwise differ from the embodiment depicted in FIG. 2 , for example in that the computer system of the controller 204 may be coupled to or may otherwise utilize one or more remote computer systems and/or other control systems.
  • In the exemplary embodiment of FIG. 2 , the output system 206 includes at least one of: a rider lighting display device 220, an external lighting system 222, a rider speaker 224, a tactile feedback device 226, a video display device 228 and an external speaker 256. The output system 206 is responsive to output data from the controller 204 to provide visual, audible or tactile feedback to a rider of the motorcycle 100 or to a driver of an external vehicle. Upon identification of a threat to the motorcycle 100 by the controller 204, the controller 204 commands an alert to a rider of the motorcycle 100 in order to enable the rider to perform measures to eliminate or reduce any risk. The alerts can be provided in any manner that can be sensed by a rider of the motorcycle 100. In some cases, the alert can be visual provided via the rider lighting display device 220, tactile via the tactile feedback device 226 and/or audible via the rider speaker 224. Some parts of the output system 206 may be included in the helmet 102 such as the rider speaker 224, the tactile feedback device 226 and/or the rider lighting display device 220. The local communications device 252 allows the controller 204 to send data to the helmet 102 through any suitable local communications protocol such as Bluetooth or WiFi. In one embodiment, the local communications device 252 is facilitated through a local communications capability of a rider's mobile telecommunications device.
  • The rider lighting display device 220 of one example embodiment is shown in FIG. 3 . The rider lighting display device 220 includes a plurality of light emitters 302 that can be activated to indicate presence and directionality of an obstacle or a potential obstacle. The rider lighting display device 220 includes a plurality of light emitters 302 arranged in a ring around a motorcycle orientation reference graphic 304. In the exemplary embodiment, there are eight light emitters 302 evenly distributed in the ring shape to provide eight degrees of directionality to the alert. However, more or less light emitters 302 could be provided. For example, four light emitters 304 could be included to provide four degrees of alert directionality including forward, backward, left and right. The light emitters 302 may be LEDs in one embodiment. The controller 204 may output data indicating directionality and severity of a collision threat and command the rider lighting display device 220 accordingly. In such an embodiment, the light emitters 302 can be controlled to emit different colors depending on the threat level such as red for server alert, orange for high alert, yellow elevated alert and green for low alert or some subset of two or three of these alert levels. Other arrangements of light emitters 302 that allow directionality and threat level severity are possible such as light strips arranged on mirrors of the motorcycle 100. The rider lighting display device 220 may also be controlled to differentiate a type of threat such as having different flashing frequencies for different threat types. The rider lighting display device 222 could be located in a display panel between the handles of the motorcycle 100 or could be projected onto or displayed by a shield of the helmet 102.
  • The external lighting system 222 includes existing or additional lights of the motorcycle 100 to indicate to drivers of external vehicles that they are a collision threat to the motorcycle when such a determination is made by the controller 204. For example, the external lighting system 222 provides notifications to drivers of other vehicles through a rear brake light (such as an LED) on the motorcycle 100 when, for example, the controller 204 determines that a vehicle is fast approaching the motorcycle from behind. In this instance, the motorcycle rider may get a notification via the rider lighting display device 220 and the driver of the vehicle receives a notification via the rear brake light or other rear light of the external lighting system 222. Front and/or side external lights could also be included as additional lights or as part of the existing lights of the motorcycle 100 to alert external drivers in front of the motorcycle 100 and to the sides of the motorcycle 100. The external speaker 256, which may be an integrated horn of the motorcycle 100 or an additional device, may additionally or alternatively provide a warning to external vehicles or humans of a collision threat with the motorcycle when such a collision threat has been determined by the controller 204. More than one external speaker 256 could be arranged around the motorcycle 100 to allow for directionality in the audible warning such as front, rear, left side and right side external speakers 256.
  • In embodiments, the warning notification can be a vibration provided to the rider of the motorcycle 100 via one or more vibrating or other tactile elements included in the tactile feedback device 226 causing vibration felt by the rider of the motorcycle 100. In some cases, the vibration can be adjusted in accordance with the threat severity, so that the higher the risk, the stronger the vibration. The vibration elements may additionally or alternatively be integrated into a jacket worn by the rider of the motorcycle 100, into the seat of the motorcycle 100, or into a helmet 102 worn by the rider of the motorcycle 100.
  • In embodiments, the alert is provided through the rider speaker 224. The alert is provided as a sound notification to the rider of the motorcycle 100 via the one or more rider speakers 224. The rider speakers 224 can be integrated into the helmet 102 of the rider, or any other speakers that generate sounds that can be heard by the rider of the motorcycle 100. In some cases, the sound notification can be a natural language voice notification, providing information of the specific threat type and/or severity identified and/or the direction of the threat. In some cases, the volume can be adjusted in accordance with the risk severity, so that the higher the risk, the higher the volume.
  • In embodiments, the video display device 228 provides a live video feed from the forward looking camera 210 and/or the backward looking camera 212. The live video feed may provide a focused area of the total video data based on a direction of the threat. The live video feed may be supplemented with graphical indications of any collision threats distinguishing different types of threats, different threat levels and the direction of the threat as determined by the controller 204. In some embodiments, video from the forward and backward looking cameras 210, 212 is recorded in the storage device 236.
  • The various possible output options of the output system 206 described above may be provided alone or in any combination. Having described the output system 206 and some example audible, visual or tactile feedback mechanisms to threat severity and directions determined by the controller 204, a more detailed description of the software operations of the controller will be provided.
  • Continuing to refer to the exemplary embodiment of FIG. 2 , the object detection and tracking module 242 can be implemented through a number of object detection and tracking algorithms. In one embodiment, the object detection and tracking module 242 receives video data from the forward and backward looking cameras 210, 212 and runs the video data, or a derivative thereof, through a machine learning algorithm to classify and localize obstacles that the machine learning algorithm is trained to detect. The machine learning algorithm may include a Convolutional Neural Network (CNN) or other neural network. One example suitable machine learning algorithm is You Only Look Once (YOLO). The object detection part of the object detection and tracking module 242 provides detected object data including bounding box size, location and classification information. Various obstacle classifications are possible including vehicle, pedestrian, cyclist, non-human animal, etc. The object tracking part of the object detection and tracking module 242 tracks a detected object over time (plural frames of video data) in order to derive velocity and acceleration information for tracked objects and to predict the obstacle's path. In one embodiment, an extended Kalman filter using a motion model for the tracked object can be included in the object tracking part. The object detection and tracking module 242 uses intrinsic and extrinsic camera parameters and possibly also motion parameters from an Inertial Measurement Unit (not shown) or other motion sensors of the motorcycle 100 to provide detected object data in real world coordinates in a coordinate frame relative to the motorcycle 100. The object detection and tracking module 242 is thus able to output location, velocity, acceleration, path projection and classification data for each detected object in forward and backward looking scenes. This data is included in detected object data provided to the obstacle proximity detection module 246 and the reverse blind spot detection module 244. The object detection and tracking module 242 has been described at a relatively high level for the purposes of conciseness. It should be appreciated that a number of object detection and tracking applications are available in the literature that receive video data and use computer vision and machine learning processing to classify and track detected objects.
  • The obstacle proximity detection module 246 receives the detected object data and data on the motion of the motorcycle 100 from the IMU or from other motion sensors such as a wheel speed sensor. In this way, the obstacle proximity detection module 246 is able to project the path of the motorcycle 100 and the projected paths of moving obstacles to determine whether there is any collision threat or any potential spatial overlap with detected stationary obstacles. A collision threat may be determined by a projected collision occurring in less than a first predetermined time threshold. In some embodiments, a plurality of different time thresholds may be used so as to define different threat levels. Furthermore, the mutual motion projections between the motorcycle 100 and the various moving obstacles can be compared to determine a directionality of the threat by determining a direction relative to the motorcycle 100 that an obstacle is travelling. For stationary obstacles, the directionality can be determined based on a relative location between the motorcycle 100 and the detected location of the stationary object. The obstacle proximity and detection module 246 can output collision threat data that is indicative of collision threat severity level and directionality, which can be included in output data for output system 206 to activate various output devices as described above. The obstacle proximity and detection module 246 may additionally distinguish detected threat types in the output data.
  • The obstacle proximity detection module 246 detects a plurality of kinds of proximity events including vehicles in a rider's blind spots, a rider is possibly in other vehicles blind spots (reverse blind spot) as described further below, common road objects (e.g. approaching an unexpected stopped vehicle, pedestrians, etc.), unusual road objects (e.g. desert animals laying on warm road at nighttime) as described further herein, fast approaching vehicles from behind, potential collisions (e.g. a vehicle may unexpectedly pull out in front of the rider but has not yet) as described further herein, etc.
  • In some embodiments, the object detection and tracking module 242 is trained to detect non-human animals based on thermal imaging received from the forward-looking camera 210, the backward looking camera 212 or other forward or backward looking camera particularly suited to thermal imaging. Such an embodiment is designed to detect non-human animals in low visibility conditions such as fog and nighttime. The obstacle proximity detection module 246 receives the detected non-human animal data and responsively outputs a collision threat based on time to potential collision with the non-human animal, which will determine a threat level, and a direction of the threat. The output system 206 responsively outputs an indication of threat level and directionality and optionally also type of threat (e.g. via a specific color or sequence of light emitters 302 or a particular sound or annunciation from the rider speaker 224). In one embodiment, the non-human animal is a snake. Snakes and other desert animals can present a particular danger to motorcycles because they often lay on warm roads at nighttime.
  • In embodiments, the reverse blind spot detection module 244 receives the detected object data from the object detection and tracking module 244. The reverse blind spot detection module 244 determines one or more blind spot regions for one or more detected vehicles. A blind spot in a vehicle is an area around the vehicle that cannot be directly observed by the driver while at the controls. A blind spot may occur behind the side window at a location that is also not visible in the side view mirrors. Motorcycles are narrower than cars and are more liable to being wholly located within a vehicle's blind spot. The reverse blind spot detection module 244 may retrieve a predetermined blind spot region and connect it to a detected vehicle at a location where a blind spot would be. The predetermined blind spot region may be in image coordinates and scaled and rotated in image space according to a distance and orientation between the vehicle and the motorcycle 100, which can be performed based on camera intrinsic and extrinsic parameters. Alternatively, the predetermined blind spot region is provided in real world coordinates and the detected vehicles are transformed into real world space so that the predetermined blind spot regions can be connected thereto. In some embodiments, the object detection and tracking module 244 is trained to detect vehicles and additionally side view mirrors on the vehicles, which can additionally support accurate location of the blind spot region attached based on a bounding box around the side view mirrors. Alternatively, an average position relative to a bounding box around the vehicle could be used to connect the blind spot region. In some embodiments, the predetermined blind spot region is enlarged based on a relative speed between the other vehicle and the motorcycle 100. The relative speed is known from the detected object data output from the object detection and tracking module 242 as described above. In this way, the faster the closing speed, the more likely that a reverse blind spot detection is made to reflect that the driver of the other vehicle would have less time to check the side mirrors and spot the motorcycle 100.
  • The reverse blind spot detection module 244 may compare a location of the motorcycle 100 with the blind spot region(s) to determine whether the motorcycle is located within the blind spot region of any other vehicle. The location of the motorcycle 100 can be obtained by the GPS device 218 or from localization based on computer vision processing of the video data from the forward and backward looking cameras 210, 212. In another embodiment, the reverse blind spot detection module 244 can compare a path trajectory of the motorcycle 100, which can be determined based on a motion model for the motorcycle 100, and location, acceleration and velocity data for the motorcycle 100. The location, acceleration and velocity data for the motorcycle is obtained from GPS device 218 and/or motion sensors of the motorcycle 100. Based on whether the location of the motorcycle 100 is within a blind spot region or is projected to be within the blind sport region within a predetermined amount of time, the reverse blind spot detection module 244 provides output data indicating a reverse blind spot threat and a directionality thereof. The reverse blind spot detection module 244 may additionally output a threat level based on the proximity of the vehicle and the motorcycle in its blind sport region or based on their relative speeds. The output system 206 outputs visual, tactile or audible feedback, which identifies direction, severity and optionally also type of threat as described above.
  • In embodiments, the collision risk prediction module 248 predicts locations and times when there is an added risk of collision with an obstacle using crowd data regarding motorcycle, and possibly other, vehicle accidents or near accidents included in a collision risk map layer 270 of the enhanced map database 254. The collision risk map layer 270 is a map layer that is regularly updated with accident or near accident information through the cellular connectivity device 216. The cellular connectivity device 216 can be a 4G or 5G data communications device, for example. In some embodiments, the cellular connectivity device 216 is facilitated through a rider's mobile telecommunications device. The collision risk map layer 270 provides georeferenced and time referenced crowd data on where collisions or near collisions have occurred. In one embodiment, the collision risk map layer 270 distinguishes between motorcycle accidents or near accidents and those of other vehicles since the threats to a motorcycle can be different to those to other types of vehicle and the collision risk prediction module 248 operates on the motorcycle specific data. In one example, the collision risk map layer 270 may reflect a greater probability of collisions at a bar driveway location and time of day (e.g. after happy hour). The collision risk prediction module 248 filters the collision risk map layer 270 using the current time and current location of the motorcycle 100, which is known from the location data provided by the GPS device 218 so as to determine upcoming relatively high risk locations (e.g. locations where the collision risk is predicted to be greater than a threshold at the current time). The collision risk prediction module 248 may provide output data to the output system 206 indicating the directionality, a threat level and optionally a type of threat.
  • The collision risk prediction module 248 may provide the output data to the obstacle proximity detection module 246 and/or the object detection and tracking module 242. The obstacle proximity detection module 246 and/or the object detection and tracking module 242 is responsive to the collision risk indication and location in the output data to increase a frequency of, or activate, the object detection and obstacle proximity detection processes and/or to spatially focus the object detection and obstacle proximity processes based on the location of the collision risk. In an additional or alternative embodiment, the obstacle proximity detection module 246 can increase in sensitivity in response to the collision risk data from the collision risk prediction module so as to indicate a higher threat level or to lower proximity thresholds so as to more readily provide output data to the output system 206 describing a collision threat. As such, the ODNS is placed on high alert and is specifically focused when the collision risk map layer 270 predicts an upcoming (e.g. within set a range of the forward looking and backward looking cameras 210, 212) collision risk.
  • In embodiments, the controller 204 determines accident conditions based on high acceleration information being obtained from the IMU (not shown) or the GPS device 218. The controller 204 can ascertain from the acceleration information whether the motorcycle has been subjected to an impact or an emergency stop. The controller 204 reports such accident conditions along with location, time and optionally date information, which can be obtained from the GPS device 218, to a remote map server (not shown) through the cellular connectivity device 216. A crowd of motorcycles will similarly report accident conditions, allowing the remote map server to create a continually updating collision risk map layer 270 that is periodically pushed to the motorcycle 100 via the cellular connectivity device 216.
  • In one embodiment, the location and time of non-human animals (e.g. snakes on a road) detected by thermal imaging are reported to the remote map server as an accident condition for motorcycles. The collision risk map layer 270 can integrate crowd sourced times and locations for non-human animals being on the road. This information could alternatively be included in a different map layer. The collision risk prediction module 248 can factor in high risk of non-human animals being on the road at certain locations and at certain times of the day and provide output data representing the collision risk to the output system 206 and the object detection and tracking module 242 and the obstacle proximity detection module 246. In this way, object detection and tracking processing can be activated or increased in frequency when there is a collision risk above a predetermined threshold according to the non-human animal data in the collision risk map layer. The output system 206 can provide an audible, tactile or visual alert concerning the collision risk, which can identify directionality and optionally also type of non-human animal (e.g. a snake graphic). The obstacle proximity detection module 246 may increase in sensitivity (as described above) when there is a high risk of collision with a non-human animal on the road.
  • In embodiments, the ODNS 200 is in operable communication with an application on a rider's mobile telecommunications device. The rider's mobile telecommunications device with the designed application may be used for configuration of the various motorcycle module settings (e.g. warning thresholds, preferred alert methods, etc.). The application could also be used for pulling stored video from the cameras (action video, etc.) 210, 212 or recorded video from the storage device 236.
  • FIG. 4 is a flowchart of a method 400 for reverse blind spot detection, in accordance with one embodiment. The method 400 can be implemented in connection with the motorcycle 100 of FIG. 1 and the ODNS 200 of FIG. 2 , in accordance with an exemplary embodiment. The method 400 may be implemented continuously during motorcycle operation or be invoked or increased in regularity based on collision risk data from the collision risk prediction module 248.
  • As depicted in FIG. 4 , the method 400 includes the step 410 of receiving video data from the forward looking camera 210 and/or the backward looking camera. Object detection and tracking is performed on the video data to classify and localize other vehicles in step 430. In step 430, a blind spot region is defined around each detected vehicle or each detected vehicle within a certain range of the motorcycle 100. The blind spot regions may be changed in size based on a relative speed between the motorcycle 100 and the other vehicle so as to be enlarged, the greater the relative speed. In step 440, a determination is made whether the motorcycle 100 is located in a blind spot region or is projected to be located in a blind spot region within a predetermined amount of time. When a determination has been made of a reverse blind spot threat, audible, visual or tactile feedback is provided to the rider of the motorcycle 100 warning of the reverse blind spot threat, the direction of the threat and optionally also identifying the type of threat.
  • FIG. 5 is a flowchart of a method 500 for collision risk determination, in accordance with one embodiment. The method 500 can be implemented in connection with the motorcycle 100 of FIG. 1 and the ODNS 200 of FIG. 2 , in accordance with an exemplary embodiment. The method 500 may be implemented continuously during motorcycle operation.
  • In step 510, a collision risk is determined from collision risk data in a collision risk map layer 270 of the enhanced map database 254. The collision risk data is time referenced and georeferenced accident or near accident data reported from a crowd of motorcycles. Based on current time and current location of the motorcycle 100, any collision risk information is extracted from the collusion risk map layer 270 and assessed for relevance based on the collision risk being above a predetermined threshold. An indication of the upcoming collision risk and the direction of the collision risk is output for further processing in step 560.
  • In step 520, video data is received from the forward looking camera 210 and/or the backward looking camera 212. In step 530, object detection and tracking is performed based on the video data to classify and localize objects in the captured scene. In step 540, obstacle proximity detection is performed using the detected object data from step 530 including detecting vehicles in rider's blind spots, detecting common road objects (e.g. approaching an unexpected stopped vehicle, pedestrians, etc.), and detecting unusual road objects (e.g. desert animals laying on warm road at nighttime). The obstacle proximity detection step 540 further determines directionality and immediacy of any proximity threat and outputs corresponding collision threat data. In step 550, audible, visual and/or tactile feedback is provided to the rider through the output system 206 based on the collision threat data to indicate to the rider the direction of the collision threat, the existence of the collision threat and optionally also the type of collision threat.
  • In step 560, the object detection and tracking step 530 is adapted and/or the obstacle proximity detection step 560 is adapted and/or the rider feedback step 550 is adapted when collision risk data from step 510 is determined based on the collision risk map layer 270. That is, steps 530 and 540 may be adapted so as to become active from a dormant state or steps 530 and 540 may be increased in frequency. Additionally, or alternatively, steps 530 and 540 may be adapted to spatially focus on a region of the video data in which a high collision risk has been determined. Additionally, or alternatively, the sensitivity of step 540 may be heightened by changing proximity thresholds used in detecting collision threats so as to more readily output an indication of a proximal collision threat. Additionally, the output system 206 may issue an audible, tactile and/or visual alert regarding a potential collision risk identified in step 510 including the direction and optionally also the type of collision risk (e.g. increased traffic from a bar at this time causing accident conditions, snakes on the road at this time, etc.).
  • FIG. 6 is a flowchart of a method 600 for detecting non-human animals on a road, in accordance with one embodiment. The method 600 can be implemented in connection with the motorcycle 100 of FIG. 1 and the ODNS 200 of FIG. 2 , in accordance with an exemplary embodiment. The method 600 may be implemented continuously during motorcycle operation or may become active or increased in frequency based on the collision risk data from the collision risk prediction module indicating potential for non-human animals on the road in the vicinity of the motorcycle 100.
  • In step 610, video data is received from a forward looking thermal camera and/or a backward looking thermal camera. In step 620, object detection and tracking are performed to localize and classify objects including non-human animals that will show up vividly in the thermal imaging. The object detection and tracking processes may be trained to label non-human animals as a category or to specify one or more types of non-human animal such as snakes. In step 630, obstacle proximity detection is performed to detect proximal collision threats including collision threats with detected non-human animals. In step 640, the detected non-human animals are reported to a remote map server through the cellular connectivity device 216. In step 650, a map update including crowd detected non-human animals is received. The map update includes time and location for each detection. The map update is included in the enhanced map database 254 possibly as part of the collision risk map layer. In this way, the method 500 described above with respect to FIG. 5 takes into account georeferenced and time referenced non-human animals on the road when determining the collision risk data. In step 660, audible, tactile and/or visual rider feedback is provided to the non-human rider when step 630 determines a proximal collision threat because of a detected non-human animal on the road. The rider feedback can indicate direction, severity (e.g. immediacy) and optionally the type of threat (e.g. distinguishing a non-human animal from other types of threat or specifying the detected type of non-human animal such as a snake).
  • It will be appreciated that the disclosed methods, systems, and motorcycles may vary from those depicted in the Figures and described herein. For example, the motorcycle 100 and the ODNS 200 and/or various components thereof may vary from that depicted in FIGS. 1 and 2 and described in connection therewith. In addition, it will be appreciated that certain steps of the method 400 may vary from those depicted in FIGS. 4, 5 and 6 and/or described above in connection therewith. It will similarly be appreciated that certain steps of the method described above may occur simultaneously or in a different order than that depicted in FIGS. 4, 5 and 6 .
  • While at least one exemplary embodiment has been presented in the foregoing detailed description, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the disclosure in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing the exemplary embodiment or exemplary embodiments. It should be understood that various changes can be made in the function and arrangement of elements without departing from the scope of the disclosure as set forth in the appended claims and the legal equivalents thereof

Claims (20)

What is claimed is:
1. An obstacle detection and notification system for a motorcycle, the system comprising:
at least one of a forward looking camera and a backward looking camera mountable to the motorcycle;
at least one processor in operable communication with the at least one of the forward looking camera and the backward looking camera, the at least one processor configured to execute program instructions, wherein the program instructions are configured to cause the at least one processor to execute processes including:
receiving video from the at least one of the forward looking camera and the backward looking camera;
performing a computer vision and machine learning based object detection and tracking process to detect, classify and track obstacles in the video and to output detected object data; and
outputting audible, tactile or visual feedback, via an output system of the motorcycle, to a rider of the motorcycle based on the detected object data, wherein the audible, tactile or visual feedback is directional in that a direction of a threat is indicated.
2. The obstacle detection and notification system of claim 1, wherein visual feedback is output to the rider.
3. The obstacle detection and notification system of claim 1, comprising a rider lighting device in the form of a plurality of lights, wherein individual lights are lit to indicate directionality of the threat relative to the motorcycle.
4. The obstacle detection and notification system of claim 3, wherein the plurality of lights is arranged in a ring,
5. The obstacle detection and notification system of claim 3, wherein the plurality of lights emit different colors depending on immediacy of a threat.
6. The obstacle detection and notification system of claim 1, wherein the audible, tactile or visual feedback differentiates direction and severity of the threat.
7. The obstacle detection and notification system of claim 1, wherein the audible, tactile or visual feedback indicates type of threat including the motorcycle being in a blind spot of a vehicle, a vehicle being in a blind spot of the motorcycle, road objects and fast approaching vehicles.
8. A method of obstacle detection and notification for a motorcycle, the method comprising:
receiving video from at least one of a forward looking camera and a backward looking camera;
performing a computer vision and machine learning based object detection and tracking process to detect, classify and track obstacles in the video and to output detected object data; and
outputting audible, tactile or visual feedback, via an output system of the motorcycle, to a rider of the motorcycle based on the detected object data, wherein the audible, tactile or visual feedback is directional in that a direction of a threat is indicated.
9. The method of claim 8, wherein visual feedback is output to the rider.
10. The method of claim 8, the output system comprising a rider lighting device in the form of a plurality of lights, wherein individual lights are lit to indicate directionality of the threat relative to the motorcycle.
11. The method of claim 10, wherein the plurality of lights is arranged in a ring,
12. The method of claim 10, wherein the plurality of lights emit different colors depending on immediacy of a threat.
13. The method of claim 8, wherein the audible, tactile or visual feedback differentiates direction and severity of the threat.
14. The method of claim 8, wherein the audible, tactile or visual feedback indicates type of threat including the motorcycle being in a blind spot of a vehicle, a vehicle being in a blind spot of the motorcycle, road objects and fast approaching vehicles.
15. A motorcycle, the motorcycle comprising:
at least one of a forward looking camera and a backward looking camera;
at least one processor in operable communication with the at least one of the forward looking camera and the backward looking camera, the at least one processor configured to execute program instructions, wherein the program instructions are configured to cause the at least one processor to execute processes including:
receiving video from the at least one of the forward looking camera and the backward looking camera;
performing a computer vision and machine learning based object detection and tracking process to detect, classify and track obstacles in the video and to output detected object data; and
outputting audible, tactile or visual feedback, via an output system of the motorcycle, to a rider of the motorcycle based on the detected object data, wherein the audible, tactile or visual feedback is directional in that a direction of a threat is indicated.
16. The motorcycle of claim 15, wherein visual feedback is output to the rider.
17. The motorcycle of claim 15, comprising a rider lighting device in the form of a plurality of lights, wherein individual lights are lit to indicate directionality of the threat relative to the motorcycle.
18. The motorcycle of claim 17, wherein the plurality of lights is arranged in a ring.
19. The motorcycle of claim 17, wherein the plurality of lights emit different colors depending on immediacy of a threat.
20. The motorcycle of claim 1, wherein the audible, tactile or visual feedback differentiates direction and severity of the threat.
US17/822,571 2021-01-13 2022-08-26 Obstacle detection and notification for motorcycles Abandoned US20220406072A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/822,571 US20220406072A1 (en) 2021-01-13 2022-08-26 Obstacle detection and notification for motorcycles

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17/148,398 US11462021B2 (en) 2021-01-13 2021-01-13 Obstacle detection and notification for motorcycles
US17/822,571 US20220406072A1 (en) 2021-01-13 2022-08-26 Obstacle detection and notification for motorcycles

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US17/148,398 Division US11462021B2 (en) 2021-01-13 2021-01-13 Obstacle detection and notification for motorcycles

Publications (1)

Publication Number Publication Date
US20220406072A1 true US20220406072A1 (en) 2022-12-22

Family

ID=82116182

Family Applications (5)

Application Number Title Priority Date Filing Date
US17/148,398 Active US11462021B2 (en) 2021-01-13 2021-01-13 Obstacle detection and notification for motorcycles
US17/822,585 Active US11798290B2 (en) 2021-01-13 2022-08-26 Obstacle detection and notification for motorcycles
US17/822,571 Abandoned US20220406072A1 (en) 2021-01-13 2022-08-26 Obstacle detection and notification for motorcycles
US17/822,578 Abandoned US20220406073A1 (en) 2021-01-13 2022-08-26 Obstacle detection and notification for motorcycles
US17/822,593 Abandoned US20220406075A1 (en) 2021-01-13 2022-08-26 Obstacle detection and notification for motorcycles

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US17/148,398 Active US11462021B2 (en) 2021-01-13 2021-01-13 Obstacle detection and notification for motorcycles
US17/822,585 Active US11798290B2 (en) 2021-01-13 2022-08-26 Obstacle detection and notification for motorcycles

Family Applications After (2)

Application Number Title Priority Date Filing Date
US17/822,578 Abandoned US20220406073A1 (en) 2021-01-13 2022-08-26 Obstacle detection and notification for motorcycles
US17/822,593 Abandoned US20220406075A1 (en) 2021-01-13 2022-08-26 Obstacle detection and notification for motorcycles

Country Status (3)

Country Link
US (5) US11462021B2 (en)
CN (1) CN114763190A (en)
DE (1) DE102021129444A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022212489A1 (en) 2022-11-23 2024-05-23 Robert Bosch Gesellschaft mit beschränkter Haftung Method and control device for operating a motorcycle in a formation of motorcycles

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10304338B1 (en) * 2018-04-26 2019-05-28 At&T Intellectual Property I, L.P. Cooperative intelligent traffic system communication between bicycles
US11716536B2 (en) * 2020-04-22 2023-08-01 Canon Kabushiki Kaisha Control device, image capturing apparatus, and control method for detecting obstacle
US11462021B2 (en) * 2021-01-13 2022-10-04 GM Global Technology Operations LLC Obstacle detection and notification for motorcycles
US11724693B2 (en) * 2021-02-09 2023-08-15 Ford Global Technologies, Llc Systems and methods to prevent vehicular mishaps
CN112633258B (en) * 2021-03-05 2021-05-25 天津所托瑞安汽车科技有限公司 Target determination method and device, electronic equipment and computer readable storage medium
US11981326B2 (en) * 2021-03-24 2024-05-14 Ford Global Technologies, Llc Object identification with thermal imaging
US20230278573A1 (en) * 2022-03-07 2023-09-07 Damon Motors Inc. Lean-compensated position and trajectory of motorcycle
US20230343218A1 (en) * 2022-04-25 2023-10-26 Trioforce International Inc. Blind spot detecting and warning device suitable for two-wheeled mobile device
WO2024058922A1 (en) * 2022-09-16 2024-03-21 Indian Motorcycle International, LLC Vehicle proximity display on user interface

Citations (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090096634A1 (en) * 2007-10-10 2009-04-16 Ossama Emam Method, system and computer program for driving assistance and monitoring
US20150109149A1 (en) * 2013-10-18 2015-04-23 Elwha Llc Pedestrian warning system
US20150228066A1 (en) * 2014-02-10 2015-08-13 Michael Scot Farb Rear Encroaching Vehicle Monitoring And Alerting System
US20160006922A1 (en) * 2009-12-07 2016-01-07 Cobra Electronics Corporation Vehicle Camera System
US20160150070A1 (en) * 2013-07-18 2016-05-26 Secure4Drive Communication Ltd. Method and device for assisting in safe driving of a vehicle
US20170101056A1 (en) * 2015-10-07 2017-04-13 Lg Electronics Inc. Vehicle and control method for the same
US20170176591A1 (en) * 2015-12-18 2017-06-22 Continental Automotive Systems, Inc. Motorcycle blind spot detection system and rear collision alert using mechanically aligned radar
US20170336218A1 (en) * 2014-12-10 2017-11-23 Here Global B.V. Apparatus and associated method for providing u-turn guidance
US20170355263A1 (en) * 2016-06-13 2017-12-14 Ford Global Technologies, Llc Blind Spot Detection Systems And Methods
US20170357859A1 (en) * 2016-06-13 2017-12-14 Ford Global Technologies, Llc Blind Spot Detection Systems And Methods
US20180012494A1 (en) * 2015-03-24 2018-01-11 Bayerische Motoren Werke Aktiengesellschaft Method for Providing Obstacle Maps for Vehicles
US20180075747A1 (en) * 2014-10-31 2018-03-15 Nodal Inc. Systems, apparatus, and methods for improving safety related to movable/ moving objects
US20180121740A1 (en) * 2016-10-28 2018-05-03 International Business Machines Corporation Vehicular collaboration for vehicular blind spot detection
US20180129889A1 (en) * 2016-11-08 2018-05-10 Samsung Electronics Co., Ltd. Electronic apparatus and control method thereof
US20180134217A1 (en) * 2015-05-06 2018-05-17 Magna Mirrors Of America, Inc. Vehicle vision system with blind zone display and alert system
US20180170327A1 (en) * 2016-12-21 2018-06-21 Hyundai Motor Company Vehicle and method for controlling the same
US20180244198A1 (en) * 2017-02-27 2018-08-30 GM Global Technology Operations LLC Overlaying on an in-vehicle display road objects associated with potential hazards
US20180253611A1 (en) * 2017-03-02 2018-09-06 Ricoh Company, Ltd. Display controller, display control method, and recording medium storing program
US20180299289A1 (en) * 2017-04-18 2018-10-18 Garmin Switzerland Gmbh Mobile application interface device for vehicle navigation assistance
US20190092347A1 (en) * 2017-09-25 2019-03-28 Mando Corporation Method and system of vehicle alarm that alarm area is changed by visible distance, and vision system for vehicle
US20190095731A1 (en) * 2017-09-28 2019-03-28 Nec Laboratories America, Inc. Generative adversarial inverse trajectory optimization for probabilistic vehicle forecasting
US20190126831A1 (en) * 2017-10-31 2019-05-02 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America Irms ar object detection and classification
US20190137622A1 (en) * 2017-11-09 2019-05-09 Brennan Lopez-Hinojosa Method and System for Gauging External Object Movement and Conditions for Connected and Autonomous Vehicle Safety
US10388157B1 (en) * 2018-03-13 2019-08-20 Allstate Insurance Company Processing system having a machine learning engine for providing a customized driving assistance output
US20190318179A1 (en) * 2017-01-16 2019-10-17 Fujitsu Limited Recording medium recording information processing program, information processing method, and information processing apparatus
US20200023797A1 (en) * 2018-07-17 2020-01-23 Denso International America, Inc. Automatic Crowd Sensing and Reporting System for Road Incidents
US20200041997A1 (en) * 2018-08-03 2020-02-06 Here Global B.V. Method and apparatus for visualizing future events for passengers of autonomous vehicles
US20200168099A1 (en) * 2017-06-07 2020-05-28 Mitsubishi Electric Corporation Hazardous vehicle prediction device, hazardous vehicle warning system, and hazardous vehicle prediction method
US20200189614A1 (en) * 2018-12-17 2020-06-18 Toyota Jidosha Kabushiki Kaisha Notification device
US10698222B1 (en) * 2019-01-31 2020-06-30 StradVision, Inc. Method for monitoring blind spot of cycle using smart helmet for cycle rider and blind spot monitoring device using them
US20200207375A1 (en) * 2018-12-26 2020-07-02 Uatc, Llc All Mover Priors
US20200215985A1 (en) * 2005-07-06 2020-07-09 Donnelly Corporation Vehicular exterior mirror system with blind spot indicator
US20200257908A1 (en) * 2019-02-13 2020-08-13 Sap Se Blind spot implementation in neural networks
US20200290637A1 (en) * 2019-03-14 2020-09-17 Caterpillar Inc. Method and system for providing object detection warning
US20210027074A1 (en) * 2018-04-02 2021-01-28 Denso Corporation Vehicle system, space area estimation method, and space area estimation apparatus
US20210089048A1 (en) * 2019-09-21 2021-03-25 Ha Q Tran Smart vehicle
US20210094577A1 (en) * 2018-08-14 2021-04-01 Mobileye Vision Technologies Ltd. Systems and Methods for Navigating with Safe Distances
US20210108926A1 (en) * 2019-10-12 2021-04-15 Ha Q. Tran Smart vehicle
US10984262B2 (en) * 2018-10-08 2021-04-20 StradVision, Inc. Learning method and testing method for monitoring blind spot of vehicle, and learning device and testing device using the same
US20210304611A1 (en) * 2020-03-27 2021-09-30 Toyota Research Institute, Inc. Detection of cyclists near ego vehicles
US20210343148A1 (en) * 2020-04-03 2021-11-04 Mando Corporation Driver assistance system and control method for the same
US20220089089A1 (en) * 2012-01-24 2022-03-24 SMR Patents S.à.r.l. Rearview device with moveable head assembly and method of assembling same
US20220153196A1 (en) * 2005-07-06 2022-05-19 Magna Mirrors Of America, Inc. Vehicular exterior mirror system with blind spot indicator
US11462021B2 (en) * 2021-01-13 2022-10-04 GM Global Technology Operations LLC Obstacle detection and notification for motorcycles

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9139133B2 (en) * 2012-05-31 2015-09-22 GM Global Technology Operations LLC Vehicle collision warning system and method
US10059261B2 (en) * 2015-11-24 2018-08-28 Thunder Power New Energy Vehicle Development Company Limited Collision warning system
IT201600082299A1 (en) * 2016-08-04 2018-02-04 Piaggio & C Spa Motorcycle with obstacle and / or collision risk sensor
US10943485B2 (en) * 2018-04-03 2021-03-09 Baidu Usa Llc Perception assistant for autonomous driving vehicles (ADVs)
JP7452650B2 (en) * 2020-06-23 2024-03-19 株式会社デンソー Parking/stopping point management device, parking/stopping point management method, vehicle device

Patent Citations (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200215985A1 (en) * 2005-07-06 2020-07-09 Donnelly Corporation Vehicular exterior mirror system with blind spot indicator
US20220153196A1 (en) * 2005-07-06 2022-05-19 Magna Mirrors Of America, Inc. Vehicular exterior mirror system with blind spot indicator
US20090096634A1 (en) * 2007-10-10 2009-04-16 Ossama Emam Method, system and computer program for driving assistance and monitoring
US20160006922A1 (en) * 2009-12-07 2016-01-07 Cobra Electronics Corporation Vehicle Camera System
US20220089089A1 (en) * 2012-01-24 2022-03-24 SMR Patents S.à.r.l. Rearview device with moveable head assembly and method of assembling same
US20160150070A1 (en) * 2013-07-18 2016-05-26 Secure4Drive Communication Ltd. Method and device for assisting in safe driving of a vehicle
US20150109149A1 (en) * 2013-10-18 2015-04-23 Elwha Llc Pedestrian warning system
US20150228066A1 (en) * 2014-02-10 2015-08-13 Michael Scot Farb Rear Encroaching Vehicle Monitoring And Alerting System
US20180075747A1 (en) * 2014-10-31 2018-03-15 Nodal Inc. Systems, apparatus, and methods for improving safety related to movable/ moving objects
US20170336218A1 (en) * 2014-12-10 2017-11-23 Here Global B.V. Apparatus and associated method for providing u-turn guidance
US20180012494A1 (en) * 2015-03-24 2018-01-11 Bayerische Motoren Werke Aktiengesellschaft Method for Providing Obstacle Maps for Vehicles
US20180134217A1 (en) * 2015-05-06 2018-05-17 Magna Mirrors Of America, Inc. Vehicle vision system with blind zone display and alert system
US20170101056A1 (en) * 2015-10-07 2017-04-13 Lg Electronics Inc. Vehicle and control method for the same
US20170176591A1 (en) * 2015-12-18 2017-06-22 Continental Automotive Systems, Inc. Motorcycle blind spot detection system and rear collision alert using mechanically aligned radar
US20170355263A1 (en) * 2016-06-13 2017-12-14 Ford Global Technologies, Llc Blind Spot Detection Systems And Methods
US20170357859A1 (en) * 2016-06-13 2017-12-14 Ford Global Technologies, Llc Blind Spot Detection Systems And Methods
US20180121740A1 (en) * 2016-10-28 2018-05-03 International Business Machines Corporation Vehicular collaboration for vehicular blind spot detection
US20180129889A1 (en) * 2016-11-08 2018-05-10 Samsung Electronics Co., Ltd. Electronic apparatus and control method thereof
US20180170327A1 (en) * 2016-12-21 2018-06-21 Hyundai Motor Company Vehicle and method for controlling the same
US20190318179A1 (en) * 2017-01-16 2019-10-17 Fujitsu Limited Recording medium recording information processing program, information processing method, and information processing apparatus
US20180244198A1 (en) * 2017-02-27 2018-08-30 GM Global Technology Operations LLC Overlaying on an in-vehicle display road objects associated with potential hazards
US20180253611A1 (en) * 2017-03-02 2018-09-06 Ricoh Company, Ltd. Display controller, display control method, and recording medium storing program
US20180299289A1 (en) * 2017-04-18 2018-10-18 Garmin Switzerland Gmbh Mobile application interface device for vehicle navigation assistance
US20200168099A1 (en) * 2017-06-07 2020-05-28 Mitsubishi Electric Corporation Hazardous vehicle prediction device, hazardous vehicle warning system, and hazardous vehicle prediction method
US20190092347A1 (en) * 2017-09-25 2019-03-28 Mando Corporation Method and system of vehicle alarm that alarm area is changed by visible distance, and vision system for vehicle
US20190095731A1 (en) * 2017-09-28 2019-03-28 Nec Laboratories America, Inc. Generative adversarial inverse trajectory optimization for probabilistic vehicle forecasting
US20190126831A1 (en) * 2017-10-31 2019-05-02 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America Irms ar object detection and classification
US20190137622A1 (en) * 2017-11-09 2019-05-09 Brennan Lopez-Hinojosa Method and System for Gauging External Object Movement and Conditions for Connected and Autonomous Vehicle Safety
US10388157B1 (en) * 2018-03-13 2019-08-20 Allstate Insurance Company Processing system having a machine learning engine for providing a customized driving assistance output
US20210027074A1 (en) * 2018-04-02 2021-01-28 Denso Corporation Vehicle system, space area estimation method, and space area estimation apparatus
US20200023797A1 (en) * 2018-07-17 2020-01-23 Denso International America, Inc. Automatic Crowd Sensing and Reporting System for Road Incidents
US20200041997A1 (en) * 2018-08-03 2020-02-06 Here Global B.V. Method and apparatus for visualizing future events for passengers of autonomous vehicles
US20210094577A1 (en) * 2018-08-14 2021-04-01 Mobileye Vision Technologies Ltd. Systems and Methods for Navigating with Safe Distances
US10984262B2 (en) * 2018-10-08 2021-04-20 StradVision, Inc. Learning method and testing method for monitoring blind spot of vehicle, and learning device and testing device using the same
US20200189614A1 (en) * 2018-12-17 2020-06-18 Toyota Jidosha Kabushiki Kaisha Notification device
US20200207375A1 (en) * 2018-12-26 2020-07-02 Uatc, Llc All Mover Priors
US10698222B1 (en) * 2019-01-31 2020-06-30 StradVision, Inc. Method for monitoring blind spot of cycle using smart helmet for cycle rider and blind spot monitoring device using them
US20200257908A1 (en) * 2019-02-13 2020-08-13 Sap Se Blind spot implementation in neural networks
US20200290637A1 (en) * 2019-03-14 2020-09-17 Caterpillar Inc. Method and system for providing object detection warning
US20210089048A1 (en) * 2019-09-21 2021-03-25 Ha Q Tran Smart vehicle
US20210108926A1 (en) * 2019-10-12 2021-04-15 Ha Q. Tran Smart vehicle
US20210304611A1 (en) * 2020-03-27 2021-09-30 Toyota Research Institute, Inc. Detection of cyclists near ego vehicles
US20210343148A1 (en) * 2020-04-03 2021-11-04 Mando Corporation Driver assistance system and control method for the same
US11462021B2 (en) * 2021-01-13 2022-10-04 GM Global Technology Operations LLC Obstacle detection and notification for motorcycles
US20220406074A1 (en) * 2021-01-13 2022-12-22 GM Global Technology Operations LLC Obstacle detection and notification for motorcycles
US20220406073A1 (en) * 2021-01-13 2022-12-22 GM Global Technology Operations LLC Obstacle detection and notification for motorcycles
US20220406075A1 (en) * 2021-01-13 2022-12-22 GM Global Technology Operations LLC Obstacle detection and notification for motorcycles

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022212489A1 (en) 2022-11-23 2024-05-23 Robert Bosch Gesellschaft mit beschränkter Haftung Method and control device for operating a motorcycle in a formation of motorcycles

Also Published As

Publication number Publication date
DE102021129444A1 (en) 2022-07-14
US11798290B2 (en) 2023-10-24
CN114763190A (en) 2022-07-19
US20220406073A1 (en) 2022-12-22
US20220406074A1 (en) 2022-12-22
US20220222475A1 (en) 2022-07-14
US11462021B2 (en) 2022-10-04
US20220406075A1 (en) 2022-12-22

Similar Documents

Publication Publication Date Title
US11798290B2 (en) Obstacle detection and notification for motorcycles
JP7397807B2 (en) Rider assistance system and method
US10286905B2 (en) Driver assistance apparatus and control method for the same
US10077007B2 (en) Sidepod stereo camera system for an autonomous vehicle
US9352683B2 (en) Traffic density sensitivity selector
CN103987577B (en) Method for monitoring the traffic conditions in the surrounding environment with signalling vehicle
US20180338117A1 (en) Surround camera system for autonomous driving
EP3293051A1 (en) Front end sensor for pedestrians
US20140063248A1 (en) Vehicle periphery monitoring device
US20210166564A1 (en) Systems and methods for providing warnings to surrounding vehicles to avoid collisions
US10759334B2 (en) System for exchanging information between vehicles and control method thereof
CN113808418B (en) Road condition information display system, method, vehicle, computer device and storage medium
WO2017115371A1 (en) Apparatus and method for avoiding vehicular accidents
KR20160091040A (en) Vehicle and Control Method Thereof
WO2023241521A1 (en) Blind area monitoring system and method
JP2023052970A (en) Display device
CN111098864B (en) Prompt method, device, automatic driving vehicle and storage medium
JP2019175372A (en) Danger prediction device, method for predicting dangers, and program
JP2022100852A (en) Attention evocation device and attention evocation method
US20230206470A1 (en) Electronic device, method, and computer readable storage medium for obtaining location information of at least one subject by using plurality of cameras
JP3222638U (en) Safe driving support device
US20230316919A1 (en) Hazard notification method and system for implementing
US20230064724A1 (en) Danger notification method, danger notification device, and non-transitory storage medium
CN117533326A (en) Vehicle control method, device, storage medium and electronic equipment
JP2022101272A (en) Attention evocation device and attention evocation method

Legal Events

Date Code Title Description
AS Assignment

Owner name: GM GLOBAL TECHNOLOGY OPERATIONS LLC, MICHIGAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OESTERLING, CHRISTOPHER L;CLIFFORD, DAVID H;SIGNING DATES FROM 20220825 TO 20220826;REEL/FRAME:060914/0315

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION