WO2022104295A1 - Object detection using fusion of vision, location, and/or other signals - Google Patents

Object detection using fusion of vision, location, and/or other signals Download PDF

Info

Publication number
WO2022104295A1
WO2022104295A1 PCT/US2021/065468 US2021065468W WO2022104295A1 WO 2022104295 A1 WO2022104295 A1 WO 2022104295A1 US 2021065468 W US2021065468 W US 2021065468W WO 2022104295 A1 WO2022104295 A1 WO 2022104295A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
location
computing system
alert condition
predicted path
Prior art date
Application number
PCT/US2021/065468
Other languages
French (fr)
Inventor
Hsiang-Huang Wu
Zhebin ZHANG
Hongyu Sun
Jian Sun
Original Assignee
Innopeak Technology, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Innopeak Technology, Inc. filed Critical Innopeak Technology, Inc.
Priority to PCT/US2021/065468 priority Critical patent/WO2022104295A1/en
Publication of WO2022104295A1 publication Critical patent/WO2022104295A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/44Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/04Traffic conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information

Definitions

  • the present disclosure relates, in general, to methods, systems, and apparatuses for implementing driver assistance technologies (e.g., advanced driver assistance systems ("ADASs"), other vision-based object detection, other location signal-based object detection, other vision and location signal-based object detection, or multiple types of signal-based object detection, or the like), and, more particularly, to methods, systems, and apparatuses for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals.
  • driver assistance technologies e.g., advanced driver assistance systems (“ADASs"), other vision-based object detection, other location signal-based object detection, other vision and location signal-based object detection, or multiple types of signal-based object detection, or the like
  • ADASs advanced driver assistance systems
  • implementing object detection using fusion of vision, location, and/or other signals and/or implementing alert condition messaging based at least in part on object detection using fusion of vision,
  • Object detection is a critical application in computer vision, and is widely used in autonomous driving.
  • the technique of object detection is mainly used to localize where objects are and to recognize what they are based on the visual images. With the help of deep learning and a large number of images for training, object detection is able to deliver great results if the target objects are not blurred or mostly occluded. In reality, most objects in these situations likely cause many accidents.
  • Many existing systems and methods for improving the accuracy of the detection of occluded and/or blurred objects rely on one technique such as visionbased detection or location-based detection.
  • Each of these techniques has its limitations for detecting the object due to their characteristics of vision or location signals (e.g., some conventional systems and methods are bad at detection when the objects are in a blind spot, or the like; on the other hand, location signals (e.g., radar, or the like) may easily be interfered with by each other and may have some errors if high precision is required, or the like).
  • location signals e.g., radar, or the like
  • the techniques of this disclosure generally relate to tools and techniques for implementing driver assistance technologies, and, more particularly, to methods, systems, and apparatuses for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals.
  • a method may comprise determining, using a computing system, at least one of a first current vehicle location or a first predicted path of a vehicle, based at least in part on one or more first location-based data from corresponding one or more first location data signals received from one or more different types of location data signal sources; sending, using the computing system, a first communication regarding the determined at least one of the first current vehicle location or the first predicted path of the vehicle to a remote computing system over one or more networks; and, in response to existence of at least one first alert condition for corresponding at least one first alert condition location that is in proximity to one or more of the determined at least one of the first current vehicle location or the first predicted path of the vehicle or that is within a first region encompassing the determined at least one of the first current vehicle location or the first predicted path of the vehicle, receiving, using the computing system, a second communication regarding the at least one first alert condition for the corresponding at least one first alert condition location from the remote computing system over the one or more networks.
  • the method may further comprise, based on a determination that the vehicle is approaching the at least one first alert condition location based at least in part on one or more second location-based data from corresponding one or more second location data signals received from the one or more different types of location data signal sources, performing the following tasks: receiving, using the computing system, one or more first object detection data signals from one or more different types of object detection data signal sources; fusing, using the computing system, the at least one first alert condition with one or more of the received one or more first object detection data signals, the at least one first alert condition location, a second current vehicle location of the vehicle, or a second predicted path of the vehicle to generate first fused data; and generating and presenting, using the computing system and via one or more user devices, a first alert message indicating that the vehicle is approaching the at least one first alert condition, based at least in part on the generated first fused data.
  • a system might comprise a computing system, which might comprise at least one first processor and a first non-transitory computer readable medium communicatively coupled to the at least one first processor.
  • the first non-transitory computer readable medium might have stored thereon computer software comprising a first set of instructions that, when executed by the at least one first processor, causes the computing system to: determine at least one of a first current vehicle location or a first predicted path of a vehicle, based at least in part on one or more first location-based data from corresponding one or more first location data signals received from one or more different types of location data signal sources; send the determined at least one of the first current vehicle location or the first predicted path of the vehicle to a remote computing system over one or more networks; in response to existence of at least one first alert condition for corresponding at least one first alert condition location that is in proximity to one or more of the determined at least one of the first current vehicle location or the first predicted path of the vehicle or that is within a first region encompassing the determined at least one of the first
  • a method may comprise receiving, using a remote computing system and from each of one or more first computing systems associated with corresponding one or more first vehicles among a plurality of vehicles and over one or more networks, one or more first communications indicating that a first alert condition has been detected at or near one or more of a first current vehicle location of each of the one or more first vehicles or a first alert condition location corresponding to the first alert condition; receiving, using the remote computing system and from each of one or more second computing systems associated with corresponding one or more second vehicles among the plurality of vehicles and over the one or more networks, one or more second communications regarding at least one of a second current vehicle location or a second predicted path for each of the one or more second vehicles; analyzing, using the remote computing system, the at least one of the second current vehicle location or the second predicted path for each of the one or more second vehicles in relation to at least one of the first alert condition or the first alert condition location; and based on a determination that at least one second vehicle among the one or more second vehicles is in proximity to
  • a system might comprise a remote computing system, which might comprise at least one first processor and a first non-transitory computer readable medium communicatively coupled to the at least one first processor.
  • the first non-transitory computer readable medium might have stored thereon computer software comprising a first set of instructions that, when executed by the at least one first processor, causes the remote computing system to: receive, from each of one or more first computing systems associated with corresponding one or more first vehicles among a plurality of vehicles and over one or more networks, one or more first communications indicating that a first alert condition has been detected at or near one or more of a first current vehicle location of each of the one or more first vehicles or a first alert condition location corresponding to the first alert condition; receive, from each of one or more second computing systems associated with corresponding one or more second vehicles among the plurality of vehicles and over the one or more networks, one or more second communications regarding at least one of a second current vehicle location or a second predicted path for each of the one or more second vehicles; analyze the at least one of a second
  • FIG. 1 is a schematic diagram illustrating a system for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals, in accordance with various embodiments.
  • FIG. 2 is a schematic block flow diagrams illustrating a non- limiting example of object detection using fusion of vision, location, and/or other signals, and/or alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals, in accordance with various embodiments.
  • Figs. 3A-3C are schematic block flow diagrams illustrating various nonlimiting examples of alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals, in accordance with various embodiments.
  • Figs. 4A-4F are flow diagrams illustrating a method for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals, in accordance with various embodiments.
  • Fig. 5 is a block diagram illustrating an example of computer or system hardware architecture, in accordance with various embodiments.
  • Fig. 6 is a block diagram illustrating a networked system of computers, computing systems, or system hardware architecture, which can be used in accordance with various embodiments.
  • Various embodiments provide tools and techniques for implementing driver assistance technologies (e.g., advanced driver assistance systems ("ADASs”), other vision-based object detection, other location signal-based object detection, other vision and location signal-based object detection, or multiple types of signal-based object detection, or the like), and, more particularly, to methods, systems, and apparatuses for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals.
  • driver assistance technologies e.g., advanced driver assistance systems (“ADASs”), other vision-based object detection, other location signal-based object detection, other vision and location signal-based object detection, or multiple types of signal-based object detection, or the like
  • ADASs advanced driver assistance systems
  • other vision-based object detection e.g., other location signal-based object detection, other vision and location signal-based object detection, or multiple types of signal-based object detection, or the like
  • alert condition messaging e
  • a computing system may determine at least one of a first current vehicle location or a first predicted path of a vehicle, based at least in part on one or more first location-based data from corresponding one or more first location data signals received from one or more different types of location data signal sources.
  • the computing system may send a first communication regarding the determined at least one of the first current vehicle location or the first predicted path of the vehicle to a remote computing system over one or more networks.
  • the computing system may receive a second communication regarding the at least one first alert condition for the corresponding at least one first alert condition location from the remote computing system over the one or more networks.
  • the computing system may perform the following tasks: receive one or more first object detection data signals from one or more different types of object detection data signal sources; fuse the at least one first alert condition with one or more of the received one or more first object detection data signals, the at least one first alert condition location, a second current vehicle location of the vehicle, or a second predicted path of the vehicle to generate first fused data; and generate and present, via one or more user devices, a first alert message indicating that the vehicle is approaching the at least one first alert condition, based at least in part on the generated first fused data.
  • the computing system may comprise at least one of a data signal fusing computing system, at least one processor on the user device, at least one processor on a mobile device associated with a user, a vehicle-based computing system, an object detection system, or a driver assistance system, and/or the like.
  • the remote computing system may comprise at least one of a remote data signal fusing computing system, a remote object detection system, or a remote driver assistance system, a server computer over the one or more networks, an image processing server, a graphics processing unit (“GPU”) -based server, a positioning and mapping server, a machine learning system, an artificial intelligence (“Al”) system, a deep learning system, a neural network, a convolutional neural network (“CNN”), a fully convolutional network (“FCN”), a cloud computing system, or a distributed computing system, and/or the like.
  • the one or more different types of location data signal sources may each comprise one of a global positioning system (“GPS”) device, a global navigation satellite system (“GNSS”) device, a text recognition-based location identification system, an image recognition-based landmark identification system, a telecommunications signal triangulation-based location identification system, a radar-based location identification system, a sonarbased location identification system, or a lidar-based location identification system, and/or the like.
  • GPS global positioning system
  • GNSS global navigation satellite system
  • text recognition-based location identification system an image recognition-based landmark identification system
  • a telecommunications signal triangulation-based location identification system a radar-based location identification system
  • sonarbased location identification system a sonarbased location identification system
  • lidar-based location identification system and/or the like.
  • determining the at least one of the first current vehicle location or the first predicted path of a vehicle may comprise at least one of: determining, using the computing system, the first current vehicle location based at least in part on GPS data from the GPS device; determining, using the computing system, the first predicted path of the vehicle based at least in part on a series of GPS data from the GPS device over time; determining, using the computing system, the first current vehicle location based at least in part on GNSS data from the GNSS device; determining, using the computing system, the first predicted path of the vehicle based at least in part on a series of GNSS data from the GNSS device over time; determining, using the computing system, the first current vehicle location based at least in part on text recognition of one or more location-identifying signs, the one or more location-identifying signs comprising at least one of one or more street signs, one or more address signs, one or more business signs, one or more highway signs, one or more city limits signs, one or more county boundary signs, one or more state
  • the one or more different types of object detection data signal sources may each comprise one of a vision-based object detection system, a radar-based object detection system, a sonar-based object detection system, or a lidar-based object detection system.
  • the vehicle may comprise one of a car, a minivan, a pickup truck, a motorcycle, an all-terrain vehicle, a scooter, a police vehicle, a fire engine, an ambulance, a recreational vehicle, a bus, a commercial van, a commercial truck, a semi-tractor-trailer truck, a boat, a ship, a submersible, an amphibious vehicle, an aircraft, a space vehicle, a satellite, an autonomous vehicle, or a drone, and/or the like.
  • the at least one first alert condition may each comprise at least one of traffic congestion along the first predicted path of the vehicle potentially causing a slow-down, a traffic accident along the first predicted path of the vehicle potentially causing a slow-down, a construction site along the first predicted path of the vehicle potentially causing a slow-down, one or more people along the first predicted path of the vehicle who are occluded from a perspective of the vehicle, one or more animals along the first predicted path of the vehicle who are occluded from the perspective of the vehicle, one or more objects along the first predicted path of the vehicle who are occluded from the perspective of the vehicle, one or more people potentially blocking the first predicted path of the vehicle, one or more animals potentially blocking the first predicted path of the vehicle, one or more objects potentially blocking the first predicted path of the vehicle, a tracked weather event along or near the first predicted path of the vehicle, a natural hazard potentially blocking the first predicted path of the vehicle, a manmade hazard potentially blocking the first predicted path of the vehicle
  • each user device may comprise at least one of a smartphone, a tablet computer, a display device, an augmented reality (“AR”) device, a virtual reality (“VR”) device, a mixed reality (“MR”) device, a vehicle console display, a vehicle heads-up display (“HUD”), a vehicle remote controller display, one or more audio speakers, or one or more haptic response devices, and/or the like.
  • AR augmented reality
  • VR virtual reality
  • MR mixed reality
  • vehicle console display a vehicle heads-up display
  • HUD vehicle heads-up display
  • vehicle remote controller display one or more audio speakers, or one or more haptic response devices, and/or the like.
  • generating and presenting the first alert message may comprise at least one of: generating a graphical display depicting one or more of the at least one first alert condition or the generated first fused data, and presenting the generated graphical display on a display device; generating a textbased message depicting one or more of the at least one first alert condition or the generated first fused data, and presenting the text-based message on a display device; generating at least one message regarding one or more of the at least one first alert condition or the generated first fused data, and sending the at least one message to the user device, wherein the at least one message may comprise at least one of an e-mail message, a short message service (“SMS") app, a multimedia messaging service (“MMS”) app, or a text message app, and/or the like; or generating at least one audio message regarding one or more of the at least one first alert condition or the generated first fused data, and sending the at least one audio message to the user device for playback on at least one audio speaker; and/
  • At least one of the second current vehicle location or the second predicted path of the vehicle may be determined based at least in part on the one or more second location-based data from the corresponding one or more second data signals received from the one or more different types of location data signal sources.
  • the computing system may receive one or more second object detection data signals from the one or more different types of object detection data signal sources. The computing system may analyze the one or more second object detection data signals.
  • the computing system may send a third communication to the remote computing system over the one or more networks indicating that the at least one first alert condition is no longer present at the at least one first alert condition location.
  • the computing system may receive one or more third object detection data signals from the one or more different types of object detection data signal sources.
  • the computing system may analyze the one or more third object detection data signals. Based on a determination that the one or more third object detection data signals correspond to at least one second alert condition, the computing system may determine at least one of a third current vehicle location or at least one second alert condition location corresponding to the at least one second alert condition, based at least in part on one or more third location-based data from corresponding one or more third data signals received from the one or more different types of location data signal sources.
  • the computing system may send a fourth communication to the remote computing system over the one or more networks indicating that the at least one second alert condition has been detected at or near the at least one of the third current vehicle location or the at least one second alert condition location.
  • a remote computing system may receive, from each of one or more first computing systems associated with corresponding one or more first vehicles among a plurality of vehicles and over one or more networks, one or more first communications indicating that a first alert condition has been detected at or near one or more of a first current vehicle location of each of the one or more first vehicles or a first alert condition location corresponding to the first alert condition.
  • the remote computing system may receive, from each of one or more second computing systems associated with corresponding one or more second vehicles among the plurality of vehicles and over the one or more networks, one or more second communications regarding at least one of a second current vehicle location or a second predicted path for each of the one or more second vehicles.
  • the remote computing system may analyze the at least one of the second current vehicle location or the second predicted path for each of the one or more second vehicles in relation to at least one of the first alert condition or the first alert condition location. Based on a determination that at least one second vehicle among the one or more second vehicles is in proximity to or approaching the first alert condition location or is within a first region encompassing the first alert condition, based at least in part on the analysis, the remote computing system may send one or more third communications to each of the second computing systems associated with each of the corresponding at least one second vehicle indicating that said second vehicle is in proximity to or approaching the first alert condition location.
  • the one or more first computing systems and the one or more second computing systems may each comprise at least one of a data signal fusing computing system, at least one processor on the user device, at least one processor on a mobile device associated with a user, a vehicle-based computing system, an object detection system, or a driver assistance system, and/or the like.
  • the remote computing system may comprise at least one of a remote data signal fusing computing system, a remote object detection system, or a remote driver assistance system, a server computer over the one or more networks, an image processing server, a graphics processing unit (“GPU”) -based server, a positioning and mapping server, a machine learning system, an artificial intelligence (“Al”) system, a deep learning system, a neural network, a convolutional neural network (“CNN”), a fully convolutional network (“FCN”), a cloud computing system, or a distributed computing system, and/or the like.
  • the plurality of vehicles may each comprise one of a car, a minivan, a pickup truck, a motorcycle, an all-terrain vehicle, a scooter, a police vehicle, a fire engine, an ambulance, a recreational vehicle, a bus, a commercial van, a commercial truck, a semi-tractor-trailer truck, a boat, a ship, a submersible, an amphibious vehicle, an aircraft, a space vehicle, a satellite, an autonomous vehicle, or a drone, and/or the like.
  • the first alert condition may comprise at least one of traffic congestion along the second predicted path of the at least one second vehicle potentially causing a slow-down, a traffic accident along the second predicted path of the at least one second vehicle potentially causing a slow-down, a construction site along the second predicted path of the at least one second vehicle potentially causing a slow-down, one or more people along the second predicted path of the at least one second vehicle who are occluded from a perspective of the at least one second vehicle, one or more animals along the second predicted path of the at least one second vehicle who are occluded from the perspective of the at least one second vehicle, one or more objects along the second predicted path of the at least one second vehicle who are occluded from the perspective of the at least one second vehicle, one or more people potentially blocking the second predicted path of the at least one second vehicle, one or more animals potentially blocking the second predicted path of the at least one second vehicle, one or more objects potentially blocking the second predicted path of the at least one second vehicle, a tracked weather event along or near
  • the remote computing system may receive, from each of one or more third computing systems associated with corresponding one or more third vehicles among the plurality of vehicles and over the one or more networks, one or more fourth communications indicating that the first alert condition is no longer present at the first alert condition location.
  • the remote computing system may receive, from each of one or more fourth computing systems associated with corresponding one or more fourth vehicles among the plurality of vehicles and over the one or more networks, one or more fifth communications regarding at least one of a fourth current vehicle location or a fourth predicted path for each of the one or more fourth vehicles.
  • the remote computing system may analyze the at least one of the fourth current vehicle location or the fourth predicted path for each of the one or more fourth vehicles in relation to at least one of the first alert condition or the first alert condition location.
  • the remote computing system may send one or more fifth communications to each of the fourth computing systems associated with each of the corresponding at least one fourth vehicle indicating that the first alert condition is no longer present at the first alert condition location.
  • the remote computing system may continue sending the one or more third communications to each of the fourth computing systems associated with each of the corresponding at least one fourth vehicle indicating that said fourth vehicle is in proximity to or approaching the first alert condition location.
  • a fusion framework that integrates object detection and location signals to provide more useful information may be provided.
  • location signals are not limited to location signals based on GPS data (as described herein), and any location signal may be easily used in this framework.
  • the system (and corresponding service) described in accordance with the various embodiments operates by interchanging and/or exchanging locations (e.g., alert condition locations, or the like) according to the area or region in which potentially affected vehicles may be located, and, in some cases, storing the event locations (or alert condition locations) and pushing them to the mobile phones (or other user devices or computing systems) of users and/or vehicles that are determined to potentially be affected by the events (or alert conditions).
  • object detection models running on the mobile phone (or other user devices or computing systems) with additional location information to enhance the driving assistant information, or the like.
  • a system or method that integrates visual images or other object detection signals with sensor signals or other location signals may enable delivery of more useful information to assist drivers.
  • this framework may be based on mobile devices to extend its ability by integrating visual images and location sensor signals to generate new driver assist information. Further, this framework may be scalable because its services could be distributed based on location(s) and may be run on the cloud, or the like.
  • the service may be started if the number of users exceeds some threshold value (e.g., within a region), and that service only serves the users in that area. Once the number of users increases, the services may be expanded accordingly.
  • the object detection model may be trained for different purposes other than the common ones. For example, occluded pedestrians may be revealed by the framework in the case that other subscribed users are nearby and have these pedestrians in detection range (where such information about the occluded pedestrians may be shared to a user in a vehicle approaching these occluded pedestrians, where such a user is not in line of sight of these pedestrians).
  • the various embodiments are not limited to traffic use, but may be applicable to any vehicle use in any environment, so long as information regarding location and object detection of alert conditions may be shared.
  • some embodiments can improve the functioning of user equipment or systems themselves (e.g., object detection systems, location detection systems, driver assistance systems, etc.), for example, by receiving, using a remote computing system and from each of one or more first computing systems associated with corresponding one or more first vehicles among a plurality of vehicles and over one or more networks, one or more first communications indicating that a first alert condition has been detected at or near one or more of a first current vehicle location of each of the one or more first vehicles or a first alert condition location corresponding to the first alert condition; receiving, using the remote computing system and from each of one or more second computing systems associated with corresponding one or more second vehicles among the plurality of vehicles and over the one or more networks, one or more second communications regarding at least one of a second current vehicle location or a second predicted path for each of the one or more second vehicles; analyzing, using the remote computing system, the at least one of the second current vehicle location or the second predicted path for each of the one or more second vehicles in relation to at least one of the first alert condition or the first
  • These functionalities can produce tangible results outside of the implementing computer system, including, merely by way of example, providing a system to perform fusion of vision, location, and/or other signals for object detection that may be exchanged among a plurality of users and/or computing systems associated with a plurality of vehicles to satisfy some tasks, at least some of which may be observed or measured by users, ADAS content developers, and/or user device and/or vehicle manufacturers.
  • Figs. 1-6 illustrate some of the features of the method, system, and apparatus for implementing driver assistance technologies (e.g., advanced driver assistance systems ("ADASs"), other vision-based object detection, other location signal-based object detection, other vision and location signal-based object detection, or multiple types of signal-based object detection, or the like), and, more particularly, to methods, systems, and apparatuses for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals, as referred to above.
  • driver assistance technologies e.g., advanced driver assistance systems (“ADASs"), other vision-based object detection, other location signal-based object detection, other vision and location signal-based object detection, or multiple types of signal-based object detection, or the like
  • ADASs advanced driver assistance systems
  • Figs. 1-6 refer to examples of different embodiments that include various components and steps, which can be considered alternatives or which can be used in conjunction with one another in the various embodiments.
  • the description of the illustrated methods, systems, and apparatuses shown in Figs. 1-6 is provided for purposes of illustration and should not be considered to limit the scope of the different embodiments.
  • Fig. 1 is a schematic diagram illustrating a system 100 for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals, in accordance with various embodiments.
  • system 100 may comprise a plurality of vehicles, including, but not limited to, a first vehicle 105a, a second vehicle 105b, and a third vehicle 105c, and so on.
  • System 100 may further comprise computing system 110 that may be disposed within or on each of at least one vehicle 105 or may be disposed external to vehicle 105 yet associated therewith (e.g., computing systems 170a- 170n, or the like).
  • system 100 may further comprise user device(s) 115 that may be disposed within or on each of at least one vehicle 105 or may be disposed external to vehicle 105 yet associated therewith and/or associated with a user (e.g., user devices 175a-175n associated with corresponding users 180a-180n, respectively, or the like).
  • Each vehicle 105 may further include, without limitation, one or more location sensors (or location data signal source(s)) and one or more object detection sensors (or object detection data signal source(s)) 125, or the like.
  • System 100 may further comprise network(s) 145, as well as remote computing system(s) 140 and corresponding database(s) 140a and, in some cases, one or both of location determination server 150 and corresponding database(s) 150a and/or image recognition server 155 and corresponding database(s) 155a.
  • Remote computing system(s) 140 and database(s) 140a), location determination server 150 (and database(s) 150a), and image recognition server 155 (and corresponding database(s) 155a) may each be accessible by one or more of computing systems 110 and/or 170a-170n, user devices 115 and/or 175a-175n, location sensor(s) 120, and object detection sensor(s) 125, and/or the like, via network(s) 145, and in some cases, via wireless communication (such as depicted in Fig. 1 by the lightning bolt symbols, or the like).
  • the wireless communications may include wireless communications using protocols including, but not limited to, at least one of BluetoothTM communications protocol, WiFi communications protocol, or other 802.11 suite of communications protocols, ZigBee communications protocol, Z-wave communications protocol, or other 802.15.4 suite of communications protocols, cellular communications protocol (e.g., 3G, 4G, 4G LTE, 5G, etc.), or other suitable communications protocols, and/or the like.
  • protocols including, but not limited to, at least one of BluetoothTM communications protocol, WiFi communications protocol, or other 802.11 suite of communications protocols, ZigBee communications protocol, Z-wave communications protocol, or other 802.15.4 suite of communications protocols, cellular communications protocol (e.g., 3G, 4G, 4G LTE, 5G, etc.), or other suitable communications protocols, and/or the like.
  • the network(s) 145 may each include a local area network (“LAN”), including, without limitation, a fiber network, an Ethernet network, a Token- RingTM network, and/or the like; a wide-area network (“WAN”); a wireless wide area network (“WWAN”); a virtual network, such as a virtual private network (“VPN”); the Internet; an intranet; an extranet; a public switched telephone network (“PSTN”); an infra-red network; a wireless network, including, without limitation, a network operating under any of the IEEE 802.11 suite of protocols, the BluetoothTM protocol known in the art, and/or any other wireless protocol; and/or any combination of these and/or other networks.
  • LAN local area network
  • WAN wide-area network
  • WWAN wireless wide area network
  • VPN virtual network
  • PSTN public switched telephone network
  • PSTN public switched telephone network
  • a wireless network including, without limitation, a network operating under any of the IEEE 802.11 suite of protocols, the BluetoothTM protocol known in the art, and
  • the network(s) 145 might include an access network of the service provider (e.g., an Internet service provider ("ISP")). In another embodiment, the network(s) 145 may include a core network of the service provider, and/or the Internet.
  • ISP Internet service provider
  • vehicles 105 and 105a- 105c may each include, without limitation, one of a car, a minivan, a pickup truck, a motorcycle, an all-terrain vehicle, a scooter, a police vehicle, a fire engine, an ambulance, a recreational vehicle, a bus, a commercial van, a commercial truck, a semi-tractor-trailer truck, a boat, a ship, a submersible, an amphibious vehicle, an aircraft, a space vehicle, a satellite, an autonomous vehicle, or a drone (including, but not limited to, one or an aerial drone, a land-based drone, a water-based drone, an amphibious drone, or a space-based drone, and/or the like), and/or the like.
  • a drone including, but not limited to, one or an aerial drone, a land-based drone, a water-based drone, an amphibious drone, or a space-based drone, and/or the like
  • the computing system 110 may include, without limitation, at least one of a data signal fusing computing system, at least one processor on the user device, at least one processor on a mobile device associated with a user, a vehicle-based computing system, an object detection system, or a driver assistance system, and/or the like.
  • the computing system(s) may be disposed within or on the vehicle (e.g., computing system(s) 110, or the like) or may be associated with the vehicle yet located outside the vehicle (e.g., computing systems 170a- 170n, or the like; in some cases, at a remote location relative to the location of the vehicle, such as in the case of a computing system for controlling operations of a drone or a computing system that is communicatively coupled with an autonomous vehicle, or the like).
  • user devices 115 and/or 175a-175n which may be associated with at least one of a user or the vehicle, may each include, without limitation, at least one of a smartphone, a tablet computer, a display device, an augmented reality (“AR”) device, a virtual reality (“VR”) device, a mixed reality (“MR”) device, a vehicle console display, a vehicle heads-up display (“HUD”), a vehicle remote controller display, one or more audio speakers, or one or more haptic response devices, and/or the like.
  • AR augmented reality
  • VR virtual reality
  • MR mixed reality
  • vehicle console display a vehicle console display
  • HUD vehicle heads-up display
  • vehicle remote controller display one or more audio speakers, or one or more haptic response devices, and/or the like.
  • the user device(s) may be disposed within or on the vehicle (e.g., user device(s) 115, or the like) or may be associated with the vehicle yet located outside the vehicle (e.g., user devices 175a-175n, or the like; in some cases, at a remote location relative to the location of the vehicle, such as in the case of a user device for controlling a drone or a user device that is communicatively coupled with an autonomous vehicle, or the like).
  • one or more different types of location data signal sources may be used and may each include, but is not limited to, one of a global positioning system (“GPS”) device, a global navigation satellite system (“GNSS”) device, a text recognition-based location identification system, an image recognition-based landmark identification system, a telecommunications signal triangulation-based location identification system, a radar-based location identification system, a sonar-based location identification system, or a lidar-based location identification system, and/or the like.
  • GPS global positioning system
  • GNSS global navigation satellite system
  • text recognition-based location identification system an image recognition-based landmark identification system
  • a telecommunications signal triangulation-based location identification system a radar-based location identification system
  • sonar-based location identification system e.g., a sonar-based location identification system
  • lidar-based location identification system e.g., lidar-based location identification system
  • one or more different types of object detection data signal sources may be used and may each include, but is not limited to, one of a vision-based object detection system, a radarbased object detection system, a sonar-based object detection system, or a lidar-based object detection system, and/or the like.
  • the remote computing system 140 may include, but is not limited to, at least one of a remote data signal fusing computing system, a remote object detection system, or a remote driver assistance system, a server computer over the one or more networks, an image processing server, a graphics processing unit (“GPU”) -based server, a positioning and mapping server, a machine learning system, an artificial intelligence (“Al”) system, a deep learning system, a neural network, a convolutional neural network (“CNN”), a fully convolutional network (“FCN”), a cloud computing system, or a distributed computing system, and/or the like.
  • a remote data signal fusing computing system e.g., a remote object detection system
  • a remote driver assistance system e.g., a server computer over the one or more networks
  • an image processing server e.g., a graphics processing unit (“GPU”) -based server, a positioning and mapping server
  • a machine learning system e.g., a machine learning system
  • Al artificial intelligence
  • the location determination server 150 may be used to further process location data obtained from one or more location sensors (e.g., location sensor(s) 120, or the like).
  • the image recognition server 155 may be used to further process object detection data obtained from one or more object detection sensors (e.g., object detection sensor(s) 125, or the like).
  • computing system 110 may determine at least one of a first current vehicle location (e.g., vehicle locations 130a-130c for corresponding vehicles 105a- 105c, or the like) or a first predicted path of a vehicle (e.g., vehicle paths 135a-135c (denoted in Fig.
  • the computing system may send a first communication regarding the determined at least one of the first current vehicle location or the first predicted path of the vehicle (e.g., a message containing current location data 185, or the like) to a remote computing system (e.g., remote computing system 140, or the like) over one or more networks (e.g., network(s) 145, or the like).
  • a remote computing system e.g., remote computing system 140, or the like
  • networks e.g., network(s) 145, or the like.
  • the computing system may receive a second communication (e.g., alert condition data 195, or the like) regarding the at least one first alert condition (e.g., alert condition 160, or the like) for the corresponding at least one first alert condition location (e.g., alert condition location 165, or the like) from the remote computing system over the one or more networks.
  • a second communication e.g., alert condition data 195, or the like
  • the at least one first alert condition may each include, without limitation, at least one of traffic congestion along the first predicted path of the vehicle potentially causing a slow-down, a traffic accident along the first predicted path of the vehicle potentially causing a slow-down, a construction site along the first predicted path of the vehicle potentially causing a slow-down, one or more people along the first predicted path of the vehicle who are occluded from a perspective of the vehicle, one or more animals along the first predicted path of the vehicle who are occluded from the perspective of the vehicle, one or more objects along the first predicted path of the vehicle who are occluded from the perspective of the vehicle, one or more people potentially blocking the first predicted path of the vehicle, one or more animals potentially blocking the first predicted path of the vehicle, one or more objects potentially blocking the first predicted path of the vehicle, a tracked weather event along or near the first predicted path of the vehicle, a natural hazard potentially blocking the first
  • the computing system may perform the following tasks: receive one or more first object detection data signals (e.g., signals containing object detection data 190, or the like) from one or more different types of object detection data signal sources (e.g., object detection sensor(s) 125, or the like); fuse the at least one first alert condition with one or more of the received one or more first object detection data signals, the at least one first alert condition location, a second current vehicle location of the vehicle, or a second predicted path of the vehicle (e.g., as shown at block 210 of Fig.
  • first object detection data signals e.g., signals containing object detection data 190, or the like
  • object detection data signal sources e.g., object detection sensor(s) 125, or the like
  • At least one of the second current vehicle location or the second predicted path of the vehicle may be determined based at least in part on the one or more second location-based data from the corresponding one or more second data signals received from the one or more different types of location data signal sources (similar to the first current vehicle location or the first predicted path, or the like).
  • determining the at least one of the first current vehicle location or the first predicted path of a vehicle may comprise the computing system performing at least one of: (1) determining the first current vehicle location based at least in part on GPS data from the GPS device; (2) determining the first predicted path of the vehicle based at least in part on a series of GPS data from the GPS device over time; (3) determining the first current vehicle location based at least in part on GNSS data from the GNSS device; (4) determining the first predicted path of the vehicle based at least in part on a series of GNSS data from the GNSS device over time; (5) determining the first current vehicle location based at least in part on text recognition of one or more location-identifying signs, the one or more location-identifying signs comprising at least one of one or more street signs, one or more address signs, one or more business signs, one or more highway signs, one or more city limits signs, one or more county boundary signs, one or more state boundary signs, one or more province boundary signs, one or more territory boundary signs, one or
  • generating and presenting the first alert message may comprise at least one of: generating a graphical display depicting one or more of the at least one first alert condition or the generated first fused data, and presenting the generated graphical display on a display device (e.g., user devices 115 and/or 175a- 175n, or the like); generating a text-based message depicting one or more of the at least one first alert condition or the generated first fused data, and presenting the text-based message on a display device (e.g., user devices 115 and/or 175a-175n, or the like); generating at least one message regarding one or more of the at least one first alert condition or the generated first fused data, and sending the at least one message to the user device (e.g., user devices 115 and/or 175a-175n, or the like); or generating at least one audio message regarding one or more of the at least one first alert condition or the generated first fused data, and sending the at least one audio message
  • the at least one message may include, but is not limited to, at least one of an e-mail message, a short message service (“SMS”) app, a multimedia messaging service (“MMS”) app, or a text message app, and/or the like.
  • SMS short message service
  • MMS multimedia messaging service
  • the computing system may receive one or more second object detection data signals from the one or more different types of object detection data signal sources.
  • the computing system may analyze the one or more second object detection data signals. Based on a determination that the at least one first alert condition is no longer present at the at least one first alert condition location, the computing system may send a third communication to the remote computing system over the one or more networks indicating that the at least one first alert condition is no longer present at the at least one first alert condition location.
  • the computing system may receive one or more third object detection data signals from the one or more different types of object detection data signal sources.
  • the computing system may analyze the one or more third object detection data signals. Based on a determination that the one or more third object detection data signals correspond to at least one second alert condition, the computing system may determine at least one of a third current vehicle location or at least one second alert condition location corresponding to the at least one second alert condition, based at least in part on one or more third location-based data from corresponding one or more third data signals received from the one or more different types of location data signal sources.
  • the computing system may send a fourth communication to the remote computing system over the one or more networks indicating that the at least one second alert condition has been detected at or near the at least one of the third current vehicle location or the at least one second alert condition location.
  • remote computing system 140 may receive, from each of one or more first computing systems associated with corresponding one or more first vehicles among a plurality of vehicles and over one or more networks, one or more first communications indicating that a first alert condition has been detected at or near one or more of a first current vehicle location of each of the one or more first vehicles or a first alert condition location corresponding to the first alert condition.
  • the remote computing system may receive, from each of one or more second computing systems associated with corresponding one or more second vehicles among the plurality of vehicles and over the one or more networks, one or more second communications regarding at least one of a second current vehicle location or a second predicted path for each of the one or more second vehicles.
  • the remote computing system may analyze the at least one of the second current vehicle location or the second predicted path for each of the one or more second vehicles in relation to at least one of the first alert condition or the first alert condition location. Based on a determination that at least one second vehicle among the one or more second vehicles is in proximity to or approaching the first alert condition location or is within a first region encompassing the first alert condition, based at least in part on the analysis, the remote computing system may send one or more third communications to each of the second computing systems associated with each of the corresponding at least one second vehicle indicating that said second vehicle is in proximity to or approaching the first alert condition location.
  • the remote computing system may receive, from each of one or more third computing systems associated with corresponding one or more third vehicles among the plurality of vehicles and over the one or more networks, one or more fourth communications indicating that the first alert condition is no longer present at the first alert condition location.
  • the remote computing system may receive, from each of one or more fourth computing systems associated with corresponding one or more fourth vehicles among the plurality of vehicles and over the one or more networks, one or more fifth communications regarding at least one of a fourth current vehicle location or a fourth predicted path for each of the one or more fourth vehicles.
  • the remote computing system may analyze the at least one of the fourth current vehicle location or the fourth predicted path for each of the one or more fourth vehicles in relation to at least one of the first alert condition or the first alert condition location.
  • the remote computing system may send one or more fifth communications to each of the fourth computing systems associated with each of the corresponding at least one fourth vehicle indicating that the first alert condition is no longer present at the first alert condition location.
  • the remote computing system may continue sending the one or more third communications to each of the fourth computing systems associated with each of the corresponding at least one fourth vehicle indicating that said fourth vehicle is in proximity to or approaching the first alert condition location.
  • Fig. 2 is a schematic block flow diagrams illustrating a non- limiting example 200 of object detection using fusion of vision, location, and/or other signals, and/or alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals, in accordance with various embodiments.
  • a computing system 110, user device(s) 115, location data signal source(s) 120, and object detection data signal source(s) 125 may be disposed within or on a vehicle 105.
  • the computing system 110 may communicatively couple with a remote computing system(s) 140 (and corresponding database(s) 140a) (and, in some cases, with user device(s) 175 as well) via network(s) 145.
  • computing system 110 may receive object detection data 190 from object detection data signal source(s) 125, and may perform object detection (at block 205).
  • Computing system 110 may also receive location data 185 from location data signal source(s) 120, and may perform fusion of object detection and location (at block 210), in some cases, based at least in part on at least one of object detection data 190, results of object detection (from block 205), and/or location data 185, and/or the like. Computing system 110 may then send vehicle location and/or detection result 215 to remote computing system 140 (and corresponding database(s) 140a) via network(s) 145.
  • remote computing system 140 (and/or corresponding database(s) 140a, which in some cases may include, but is not limited to, a structured query language ("SQL") database(s), or the like) may be configured to store and/or update alert condition information (in some cases, event information, or the like) corresponding to alert condition (e.g., alert condition(s) 160 of Fig. 1, or the like) and/or alert condition location (e.g., alert condition location(s) 165 of Fig. 1, or the like), to interchange and/or exchange the location information, and/or the like.
  • SQL structured query language
  • Remote computing system(s) 140 may send alert condition data and location 220 to computing system 110 via network(s) 145.
  • Computing system 110 may then perform coordinate transformation (at block 225) of the alert condition data and location 220, the results of which may be used as input to the fusion process at block 210.
  • Computing system 110 may then display and/or present (at block 230) the alert information (i.e., the results of the fusion at block 210) on user device(s) 115 (which may be located within or on vehicle 105) and/or user device(s) 175 (which may be located external to vehicle 105; in some cases, in communication with computing system 110 via network(s) 145, or the like).
  • the application for fusion of object detection and location information may be invoked when turning on the phone camera (e.g., the object detection data signal source(s) 125, or the like).
  • the phone camera e.g., the object detection data signal source(s) 125, or the like.
  • object detection and location information threads e.g., blocks 205 and 220/225, or the like
  • the frame may be added to the video display thread (e.g., block 230, or the like).
  • the object detection result may be handled by the same location information thread to send the corresponding location to the remote computing system 140 and/or the database 140a, or the like.
  • Various embodiments provide a system and service that may be configured to exchange or interchange location signals and/or object detection results.
  • a SQL database(s) may be used to store the location data, while the mobile phone may be used to perform object detection, to send and receive the information regarding locations.
  • object detection models may run on the frameworks of mobile Pytorch or TensorflowLite, and/or the like.
  • JNI Java native interface
  • inferencing performed by the object detection model may be processed by a graphics processing unit (“GPU”), in some cases, through C++ or the like to achieve faster runtimes.
  • Sensor data like GPS data, or the like, on the mobile phone
  • OS operating system
  • API application programming interface
  • VPN virtual private network
  • a VPN may encrypt 100% of the Internet traffic sent from a computer and may deliver it to an alternate server somewhere else on the Internet.
  • the subscription of one WiFi device may be recognized as a tracking identifier ("ID") that can provide information of the localizations for a period of time.
  • ID tracking identifier
  • only subscribers of the VPN service may be allowed to send and receive the locations.
  • a fusion framework that integrates object detection and location signals to provide more useful information may be provided.
  • location signals are not limited to location signals based on GPS data (as described herein), and any location signal may be easily used in this framework.
  • the system (and corresponding service) described in accordance with the various embodiments operates by interchanging and/or exchanging locations (e.g., alert condition locations, or the like) according to the area or region in which potentially affected vehicles may be located, and, in some cases, storing the event locations (or alert condition locations) and pushing them to the mobile phones (or other user devices or computing systems) of users and/or vehicles that are determined to potentially be affected by the events (or alert conditions).
  • object detection models running on the mobile phone (or other user devices or computing systems) with additional location information to enhance the driving assistant information, or the like.
  • a system or method that integrates visual images or other object detection signals with sensor signals or other location signals may enable delivery of more useful information to assist drivers.
  • this framework may be based on mobile devices to extend its ability by integrating visual images and location sensor signals to generate new driver assist information. Further, this framework may be scalable because its services could be distributed based on location(s) and may be run on the cloud, or the like.
  • the service may be started if the number of users exceeds some threshold value (e.g., within a region), and that service only serves the users in that area. Once the number of users increases, the services may be expanded accordingly.
  • the object detection model may be trained for different purposes other than the common ones. For example, occluded pedestrians may be revealed by the framework in the case that other subscribed users are nearby and have these pedestrians in detection range (where such information about the occluded pedestrians may be shared to a user in a vehicle approaching these occluded pedestrians, where such a user is not in line of sight of these pedestrians).
  • the various embodiments are not limited to traffic use, but may be applicable to any vehicle use in any environment, so long as information regarding location and object detection of alert conditions may be shared.
  • the framework provides a general solution to the fusion of the object detection and the location signals, it may be extended to, but is not limited to, the following aspects: (a) allowing tracking algorithms to be performed on the detection results since the detection result with ID is the tracking result; (b) allowing various wireless communications such as 4G/5G or WiFi that can contain the location information; (c) allowing depth/distance information measured from and/or by the end users; (d) allow post-processing of the fused results; and/or the like.
  • Figs. 3A-3C are schematic block flow diagrams illustrating various non-limiting examples 300, 300', and 300" of alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals, in accordance with various embodiments.
  • Fig. 3 is directed to automobiles as the vehicles 105 and road traffic-related alert conditions, or the like, the various embodiments are not so limited, and any suitable vehicle and/or alert condition (as described herein with respect to Figs. 1 and 4, or the like) may be the focus of the object detection and/or the alert condition messaging as described herein.
  • Fig. 3 merely provides a simple set of examples for illustrating implementation of object detection and/or alert condition messaging, in accordance with the various embodiments.
  • a plurality of vehicles 105 may travel along with bi-directional traffic on lanes 305a and 305b.
  • destination markers or signs 310 may indicate location or distance to a particular location (e.g., signs 310a and 310b indicate that the distance to the town of Springfield is 15 miles and 20 miles, respectively).
  • an alert condition 160 (in this case, a car accident involving vehicles 105e and 105f) has been observed at alert condition location 165 (in this case, on one of lanes 305a heading to the town of Springfield, near the sign indicating 15 miles to Springfield).
  • Computing systems on at least one of vehicle 105d (travelling in the opposite direction on one of lanes 305b), user device 175a (associated with user 180a, who is standing by the side of the affected lane 305a), vehicle 105g (travelling in the same direction on one of lanes 305a), and/or vehicle 105h (travelling in the same direction on one of lanes 305a), and/or the like, may detect the alert condition 160 using object detection and location determination (such as described in detail above with respect to Figs. 1 and 2, or the like), and may send respective alert condition messages 195a, 195b, 195c, and 195d to remote computing system 140, over one or more networks (not shown in Fig.
  • the alert condition messages 195a, 195b, 195c, and 195d may each include, but is not limited to, object detection data regarding the alert condition 160 and location data regarding the alert condition location 165, and/or the like.
  • computing systems on at least one of vehicles 105i, 105j , and/or 105k (which are near sign 310b, and thus about 5 miles from the alert condition location 165, or the like), and/or the like, may send at least location data 185a, 185b, and 185c, respectively (similar to location data 185 in Figs. 1 and 2, or the like), to remote computing system 140, over the one or more networks.
  • computing systems on at least one of vehicles 105i, 105j, and/or 105k, and/or the like may also send at least object detection data 190a, 190b, and 190c, respectively (similar to object detection data 190 in Figs. 1 and 2, or the like), to remote computing system 140, over the one or more networks.
  • computing systems on the at least one of vehicle 105d, user device 175a, vehicle 105g, and/or vehicle 105h may also send location data 185 and/or object detection data 190, or the like, to remote computing system 140, over the one or more networks.
  • Remote computing system 140 may receive the alert condition messages (e.g., alert condition messages 195a, 195b, 195c, and 195d, or the like) (at block 315), may analyze the alert condition data (at block 320), and may identify the alert condition and/or the extent of the alert condition (e.g., the size of the area affected by the alert condition, or the like) (at block 325), and may identify alert condition location and range (at block 330).
  • alert condition messages e.g., alert condition messages 195a, 195b, 195c, and 195d, or the like
  • Remote computing system 140 may also receive location data of the vehicles (e.g., location data 185a, 185b, and 185c; in some cases, object detection data 190a, 190b, and 190c, as well; in some instances, location data 185 and/or object detection data 190 from computing systems on the at least one of vehicle 105d, user device 175a, vehicle 105g, and/or vehicle 105h, as well) (at block 335).
  • Remote computing system 140 may analyze the location data of the vehicles (as well as any received object detection data) (at block 340).
  • remote computing system 140 may identify one or more vehicles at, near, or approaching the alert condition location, based at least in part on the results of blocks 330 and 340.
  • Remote computing system 140 may subsequently send alert condition messages (e.g., alert condition message 195, or the like) to the computing systems corresponding to the identified vehicles (in this case, vehicles 105i and 105j, which are heading toward the alert condition location 165, but not vehicle 105k, which is going in the opposite direction, as well as away, from the alert condition location 165).
  • alert condition messages e.g., alert condition message 195, or the like
  • the computing systems corresponding to the identified vehicles in this case, vehicles 105i and 105j, which are heading toward the alert condition location 165, but not vehicle 105k, which is going in the opposite direction, as well as away, from the alert condition location 165.
  • the alert condition over messages 195a', 195b', and 195c' may each include, but is not limited to, object detection data regarding the former alert condition 160' and location data regarding the former alert condition location 165', as well as data regarding the absence of the alert condition 160 at the alert condition location 165, and/or the like.
  • computing systems on at least one of vehicles 105o, 105p, and/or 105q may send at least location data 185d, 185e, and 185f, respectively (similar to location data 185 in Figs. 1 and 2, or the like), to remote computing system 140, over the one or more networks.
  • computing systems on at least one of vehicles 105o, 105p, and/or 105q, and/or the like may also send at least object detection data 190d, 190e, and 190f, respectively (similar to object detection data 190 in Figs.
  • computing systems on the at least one of vehicles 1051, 105m, and/or 105n may also send location data 185 and/or object detection data 190, or the like, to remote computing system 140, over the one or more networks.
  • Remote computing system 140 may receive the alert condition over messages (e.g., alert condition over messages 195a', 195b', and 195c', or the like) (at block 355), may determine whether a threshold number of such messages have been received (i.e., whether such alert condition over messages 195' from a threshold number of the computing systems corresponding to the threshold number of vehicles has been received by the remote computing system 140) (at block 360).
  • the threshold number may include, but is not limited to, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, or the like.
  • remote computing system 140 may identify vehicles at, near, or approaching the former alert condition location (at block 365), and may subsequently send, at block 370, alert condition over messages (e.g., alert condition over message 195', or the like) to the computing systems corresponding to the identified vehicles (in this case, vehicles 105o and 105p, which are heading toward the former alert condition location 165', but not vehicle 105q, which is going in the opposite direction, as well as away, from the former alert condition location 165').
  • alert condition over messages e.g., alert condition over message 195', or the like
  • remote computing system 140 may identify vehicles at, near, or approaching the alert condition location (at block 345), and may subsequently continue to send, at block 350, alert condition messages (e.g., alert condition over message 195, or the like) to the computing systems corresponding to the identified vehicles (in this case, vehicles 105o and 105p, which are heading toward the alert condition location 165, but not vehicle 105q, which is going in the opposite direction, as well as away, from the alert condition location 165).
  • alert condition messages e.g., alert condition over message 195, or the like
  • a plurality of vehicles 105 may travel along with bi-directional traffic on lanes 305c and 305d.
  • a destination marker(s) or sign(s) 310 may indicate location or distance to a particular location (e.g., sign 310c indicates that the distance to the town of Fairview is 12 miles).
  • an alert condition 160" (in this case, approaching vehicles from the opposite direction making it dangerous for a vehicle attempting to pass the vehicle(s) in front of it) has been observed at alert condition location 165" (in this case, both lands 305c and 305d near the sign indicating 12 miles to Fairview, where vehicles are converging from opposite directions).
  • Computing systems on at least one of vehicle 105r (in this case, a car travelling in a second direction on lane 305d), vehicle 105s (in this case, a semi- tractor-trailer truck travelling in the second direction on lane 305d), vehicle 105t (in this case, a car travelling in a first direction, opposite to the second direction, on lane 305c), and/or vehicle 105u (in this case, another semi-tractor-trailer truck travelling in the first direction on lane 305c), and/or the like, may detect the alert condition 160" using object detection and location determination (such as described in detail above with respect to Figs.
  • alert condition messages 195e, 195f, 195g, and 195h may each include, but is not limited to, object detection data regarding the alert condition 160" and location data regarding the alert condition location 165", and/or the like.
  • computing systems on at least one of vehicles 105v, 105w, and/or 105x, and/or the like may send at least location data 185g, 185h, and 185i, respectively (similar to location data 185 in Figs.
  • computing systems on at least one of vehicles 105v, 105w, and/or 105x, and/or the like may also send at least object detection data 190g, 190h, and 190i, respectively (similar to object detection data 190 in Figs. 1 and 2, or the like), to remote computing system 140, over the one or more networks.
  • computing systems on the at least one of vehicles 105r, 105s, 105t, and/or 105u may also send location data 185 and/or object detection data 190, or the like, to remote computing system 140, over the one or more networks.
  • Remote computing system 140 may receive the alert condition messages (e.g., alert condition messages 195e, 195f, 195g, and 195h, or the like) (at block 315), may analyze the alert condition data (at block 320), and may identify the alert condition and/or the extent of the alert condition (e.g., the size of the area affected by the alert condition, or the like) (at block 325), and may identify alert condition location and range (at block 330).
  • alert condition messages e.g., alert condition messages 195e, 195f, 195g, and 195h, or the like
  • Remote computing system 140 may also receive location data of the vehicles (e.g., location data 185g, 185h, and 185i; in some cases, object detection data 190g, 190h, and 190i, as well; in some instances, location data 185 and/or object detection data 190 from computing systems on the at least one of vehicles 105r, 105s, 105t, and/or 105u, as well) (at block 335).
  • Remote computing system 140 may analyze the location data of the vehicles (as well as any received object detection data) (at block 340).
  • remote computing system 140 may identify one or more vehicles at, near, or approaching the alert condition location, based at least in part on the results of blocks 330 and 340.
  • Remote computing system 140 may subsequently send alert condition messages (e.g., alert condition message 195", or the like) to the computing systems corresponding to the identified vehicles (in this case, vehicles 105v and 105w, which are heading toward the alert condition location 165", but not vehicle 105x, which is going in the opposite direction, as well as away, from the alert condition location 165").
  • alert condition messages e.g., alert condition message 195", or the like
  • a similar alert condition may exist for vehicle 105r in terms of the danger in passing vehicle 105s due to oncoming traffic in the form of vehicles 105t, 105u, 105v, and/or 105w, or the like, and the appropriate messages would be sent and received in a manner similar to that described above with respect to vehicles 105v and/or 105w attempting to pass vehicle 105u due to oncoming traffic in the form of vehicles 105r and/or 105s, or the like.
  • Fig. 3A illustrates an example of "object detection to the location” (in this case, where there is a car accident)
  • Fig. 3C illustrates an example of "location to object detection” (in this case, approaching a dangerous pass zone).
  • the object detection model may detect the car accident and may send its GPS location (or other location data) to the remote computing system (and corresponding database(s)).
  • mobile phones or other user devices of users in those other cars may show the warning message(s) to give notice to these users that there is a car accident ahead and to please be aware of that.
  • the remote computing system may update its database(s) accordingly and may push updated information to the other users so that they would no longer be warned when they are approaching the affected (now, cleared) location(s).
  • Fig. 3C illustrates an example of "location to object detection" when the occluded object (in this case, oncoming traffic) may be approaching.
  • the occluded object in this case, oncoming traffic
  • Fig. 3C illustrates an example of "location to object detection" when the occluded object (in this case, oncoming traffic) may be approaching.
  • the occluded objects in this case, vehicles 105s and 105r approaching from the opposite direction on the adjacent lane of the two-lane road.
  • such information is important for the drivers to know.
  • Figs. 4A-4F are flow diagrams illustrating a method 400 for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals, in accordance with various embodiments.
  • Method 400 of Fig. 4A either continues onto Fig. 4C following the circular marker denoted, "A,” or continues onto Fig. 4D following the circular marker denoted, "B.”
  • Method 400 of Fig. 4E continues onto Fig. 4F following the circular marker denoted, "C.” In some cases, method 400 of Fig. 4F may return to Fig. 4E following the circular marked denoted, "D.”
  • Figs. 1, 2, 3 A, 3B, and 3C respectively (or components thereof), can operate according to the method 400 illustrated by Fig. 4 (e.g., by executing instructions embodied on a computer readable medium), the systems, examples, or embodiments 100, 200, 300, 300', and 300" of Figs. 1, 2, 3A, 3B, and 3C can each also operate according to other modes of operation and/or perform other suitable procedures.
  • method 400 at block 402, may comprise determining, using a computing system, at least one of a first current vehicle location or a first predicted path of a vehicle, based at least in part on one or more first location-based data from corresponding one or more first location data signals received from one or more different types of location data signal sources.
  • the computing system may include, without limitation, at least one of a data signal fusing computing system, at least one processor on the user device, at least one processor on a mobile device associated with a user, a vehicle-based computing system, an object detection system, or a driver assistance system, and/or the like.
  • the one or more different types of location data signal sources may each include, but is not limited to, one of a global positioning system (“GPS”) device, a global navigation satellite system (“GNSS”) device, a text recognition-based location identification system, an image recognition-based landmark identification system, a telecommunications signal triangulation-based location identification system, a radar-based location identification system, a sonar-based location identification system, or a lidar-based location identification system, and/or the like.
  • GPS global positioning system
  • GNSS global navigation satellite system
  • text recognition-based location identification system an image recognition-based landmark identification system
  • a telecommunications signal triangulation-based location identification system a radar-based location identification system
  • sonar-based location identification system a sonar-based location identification system
  • lidar-based location identification system and/or the like.
  • the vehicle may include, without limitation, one of a car, a minivan, a pickup truck, a motorcycle, an all-terrain vehicle, a scooter, a police vehicle, a fire engine, an ambulance, a recreational vehicle, a bus, a commercial van, a commercial truck, a semi-tractor- trailer truck, a boat, a ship, a submersible, an amphibious vehicle, an aircraft, a space vehicle, a satellite, an autonomous vehicle, or a drone (including, but not limited to, one or an aerial drone, a land-based drone, a water-based drone, an amphibious drone, or a space-based drone, and/or the like), and/or the like.
  • a drone including, but not limited to, one or an aerial drone, a land-based drone, a water-based drone, an amphibious drone, or a space-based drone, and/or the like
  • determining the at least one of the first current vehicle location or the first predicted path of a vehicle may comprise at least one of: determining, using the computing system, the first current vehicle location based at least in part on GPS data from the GPS device; determining, using the computing system, the first predicted path of the vehicle based at least in part on a series of GPS data from the GPS device over time; determining, using the computing system, the first current vehicle location based at least in part on GNSS data from the GNSS device; determining, using the computing system, the first predicted path of the vehicle based at least in part on a series of GNSS data from the GNSS device over time; determining, using the computing system, the first current vehicle location based at least in part on text recognition of one or more location-identifying signs, the one or more location-identifying signs comprising at least one of one or more street signs, one or more address signs, one or more business signs, one or more highway signs, one or more city limits signs, one or more county boundary signs, one or more state
  • method 400 may comprise sending, using the computing system, a first communication regarding the determined at least one of the first current vehicle location or the first predicted path of the vehicle to a remote computing system over one or more networks.
  • the remote computing system may include, but is not limited to, at least one of a remote data signal fusing computing system, a remote object detection system, or a remote driver assistance system, a server computer over the one or more networks, an image processing server, a graphics processing unit (“GPU”) -based server, a positioning and mapping server, a machine learning system, an artificial intelligence (“Al”) system, a deep learning system, a neural network, a convolutional neural network (“CNN”), a fully convolutional network (“FCN”), a cloud computing system, or a distributed computing system, and/or the like.
  • Method 400 may further comprise, at block 406, in response to existence of at least one first alert condition for corresponding at least one first alert condition location that is in proximity to one or more of the determined at least one of the first current vehicle location or the first predicted path of the vehicle or that is within a first region encompassing the determined at least one of the first current vehicle location or the first predicted path of the vehicle, receiving, using the computing system, a second communication regarding the at least one first alert condition for the corresponding at least one first alert condition location from the remote computing system over the one or more networks.
  • the at least one first alert condition may each include, without limitation, at least one of traffic congestion along the first predicted path of the vehicle potentially causing a slow-down, a traffic accident along the first predicted path of the vehicle potentially causing a slow-down, a construction site along the first predicted path of the vehicle potentially causing a slow-down, one or more people along the first predicted path of the vehicle who are occluded from a perspective of the vehicle, one or more animals along the first predicted path of the vehicle who are occluded from the perspective of the vehicle, one or more objects along the first predicted path of the vehicle who are occluded from the perspective of the vehicle, one or more people potentially blocking the first predicted path of the vehicle, one or more animals potentially blocking the first predicted path of the vehicle, one or more objects potentially blocking the first predicted path of the vehicle, a tracked weather event along or near the first predicted path of the vehicle, a natural hazard potentially blocking the first predicted path of the vehicle, a manmade hazard potentially blocking the first
  • method 400 may comprise determining whether the vehicle is approaching the at least one first alert condition location based at least in part on one or more second location-based data from corresponding one or more second location data signals received from the one or more different types of location data signal sources.
  • method 400 may comprise performing the following tasks: receiving, using the computing system, one or more first object detection data signals from one or more different types of object detection data signal sources (block 410); fusing, using the computing system, the at least one first alert condition with one or more of the received one or more first object detection data signals, the at least one first alert condition location, a second current vehicle location of the vehicle, or a second predicted path of the vehicle to generate first fused data (block 412); and generating and presenting, using the computing system and via one or more user devices, a first alert message indicating that the vehicle is approaching the at least one first alert condition, based at least in part on the generated first fused data (block 414).
  • At least one of the second current vehicle location or the second predicted path of the vehicle may be determined based at least in part on the one or more second location-based data from the corresponding one or more second data signals received from the one or more different types of location data signal sources.
  • the one or more different types of object detection data signal sources may each include, but is not limited to, one of a vision-based object detection system, a radar-based object detection system, a sonar-based object detection system, or a lidar-based object detection system, and/or the like.
  • each user device which may be associated with at least one of a user or the vehicle, may include, without limitation, at least one of a smartphone, a tablet computer, a display device, an augmented reality (“AR”) device, a virtual reality (“VR”) device, a mixed reality (“MR”) device, a vehicle console display, a vehicle heads-up display (“HUD”), a vehicle remote controller display, one or more audio speakers, or one or more haptic response devices, and/or the like.
  • AR augmented reality
  • VR virtual reality
  • MR mixed reality
  • vehicle console display a vehicle heads-up display
  • HUD vehicle heads-up display
  • vehicle remote controller display one or more audio speakers, or one or more haptic response devices, and/or the like.
  • the user device(s) may be disposed within or on the vehicle or may be associated with the vehicle yet located outside the vehicle (in some cases, at a remote location relative to the location of the vehicle, such as in the case of a user device for controlling a drone or a user device that is communicatively coupled with an autonomous vehicle, or the like).
  • Method 400 may either continue onto the process at block 424 in Fig. 4C following the circular marker denoted, "A,” or continue onto the process at block 430 in Fig. 4D following the circular marker denoted, "B.”
  • generating and presenting the first alert message may comprise at least one of: generating a graphical display depicting one or more of the at least one first alert condition or the generated first fused data, and presenting the generated graphical display on a display device (block 416); generating a text-based message depicting one or more of the at least one first alert condition or the generated first fused data, and presenting the text-based message on a display device (block 418); generating at least one message regarding one or more of the at least one first alert condition or the generated first fused data, and sending the at least one message to the user device (block 420); or generating at least one audio message regarding one or more of the at least one first alert condition or the generated first fused data, and sending the at least one audio message to the user device for playback on at least one audio speaker (block 422); and/or the like.
  • the at least one message may include, but is not limited to, at least one of an e-mail message, a short message service (“SMS”) app, a multimedia messaging service (“MMS”) app, or a text message app, and/or the like.
  • SMS short message service
  • MMS multimedia messaging service
  • method 400 may comprise, at or near the at least one first alert condition location, receiving, using the computing system, one or more second object detection data signals from the one or more different types of object detection data signal sources; analyzing, using the computing system, the one or more second object detection data signals (block 426); and based on a determination that the at least one first alert condition is no longer present at the at least one first alert condition location, sending, using the computing system, a third communication to the remote computing system over the one or more networks indicating that the at least one first alert condition is no longer present at the at least one first alert condition location (block 428).
  • method 400 may comprise, receiving, using the computing system, one or more third object detection data signals from the one or more different types of object detection data signal sources; analyzing, using the computing system, the one or more third object detection data signals (block 432); based on a determination that the one or more third object detection data signals correspond to at least one second alert condition, determining, using the computing system, at least one of a third current vehicle location or at least one second alert condition location corresponding to the at least one second alert condition, based at least in part on one or more third locationbased data from corresponding one or more third data signals received from the one or more different types of location data signal sources (block 434); and sending, using the computing system, a fourth communication to the remote computing system over the one or more networks indicating that the at least one second alert condition has been detected at or near the at least one of the third current vehicle location or the at least one second alert condition location (block 436).
  • method 400 may comprise receiving, using a remote computing system and from each of one or more first computing systems associated with corresponding one or more first vehicles among a plurality of vehicles and over one or more networks, one or more first communications indicating that a first alert condition has been detected at or near one or more of a first current vehicle location of each of the one or more first vehicles or a first alert condition location corresponding to the first alert condition.
  • the one or more first communications may, in some cases, correspond to the fourth communication in Fig. 4D, albeit from one or more computing systems associated with corresponding one or more vehicles.
  • the one or more first computing systems and the one or more second computing systems may each include, without limitation, at least one of a data signal fusing computing system, at least one processor on the user device, at least one processor on a mobile device associated with a user, a vehiclebased computing system, an object detection system, or a driver assistance system, and/or the like.
  • the remote computing system may include, but is not limited to, at least one of a remote data signal fusing computing system, a remote object detection system, or a remote driver assistance system, a server computer over the one or more networks, an image processing server, a graphics processing unit (“GPU”) -based server, a positioning and mapping server, a machine learning system, an artificial intelligence (“Al”) system, a deep learning system, a neural network, a convolutional neural network (“CNN”), a fully convolutional network (“FCN”), a cloud computing system, or a distributed computing system, and/or the like.
  • the plurality of vehicles may each include, without limitation, one of a car, a minivan, a pickup truck, a motorcycle, an all-terrain vehicle, a scooter, a police vehicle, a fire engine, an ambulance, a recreational vehicle, a bus, a commercial van, a commercial truck, a semi-tractor-trailer truck, a boat, a ship, a submersible, an amphibious vehicle, an aircraft, a space vehicle, a satellite, an autonomous vehicle, or a drone (including, but not limited to, one or an aerial drone, a land-based drone, a water-based drone, an amphibious drone, or a space-based drone, and/or the like), and/or the like.
  • a drone including, but not limited to, one or an aerial drone, a land-based drone, a water-based drone, an amphibious drone, or a space-based drone, and/or the like
  • the first alert condition may include, but is not limited to, at least one of traffic congestion along the second predicted path of the at least one second vehicle potentially causing a slow-down, a traffic accident along the second predicted path of the at least one second vehicle potentially causing a slow-down, a construction site along the second predicted path of the at least one second vehicle potentially causing a slow-down, one or more people along the second predicted path of the at least one second vehicle who are occluded from a perspective of the at least one second vehicle, one or more animals along the second predicted path of the at least one second vehicle who are occluded from the perspective of the at least one second vehicle, one or more objects along the second predicted path of the at least one second vehicle who are occluded from the perspective of the at least one second vehicle, one or more people potentially blocking the second predicted path of the at least one second vehicle, one or more animals potentially blocking the second predicted path of the at least one second vehicle, one or more objects potentially blocking the second predicted path of the at least one second vehicle, one or more objects potentially blocking the second predicted path
  • method 400 may comprise receiving, using the remote computing system and from each of one or more second computing systems associated with corresponding one or more second vehicles among the plurality of vehicles and over the one or more networks, one or more second communications regarding at least one of a second current vehicle location or a second predicted path for each of the one or more second vehicles.
  • the one or more second communications may, in some cases, correspond to the first communication in Fig. 4A, albeit from one or more computing systems associated with corresponding one or more vehicles.
  • Method 400 may further comprise, at block 442, analyzing, using the remote computing system, the at least one of the second current vehicle location or the second predicted path for each of the one or more second vehicles in relation to at least one of the first alert condition or the first alert condition location.
  • method 400 may comprise determining whether at least one second vehicle among the one or more second vehicles is in proximity to or approaching the first alert condition location or is within a first region encompassing the first alert condition, based at least in part on the analysis. If so, method 400 may further comprise sending, using the remote computing system, one or more third communications to each of the second computing systems associated with each of the corresponding at least one second vehicle indicating that said second vehicle is in proximity to or approaching the first alert condition location (block 446).
  • the one or more third communications may, in some cases, correspond to the second communication in Fig. 4A, albeit to one or more computing systems associated with corresponding one or more vehicles.
  • Method 400 may continue onto the process at block 448 in Fig. 4F following the circular marker denoted, "C.”
  • method 400 may comprise receiving, using the remote computing system and from each of one or more third computing systems associated with corresponding one or more third vehicles among the plurality of vehicles and over the one or more networks, one or more fourth communications indicating that the first alert condition is no longer present at the first alert condition location; receiving, using the remote computing system and from each of one or more fourth computing systems associated with corresponding one or more fourth vehicles among the plurality of vehicles and over the one or more networks, one or more fifth communications regarding at least one of a fourth current vehicle location or a fourth predicted path for each of the one or more fourth vehicles (block 450); and analyzing, using the remote computing system, the at least one of the fourth current vehicle location or the fourth predicted path for each of the one or more fourth vehicles in relation to at least one of the first alert condition or the first alert condition location (block 452).
  • the one or more fourth communications may, in some cases, correspond to the third communication in Fig. 4C, albeit from
  • method 400 may comprise determining whether at least one fourth vehicle among the one or more fourth vehicles is in proximity to or approaching the first alert condition location or is within the first region encompassing the first alert condition, based at least in part on the analysis. If so, at block 456, method 400 may further comprise determining whether the one or more fourth communications from a threshold number of third computing systems associated with the corresponding one or more third vehicles have been received. If so, method 400 may further comprise, at block 458, sending, using the remote computing system, one or more fifth communications to each of the fourth computing systems associated with each of the corresponding at least one fourth vehicle indicating that the first alert condition is no longer present at the first alert condition location. If not, method 400 may return to the process at block 446 in Fig. 4E, following the circular marked denoted, "D.”
  • FIG. 5 is a block diagram illustrating an example of computer or system hardware architecture, in accordance with various embodiments.
  • Fig. 5 provides a schematic illustration of one embodiment of a computer system 500 of the service provider system hardware that can perform the methods provided by various other embodiments, as described herein, and/or can perform the functions of computer or hardware system (i.e., computing systems 110 and 170a-170n, user devices 115, 175, and 175a-175n, remote computing systems 140, location determination server 150, and image recognition server 155, etc.), as described above.
  • Fig. 5 is meant only to provide a generalized illustration of various components, of which one or more (or none) of each may be utilized as appropriate.
  • Fig. 5, therefore, broadly illustrates how individual system elements may be implemented in a relatively separated or relatively more integrated manner.
  • the computer or hardware system 500 - which might represent an embodiment of the computer or hardware system (i.e., computing systems 110 and 170a-170n, user devices 115, 175, and 175a-175n, remote computing systems 140, location determination server 150, and image recognition server 155, etc.), described above with respect to Figs. 1-4 - is shown comprising hardware elements that can be electrically coupled via a bus 505 (or may otherwise be in communication, as appropriate).
  • the hardware elements may include one or more processors 510, including, without limitation, one or more general-purpose processors and/or one or more special-purpose processors (such as microprocessors, digital signal processing chips, graphics acceleration processors, and/or the like); one or more input devices 515, which can include, without limitation, a mouse, a keyboard, and/or the like; and one or more output devices 520, which can include, without limitation, a display device, a printer, and/or the like.
  • processors 510 including, without limitation, one or more general-purpose processors and/or one or more special-purpose processors (such as microprocessors, digital signal processing chips, graphics acceleration processors, and/or the like)
  • input devices 515 which can include, without limitation, a mouse, a keyboard, and/or the like
  • output devices 520 which can include, without limitation, a display device, a printer, and/or the like.
  • the computer or hardware system 500 may further include (and/or be in communication with) one or more storage devices 525, which can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, solid-state storage device such as a random access memory (“RAM”) and/or a read-only memory (“ROM”), which can be programmable, flash-updateable, and/or the like.
  • RAM random access memory
  • ROM read-only memory
  • Such storage devices may be configured to implement any appropriate data stores, including, without limitation, various file systems, database structures, and/or the like.
  • the computer or hardware system 500 might also include a communications subsystem 530, which can include, without limitation, a modem, a network card (wireless or wired), an infra-red communication device, a wireless communication device and/or chipset (such as a BluetoothTM device, an 802.11 device, a WiFi device, a WiMax device, a WWAN device, cellular communication facilities, etc.), and/or the like.
  • the communications subsystem 530 may permit data to be exchanged with a network (such as the network described below, to name one example), with other computer or hardware systems, and/or with any other devices described herein.
  • the computer or hardware system 500 will further comprise a working memory 535, which can include a RAM or ROM device, as described above.
  • the computer or hardware system 500 also may comprise software elements, shown as being currently located within the working memory 535, including an operating system 540, device drivers, executable libraries, and/or other code, such as one or more application programs 545, which may comprise computer programs provided by various embodiments (including, without limitation, hypervisors, VMs, and the like), and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
  • an operating system 540 may comprise computer programs provided by various embodiments (including, without limitation, hypervisors, VMs, and the like), and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
  • application programs 545 may comprise computer programs provided by various embodiments (including, without limitation, hypervisors, VMs, and the like), and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
  • one or more procedures described with respect to the method(s) discussed above might be implemented as code and/or instructions executable by a computer (and/or a processor within a computer); in an aspect, then, such code and/or instructions can be used to configure and/or adapt a general purpose computer (or other device) to perform one or more operations in accordance with the described methods.
  • a set of these instructions and/or code might be encoded and/or stored on a non-transitory computer readable storage medium, such as the storage device(s) 525 described above.
  • the storage medium might be incorporated within a computer system, such as the system 500.
  • the storage medium might be separate from a computer system (i.e., a removable medium, such as a compact disc, etc.), and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon.
  • These instructions might take the form of executable code, which is executable by the computer or hardware system 500 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer or hardware system 500 (e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc.) then takes the form of executable code.
  • some embodiments may employ a computer or hardware system (such as the computer or hardware system 500) to perform methods in accordance with various embodiments of the invention.
  • some or all of the procedures of such methods are performed by the computer or hardware system 500 in response to processor 510 executing one or more sequences of one or more instructions (which might be incorporated into the operating system 540 and/or other code, such as an application program 545) contained in the working memory 535.
  • Such instructions may be read into the working memory 535 from another computer readable medium, such as one or more of the storage device(s) 525.
  • execution of the sequences of instructions contained in the working memory 535 might cause the processor(s) 510 to perform one or more procedures of the methods described herein.
  • machine readable medium and “computer readable medium,” as used herein, refer to any medium that participates in providing data that causes a machine to operate in some fashion.
  • various computer readable media might be involved in providing instructions/code to processor(s) 510 for execution and/or might be used to store and/or carry such instructions/code (e.g., as signals).
  • a computer readable medium is a non-transitory, physical, and/or tangible storage medium.
  • a computer readable medium may take many forms, including, but not limited to, non-volatile media, volatile media, or the like.
  • Non-volatile media includes, for example, optical and/or magnetic disks, such as the storage device(s) 525.
  • Volatile media includes, without limitation, dynamic memory, such as the working memory 535.
  • a computer readable medium may take the form of transmission media, which includes, without limitation, coaxial cables, copper wire, and fiber optics, including the wires that comprise the bus 505, as well as the various components of the communication subsystem 530 (and/or the media by which the communications subsystem 530 provides communication with other devices).
  • transmission media can also take the form of waves (including without limitation radio, acoustic, and/or light waves, such as those generated during radiowave and infra-red data communications).
  • Common forms of physical and/or tangible computer readable media include, for example, a floppy disk, a flexible disk, a hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read instructions and/or code.
  • Various forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to the processor(s) 510 for execution.
  • the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer.
  • a remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the computer or hardware system 500.
  • These signals which might be in the form of electromagnetic signals, acoustic signals, optical signals, and/or the like, are all examples of carrier waves on which instructions can be encoded, in accordance with various embodiments of the invention.
  • the communications subsystem 530 (and/or components thereof) generally will receive the signals, and the bus 505 then might carry the signals (and/or the data, instructions, etc. carried by the signals) to the working memory 535, from which the processor(s) 505 retrieves and executes the instructions.
  • the instructions received by the working memory 535 may optionally be stored on a storage device 525 either before or after execution by the processor(s) 510.
  • a set of embodiments comprises methods and systems for implementing driver assistance technologies (e.g., advanced driver assistance systems ("ADASs"), other vision-based object detection, other location signal-based object detection, other vision and location signal-based object detection, or multiple types of signal-based object detection, or the like), and, more particularly, to methods, systems, and apparatuses for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals.
  • Fig. 6 illustrates a schematic diagram of a system 600 that can be used in accordance with one set of embodiments.
  • the system 600 can include one or more user computers, user devices, or customer devices 605.
  • a user computer, user device, or customer device 605 can be a general purpose personal computer (including, merely by way of example, desktop computers, tablet computers, laptop computers, handheld computers, and the like, running any appropriate operating system, several of which are available from vendors such as Apple, Microsoft Corp., and the like), cloud computing devices, a server(s), and/or a workstation computer(s) running any of a variety of commercially-available UNIXTM or UNIX-like operating systems.
  • a user computer, user device, or customer device 605 can also have any of a variety of applications, including one or more applications configured to perform methods provided by various embodiments (as described above, for example), as well as one or more office applications, database client and/or server applications, and/or web browser applications.
  • a user computer, user device, or customer device 605 can be any other electronic device, such as a thin-client computer, Internet- enabled mobile telephone, and/or personal digital assistant, capable of communicating via a network (e.g., the network(s) 610 described below) and/or of displaying and navigating web pages or other types of electronic documents.
  • a network e.g., the network(s) 610 described below
  • the system 600 is shown with two user computers, user devices, or customer devices 605, any number of user computers, user devices, or customer devices can be supported.
  • Some embodiments operate in a networked environment, which can include a network(s) 610.
  • the network(s) 610 can be any type of network familiar to those skilled in the art that can support data communications using any of a variety of commercially-available (and/or free or proprietary) protocols, including, without limitation, TCP/IP, SNATM, IPXTM, AppleTalkTM, and the like.
  • TCP/IP Transmission Control Protocol
  • SNATM Session Initiation Protocol
  • IPXTM IPXTM
  • AppleTalkTM Session Init
  • LAN local area network
  • WAN wide-area network
  • WWAN wireless wide area network
  • VPN virtual private network
  • PSTN public switched telephone network
  • a wireless network including, without limitation, a network operating under any of the IEEE 802.11 suite of protocols, the BluetoothTM protocol known in the art, and/or any other wireless protocol; and/or any combination of these and/or other networks.
  • the network might include an access network of the service provider (e.g., an Internet service provider (“ISP”)).
  • ISP Internet service provider
  • the network might include a core network of the service provider, and/or the Internet.
  • Embodiments can also include one or more server computers 615.
  • Each of the server computers 615 may be configured with an operating system, including, without limitation, any of those discussed above, as well as any commercially (or freely) available server operating systems.
  • Each of the servers 615 may also be running one or more applications, which can be configured to provide services to one or more clients 605 and/or other servers 615.
  • one of the servers 615 might be a data server, a web server, a cloud computing device(s), or the like, as described above.
  • the data server might include (or be in communication with) a web server, which can be used, merely by way of example, to process requests for web pages or other electronic documents from user computers 605.
  • the web server can also run a variety of server applications, including HTTP servers, FTP servers, CGI servers, database servers, Java servers, and the like.
  • the web server may be configured to serve web pages that can be operated within a web browser on one or more of the user computers 605 to perform methods of the invention.
  • the server computers 615 might include one or more application servers, which can be configured with one or more applications accessible by a client running on one or more of the client computers 605 and/or other servers 615.
  • the server(s) 615 can be one or more general purpose computers capable of executing programs or scripts in response to the user computers 605 and/or other servers 615, including, without limitation, web applications (which might, in some cases, be configured to perform methods provided by various embodiments).
  • a web application can be implemented as one or more scripts or programs written in any suitable programming language, such as JavaTM, C, C#TM or C++, and/or any scripting language, such as Perl, Python, or TCL, as well as combinations of any programming and/or scripting languages.
  • the application server(s) can also include database servers, including, without limitation, those commercially available from OracleTM, MicrosoftTM, SybaseTM, IBMTM, and the like, which can process requests from clients (including, depending on the configuration, dedicated database clients, API clients, web browsers, etc.) running on a user computer, user device, or customer device 605 and/or another server 615.
  • an application server can perform one or more of the processes for implementing driver assistance technologies, and, more particularly, to methods, systems, and apparatuses for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals, as described in detail above.
  • Data provided by an application server may be formatted as one or more web pages (comprising HTML, JavaScript, etc., for example) and/or may be forwarded to a user computer 605 via a web server (as described above, for example).
  • a web server might receive web page requests and/or input data from a user computer 605 and/or forward the web page requests and/or input data to an application server.
  • a web server may be integrated with an application server.
  • one or more servers 615 can function as a file server and/or can include one or more of the files (e.g., application code, data files, etc.) necessary to implement various disclosed methods, incorporated by an application running on a user computer 605 and/or another server 615.
  • a file server can include all necessary files, allowing such an application to be invoked remotely by a user computer, user device, or customer device 605 and/or server 615.
  • the system can include one or more databases 620a- 620n (collectively, "databases 620").
  • databases 620 The location of each of the databases 620 is discretionary: merely by way of example, a database 620a might reside on a storage medium local to (and/or resident in) a server 615a (and/or a user computer, user device, or customer device 605).
  • a database 620n can be remote from any or all of the computers 605, 615, so long as it can be in communication (e.g., via the network 610) with one or more of these.
  • a database 620 can reside in a storage-area network ("SAN") familiar to those skilled in the art.
  • SAN storage-area network
  • the database 620 can be a relational database, such as an Oracle database, that is adapted to store, update, and retrieve data in response to SQL-formatted commands.
  • the database might be controlled and/or maintained by a database server, as described above, for example.
  • system 600 might further comprise a first vehicle 625a among a plurality of vehicles (similar to vehicles 105 and 105a- 105x of Figs. 1-3, or the like).
  • System 600 may further comprise computing system 630 that may be disposed within or on vehicle 625 (similar to computing systems 110 of Figs. 1 and 2, or the like) or external to vehicle 625 yet associated therewith (not shown in Fig. 6; similar to computing systems 170a-170n of Fig. 1, or the like).
  • system 600 may further comprise user device(s) 635 that may be disposed within or on vehicle 625 (optional; similar to optional user device(s) 115 of Figs.
  • Each vehicle 625 may further comprise one or more location sensors 640 (similar to location sensor(s) or location data signal source(s) 120 of Figs. 1 and 2, or the like) and one or more object detection sensors 645 (similar to object detection sensor(s) or object detection data signal source(s) 125 of Figs. 1 and 2, or the like).
  • System 600 may further comprise remote computing system(s) 660 and corresponding database(s) 660a (similar to remote computing systems 140 and corresponding database(s) 140a of Figs. 1-3, or the like), location determination service 665 and corresponding database(s) 665 a (similar to location determination server 150 and corresponding database(s) 150a of Fig. 1, or the like), and image recognition server 670 and corresponding database(s) 670a (similar to image recognition server 155 and corresponding database(s) 155a of Fig. 1, or the like).
  • remote computing system(s) 660 and corresponding database(s) 660a similar to remote computing systems 140 and corresponding database(s) 140a of Figs. 1-3, or the like
  • location determination service 665 and corresponding database(s) 665 a similar to location determination server 150 and corresponding database(s) 150a of Fig. 1, or the like
  • image recognition server 670 and corresponding database(s) 670a similar to image recognition server 155 and corresponding database(s)
  • computing system 630 and/or user device(s) 605 or 635 may determine at least one of a first current vehicle location (e.g., vehicle location 650a for corresponding vehicle 625 a, or the like) or a first predicted path of a vehicle (e.g., vehicle path 655a (denoted in Fig. 6 by the broad arrow, or the like) for vehicle 625a, or the like), based at least in part on one or more first location-based data (e.g., current location data 685, or the like) from corresponding one or more first location data signals received from one or more different types of location data signal sources (e.g., location sensor(s) 640, or the like).
  • a first current vehicle location e.g., vehicle location 650a for corresponding vehicle 625 a, or the like
  • a first predicted path of a vehicle e.g., vehicle path 655a (denoted in Fig. 6 by the broad arrow, or the like) for vehicle 625a, or the
  • the computing system may send a first communication regarding the determined at least one of the first current vehicle location or the first predicted path of the vehicle (e.g., a message containing current location data 685, or the like) to a remote computing system (e.g., remote computing system 660, or the like) over one or more networks (e.g., network(s) 610, or the like).
  • a remote computing system e.g., remote computing system 660, or the like
  • networks e.g., network(s) 610, or the like.
  • the computing system may receive a second communication (e.g., alert condition data 695, or the like) regarding the at least one first alert condition (e.g., alert condition 675, or the like) for the corresponding at least one first alert condition location (e.g., alert condition location 680, or the like) from the remote computing system over the one or more networks.
  • a second communication e.g., alert condition data 695, or the like
  • the at least one first alert condition may each include, without limitation, at least one of traffic congestion along the first predicted path of the vehicle potentially causing a slow-down, a traffic accident along the first predicted path of the vehicle potentially causing a slow-down, a construction site along the first predicted path of the vehicle potentially causing a slow-down, one or more people along the first predicted path of the vehicle who are occluded from a perspective of the vehicle, one or more animals along the first predicted path of the vehicle who are occluded from the perspective of the vehicle, one or more objects along the first predicted path of the vehicle who are occluded from the perspective of the vehicle, one or more people potentially blocking the first predicted path of the vehicle, one or more animals potentially blocking the first predicted path of the vehicle, one or more objects potentially blocking the first predicted path of the vehicle, a tracked weather event along or near the first predicted path of the vehicle, a natural haz
  • the computing system may perform the following tasks: receive one or more first object detection data signals (e.g., signals containing object detection data 690, or the like) from one or more different types of object detection data signal sources (e.g., object detection sensor(s) 645, or the like); fuse the at least one first alert condition with one or more of the received one or more first object detection data signals, the at least one first alert condition location, a second current vehicle location of the vehicle, or a second predicted path of the vehicle to generate first fused data; and generate and present, via one or more user devices (e.g., user device(s) 605, 605a, 605b, and/or 635, or the like), a first alert message indicating that the
  • At least one of the second current vehicle location or the second predicted path of the vehicle may be determined based at least in part on the one or more second location-based data from the corresponding one or more second data signals received from the one or more different types of location data signal sources (similar to the first current vehicle location or the first predicted path, or the like).
  • determining the at least one of the first current vehicle location or the first predicted path of a vehicle may comprise the computing system performing at least one of: (1) determining the first current vehicle location based at least in part on GPS data from the GPS device; (2) determining the first predicted path of the vehicle based at least in part on a series of GPS data from the GPS device over time; (3) determining the first current vehicle location based at least in part on GNSS data from the GNSS device; (4) determining the first predicted path of the vehicle based at least in part on a series of GNSS data from the GNSS device over time; (5) determining the first current vehicle location based at least in part on text recognition of one or more location-identifying signs, the one or more location-identifying signs comprising at least one of one or more street signs, one or more address signs, one or more business signs, one or more highway signs, one or more city limits signs, one or more county boundary signs, one or more state boundary signs, one or more province boundary signs, one or more territory boundary signs, one or
  • generating and presenting the first alert message may comprise at least one of: generating a graphical display depicting one or more of the at least one first alert condition or the generated first fused data, and presenting the generated graphical display on a display device (e.g., user devices 605, 605a, 605b, and/or 635, or the like); generating a text-based message depicting one or more of the at least one first alert condition or the generated first fused data, and presenting the text-based message on a display device (e.g., user devices 605, 605a, 605b, and/or 635, or the like); generating at least one message regarding one or more of the at least one first alert condition or the generated first fused data, and sending the at least one message to the user device (e.g., user devices 605, 605a, 605b, and/or 635, or the like); or generating at least one audio message regarding one or more of the at least one first alert condition or the generated first fuse
  • the at least one message may include, but is not limited to, at least one of an e- mail message, a short message service (“SMS”) app, a multimedia messaging service (“MMS”) app, or a text message app, and/or the like.
  • SMS short message service
  • MMS multimedia messaging service
  • the computing system may receive one or more second object detection data signals from the one or more different types of object detection data signal sources.
  • the computing system may analyze the one or more second object detection data signals. Based on a determination that the at least one first alert condition is no longer present at the at least one first alert condition location, the computing system may send a third communication to the remote computing system over the one or more networks indicating that the at least one first alert condition is no longer present at the at least one first alert condition location.
  • the computing system may receive one or more third object detection data signals from the one or more different types of object detection data signal sources.
  • the computing system may analyze the one or more third object detection data signals. Based on a determination that the one or more third object detection data signals correspond to at least one second alert condition, the computing system may determine at least one of a third current vehicle location or at least one second alert condition location corresponding to the at least one second alert condition, based at least in part on one or more third location-based data from corresponding one or more third data signals received from the one or more different types of location data signal sources.
  • the computing system may send a fourth communication to the remote computing system over the one or more networks indicating that the at least one second alert condition has been detected at or near the at least one of the third current vehicle location or the at least one second alert condition location.
  • remote computing system 660 may receive, from each of one or more first computing systems associated with corresponding one or more first vehicles among a plurality of vehicles and over one or more networks, one or more first communications indicating that a first alert condition has been detected at or near one or more of a first current vehicle location of each of the one or more first vehicles or a first alert condition location corresponding to the first alert condition.
  • the remote computing system may receive, from each of one or more second computing systems associated with corresponding one or more second vehicles among the plurality of vehicles and over the one or more networks, one or more second communications regarding at least one of a second current vehicle location or a second predicted path for each of the one or more second vehicles.
  • the remote computing system may analyze the at least one of the second current vehicle location or the second predicted path for each of the one or more second vehicles in relation to at least one of the first alert condition or the first alert condition location. Based on a determination that at least one second vehicle among the one or more second vehicles is in proximity to or approaching the first alert condition location or is within a first region encompassing the first alert condition, based at least in part on the analysis, the remote computing system may send one or more third communications to each of the second computing systems associated with each of the corresponding at least one second vehicle indicating that said second vehicle is in proximity to or approaching the first alert condition location.
  • the remote computing system may receive, from each of one or more third computing systems associated with corresponding one or more third vehicles among the plurality of vehicles and over the one or more networks, one or more fourth communications indicating that the first alert condition is no longer present at the first alert condition location.
  • the remote computing system may receive, from each of one or more fourth computing systems associated with corresponding one or more fourth vehicles among the plurality of vehicles and over the one or more networks, one or more fifth communications regarding at least one of a fourth current vehicle location or a fourth predicted path for each of the one or more fourth vehicles.
  • the remote computing system may analyze the at least one of the fourth current vehicle location or the fourth predicted path for each of the one or more fourth vehicles in relation to at least one of the first alert condition or the first alert condition location.
  • the remote computing system may send one or more fifth communications to each of the fourth computing systems associated with each of the corresponding at least one fourth vehicle indicating that the first alert condition is no longer present at the first alert condition location.
  • the remote computing system may continue sending the one or more third communications to each of the fourth computing systems associated with each of the corresponding at least one fourth vehicle indicating that said fourth vehicle is in proximity to or approaching the first alert condition location.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)

Abstract

Novel tools and techniques are provided for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based thereon. In various embodiments, based on a determination that an alert condition(s) is near a location of a vehicle(s) or its predicted path(s), an alert condition message (including information regarding the alert condition and its determined location) may be sent or pushed to a computing system or user device associated with (and in some cases, located within or on) the vehicle(s). As the vehicle(s) approaches the alert condition location(s), its computing system or user device may fuse the alert condition with one or more of object detection data signals, alert condition location, and/or current location and/or current predicted path of the vehicle to generate fused data, which may then be used to generate and present alert messages indicating that the vehicle(s) is(are) approaching the alert condition.

Description

OBJECT DETECTION USING FUSION OF
VISION, LOCATION, AND/OR OTHER SIGNALS
COPYRIGHT STATEMENT
[0001] A portion of the disclosure of this patent document contains material that is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
FIELD
[0002] The present disclosure relates, in general, to methods, systems, and apparatuses for implementing driver assistance technologies (e.g., advanced driver assistance systems ("ADASs"), other vision-based object detection, other location signal-based object detection, other vision and location signal-based object detection, or multiple types of signal-based object detection, or the like), and, more particularly, to methods, systems, and apparatuses for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals.
BACKGROUND
[0003] Object detection is a critical application in computer vision, and is widely used in autonomous driving. The technique of object detection is mainly used to localize where objects are and to recognize what they are based on the visual images. With the help of deep learning and a large number of images for training, object detection is able to deliver great results if the target objects are not blurred or mostly occluded. In reality, most objects in these situations likely cause many accidents. [0004] Many existing systems and methods for improving the accuracy of the detection of occluded and/or blurred objects rely on one technique such as visionbased detection or location-based detection. Each of these techniques has its limitations for detecting the object due to their characteristics of vision or location signals (e.g., some conventional systems and methods are bad at detection when the objects are in a blind spot, or the like; on the other hand, location signals (e.g., radar, or the like) may easily be interfered with by each other and may have some errors if high precision is required, or the like).
[0005] Other issues of conventional object detection systems are always served for the end users. That is, the detection results are generated from the point of view of the end user and usually not shared because of privacy concerns.
[0006] Hence, there is a need for more robust and scalable solutions for implementing driver assistance technologies.
SUMMARY
[0007] The techniques of this disclosure generally relate to tools and techniques for implementing driver assistance technologies, and, more particularly, to methods, systems, and apparatuses for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals. [0008] In an aspect, a method may comprise determining, using a computing system, at least one of a first current vehicle location or a first predicted path of a vehicle, based at least in part on one or more first location-based data from corresponding one or more first location data signals received from one or more different types of location data signal sources; sending, using the computing system, a first communication regarding the determined at least one of the first current vehicle location or the first predicted path of the vehicle to a remote computing system over one or more networks; and, in response to existence of at least one first alert condition for corresponding at least one first alert condition location that is in proximity to one or more of the determined at least one of the first current vehicle location or the first predicted path of the vehicle or that is within a first region encompassing the determined at least one of the first current vehicle location or the first predicted path of the vehicle, receiving, using the computing system, a second communication regarding the at least one first alert condition for the corresponding at least one first alert condition location from the remote computing system over the one or more networks. The method may further comprise, based on a determination that the vehicle is approaching the at least one first alert condition location based at least in part on one or more second location-based data from corresponding one or more second location data signals received from the one or more different types of location data signal sources, performing the following tasks: receiving, using the computing system, one or more first object detection data signals from one or more different types of object detection data signal sources; fusing, using the computing system, the at least one first alert condition with one or more of the received one or more first object detection data signals, the at least one first alert condition location, a second current vehicle location of the vehicle, or a second predicted path of the vehicle to generate first fused data; and generating and presenting, using the computing system and via one or more user devices, a first alert message indicating that the vehicle is approaching the at least one first alert condition, based at least in part on the generated first fused data.
[0009] In another aspect, a system might comprise a computing system, which might comprise at least one first processor and a first non-transitory computer readable medium communicatively coupled to the at least one first processor. The first non-transitory computer readable medium might have stored thereon computer software comprising a first set of instructions that, when executed by the at least one first processor, causes the computing system to: determine at least one of a first current vehicle location or a first predicted path of a vehicle, based at least in part on one or more first location-based data from corresponding one or more first location data signals received from one or more different types of location data signal sources; send the determined at least one of the first current vehicle location or the first predicted path of the vehicle to a remote computing system over one or more networks; in response to existence of at least one first alert condition for corresponding at least one first alert condition location that is in proximity to one or more of the determined at least one of the first current vehicle location or the first predicted path of the vehicle or that is within a first region encompassing the determined at least one of the first current vehicle location or the first predicted path of the vehicle, receive the at least one first alert condition for the corresponding at least one first alert condition location from the remote computing system over the one or more networks; based on a determination that the vehicle is approaching the at least one first alert condition location based at least in part on one or more second locationbased data from corresponding one or more second location data signals received from the one or more different types of location data signal sources, perform the following tasks: receive one or more first object detection data signals from one or more different types of object detection data signal sources; fuse the at least one first alert condition with one or more of the received one or more first object detection data signals, the at least one first alert condition location, a second current vehicle location of the vehicle, or a second predicted path of the vehicle to generate first fused data; and generate and present, using the computing system and via one or more user devices, a first alert message indicating that the vehicle is approaching the at least one first alert condition, based at least in part on the generated first fused data.
[0010] In yet another aspect, a method may comprise receiving, using a remote computing system and from each of one or more first computing systems associated with corresponding one or more first vehicles among a plurality of vehicles and over one or more networks, one or more first communications indicating that a first alert condition has been detected at or near one or more of a first current vehicle location of each of the one or more first vehicles or a first alert condition location corresponding to the first alert condition; receiving, using the remote computing system and from each of one or more second computing systems associated with corresponding one or more second vehicles among the plurality of vehicles and over the one or more networks, one or more second communications regarding at least one of a second current vehicle location or a second predicted path for each of the one or more second vehicles; analyzing, using the remote computing system, the at least one of the second current vehicle location or the second predicted path for each of the one or more second vehicles in relation to at least one of the first alert condition or the first alert condition location; and based on a determination that at least one second vehicle among the one or more second vehicles is in proximity to or approaching the first alert condition location or is within a first region encompassing the first alert condition, based at least in part on the analysis, sending, using the remote computing system, one or more third communications to each of the second computing systems associated with each of the corresponding at least one second vehicle indicating that said second vehicle is in proximity to or approaching the first alert condition location. [0011] In still another aspect, a system might comprise a remote computing system, which might comprise at least one first processor and a first non-transitory computer readable medium communicatively coupled to the at least one first processor. The first non-transitory computer readable medium might have stored thereon computer software comprising a first set of instructions that, when executed by the at least one first processor, causes the remote computing system to: receive, from each of one or more first computing systems associated with corresponding one or more first vehicles among a plurality of vehicles and over one or more networks, one or more first communications indicating that a first alert condition has been detected at or near one or more of a first current vehicle location of each of the one or more first vehicles or a first alert condition location corresponding to the first alert condition; receive, from each of one or more second computing systems associated with corresponding one or more second vehicles among the plurality of vehicles and over the one or more networks, one or more second communications regarding at least one of a second current vehicle location or a second predicted path for each of the one or more second vehicles; analyze the at least one of the second current vehicle location or the second predicted path for each of the one or more second vehicles in relation to at least one of the first alert condition or the first alert condition location; and based on a determination that at least one second vehicle among the one or more second vehicles is in proximity to or approaching the first alert condition location or is within a first region encompassing the first alert condition, based at least in part on the analysis, send one or more third communications to each of the second computing systems associated with each of the corresponding at least one second vehicle indicating that said second vehicle is in proximity to or approaching the first alert condition location.
[0012] Various modifications and additions can be made to the embodiments discussed without departing from the scope of the invention. For example, while the embodiments described above refer to particular features, the scope of this invention also includes embodiments having different combination of features and embodiments that do not include all of the above-described features.
[0013] The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the techniques described in this disclosure will be apparent from the description and drawings, and from the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[0014] A further understanding of the nature and advantages of particular embodiments may be realized by reference to the remaining portions of the specification and the drawings, in which like reference numerals are used to refer to similar components. In some instances, a sub-label is associated with a reference numeral to denote one of multiple similar components. When reference is made to a reference numeral without specification to an existing sub-label, it is intended to refer to all such multiple similar components.
[0015] Fig. 1 is a schematic diagram illustrating a system for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals, in accordance with various embodiments.
[0016] Fig. 2 is a schematic block flow diagrams illustrating a non- limiting example of object detection using fusion of vision, location, and/or other signals, and/or alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals, in accordance with various embodiments.
[0017] Figs. 3A-3C are schematic block flow diagrams illustrating various nonlimiting examples of alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals, in accordance with various embodiments.
[0018] Figs. 4A-4F are flow diagrams illustrating a method for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals, in accordance with various embodiments. [0019] Fig. 5 is a block diagram illustrating an example of computer or system hardware architecture, in accordance with various embodiments.
[0020] Fig. 6 is a block diagram illustrating a networked system of computers, computing systems, or system hardware architecture, which can be used in accordance with various embodiments.
DETAILED DESCRIPTION
[0021] Overview
[0022] Various embodiments provide tools and techniques for implementing driver assistance technologies (e.g., advanced driver assistance systems ("ADASs"), other vision-based object detection, other location signal-based object detection, other vision and location signal-based object detection, or multiple types of signal-based object detection, or the like), and, more particularly, to methods, systems, and apparatuses for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals.
[0023] In various embodiments, a computing system may determine at least one of a first current vehicle location or a first predicted path of a vehicle, based at least in part on one or more first location-based data from corresponding one or more first location data signals received from one or more different types of location data signal sources. The computing system may send a first communication regarding the determined at least one of the first current vehicle location or the first predicted path of the vehicle to a remote computing system over one or more networks. In response to existence of at least one first alert condition for corresponding at least one first alert condition location that is in proximity to one or more of the determined at least one of the first current vehicle location or the first predicted path of the vehicle or that is within a first region encompassing the determined at least one of the first current vehicle location or the first predicted path of the vehicle, the computing system may receive a second communication regarding the at least one first alert condition for the corresponding at least one first alert condition location from the remote computing system over the one or more networks. Based on a determination that the vehicle is approaching the at least one first alert condition location based at least in part on one or more second location-based data from corresponding one or more second location data signals received from the one or more different types of location data signal sources, the computing system may perform the following tasks: receive one or more first object detection data signals from one or more different types of object detection data signal sources; fuse the at least one first alert condition with one or more of the received one or more first object detection data signals, the at least one first alert condition location, a second current vehicle location of the vehicle, or a second predicted path of the vehicle to generate first fused data; and generate and present, via one or more user devices, a first alert message indicating that the vehicle is approaching the at least one first alert condition, based at least in part on the generated first fused data.
[0024] In some embodiments, the computing system may comprise at least one of a data signal fusing computing system, at least one processor on the user device, at least one processor on a mobile device associated with a user, a vehicle-based computing system, an object detection system, or a driver assistance system, and/or the like. In some instances, the remote computing system may comprise at least one of a remote data signal fusing computing system, a remote object detection system, or a remote driver assistance system, a server computer over the one or more networks, an image processing server, a graphics processing unit ("GPU") -based server, a positioning and mapping server, a machine learning system, an artificial intelligence ("Al") system, a deep learning system, a neural network, a convolutional neural network ("CNN"), a fully convolutional network ("FCN"), a cloud computing system, or a distributed computing system, and/or the like.
[0025] According to some embodiments, the one or more different types of location data signal sources may each comprise one of a global positioning system ("GPS") device, a global navigation satellite system ("GNSS") device, a text recognition-based location identification system, an image recognition-based landmark identification system, a telecommunications signal triangulation-based location identification system, a radar-based location identification system, a sonarbased location identification system, or a lidar-based location identification system, and/or the like.
[0026] In some instances, determining the at least one of the first current vehicle location or the first predicted path of a vehicle may comprise at least one of: determining, using the computing system, the first current vehicle location based at least in part on GPS data from the GPS device; determining, using the computing system, the first predicted path of the vehicle based at least in part on a series of GPS data from the GPS device over time; determining, using the computing system, the first current vehicle location based at least in part on GNSS data from the GNSS device; determining, using the computing system, the first predicted path of the vehicle based at least in part on a series of GNSS data from the GNSS device over time; determining, using the computing system, the first current vehicle location based at least in part on text recognition of one or more location-identifying signs, the one or more location-identifying signs comprising at least one of one or more street signs, one or more address signs, one or more business signs, one or more highway signs, one or more city limits signs, one or more county boundary signs, one or more state boundary signs, one or more province boundary signs, one or more territory boundary signs, one or more regional boundary signs, one or more national border signs, one or more landmark identification signs, one or more distance to destination signs, one or more route markers, one or more highway location markers, one or more driver location signs, or one or more mile markers, and/or the like; determining, using the computing system, the first predicted path of the vehicle based at least in part on text recognition of a plurality of location-identifying signs over time; determining, using the computing system, the first current vehicle location based at least in part on image recognition of one or more landmarks, the one or more landmarks comprising at least one of one or more cityscapes, one or more skylines, one or more unique buildings, one or more franchise buildings, one or more unique street views, one or more overlooks, one or more scenic locations, one or more mountain ranges, one or more individual mountains, one or more bodies of water, one or more manmade monuments, one or more landscape art pieces, or one or more unique structures, and/or the like; determining, using the computing system, the first predicted path of the vehicle based at least in part on image recognition of a plurality of landmarks over time; determining, using the computing system, the first current vehicle location based at least in part on analysis of signal strength of telecommunications signals from a plurality of stationary telecommunications transceivers; determining, using the computing system, the first predicted path of the vehicle based at least in part on analysis of changes in signal strength of telecommunications signals from the plurality of stationary telecommunications transceivers over time; determining, using the computing system, the first current vehicle location based at least in part on analysis of radar signal data; determining, using the computing system, the first predicted path of the vehicle based at least in part on analysis of changes in radar signal data over time; determining, using the computing system, the first current vehicle location based at least in part on analysis of sonar signal data; determining, using the computing system, the first predicted path of the vehicle based at least in part on analysis of changes in sonar signal data over time; determining, using the computing system, the first current vehicle location based at least in part on analysis of lidar signal data; or determining, using the computing system, the first predicted path of the vehicle based at least in part on analysis of changes in lidar signal data over time; and/or the like.
[0027] In some embodiments, the one or more different types of object detection data signal sources may each comprise one of a vision-based object detection system, a radar-based object detection system, a sonar-based object detection system, or a lidar-based object detection system. In some cases, the vehicle may comprise one of a car, a minivan, a pickup truck, a motorcycle, an all-terrain vehicle, a scooter, a police vehicle, a fire engine, an ambulance, a recreational vehicle, a bus, a commercial van, a commercial truck, a semi-tractor-trailer truck, a boat, a ship, a submersible, an amphibious vehicle, an aircraft, a space vehicle, a satellite, an autonomous vehicle, or a drone, and/or the like.
[0028] Merely by way of example, in some cases, the at least one first alert condition may each comprise at least one of traffic congestion along the first predicted path of the vehicle potentially causing a slow-down, a traffic accident along the first predicted path of the vehicle potentially causing a slow-down, a construction site along the first predicted path of the vehicle potentially causing a slow-down, one or more people along the first predicted path of the vehicle who are occluded from a perspective of the vehicle, one or more animals along the first predicted path of the vehicle who are occluded from the perspective of the vehicle, one or more objects along the first predicted path of the vehicle who are occluded from the perspective of the vehicle, one or more people potentially blocking the first predicted path of the vehicle, one or more animals potentially blocking the first predicted path of the vehicle, one or more objects potentially blocking the first predicted path of the vehicle, a tracked weather event along or near the first predicted path of the vehicle, a natural hazard potentially blocking the first predicted path of the vehicle, a manmade hazard potentially blocking the first predicted path of the vehicle, one or more people potentially intercepting the vehicle along the first predicted path, one or more animals potentially intercepting the vehicle along the first predicted path, one or more objects potentially intercepting the vehicle along the first predicted path, or one or more other vehicles potentially intercepting the vehicle along the first predicted path, and/or the like.
[0029] In some instances, each user device may comprise at least one of a smartphone, a tablet computer, a display device, an augmented reality ("AR") device, a virtual reality ("VR") device, a mixed reality ("MR") device, a vehicle console display, a vehicle heads-up display ("HUD"), a vehicle remote controller display, one or more audio speakers, or one or more haptic response devices, and/or the like. [0030] According to some embodiments, generating and presenting the first alert message may comprise at least one of: generating a graphical display depicting one or more of the at least one first alert condition or the generated first fused data, and presenting the generated graphical display on a display device; generating a textbased message depicting one or more of the at least one first alert condition or the generated first fused data, and presenting the text-based message on a display device; generating at least one message regarding one or more of the at least one first alert condition or the generated first fused data, and sending the at least one message to the user device, wherein the at least one message may comprise at least one of an e-mail message, a short message service ("SMS") app, a multimedia messaging service ("MMS") app, or a text message app, and/or the like; or generating at least one audio message regarding one or more of the at least one first alert condition or the generated first fused data, and sending the at least one audio message to the user device for playback on at least one audio speaker; and/or the like.
[0031] In some cases, at least one of the second current vehicle location or the second predicted path of the vehicle may be determined based at least in part on the one or more second location-based data from the corresponding one or more second data signals received from the one or more different types of location data signal sources. [0032] In some embodiments, at or near the at least one first alert condition location, the computing system may receive one or more second object detection data signals from the one or more different types of object detection data signal sources. The computing system may analyze the one or more second object detection data signals. Based on a determination that the at least one first alert condition is no longer present at the at least one first alert condition location, the computing system may send a third communication to the remote computing system over the one or more networks indicating that the at least one first alert condition is no longer present at the at least one first alert condition location.
[0033] According to some embodiments, the computing system may receive one or more third object detection data signals from the one or more different types of object detection data signal sources. The computing system may analyze the one or more third object detection data signals. Based on a determination that the one or more third object detection data signals correspond to at least one second alert condition, the computing system may determine at least one of a third current vehicle location or at least one second alert condition location corresponding to the at least one second alert condition, based at least in part on one or more third location-based data from corresponding one or more third data signals received from the one or more different types of location data signal sources. The computing system may send a fourth communication to the remote computing system over the one or more networks indicating that the at least one second alert condition has been detected at or near the at least one of the third current vehicle location or the at least one second alert condition location.
[0034] In another aspect, a remote computing system may receive, from each of one or more first computing systems associated with corresponding one or more first vehicles among a plurality of vehicles and over one or more networks, one or more first communications indicating that a first alert condition has been detected at or near one or more of a first current vehicle location of each of the one or more first vehicles or a first alert condition location corresponding to the first alert condition. The remote computing system may receive, from each of one or more second computing systems associated with corresponding one or more second vehicles among the plurality of vehicles and over the one or more networks, one or more second communications regarding at least one of a second current vehicle location or a second predicted path for each of the one or more second vehicles. The remote computing system may analyze the at least one of the second current vehicle location or the second predicted path for each of the one or more second vehicles in relation to at least one of the first alert condition or the first alert condition location. Based on a determination that at least one second vehicle among the one or more second vehicles is in proximity to or approaching the first alert condition location or is within a first region encompassing the first alert condition, based at least in part on the analysis, the remote computing system may send one or more third communications to each of the second computing systems associated with each of the corresponding at least one second vehicle indicating that said second vehicle is in proximity to or approaching the first alert condition location.
[0035] In some embodiments, the one or more first computing systems and the one or more second computing systems may each comprise at least one of a data signal fusing computing system, at least one processor on the user device, at least one processor on a mobile device associated with a user, a vehicle-based computing system, an object detection system, or a driver assistance system, and/or the like. In some instances, the remote computing system may comprise at least one of a remote data signal fusing computing system, a remote object detection system, or a remote driver assistance system, a server computer over the one or more networks, an image processing server, a graphics processing unit ("GPU") -based server, a positioning and mapping server, a machine learning system, an artificial intelligence ("Al") system, a deep learning system, a neural network, a convolutional neural network ("CNN"), a fully convolutional network ("FCN"), a cloud computing system, or a distributed computing system, and/or the like. In some cases, the plurality of vehicles may each comprise one of a car, a minivan, a pickup truck, a motorcycle, an all-terrain vehicle, a scooter, a police vehicle, a fire engine, an ambulance, a recreational vehicle, a bus, a commercial van, a commercial truck, a semi-tractor-trailer truck, a boat, a ship, a submersible, an amphibious vehicle, an aircraft, a space vehicle, a satellite, an autonomous vehicle, or a drone, and/or the like.
[0036] According to some embodiments, the first alert condition may comprise at least one of traffic congestion along the second predicted path of the at least one second vehicle potentially causing a slow-down, a traffic accident along the second predicted path of the at least one second vehicle potentially causing a slow-down, a construction site along the second predicted path of the at least one second vehicle potentially causing a slow-down, one or more people along the second predicted path of the at least one second vehicle who are occluded from a perspective of the at least one second vehicle, one or more animals along the second predicted path of the at least one second vehicle who are occluded from the perspective of the at least one second vehicle, one or more objects along the second predicted path of the at least one second vehicle who are occluded from the perspective of the at least one second vehicle, one or more people potentially blocking the second predicted path of the at least one second vehicle, one or more animals potentially blocking the second predicted path of the at least one second vehicle, one or more objects potentially blocking the second predicted path of the at least one second vehicle, a tracked weather event along or near the second predicted path of the at least one second vehicle, a natural hazard potentially blocking the second predicted path of the at least one second vehicle, a manmade hazard potentially blocking the second predicted path of the at least one second vehicle, one or more people potentially intercepting the at least one second vehicle along the second predicted path, one or more animals potentially intercepting the at least one second vehicle along the second predicted path, one or more objects potentially intercepting the at least one second vehicle along the second predicted path, or one or more other vehicles potentially intercepting the at least one second vehicle along the second predicted path, and/or the like.
[0037] In some embodiments, the remote computing system may receive, from each of one or more third computing systems associated with corresponding one or more third vehicles among the plurality of vehicles and over the one or more networks, one or more fourth communications indicating that the first alert condition is no longer present at the first alert condition location. The remote computing system may receive, from each of one or more fourth computing systems associated with corresponding one or more fourth vehicles among the plurality of vehicles and over the one or more networks, one or more fifth communications regarding at least one of a fourth current vehicle location or a fourth predicted path for each of the one or more fourth vehicles. The remote computing system may analyze the at least one of the fourth current vehicle location or the fourth predicted path for each of the one or more fourth vehicles in relation to at least one of the first alert condition or the first alert condition location. Based on a determination that at least one fourth vehicle among the one or more fourth vehicles is in proximity to or approaching the first alert condition location or is within the first region encompassing the first alert condition, based at least in part on the analysis, and after receiving the one or more fourth communications from a threshold number of third computing systems associated with the corresponding one or more third vehicles, the remote computing system may send one or more fifth communications to each of the fourth computing systems associated with each of the corresponding at least one fourth vehicle indicating that the first alert condition is no longer present at the first alert condition location. If the one or more fourth communications from a threshold number of third computing systems associated with the corresponding one or more third vehicles have not been received, the remote computing system may continue sending the one or more third communications to each of the fourth computing systems associated with each of the corresponding at least one fourth vehicle indicating that said fourth vehicle is in proximity to or approaching the first alert condition location.
[0038] In the various aspects described herein, a fusion framework that integrates object detection and location signals to provide more useful information may be provided. Here, location signals are not limited to location signals based on GPS data (as described herein), and any location signal may be easily used in this framework. The system (and corresponding service) described in accordance with the various embodiments operates by interchanging and/or exchanging locations (e.g., alert condition locations, or the like) according to the area or region in which potentially affected vehicles may be located, and, in some cases, storing the event locations (or alert condition locations) and pushing them to the mobile phones (or other user devices or computing systems) of users and/or vehicles that are determined to potentially be affected by the events (or alert conditions). Further, object detection models running on the mobile phone (or other user devices or computing systems) with additional location information to enhance the driving assistant information, or the like.
[0039] This allows scalability and robust object detection. For example, driving assistant software applications ("apps") (such as navigation, or the like) have large user scale and demand for more support on it. A system or method that integrates visual images or other object detection signals with sensor signals or other location signals (such as described herein in accordance with the various embodiments) may enable delivery of more useful information to assist drivers. In addition, this framework may be based on mobile devices to extend its ability by integrating visual images and location sensor signals to generate new driver assist information. Further, this framework may be scalable because its services could be distributed based on location(s) and may be run on the cloud, or the like. For example, the service may be started if the number of users exceeds some threshold value (e.g., within a region), and that service only serves the users in that area. Once the number of users increases, the services may be expanded accordingly. Also, the object detection model may be trained for different purposes other than the common ones. For example, occluded pedestrians may be revealed by the framework in the case that other subscribed users are nearby and have these pedestrians in detection range (where such information about the occluded pedestrians may be shared to a user in a vehicle approaching these occluded pedestrians, where such a user is not in line of sight of these pedestrians). As described herein, the various embodiments are not limited to traffic use, but may be applicable to any vehicle use in any environment, so long as information regarding location and object detection of alert conditions may be shared.
[0040] These and other aspects of the system and method for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals are described in greater detail with respect to the figures.
[0041] The following detailed description illustrates a few embodiments in further detail to enable one of skill in the art to practice such embodiments. The described examples are provided for illustrative purposes and are not intended to limit the scope of the invention.
[0042] In the following description, for the purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the described embodiments. It will be apparent to one skilled in the art, however, that other embodiments of the present invention may be practiced without some of these details. In other instances, some structures and devices are shown in block diagram form. Several embodiments are described herein, and while various features are ascribed to different embodiments, it should be appreciated that the features described with respect to one embodiment may be incorporated with other embodiments as well. By the same token, however, no single feature or features of any described embodiment should be considered essential to every embodiment of the invention, as other embodiments of the invention may omit such features.
[0043] Unless otherwise indicated, all numbers used herein to express quantities, dimensions, and so forth used should be understood as being modified in all instances by the term "about." In this application, the use of the singular includes the plural unless specifically stated otherwise, and use of the terms "and" and "or" means "and/or" unless otherwise indicated. Moreover, the use of the term "including," as well as other forms, such as "includes" and "included," should be considered nonexclusive. Also, terms such as "element" or "component" encompass both elements and components comprising one unit and elements and components that comprise more than one unit, unless specifically stated otherwise.
[0044] Various embodiments as described herein - while embodying (in some cases) software products, computer-performed methods, and/or computer systems - represent tangible, concrete improvements to existing technological areas, including, without limitation, object detection technology, location detection technology, driver assistance technology, and/or the like. In other aspects, some embodiments can improve the functioning of user equipment or systems themselves (e.g., object detection systems, location detection systems, driver assistance systems, etc.), for example, by receiving, using a remote computing system and from each of one or more first computing systems associated with corresponding one or more first vehicles among a plurality of vehicles and over one or more networks, one or more first communications indicating that a first alert condition has been detected at or near one or more of a first current vehicle location of each of the one or more first vehicles or a first alert condition location corresponding to the first alert condition; receiving, using the remote computing system and from each of one or more second computing systems associated with corresponding one or more second vehicles among the plurality of vehicles and over the one or more networks, one or more second communications regarding at least one of a second current vehicle location or a second predicted path for each of the one or more second vehicles; analyzing, using the remote computing system, the at least one of the second current vehicle location or the second predicted path for each of the one or more second vehicles in relation to at least one of the first alert condition or the first alert condition location; and based on a determination that at least one second vehicle among the one or more second vehicles is in proximity to or approaching the first alert condition location or is within a first region encompassing the first alert condition, based at least in part on the analysis, sending, using the remote computing system, one or more third communications to each of the second computing systems associated with each of the corresponding at least one second vehicle indicating that said second vehicle is in proximity to or approaching the first alert condition location; and/or the like.
[0045] In particular, to the extent any abstract concepts are present in the various embodiments, those concepts can be implemented as described herein by devices, software, systems, and methods that involve novel functionality (e.g., steps or operations), such as, implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals, and/or the like, to name a few examples, that extend beyond mere conventional computer processing operations. These functionalities can produce tangible results outside of the implementing computer system, including, merely by way of example, providing a system to perform fusion of vision, location, and/or other signals for object detection that may be exchanged among a plurality of users and/or computing systems associated with a plurality of vehicles to satisfy some tasks, at least some of which may be observed or measured by users, ADAS content developers, and/or user device and/or vehicle manufacturers.
[0046] Some Embodiments
[0047] We now turn to the embodiments as illustrated by the drawings. Figs. 1-6 illustrate some of the features of the method, system, and apparatus for implementing driver assistance technologies (e.g., advanced driver assistance systems ("ADASs"), other vision-based object detection, other location signal-based object detection, other vision and location signal-based object detection, or multiple types of signal-based object detection, or the like), and, more particularly, to methods, systems, and apparatuses for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals, as referred to above. The methods, systems, and apparatuses illustrated by Figs. 1-6 refer to examples of different embodiments that include various components and steps, which can be considered alternatives or which can be used in conjunction with one another in the various embodiments. The description of the illustrated methods, systems, and apparatuses shown in Figs. 1-6 is provided for purposes of illustration and should not be considered to limit the scope of the different embodiments.
[0048] With reference to the figures, Fig. 1 is a schematic diagram illustrating a system 100 for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals, in accordance with various embodiments.
[0049] In the non- limiting embodiment of Fig. 1, system 100 may comprise a plurality of vehicles, including, but not limited to, a first vehicle 105a, a second vehicle 105b, and a third vehicle 105c, and so on. System 100 may further comprise computing system 110 that may be disposed within or on each of at least one vehicle 105 or may be disposed external to vehicle 105 yet associated therewith (e.g., computing systems 170a- 170n, or the like). Similarly, system 100 may further comprise user device(s) 115 that may be disposed within or on each of at least one vehicle 105 or may be disposed external to vehicle 105 yet associated therewith and/or associated with a user (e.g., user devices 175a-175n associated with corresponding users 180a-180n, respectively, or the like). Each vehicle 105 may further include, without limitation, one or more location sensors (or location data signal source(s)) and one or more object detection sensors (or object detection data signal source(s)) 125, or the like.
[0050] System 100 may further comprise network(s) 145, as well as remote computing system(s) 140 and corresponding database(s) 140a and, in some cases, one or both of location determination server 150 and corresponding database(s) 150a and/or image recognition server 155 and corresponding database(s) 155a. Remote computing system(s) 140 (and database(s) 140a), location determination server 150 (and database(s) 150a), and image recognition server 155 (and corresponding database(s) 155a) may each be accessible by one or more of computing systems 110 and/or 170a-170n, user devices 115 and/or 175a-175n, location sensor(s) 120, and object detection sensor(s) 125, and/or the like, via network(s) 145, and in some cases, via wireless communication (such as depicted in Fig. 1 by the lightning bolt symbols, or the like). In some embodiments, the wireless communications may include wireless communications using protocols including, but not limited to, at least one of Bluetooth™ communications protocol, WiFi communications protocol, or other 802.11 suite of communications protocols, ZigBee communications protocol, Z-wave communications protocol, or other 802.15.4 suite of communications protocols, cellular communications protocol (e.g., 3G, 4G, 4G LTE, 5G, etc.), or other suitable communications protocols, and/or the like.
[0051] In some cases, the network(s) 145 may each include a local area network ("LAN"), including, without limitation, a fiber network, an Ethernet network, a Token- Ring™ network, and/or the like; a wide-area network ("WAN"); a wireless wide area network ("WWAN"); a virtual network, such as a virtual private network ("VPN"); the Internet; an intranet; an extranet; a public switched telephone network ("PSTN"); an infra-red network; a wireless network, including, without limitation, a network operating under any of the IEEE 802.11 suite of protocols, the Bluetooth™ protocol known in the art, and/or any other wireless protocol; and/or any combination of these and/or other networks. In a particular embodiment, the network(s) 145 might include an access network of the service provider (e.g., an Internet service provider ("ISP")). In another embodiment, the network(s) 145 may include a core network of the service provider, and/or the Internet.
[0052] Merely by way of example, in some cases, vehicles 105 and 105a- 105c may each include, without limitation, one of a car, a minivan, a pickup truck, a motorcycle, an all-terrain vehicle, a scooter, a police vehicle, a fire engine, an ambulance, a recreational vehicle, a bus, a commercial van, a commercial truck, a semi-tractor-trailer truck, a boat, a ship, a submersible, an amphibious vehicle, an aircraft, a space vehicle, a satellite, an autonomous vehicle, or a drone (including, but not limited to, one or an aerial drone, a land-based drone, a water-based drone, an amphibious drone, or a space-based drone, and/or the like), and/or the like.
[0053] In some embodiments, the computing system 110 may include, without limitation, at least one of a data signal fusing computing system, at least one processor on the user device, at least one processor on a mobile device associated with a user, a vehicle-based computing system, an object detection system, or a driver assistance system, and/or the like. In some cases, the computing system(s) may be disposed within or on the vehicle (e.g., computing system(s) 110, or the like) or may be associated with the vehicle yet located outside the vehicle (e.g., computing systems 170a- 170n, or the like; in some cases, at a remote location relative to the location of the vehicle, such as in the case of a computing system for controlling operations of a drone or a computing system that is communicatively coupled with an autonomous vehicle, or the like).
[0054] In some instances, user devices 115 and/or 175a-175n, which may be associated with at least one of a user or the vehicle, may each include, without limitation, at least one of a smartphone, a tablet computer, a display device, an augmented reality ("AR") device, a virtual reality ("VR") device, a mixed reality ("MR") device, a vehicle console display, a vehicle heads-up display ("HUD"), a vehicle remote controller display, one or more audio speakers, or one or more haptic response devices, and/or the like. In some cases, the user device(s) may be disposed within or on the vehicle (e.g., user device(s) 115, or the like) or may be associated with the vehicle yet located outside the vehicle (e.g., user devices 175a-175n, or the like; in some cases, at a remote location relative to the location of the vehicle, such as in the case of a user device for controlling a drone or a user device that is communicatively coupled with an autonomous vehicle, or the like).
[0055] According to some embodiments, one or more different types of location data signal sources (e.g., location sensor(s) 120, or the like) may be used and may each include, but is not limited to, one of a global positioning system ("GPS") device, a global navigation satellite system ("GNSS") device, a text recognition-based location identification system, an image recognition-based landmark identification system, a telecommunications signal triangulation-based location identification system, a radar-based location identification system, a sonar-based location identification system, or a lidar-based location identification system, and/or the like. In some embodiments, one or more different types of object detection data signal sources (e.g., object detection sensor(s) 125, or the like) may be used and may each include, but is not limited to, one of a vision-based object detection system, a radarbased object detection system, a sonar-based object detection system, or a lidar-based object detection system, and/or the like.
[0056] In some instances, the remote computing system 140 may include, but is not limited to, at least one of a remote data signal fusing computing system, a remote object detection system, or a remote driver assistance system, a server computer over the one or more networks, an image processing server, a graphics processing unit ("GPU") -based server, a positioning and mapping server, a machine learning system, an artificial intelligence ("Al") system, a deep learning system, a neural network, a convolutional neural network ("CNN"), a fully convolutional network ("FCN"), a cloud computing system, or a distributed computing system, and/or the like. In some embodiments, the location determination server 150 may be used to further process location data obtained from one or more location sensors (e.g., location sensor(s) 120, or the like). Similarly, the image recognition server 155 may be used to further process object detection data obtained from one or more object detection sensors (e.g., object detection sensor(s) 125, or the like).
[0057] In operation, computing system 110, user device(s) 115, one or more computing systems 170a-170n, and/or one or more user devices 175a-175n (collectively, "computing system") may determine at least one of a first current vehicle location (e.g., vehicle locations 130a-130c for corresponding vehicles 105a- 105c, or the like) or a first predicted path of a vehicle (e.g., vehicle paths 135a-135c (denoted in Fig. 1 by broad arrows, or the like) for corresponding vehicles 105a- 105c, or the like), based at least in part on one or more first location-based data (e.g., current location data 185, or the like) from corresponding one or more first location data signals received from one or more different types of location data signal sources (e.g., location sensor(s) 120, or the like). The computing system may send a first communication regarding the determined at least one of the first current vehicle location or the first predicted path of the vehicle (e.g., a message containing current location data 185, or the like) to a remote computing system (e.g., remote computing system 140, or the like) over one or more networks (e.g., network(s) 145, or the like). [0058] In response to existence of at least one first alert condition (e.g., alert condition(s) 160, or the like) for corresponding at least one first alert condition location that is in proximity to one or more of the determined at least one of the first current vehicle location or the first predicted path of the vehicle or that is within a first region encompassing the determined at least one of the first current vehicle location or the first predicted path of the vehicle, the computing system may receive a second communication (e.g., alert condition data 195, or the like) regarding the at least one first alert condition (e.g., alert condition 160, or the like) for the corresponding at least one first alert condition location (e.g., alert condition location 165, or the like) from the remote computing system over the one or more networks.
[0059] Merely by way of example, in some cases, the at least one first alert condition (e.g., alert condition(s) 160, or the like) may each include, without limitation, at least one of traffic congestion along the first predicted path of the vehicle potentially causing a slow-down, a traffic accident along the first predicted path of the vehicle potentially causing a slow-down, a construction site along the first predicted path of the vehicle potentially causing a slow-down, one or more people along the first predicted path of the vehicle who are occluded from a perspective of the vehicle, one or more animals along the first predicted path of the vehicle who are occluded from the perspective of the vehicle, one or more objects along the first predicted path of the vehicle who are occluded from the perspective of the vehicle, one or more people potentially blocking the first predicted path of the vehicle, one or more animals potentially blocking the first predicted path of the vehicle, one or more objects potentially blocking the first predicted path of the vehicle, a tracked weather event along or near the first predicted path of the vehicle, a natural hazard potentially blocking the first predicted path of the vehicle, a manmade hazard potentially blocking the first predicted path of the vehicle, one or more people potentially intercepting the vehicle along the first predicted path, one or more animals potentially intercepting the vehicle along the first predicted path, one or more objects potentially intercepting the vehicle along the first predicted path, or one or more other vehicles potentially intercepting the vehicle along the first predicted path, and/or the like.
[0060] Based on a determination that the vehicle is approaching the at least one first alert condition location (e.g., alert condition location 165, or the like) based at least in part on one or more second location-based data from corresponding one or more second location data signals received from the one or more different types of location data signal sources, the computing system may perform the following tasks: receive one or more first object detection data signals (e.g., signals containing object detection data 190, or the like) from one or more different types of object detection data signal sources (e.g., object detection sensor(s) 125, or the like); fuse the at least one first alert condition with one or more of the received one or more first object detection data signals, the at least one first alert condition location, a second current vehicle location of the vehicle, or a second predicted path of the vehicle (e.g., as shown at block 210 of Fig. 2, or the like) to generate first fused data; and generate and present, via one or more user devices (e.g., user device(s) 115 and/or 175a-175n, or the like), a first alert message indicating that the vehicle is approaching the at least one first alert condition, based at least in part on the generated first fused data. In some cases, at least one of the second current vehicle location or the second predicted path of the vehicle may be determined based at least in part on the one or more second location-based data from the corresponding one or more second data signals received from the one or more different types of location data signal sources (similar to the first current vehicle location or the first predicted path, or the like).
[0061] In some instances, determining the at least one of the first current vehicle location or the first predicted path of a vehicle may comprise the computing system performing at least one of: (1) determining the first current vehicle location based at least in part on GPS data from the GPS device; (2) determining the first predicted path of the vehicle based at least in part on a series of GPS data from the GPS device over time; (3) determining the first current vehicle location based at least in part on GNSS data from the GNSS device; (4) determining the first predicted path of the vehicle based at least in part on a series of GNSS data from the GNSS device over time; (5) determining the first current vehicle location based at least in part on text recognition of one or more location-identifying signs, the one or more location-identifying signs comprising at least one of one or more street signs, one or more address signs, one or more business signs, one or more highway signs, one or more city limits signs, one or more county boundary signs, one or more state boundary signs, one or more province boundary signs, one or more territory boundary signs, one or more regional boundary signs, one or more national border signs, one or more landmark identification signs, one or more distance to destination signs, one or more route markers, one or more highway location markers, one or more driver location signs, or one or more mile markers, and/or the like; (6) determining the first predicted path of the vehicle based at least in part on text recognition of a plurality of location-identifying signs over time; (7) determining the first current vehicle location based at least in part on image recognition of one or more landmarks, the one or more landmarks comprising at least one of one or more cityscapes, one or more skylines, one or more unique buildings, one or more franchise buildings, one or more unique street views, one or more overlooks, one or more scenic locations, one or more mountain ranges, one or more individual mountains, one or more bodies of water, one or more manmade monuments, one or more landscape art pieces, or one or more unique structures; (8) determining the first predicted path of the vehicle based at least in part on image recognition of a plurality of landmarks over time, and/or the like; (9) determining the first current vehicle location based at least in part on analysis of signal strength of telecommunications signals from a plurality of stationary telecommunications transceivers; (10) determining the first predicted path of the vehicle based at least in part on analysis of changes in signal strength of telecommunications signals from the plurality of stationary telecommunications transceivers over time; (11) determining the first current vehicle location based at least in part on analysis of radar signal data; (12) determining the first predicted path of the vehicle based at least in part on analysis of changes in radar signal data over time; (13) determining the first current vehicle location based at least in part on analysis of sonar signal data; (14) determining the first predicted path of the vehicle based at least in part on analysis of changes in sonar signal data over time; (15) determining the first current vehicle location based at least in part on analysis of lidar signal data; or (16) determining the first predicted path of the vehicle based at least in part on analysis of changes in lidar signal data over time.
[0062] According to some embodiments, generating and presenting the first alert message may comprise at least one of: generating a graphical display depicting one or more of the at least one first alert condition or the generated first fused data, and presenting the generated graphical display on a display device (e.g., user devices 115 and/or 175a- 175n, or the like); generating a text-based message depicting one or more of the at least one first alert condition or the generated first fused data, and presenting the text-based message on a display device (e.g., user devices 115 and/or 175a-175n, or the like); generating at least one message regarding one or more of the at least one first alert condition or the generated first fused data, and sending the at least one message to the user device (e.g., user devices 115 and/or 175a-175n, or the like); or generating at least one audio message regarding one or more of the at least one first alert condition or the generated first fused data, and sending the at least one audio message to the user device for playback on at least one audio speaker (e.g., user devices 115 and/or 175a-175n, or the like); and/or the like. In some cases, the at least one message may include, but is not limited to, at least one of an e-mail message, a short message service ("SMS") app, a multimedia messaging service ("MMS") app, or a text message app, and/or the like.
[0063] In some embodiments, at or near the at least one first alert condition location, the computing system may receive one or more second object detection data signals from the one or more different types of object detection data signal sources. The computing system may analyze the one or more second object detection data signals. Based on a determination that the at least one first alert condition is no longer present at the at least one first alert condition location, the computing system may send a third communication to the remote computing system over the one or more networks indicating that the at least one first alert condition is no longer present at the at least one first alert condition location.
[0064] According to some embodiments, the computing system may receive one or more third object detection data signals from the one or more different types of object detection data signal sources. The computing system may analyze the one or more third object detection data signals. Based on a determination that the one or more third object detection data signals correspond to at least one second alert condition, the computing system may determine at least one of a third current vehicle location or at least one second alert condition location corresponding to the at least one second alert condition, based at least in part on one or more third location-based data from corresponding one or more third data signals received from the one or more different types of location data signal sources. The computing system may send a fourth communication to the remote computing system over the one or more networks indicating that the at least one second alert condition has been detected at or near the at least one of the third current vehicle location or the at least one second alert condition location.
[0065] In some aspects, remote computing system 140 (herein, simply "remote computing system") may receive, from each of one or more first computing systems associated with corresponding one or more first vehicles among a plurality of vehicles and over one or more networks, one or more first communications indicating that a first alert condition has been detected at or near one or more of a first current vehicle location of each of the one or more first vehicles or a first alert condition location corresponding to the first alert condition. [0066] The remote computing system may receive, from each of one or more second computing systems associated with corresponding one or more second vehicles among the plurality of vehicles and over the one or more networks, one or more second communications regarding at least one of a second current vehicle location or a second predicted path for each of the one or more second vehicles. The remote computing system may analyze the at least one of the second current vehicle location or the second predicted path for each of the one or more second vehicles in relation to at least one of the first alert condition or the first alert condition location. Based on a determination that at least one second vehicle among the one or more second vehicles is in proximity to or approaching the first alert condition location or is within a first region encompassing the first alert condition, based at least in part on the analysis, the remote computing system may send one or more third communications to each of the second computing systems associated with each of the corresponding at least one second vehicle indicating that said second vehicle is in proximity to or approaching the first alert condition location.
[0067] In some embodiments, the remote computing system may receive, from each of one or more third computing systems associated with corresponding one or more third vehicles among the plurality of vehicles and over the one or more networks, one or more fourth communications indicating that the first alert condition is no longer present at the first alert condition location. The remote computing system may receive, from each of one or more fourth computing systems associated with corresponding one or more fourth vehicles among the plurality of vehicles and over the one or more networks, one or more fifth communications regarding at least one of a fourth current vehicle location or a fourth predicted path for each of the one or more fourth vehicles. The remote computing system may analyze the at least one of the fourth current vehicle location or the fourth predicted path for each of the one or more fourth vehicles in relation to at least one of the first alert condition or the first alert condition location. Based on a determination that at least one fourth vehicle among the one or more fourth vehicles is in proximity to or approaching the first alert condition location or is within the first region encompassing the first alert condition, based at least in part on the analysis, and based on a determination that the one or more fourth communications from a threshold number of third computing systems associated with the corresponding one or more third vehicles have been received (i.e., after receiving the one or more fourth communications from a threshold number of third computing systems associated with the corresponding one or more third vehicles), the remote computing system may send one or more fifth communications to each of the fourth computing systems associated with each of the corresponding at least one fourth vehicle indicating that the first alert condition is no longer present at the first alert condition location. If the one or more fourth communications from a threshold number of third computing systems associated with the corresponding one or more third vehicles have not been received, the remote computing system may continue sending the one or more third communications to each of the fourth computing systems associated with each of the corresponding at least one fourth vehicle indicating that said fourth vehicle is in proximity to or approaching the first alert condition location.
[0068] These and other functions of the system 100 (and its components) are described in greater detail below with respect to Figs. 2-4.
[0069] Fig. 2 is a schematic block flow diagrams illustrating a non- limiting example 200 of object detection using fusion of vision, location, and/or other signals, and/or alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals, in accordance with various embodiments. [0070] With reference to Fig. 2, a computing system 110, user device(s) 115, location data signal source(s) 120, and object detection data signal source(s) 125 may be disposed within or on a vehicle 105. The computing system 110 may communicatively couple with a remote computing system(s) 140 (and corresponding database(s) 140a) (and, in some cases, with user device(s) 175 as well) via network(s) 145. Vehicle 105, computing system 110, user device(s) 115, location data signal source(s) 120, location data 185, object detection data signal source(s) 125, object detection data 190, remote computing system(s) 140, database(s) 140a, user device(s) 175, and network(s) 145 of Fig. 2 are otherwise similar, if not identical, to the corresponding vehicles 105 and/or 105a-105c, computing system 110, user device(s) 115, location sensor(s) 120, current location data 185, object detection sensor(s) 125, object detection data 190, remote computing system(s) 140, database(s) 140a, user device(s) 175a-175n, and network(s) 145 of Fig. 1, or the like, and the descriptions of these components of Fig. 1 are applicable to the corresponding components of Fig. 2, and vice versa. [0071] In operation, computing system 110 may receive object detection data 190 from object detection data signal source(s) 125, and may perform object detection (at block 205). Computing system 110 may also receive location data 185 from location data signal source(s) 120, and may perform fusion of object detection and location (at block 210), in some cases, based at least in part on at least one of object detection data 190, results of object detection (from block 205), and/or location data 185, and/or the like. Computing system 110 may then send vehicle location and/or detection result 215 to remote computing system 140 (and corresponding database(s) 140a) via network(s) 145. In some embodiments, remote computing system 140 (and/or corresponding database(s) 140a, which in some cases may include, but is not limited to, a structured query language ("SQL") database(s), or the like) may be configured to store and/or update alert condition information (in some cases, event information, or the like) corresponding to alert condition (e.g., alert condition(s) 160 of Fig. 1, or the like) and/or alert condition location (e.g., alert condition location(s) 165 of Fig. 1, or the like), to interchange and/or exchange the location information, and/or the like.
[0072] Remote computing system(s) 140 may send alert condition data and location 220 to computing system 110 via network(s) 145. Computing system 110 may then perform coordinate transformation (at block 225) of the alert condition data and location 220, the results of which may be used as input to the fusion process at block 210. Computing system 110 may then display and/or present (at block 230) the alert information (i.e., the results of the fusion at block 210) on user device(s) 115 (which may be located within or on vehicle 105) and/or user device(s) 175 (which may be located external to vehicle 105; in some cases, in communication with computing system 110 via network(s) 145, or the like).
[0073] In some embodiments, in the case that the computing system 110 comprises a mobile phone (e.g., a smartphone, or the like), the application for fusion of object detection and location information (e.g., at block 210, or the like) may be invoked when turning on the phone camera (e.g., the object detection data signal source(s) 125, or the like). As soon as the frame is provided (or after the frame has been provided), object detection and location information threads (e.g., blocks 205 and 220/225, or the like) may be triggered and the frame may be added to the video display thread (e.g., block 230, or the like). At the same time (or concurrently), the object detection result may be handled by the same location information thread to send the corresponding location to the remote computing system 140 and/or the database 140a, or the like.
[0074] Various embodiments provide a system and service that may be configured to exchange or interchange location signals and/or object detection results. In some cases, a SQL database(s) may be used to store the location data, while the mobile phone may be used to perform object detection, to send and receive the information regarding locations. In some instances, object detection models may run on the frameworks of mobile Pytorch or TensorflowLite, and/or the like. With Java native interface ("JNI") or the like, inferencing performed by the object detection model may be processed by a graphics processing unit ("GPU"), in some cases, through C++ or the like to achieve faster runtimes. Sensor data (like GPS data, or the like, on the mobile phone) may be accessed by the operating system ("OS") application programming interface ("API") (e.g., Android API, or the like).
[0075] To protect privacy of users, a virtual private network ("VPN") may be used. A VPN may encrypt 100% of the Internet traffic sent from a computer and may deliver it to an alternate server somewhere else on the Internet. On the other hand, the subscription of one WiFi device may be recognized as a tracking identifier ("ID") that can provide information of the localizations for a period of time. In some embodiments, only subscribers of the VPN service may be allowed to send and receive the locations.
[0076] In the various aspects described herein, a fusion framework that integrates object detection and location signals to provide more useful information may be provided. Here, location signals are not limited to location signals based on GPS data (as described herein), and any location signal may be easily used in this framework. The system (and corresponding service) described in accordance with the various embodiments operates by interchanging and/or exchanging locations (e.g., alert condition locations, or the like) according to the area or region in which potentially affected vehicles may be located, and, in some cases, storing the event locations (or alert condition locations) and pushing them to the mobile phones (or other user devices or computing systems) of users and/or vehicles that are determined to potentially be affected by the events (or alert conditions). Further, object detection models running on the mobile phone (or other user devices or computing systems) with additional location information to enhance the driving assistant information, or the like.
[0077] This allows scalability and robust object detection. For example, driving assistant software applications ("apps") (such as navigation, or the like) have large user scale and demand for more support on it. A system or method that integrates visual images or other object detection signals with sensor signals or other location signals (such as described herein in accordance with the various embodiments) may enable delivery of more useful information to assist drivers. In addition, this framework may be based on mobile devices to extend its ability by integrating visual images and location sensor signals to generate new driver assist information. Further, this framework may be scalable because its services could be distributed based on location(s) and may be run on the cloud, or the like. For example, the service may be started if the number of users exceeds some threshold value (e.g., within a region), and that service only serves the users in that area. Once the number of users increases, the services may be expanded accordingly. Also, the object detection model may be trained for different purposes other than the common ones. For example, occluded pedestrians may be revealed by the framework in the case that other subscribed users are nearby and have these pedestrians in detection range (where such information about the occluded pedestrians may be shared to a user in a vehicle approaching these occluded pedestrians, where such a user is not in line of sight of these pedestrians). As described herein, the various embodiments are not limited to traffic use, but may be applicable to any vehicle use in any environment, so long as information regarding location and object detection of alert conditions may be shared.
[0078] Although the framework provides a general solution to the fusion of the object detection and the location signals, it may be extended to, but is not limited to, the following aspects: (a) allowing tracking algorithms to be performed on the detection results since the detection result with ID is the tracking result; (b) allowing various wireless communications such as 4G/5G or WiFi that can contain the location information; (c) allowing depth/distance information measured from and/or by the end users; (d) allow post-processing of the fused results; and/or the like.
[0079] These and other functions of the example 200 (and the components of Fig. 2) are described in greater detail herein with respect to Figs. 1, 3, and 4. [0080] Figs. 3A-3C (collectively, "Fig. 3") are schematic block flow diagrams illustrating various non-limiting examples 300, 300', and 300" of alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals, in accordance with various embodiments. Although Fig. 3 is directed to automobiles as the vehicles 105 and road traffic-related alert conditions, or the like, the various embodiments are not so limited, and any suitable vehicle and/or alert condition (as described herein with respect to Figs. 1 and 4, or the like) may be the focus of the object detection and/or the alert condition messaging as described herein. Fig. 3 merely provides a simple set of examples for illustrating implementation of object detection and/or alert condition messaging, in accordance with the various embodiments.
[0081] Referring to the non- limiting example 300 of Fig. 3 A, on a four-lane roadway 305, a plurality of vehicles 105 may travel along with bi-directional traffic on lanes 305a and 305b. In some cases, destination markers or signs 310 may indicate location or distance to a particular location (e.g., signs 310a and 310b indicate that the distance to the town of Springfield is 15 miles and 20 miles, respectively).
[0082] As shown in the non- limiting example 300 of Fig. 3A, an alert condition 160 (in this case, a car accident involving vehicles 105e and 105f) has been observed at alert condition location 165 (in this case, on one of lanes 305a heading to the town of Springfield, near the sign indicating 15 miles to Springfield). Computing systems on at least one of vehicle 105d (travelling in the opposite direction on one of lanes 305b), user device 175a (associated with user 180a, who is standing by the side of the affected lane 305a), vehicle 105g (travelling in the same direction on one of lanes 305a), and/or vehicle 105h (travelling in the same direction on one of lanes 305a), and/or the like, may detect the alert condition 160 using object detection and location determination (such as described in detail above with respect to Figs. 1 and 2, or the like), and may send respective alert condition messages 195a, 195b, 195c, and 195d to remote computing system 140, over one or more networks (not shown in Fig. 3A; similar to network(s) 145 of Figs. 1 and 2, or the like). The alert condition messages 195a, 195b, 195c, and 195d may each include, but is not limited to, object detection data regarding the alert condition 160 and location data regarding the alert condition location 165, and/or the like. Meanwhile, computing systems on at least one of vehicles 105i, 105j , and/or 105k (which are near sign 310b, and thus about 5 miles from the alert condition location 165, or the like), and/or the like, may send at least location data 185a, 185b, and 185c, respectively (similar to location data 185 in Figs. 1 and 2, or the like), to remote computing system 140, over the one or more networks. Although not shown, computing systems on at least one of vehicles 105i, 105j, and/or 105k, and/or the like, may also send at least object detection data 190a, 190b, and 190c, respectively (similar to object detection data 190 in Figs. 1 and 2, or the like), to remote computing system 140, over the one or more networks. Also, although not shown, computing systems on the at least one of vehicle 105d, user device 175a, vehicle 105g, and/or vehicle 105h may also send location data 185 and/or object detection data 190, or the like, to remote computing system 140, over the one or more networks.
[0083] Remote computing system 140 may receive the alert condition messages (e.g., alert condition messages 195a, 195b, 195c, and 195d, or the like) (at block 315), may analyze the alert condition data (at block 320), and may identify the alert condition and/or the extent of the alert condition (e.g., the size of the area affected by the alert condition, or the like) (at block 325), and may identify alert condition location and range (at block 330). Remote computing system 140 may also receive location data of the vehicles (e.g., location data 185a, 185b, and 185c; in some cases, object detection data 190a, 190b, and 190c, as well; in some instances, location data 185 and/or object detection data 190 from computing systems on the at least one of vehicle 105d, user device 175a, vehicle 105g, and/or vehicle 105h, as well) (at block 335). Remote computing system 140 may analyze the location data of the vehicles (as well as any received object detection data) (at block 340). At block 345, remote computing system 140 may identify one or more vehicles at, near, or approaching the alert condition location, based at least in part on the results of blocks 330 and 340. Remote computing system 140, at block 350, may subsequently send alert condition messages (e.g., alert condition message 195, or the like) to the computing systems corresponding to the identified vehicles (in this case, vehicles 105i and 105j, which are heading toward the alert condition location 165, but not vehicle 105k, which is going in the opposite direction, as well as away, from the alert condition location 165). [0084] Turning to the non-limiting example 300' of Fig. 3B, after some time, during which the alert condition has been addressed or handled such that the alert condition is no longer present at the alert condition location (in this case, after the car accident has been cleared), computing systems on at least one of vehicle 1051 (travelling in the opposite direction on one of lanes 305b), vehicle 105m (travelling in the same direction on one of lanes 305a), and/or vehicle 105n (travelling in the same direction on one of lanes 305a), and/or the like, each having received the alert condition message 195 (in a manner as described above with respect to Fig. 3A), may detect the absence of the alert condition 160 at the alert condition location 165, and may each send respective alert condition over messages 195a', 195b', and 195c' to remote computing system 140, over one or more networks (not shown in Fig. 3B; similar to network(s) 145 of Figs. 1 and 2, or the like). The alert condition over messages 195a', 195b', and 195c' may each include, but is not limited to, object detection data regarding the former alert condition 160' and location data regarding the former alert condition location 165', as well as data regarding the absence of the alert condition 160 at the alert condition location 165, and/or the like. Meanwhile, computing systems on at least one of vehicles 105o, 105p, and/or 105q (which are near sign 310b, and thus about 5 miles from the alert condition location 165 or former alert condition location 165', or the like), and/or the like, may send at least location data 185d, 185e, and 185f, respectively (similar to location data 185 in Figs. 1 and 2, or the like), to remote computing system 140, over the one or more networks. Although not shown, computing systems on at least one of vehicles 105o, 105p, and/or 105q, and/or the like, may also send at least object detection data 190d, 190e, and 190f, respectively (similar to object detection data 190 in Figs. 1 and 2, or the like), to remote computing system 140, over the one or more networks. Also, although not shown, computing systems on the at least one of vehicles 1051, 105m, and/or 105n may also send location data 185 and/or object detection data 190, or the like, to remote computing system 140, over the one or more networks.
[0085] Remote computing system 140 may receive the alert condition over messages (e.g., alert condition over messages 195a', 195b', and 195c', or the like) (at block 355), may determine whether a threshold number of such messages have been received (i.e., whether such alert condition over messages 195' from a threshold number of the computing systems corresponding to the threshold number of vehicles has been received by the remote computing system 140) (at block 360). In some cases, the threshold number may include, but is not limited to, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, or the like. If the threshold number of alert condition over messages have been received, then remote computing system 140 may identify vehicles at, near, or approaching the former alert condition location (at block 365), and may subsequently send, at block 370, alert condition over messages (e.g., alert condition over message 195', or the like) to the computing systems corresponding to the identified vehicles (in this case, vehicles 105o and 105p, which are heading toward the former alert condition location 165', but not vehicle 105q, which is going in the opposite direction, as well as away, from the former alert condition location 165').
[0086] However, if the threshold number of alert condition over messages have not been received, then remote computing system 140 may identify vehicles at, near, or approaching the alert condition location (at block 345), and may subsequently continue to send, at block 350, alert condition messages (e.g., alert condition over message 195, or the like) to the computing systems corresponding to the identified vehicles (in this case, vehicles 105o and 105p, which are heading toward the alert condition location 165, but not vehicle 105q, which is going in the opposite direction, as well as away, from the alert condition location 165).
[0087] With reference to the non-limiting example 300" of Fig. 3C, on a two-lane roadway 305', a plurality of vehicles 105 may travel along with bi-directional traffic on lanes 305c and 305d. In some cases, a destination marker(s) or sign(s) 310 may indicate location or distance to a particular location (e.g., sign 310c indicates that the distance to the town of Fairview is 12 miles).
[0088] As shown in the non- limiting example 300' of Fig. 3C, an alert condition 160" (in this case, approaching vehicles from the opposite direction making it dangerous for a vehicle attempting to pass the vehicle(s) in front of it) has been observed at alert condition location 165" (in this case, both lands 305c and 305d near the sign indicating 12 miles to Fairview, where vehicles are converging from opposite directions). Computing systems on at least one of vehicle 105r (in this case, a car travelling in a second direction on lane 305d), vehicle 105s (in this case, a semi- tractor-trailer truck travelling in the second direction on lane 305d), vehicle 105t (in this case, a car travelling in a first direction, opposite to the second direction, on lane 305c), and/or vehicle 105u (in this case, another semi-tractor-trailer truck travelling in the first direction on lane 305c), and/or the like, may detect the alert condition 160" using object detection and location determination (such as described in detail above with respect to Figs. 1 and 2, or the like), and may send respective alert condition messages 195e, 195f, 195g, and 195h to remote computing system 140, over one or more networks (not shown in Fig. 3C; similar to network(s) 145 of Figs. 1 and 2, or the like). The alert condition messages 195e, 195f, 195g, and 195h may each include, but is not limited to, object detection data regarding the alert condition 160" and location data regarding the alert condition location 165", and/or the like. Meanwhile, computing systems on at least one of vehicles 105v, 105w, and/or 105x, and/or the like, may send at least location data 185g, 185h, and 185i, respectively (similar to location data 185 in Figs. 1 and 2, or the like), to remote computing system 140, over the one or more networks. Although not shown, computing systems on at least one of vehicles 105v, 105w, and/or 105x, and/or the like, may also send at least object detection data 190g, 190h, and 190i, respectively (similar to object detection data 190 in Figs. 1 and 2, or the like), to remote computing system 140, over the one or more networks. Also, although not shown, computing systems on the at least one of vehicles 105r, 105s, 105t, and/or 105u may also send location data 185 and/or object detection data 190, or the like, to remote computing system 140, over the one or more networks.
[0089] Remote computing system 140 may receive the alert condition messages (e.g., alert condition messages 195e, 195f, 195g, and 195h, or the like) (at block 315), may analyze the alert condition data (at block 320), and may identify the alert condition and/or the extent of the alert condition (e.g., the size of the area affected by the alert condition, or the like) (at block 325), and may identify alert condition location and range (at block 330). Remote computing system 140 may also receive location data of the vehicles (e.g., location data 185g, 185h, and 185i; in some cases, object detection data 190g, 190h, and 190i, as well; in some instances, location data 185 and/or object detection data 190 from computing systems on the at least one of vehicles 105r, 105s, 105t, and/or 105u, as well) (at block 335). Remote computing system 140 may analyze the location data of the vehicles (as well as any received object detection data) (at block 340). At block 345, remote computing system 140 may identify one or more vehicles at, near, or approaching the alert condition location, based at least in part on the results of blocks 330 and 340. Remote computing system 140, at block 350, may subsequently send alert condition messages (e.g., alert condition message 195", or the like) to the computing systems corresponding to the identified vehicles (in this case, vehicles 105v and 105w, which are heading toward the alert condition location 165", but not vehicle 105x, which is going in the opposite direction, as well as away, from the alert condition location 165").
[0090] In addition, although not shown in Fig. 3C, a similar alert condition may exist for vehicle 105r in terms of the danger in passing vehicle 105s due to oncoming traffic in the form of vehicles 105t, 105u, 105v, and/or 105w, or the like, and the appropriate messages would be sent and received in a manner similar to that described above with respect to vehicles 105v and/or 105w attempting to pass vehicle 105u due to oncoming traffic in the form of vehicles 105r and/or 105s, or the like.
[0091] In some embodiments, Fig. 3A (and 3B) illustrates an example of "object detection to the location" (in this case, where there is a car accident), Fig. 3C illustrates an example of "location to object detection" (in this case, approaching a dangerous pass zone). The object detection model may detect the car accident and may send its GPS location (or other location data) to the remote computing system (and corresponding database(s)). There may be a running service on the server that is configured to push this information of the car accident to other (nearby) users, for example, to their mobile phones or other user devices, or the like. When the other cars are approaching the alert condition location(s) and/or entering some (encompassing) region (e.g., as determined by the location signals, GPS in this example), mobile phones or other user devices of users in those other cars may show the warning message(s) to give notice to these users that there is a car accident ahead and to please be aware of that.
[0092] Once the car accident has been cleared (as shown in Fig. 3B), and as observed by some number of the vehicles' object detection systems (e.g., 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, or the like, number of systems corresponding to like number of vehicles), the remote computing system may update its database(s) accordingly and may push updated information to the other users so that they would no longer be warned when they are approaching the affected (now, cleared) location(s).
[0093] According to some embodiments, Fig. 3C illustrates an example of "location to object detection") when the occluded object (in this case, oncoming traffic) may be approaching. As shown in Fig. 3C, for the drivers of vehicles 105v and 105w, it is extremely difficult (if not impossible) to see or detect the occluded objects (in this case, vehicles 105s and 105r approaching from the opposite direction on the adjacent lane of the two-lane road). However, such information is important for the drivers to know. If such information could be known to these drivers earlier and/or if the user devices (e.g., mobile phones) of these drivers could draw virtual images of the oncoming vehicles, it would better prepare these drivers (and would likely discourage them from attempting to dangerously pass vehicle 105u). Further, once the location(s) is(are) (or has(have) become) consistent with the object detection result(s), they may then be fused together to enhance the user experience (particular, in terms of the alert condition presentation, or the like).
[0094] These and other functions of the examples 300, 300', and 300" (and the components of Figs. 3A-3C) are described in greater detail herein with respect to Figs. 1, 2, and 4.
[0095] Figs. 4A-4F (collectively, "Fig. 4") are flow diagrams illustrating a method 400 for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals, in accordance with various embodiments. Method 400 of Fig. 4A either continues onto Fig. 4C following the circular marker denoted, "A," or continues onto Fig. 4D following the circular marker denoted, "B." Method 400 of Fig. 4E continues onto Fig. 4F following the circular marker denoted, "C." In some cases, method 400 of Fig. 4F may return to Fig. 4E following the circular marked denoted, "D."
[0096] While the techniques and procedures are depicted and/or described in a certain order for purposes of illustration, it should be appreciated that certain procedures may be reordered and/or omitted within the scope of various embodiments. Moreover, while the method 400 illustrated by Fig. 4 can be implemented by or with (and, in some cases, are described below with respect to) the systems, examples, or embodiments 100, 200, 300, 300', and 300" of Figs. 1, 2, 3A, 3B, and 3C, respectively (or components thereof), such methods may also be implemented using any suitable hardware (or software) implementation. Similarly, while each of the systems, examples, or embodiments 100, 200, 300, 300', and 300" of Figs. 1, 2, 3 A, 3B, and 3C, respectively (or components thereof), can operate according to the method 400 illustrated by Fig. 4 (e.g., by executing instructions embodied on a computer readable medium), the systems, examples, or embodiments 100, 200, 300, 300', and 300" of Figs. 1, 2, 3A, 3B, and 3C can each also operate according to other modes of operation and/or perform other suitable procedures.
[0097] In the non- limiting embodiment of Fig. 4A, method 400, at block 402, may comprise determining, using a computing system, at least one of a first current vehicle location or a first predicted path of a vehicle, based at least in part on one or more first location-based data from corresponding one or more first location data signals received from one or more different types of location data signal sources.
[0098] In some embodiments, the computing system may include, without limitation, at least one of a data signal fusing computing system, at least one processor on the user device, at least one processor on a mobile device associated with a user, a vehicle-based computing system, an object detection system, or a driver assistance system, and/or the like. According to some embodiments, the one or more different types of location data signal sources may each include, but is not limited to, one of a global positioning system ("GPS") device, a global navigation satellite system ("GNSS") device, a text recognition-based location identification system, an image recognition-based landmark identification system, a telecommunications signal triangulation-based location identification system, a radar-based location identification system, a sonar-based location identification system, or a lidar-based location identification system, and/or the like. In some cases, the vehicle may include, without limitation, one of a car, a minivan, a pickup truck, a motorcycle, an all-terrain vehicle, a scooter, a police vehicle, a fire engine, an ambulance, a recreational vehicle, a bus, a commercial van, a commercial truck, a semi-tractor- trailer truck, a boat, a ship, a submersible, an amphibious vehicle, an aircraft, a space vehicle, a satellite, an autonomous vehicle, or a drone (including, but not limited to, one or an aerial drone, a land-based drone, a water-based drone, an amphibious drone, or a space-based drone, and/or the like), and/or the like.
[0099] In some instances, determining the at least one of the first current vehicle location or the first predicted path of a vehicle may comprise at least one of: determining, using the computing system, the first current vehicle location based at least in part on GPS data from the GPS device; determining, using the computing system, the first predicted path of the vehicle based at least in part on a series of GPS data from the GPS device over time; determining, using the computing system, the first current vehicle location based at least in part on GNSS data from the GNSS device; determining, using the computing system, the first predicted path of the vehicle based at least in part on a series of GNSS data from the GNSS device over time; determining, using the computing system, the first current vehicle location based at least in part on text recognition of one or more location-identifying signs, the one or more location-identifying signs comprising at least one of one or more street signs, one or more address signs, one or more business signs, one or more highway signs, one or more city limits signs, one or more county boundary signs, one or more state boundary signs, one or more province boundary signs, one or more territory boundary signs, one or more regional boundary signs, one or more national border signs, one or more landmark identification signs, one or more distance to destination signs, one or more route markers, one or more highway location markers, one or more driver location signs, or one or more mile markers, and/or the like; determining, using the computing system, the first predicted path of the vehicle based at least in part on text recognition of a plurality of location-identifying signs over time; determining, using the computing system, the first current vehicle location based at least in part on image recognition of one or more landmarks, the one or more landmarks comprising at least one of one or more cityscapes, one or more skylines, one or more unique buildings, one or more franchise buildings, one or more unique street views, one or more overlooks, one or more scenic locations, one or more mountain ranges, one or more individual mountains, one or more bodies of water, one or more manmade monuments, one or more landscape art pieces, or one or more unique structures, and/or the like; determining, using the computing system, the first predicted path of the vehicle based at least in part on image recognition of a plurality of landmarks over time; determining, using the computing system, the first current vehicle location based at least in part on analysis of signal strength of telecommunications signals from a plurality of stationary telecommunications transceivers; determining, using the computing system, the first predicted path of the vehicle based at least in part on analysis of changes in signal strength of telecommunications signals from the plurality of stationary telecommunications transceivers over time; determining, using the computing system, the first current vehicle location based at least in part on analysis of radar signal data; determining, using the computing system, the first predicted path of the vehicle based at least in part on analysis of changes in radar signal data over time; determining, using the computing system, the first current vehicle location based at least in part on analysis of sonar signal data; determining, using the computing system, the first predicted path of the vehicle based at least in part on analysis of changes in sonar signal data over time; determining, using the computing system, the first current vehicle location based at least in part on analysis of lidar signal data; or determining, using the computing system, the first predicted path of the vehicle based at least in part on analysis of changes in lidar signal data over time.
[0100] At block 404, method 400 may comprise sending, using the computing system, a first communication regarding the determined at least one of the first current vehicle location or the first predicted path of the vehicle to a remote computing system over one or more networks. In some instances, the remote computing system may include, but is not limited to, at least one of a remote data signal fusing computing system, a remote object detection system, or a remote driver assistance system, a server computer over the one or more networks, an image processing server, a graphics processing unit ("GPU") -based server, a positioning and mapping server, a machine learning system, an artificial intelligence ("Al") system, a deep learning system, a neural network, a convolutional neural network ("CNN"), a fully convolutional network ("FCN"), a cloud computing system, or a distributed computing system, and/or the like.
[0101] Method 400 may further comprise, at block 406, in response to existence of at least one first alert condition for corresponding at least one first alert condition location that is in proximity to one or more of the determined at least one of the first current vehicle location or the first predicted path of the vehicle or that is within a first region encompassing the determined at least one of the first current vehicle location or the first predicted path of the vehicle, receiving, using the computing system, a second communication regarding the at least one first alert condition for the corresponding at least one first alert condition location from the remote computing system over the one or more networks.
[0102] Merely by way of example, in some cases, the at least one first alert condition may each include, without limitation, at least one of traffic congestion along the first predicted path of the vehicle potentially causing a slow-down, a traffic accident along the first predicted path of the vehicle potentially causing a slow-down, a construction site along the first predicted path of the vehicle potentially causing a slow-down, one or more people along the first predicted path of the vehicle who are occluded from a perspective of the vehicle, one or more animals along the first predicted path of the vehicle who are occluded from the perspective of the vehicle, one or more objects along the first predicted path of the vehicle who are occluded from the perspective of the vehicle, one or more people potentially blocking the first predicted path of the vehicle, one or more animals potentially blocking the first predicted path of the vehicle, one or more objects potentially blocking the first predicted path of the vehicle, a tracked weather event along or near the first predicted path of the vehicle, a natural hazard potentially blocking the first predicted path of the vehicle, a manmade hazard potentially blocking the first predicted path of the vehicle, one or more people potentially intercepting the vehicle along the first predicted path, one or more animals potentially intercepting the vehicle along the first predicted path, one or more objects potentially intercepting the vehicle along the first predicted path, or one or more other vehicles potentially intercepting the vehicle along the first predicted path, and/or the like.
[0103] At block 408, method 400 may comprise determining whether the vehicle is approaching the at least one first alert condition location based at least in part on one or more second location-based data from corresponding one or more second location data signals received from the one or more different types of location data signal sources. If so, method 400 may comprise performing the following tasks: receiving, using the computing system, one or more first object detection data signals from one or more different types of object detection data signal sources (block 410); fusing, using the computing system, the at least one first alert condition with one or more of the received one or more first object detection data signals, the at least one first alert condition location, a second current vehicle location of the vehicle, or a second predicted path of the vehicle to generate first fused data (block 412); and generating and presenting, using the computing system and via one or more user devices, a first alert message indicating that the vehicle is approaching the at least one first alert condition, based at least in part on the generated first fused data (block 414). [0104] In some cases, at least one of the second current vehicle location or the second predicted path of the vehicle may be determined based at least in part on the one or more second location-based data from the corresponding one or more second data signals received from the one or more different types of location data signal sources. In some embodiments, the one or more different types of object detection data signal sources may each include, but is not limited to, one of a vision-based object detection system, a radar-based object detection system, a sonar-based object detection system, or a lidar-based object detection system, and/or the like. In some instances, each user device, which may be associated with at least one of a user or the vehicle, may include, without limitation, at least one of a smartphone, a tablet computer, a display device, an augmented reality ("AR") device, a virtual reality ("VR") device, a mixed reality ("MR") device, a vehicle console display, a vehicle heads-up display ("HUD"), a vehicle remote controller display, one or more audio speakers, or one or more haptic response devices, and/or the like. In some cases, the user device(s) may be disposed within or on the vehicle or may be associated with the vehicle yet located outside the vehicle (in some cases, at a remote location relative to the location of the vehicle, such as in the case of a user device for controlling a drone or a user device that is communicatively coupled with an autonomous vehicle, or the like).
[0105] Method 400 may either continue onto the process at block 424 in Fig. 4C following the circular marker denoted, "A," or continue onto the process at block 430 in Fig. 4D following the circular marker denoted, "B."
[0106] Referring to Fig. 4B, generating and presenting the first alert message (at block 414) may comprise at least one of: generating a graphical display depicting one or more of the at least one first alert condition or the generated first fused data, and presenting the generated graphical display on a display device (block 416); generating a text-based message depicting one or more of the at least one first alert condition or the generated first fused data, and presenting the text-based message on a display device (block 418); generating at least one message regarding one or more of the at least one first alert condition or the generated first fused data, and sending the at least one message to the user device (block 420); or generating at least one audio message regarding one or more of the at least one first alert condition or the generated first fused data, and sending the at least one audio message to the user device for playback on at least one audio speaker (block 422); and/or the like. In some cases, the at least one message may include, but is not limited to, at least one of an e-mail message, a short message service ("SMS") app, a multimedia messaging service ("MMS") app, or a text message app, and/or the like.
[0107] At block 424 in Fig. 4C (following the circular marker denoted, "A"), method 400 may comprise, at or near the at least one first alert condition location, receiving, using the computing system, one or more second object detection data signals from the one or more different types of object detection data signal sources; analyzing, using the computing system, the one or more second object detection data signals (block 426); and based on a determination that the at least one first alert condition is no longer present at the at least one first alert condition location, sending, using the computing system, a third communication to the remote computing system over the one or more networks indicating that the at least one first alert condition is no longer present at the at least one first alert condition location (block 428).
[0108] At block 430 in Fig. 4D (following the circular marker denoted, "B"), method 400 may comprise, receiving, using the computing system, one or more third object detection data signals from the one or more different types of object detection data signal sources; analyzing, using the computing system, the one or more third object detection data signals (block 432); based on a determination that the one or more third object detection data signals correspond to at least one second alert condition, determining, using the computing system, at least one of a third current vehicle location or at least one second alert condition location corresponding to the at least one second alert condition, based at least in part on one or more third locationbased data from corresponding one or more third data signals received from the one or more different types of location data signal sources (block 434); and sending, using the computing system, a fourth communication to the remote computing system over the one or more networks indicating that the at least one second alert condition has been detected at or near the at least one of the third current vehicle location or the at least one second alert condition location (block 436).
[0109] With reference to the non-limiting embodiment of Fig. 4E, method 400, at block 438, may comprise receiving, using a remote computing system and from each of one or more first computing systems associated with corresponding one or more first vehicles among a plurality of vehicles and over one or more networks, one or more first communications indicating that a first alert condition has been detected at or near one or more of a first current vehicle location of each of the one or more first vehicles or a first alert condition location corresponding to the first alert condition. Here, the one or more first communications may, in some cases, correspond to the fourth communication in Fig. 4D, albeit from one or more computing systems associated with corresponding one or more vehicles.
[0110] In some embodiments, the one or more first computing systems and the one or more second computing systems may each include, without limitation, at least one of a data signal fusing computing system, at least one processor on the user device, at least one processor on a mobile device associated with a user, a vehiclebased computing system, an object detection system, or a driver assistance system, and/or the like. In some instances, the remote computing system may include, but is not limited to, at least one of a remote data signal fusing computing system, a remote object detection system, or a remote driver assistance system, a server computer over the one or more networks, an image processing server, a graphics processing unit ("GPU") -based server, a positioning and mapping server, a machine learning system, an artificial intelligence ("Al") system, a deep learning system, a neural network, a convolutional neural network ("CNN"), a fully convolutional network ("FCN"), a cloud computing system, or a distributed computing system, and/or the like. In some cases, the plurality of vehicles may each include, without limitation, one of a car, a minivan, a pickup truck, a motorcycle, an all-terrain vehicle, a scooter, a police vehicle, a fire engine, an ambulance, a recreational vehicle, a bus, a commercial van, a commercial truck, a semi-tractor-trailer truck, a boat, a ship, a submersible, an amphibious vehicle, an aircraft, a space vehicle, a satellite, an autonomous vehicle, or a drone (including, but not limited to, one or an aerial drone, a land-based drone, a water-based drone, an amphibious drone, or a space-based drone, and/or the like), and/or the like.
[0111] According to some embodiments, the first alert condition may include, but is not limited to, at least one of traffic congestion along the second predicted path of the at least one second vehicle potentially causing a slow-down, a traffic accident along the second predicted path of the at least one second vehicle potentially causing a slow-down, a construction site along the second predicted path of the at least one second vehicle potentially causing a slow-down, one or more people along the second predicted path of the at least one second vehicle who are occluded from a perspective of the at least one second vehicle, one or more animals along the second predicted path of the at least one second vehicle who are occluded from the perspective of the at least one second vehicle, one or more objects along the second predicted path of the at least one second vehicle who are occluded from the perspective of the at least one second vehicle, one or more people potentially blocking the second predicted path of the at least one second vehicle, one or more animals potentially blocking the second predicted path of the at least one second vehicle, one or more objects potentially blocking the second predicted path of the at least one second vehicle, a tracked weather event along or near the second predicted path of the at least one second vehicle, a natural hazard potentially blocking the second predicted path of the at least one second vehicle, a manmade hazard potentially blocking the second predicted path of the at least one second vehicle, one or more people potentially intercepting the at least one second vehicle along the second predicted path, one or more animals potentially intercepting the at least one second vehicle along the second predicted path, one or more objects potentially intercepting the at least one second vehicle along the second predicted path, or one or more other vehicles potentially intercepting the at least one second vehicle along the second predicted path, and/or the like.
[0112] At block 440, method 400 may comprise receiving, using the remote computing system and from each of one or more second computing systems associated with corresponding one or more second vehicles among the plurality of vehicles and over the one or more networks, one or more second communications regarding at least one of a second current vehicle location or a second predicted path for each of the one or more second vehicles. Here, the one or more second communications may, in some cases, correspond to the first communication in Fig. 4A, albeit from one or more computing systems associated with corresponding one or more vehicles.
[0113] Method 400 may further comprise, at block 442, analyzing, using the remote computing system, the at least one of the second current vehicle location or the second predicted path for each of the one or more second vehicles in relation to at least one of the first alert condition or the first alert condition location. At block 444, method 400 may comprise determining whether at least one second vehicle among the one or more second vehicles is in proximity to or approaching the first alert condition location or is within a first region encompassing the first alert condition, based at least in part on the analysis. If so, method 400 may further comprise sending, using the remote computing system, one or more third communications to each of the second computing systems associated with each of the corresponding at least one second vehicle indicating that said second vehicle is in proximity to or approaching the first alert condition location (block 446). Here, the one or more third communications may, in some cases, correspond to the second communication in Fig. 4A, albeit to one or more computing systems associated with corresponding one or more vehicles. [0114] Method 400 may continue onto the process at block 448 in Fig. 4F following the circular marker denoted, "C."
[0115] At block 448 in Fig. 4F (following the circular marker denoted, "C"), method 400 may comprise receiving, using the remote computing system and from each of one or more third computing systems associated with corresponding one or more third vehicles among the plurality of vehicles and over the one or more networks, one or more fourth communications indicating that the first alert condition is no longer present at the first alert condition location; receiving, using the remote computing system and from each of one or more fourth computing systems associated with corresponding one or more fourth vehicles among the plurality of vehicles and over the one or more networks, one or more fifth communications regarding at least one of a fourth current vehicle location or a fourth predicted path for each of the one or more fourth vehicles (block 450); and analyzing, using the remote computing system, the at least one of the fourth current vehicle location or the fourth predicted path for each of the one or more fourth vehicles in relation to at least one of the first alert condition or the first alert condition location (block 452). Here, the one or more fourth communications may, in some cases, correspond to the third communication in Fig. 4C, albeit from one or more computing systems associated with corresponding one or more vehicles.
[0116] At block 454, method 400 may comprise determining whether at least one fourth vehicle among the one or more fourth vehicles is in proximity to or approaching the first alert condition location or is within the first region encompassing the first alert condition, based at least in part on the analysis. If so, at block 456, method 400 may further comprise determining whether the one or more fourth communications from a threshold number of third computing systems associated with the corresponding one or more third vehicles have been received. If so, method 400 may further comprise, at block 458, sending, using the remote computing system, one or more fifth communications to each of the fourth computing systems associated with each of the corresponding at least one fourth vehicle indicating that the first alert condition is no longer present at the first alert condition location. If not, method 400 may return to the process at block 446 in Fig. 4E, following the circular marked denoted, "D."
[0117] Examples of System and Hardware Implementation
[0118] Fig. 5 is a block diagram illustrating an example of computer or system hardware architecture, in accordance with various embodiments. Fig. 5 provides a schematic illustration of one embodiment of a computer system 500 of the service provider system hardware that can perform the methods provided by various other embodiments, as described herein, and/or can perform the functions of computer or hardware system (i.e., computing systems 110 and 170a-170n, user devices 115, 175, and 175a-175n, remote computing systems 140, location determination server 150, and image recognition server 155, etc.), as described above. It should be noted that Fig. 5 is meant only to provide a generalized illustration of various components, of which one or more (or none) of each may be utilized as appropriate. Fig. 5, therefore, broadly illustrates how individual system elements may be implemented in a relatively separated or relatively more integrated manner.
[0119] The computer or hardware system 500 - which might represent an embodiment of the computer or hardware system (i.e., computing systems 110 and 170a-170n, user devices 115, 175, and 175a-175n, remote computing systems 140, location determination server 150, and image recognition server 155, etc.), described above with respect to Figs. 1-4 - is shown comprising hardware elements that can be electrically coupled via a bus 505 (or may otherwise be in communication, as appropriate). The hardware elements may include one or more processors 510, including, without limitation, one or more general-purpose processors and/or one or more special-purpose processors (such as microprocessors, digital signal processing chips, graphics acceleration processors, and/or the like); one or more input devices 515, which can include, without limitation, a mouse, a keyboard, and/or the like; and one or more output devices 520, which can include, without limitation, a display device, a printer, and/or the like.
[0120] The computer or hardware system 500 may further include (and/or be in communication with) one or more storage devices 525, which can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, solid-state storage device such as a random access memory ("RAM") and/or a read-only memory ("ROM"), which can be programmable, flash-updateable, and/or the like. Such storage devices may be configured to implement any appropriate data stores, including, without limitation, various file systems, database structures, and/or the like. [0121] The computer or hardware system 500 might also include a communications subsystem 530, which can include, without limitation, a modem, a network card (wireless or wired), an infra-red communication device, a wireless communication device and/or chipset (such as a Bluetooth™ device, an 802.11 device, a WiFi device, a WiMax device, a WWAN device, cellular communication facilities, etc.), and/or the like. The communications subsystem 530 may permit data to be exchanged with a network (such as the network described below, to name one example), with other computer or hardware systems, and/or with any other devices described herein. In many embodiments, the computer or hardware system 500 will further comprise a working memory 535, which can include a RAM or ROM device, as described above.
[0122] The computer or hardware system 500 also may comprise software elements, shown as being currently located within the working memory 535, including an operating system 540, device drivers, executable libraries, and/or other code, such as one or more application programs 545, which may comprise computer programs provided by various embodiments (including, without limitation, hypervisors, VMs, and the like), and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein. Merely by way of example, one or more procedures described with respect to the method(s) discussed above might be implemented as code and/or instructions executable by a computer (and/or a processor within a computer); in an aspect, then, such code and/or instructions can be used to configure and/or adapt a general purpose computer (or other device) to perform one or more operations in accordance with the described methods.
[0123] A set of these instructions and/or code might be encoded and/or stored on a non-transitory computer readable storage medium, such as the storage device(s) 525 described above. In some cases, the storage medium might be incorporated within a computer system, such as the system 500. In other embodiments, the storage medium might be separate from a computer system (i.e., a removable medium, such as a compact disc, etc.), and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon. These instructions might take the form of executable code, which is executable by the computer or hardware system 500 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer or hardware system 500 (e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc.) then takes the form of executable code.
[0124] It will be apparent to those skilled in the art that substantial variations may be made in accordance with particular requirements. For example, customized hardware (such as programmable logic controllers, field-programmable gate arrays, application-specific integrated circuits, and/or the like) might also be used, and/or particular elements might be implemented in hardware, software (including portable software, such as applets, etc.), or both. Further, connection to other computing devices such as network input/output devices may be employed.
[0125] As mentioned above, in one aspect, some embodiments may employ a computer or hardware system (such as the computer or hardware system 500) to perform methods in accordance with various embodiments of the invention.
According to a set of embodiments, some or all of the procedures of such methods are performed by the computer or hardware system 500 in response to processor 510 executing one or more sequences of one or more instructions (which might be incorporated into the operating system 540 and/or other code, such as an application program 545) contained in the working memory 535. Such instructions may be read into the working memory 535 from another computer readable medium, such as one or more of the storage device(s) 525. Merely by way of example, execution of the sequences of instructions contained in the working memory 535 might cause the processor(s) 510 to perform one or more procedures of the methods described herein. [0126] The terms "machine readable medium" and "computer readable medium," as used herein, refer to any medium that participates in providing data that causes a machine to operate in some fashion. In an embodiment implemented using the computer or hardware system 500, various computer readable media might be involved in providing instructions/code to processor(s) 510 for execution and/or might be used to store and/or carry such instructions/code (e.g., as signals). In many implementations, a computer readable medium is a non-transitory, physical, and/or tangible storage medium. In some embodiments, a computer readable medium may take many forms, including, but not limited to, non-volatile media, volatile media, or the like. Non-volatile media includes, for example, optical and/or magnetic disks, such as the storage device(s) 525. Volatile media includes, without limitation, dynamic memory, such as the working memory 535. In some alternative embodiments, a computer readable medium may take the form of transmission media, which includes, without limitation, coaxial cables, copper wire, and fiber optics, including the wires that comprise the bus 505, as well as the various components of the communication subsystem 530 (and/or the media by which the communications subsystem 530 provides communication with other devices). In an alternative set of embodiments, transmission media can also take the form of waves (including without limitation radio, acoustic, and/or light waves, such as those generated during radiowave and infra-red data communications).
[0127] Common forms of physical and/or tangible computer readable media include, for example, a floppy disk, a flexible disk, a hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read instructions and/or code.
[0128] Various forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to the processor(s) 510 for execution. Merely by way of example, the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer. A remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the computer or hardware system 500. These signals, which might be in the form of electromagnetic signals, acoustic signals, optical signals, and/or the like, are all examples of carrier waves on which instructions can be encoded, in accordance with various embodiments of the invention. [0129] The communications subsystem 530 (and/or components thereof) generally will receive the signals, and the bus 505 then might carry the signals (and/or the data, instructions, etc. carried by the signals) to the working memory 535, from which the processor(s) 505 retrieves and executes the instructions. The instructions received by the working memory 535 may optionally be stored on a storage device 525 either before or after execution by the processor(s) 510.
[0130] As noted above, a set of embodiments comprises methods and systems for implementing driver assistance technologies (e.g., advanced driver assistance systems ("ADASs"), other vision-based object detection, other location signal-based object detection, other vision and location signal-based object detection, or multiple types of signal-based object detection, or the like), and, more particularly, to methods, systems, and apparatuses for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals. Fig. 6 illustrates a schematic diagram of a system 600 that can be used in accordance with one set of embodiments. The system 600 can include one or more user computers, user devices, or customer devices 605. A user computer, user device, or customer device 605 can be a general purpose personal computer (including, merely by way of example, desktop computers, tablet computers, laptop computers, handheld computers, and the like, running any appropriate operating system, several of which are available from vendors such as Apple, Microsoft Corp., and the like), cloud computing devices, a server(s), and/or a workstation computer(s) running any of a variety of commercially-available UNIX™ or UNIX-like operating systems. A user computer, user device, or customer device 605 can also have any of a variety of applications, including one or more applications configured to perform methods provided by various embodiments (as described above, for example), as well as one or more office applications, database client and/or server applications, and/or web browser applications. Alternatively, a user computer, user device, or customer device 605 can be any other electronic device, such as a thin-client computer, Internet- enabled mobile telephone, and/or personal digital assistant, capable of communicating via a network (e.g., the network(s) 610 described below) and/or of displaying and navigating web pages or other types of electronic documents. Although the system 600 is shown with two user computers, user devices, or customer devices 605, any number of user computers, user devices, or customer devices can be supported. [0131] Some embodiments operate in a networked environment, which can include a network(s) 610. The network(s) 610 can be any type of network familiar to those skilled in the art that can support data communications using any of a variety of commercially-available (and/or free or proprietary) protocols, including, without limitation, TCP/IP, SNA™, IPX™, AppleTalk™, and the like. Merely by way of example, the network(s) 610 (similar to network(s) 145 of Figs. 1 and 2, or the like) can each include a local area network ("LAN"), including, without limitation, a fiber network, an Ethernet network, a Token-Ring™ network, and/or the like; a wide-area network ("WAN"); a wireless wide area network ("WWAN"); a virtual network, such as a virtual private network ("VPN"); the Internet; an intranet; an extranet; a public switched telephone network ("PSTN"); an infra-red network; a wireless network, including, without limitation, a network operating under any of the IEEE 802.11 suite of protocols, the Bluetooth™ protocol known in the art, and/or any other wireless protocol; and/or any combination of these and/or other networks. In a particular embodiment, the network might include an access network of the service provider (e.g., an Internet service provider ("ISP")). In another embodiment, the network might include a core network of the service provider, and/or the Internet.
[0132] Embodiments can also include one or more server computers 615. Each of the server computers 615 may be configured with an operating system, including, without limitation, any of those discussed above, as well as any commercially (or freely) available server operating systems. Each of the servers 615 may also be running one or more applications, which can be configured to provide services to one or more clients 605 and/or other servers 615.
[0133] Merely by way of example, one of the servers 615 might be a data server, a web server, a cloud computing device(s), or the like, as described above. The data server might include (or be in communication with) a web server, which can be used, merely by way of example, to process requests for web pages or other electronic documents from user computers 605. The web server can also run a variety of server applications, including HTTP servers, FTP servers, CGI servers, database servers, Java servers, and the like. In some embodiments of the invention, the web server may be configured to serve web pages that can be operated within a web browser on one or more of the user computers 605 to perform methods of the invention.
[0134] The server computers 615, in some embodiments, might include one or more application servers, which can be configured with one or more applications accessible by a client running on one or more of the client computers 605 and/or other servers 615. Merely by way of example, the server(s) 615 can be one or more general purpose computers capable of executing programs or scripts in response to the user computers 605 and/or other servers 615, including, without limitation, web applications (which might, in some cases, be configured to perform methods provided by various embodiments). Merely by way of example, a web application can be implemented as one or more scripts or programs written in any suitable programming language, such as Java™, C, C#™ or C++, and/or any scripting language, such as Perl, Python, or TCL, as well as combinations of any programming and/or scripting languages. The application server(s) can also include database servers, including, without limitation, those commercially available from Oracle™, Microsoft™, Sybase™, IBM™, and the like, which can process requests from clients (including, depending on the configuration, dedicated database clients, API clients, web browsers, etc.) running on a user computer, user device, or customer device 605 and/or another server 615. In some embodiments, an application server can perform one or more of the processes for implementing driver assistance technologies, and, more particularly, to methods, systems, and apparatuses for implementing object detection using fusion of vision, location, and/or other signals, and/or implementing alert condition messaging based at least in part on object detection using fusion of vision, location, and/or other signals, as described in detail above. Data provided by an application server may be formatted as one or more web pages (comprising HTML, JavaScript, etc., for example) and/or may be forwarded to a user computer 605 via a web server (as described above, for example). Similarly, a web server might receive web page requests and/or input data from a user computer 605 and/or forward the web page requests and/or input data to an application server. In some cases, a web server may be integrated with an application server.
[0135] In accordance with further embodiments, one or more servers 615 can function as a file server and/or can include one or more of the files (e.g., application code, data files, etc.) necessary to implement various disclosed methods, incorporated by an application running on a user computer 605 and/or another server 615. Alternatively, as those skilled in the art will appreciate, a file server can include all necessary files, allowing such an application to be invoked remotely by a user computer, user device, or customer device 605 and/or server 615.
[0136] It should be noted that the functions described with respect to various servers herein (e.g., application server, database server, web server, file server, etc.) can be performed by a single server and/or a plurality of specialized servers, depending on implementation- specific needs and parameters.
[0137] In some embodiments, the system can include one or more databases 620a- 620n (collectively, "databases 620"). The location of each of the databases 620 is discretionary: merely by way of example, a database 620a might reside on a storage medium local to (and/or resident in) a server 615a (and/or a user computer, user device, or customer device 605). Alternatively, a database 620n can be remote from any or all of the computers 605, 615, so long as it can be in communication (e.g., via the network 610) with one or more of these. In a particular set of embodiments, a database 620 can reside in a storage-area network ("SAN") familiar to those skilled in the art. (Likewise, any necessary files for performing the functions attributed to the computers 605, 615 can be stored locally on the respective computer and/or remotely, as appropriate.) In one set of embodiments, the database 620 can be a relational database, such as an Oracle database, that is adapted to store, update, and retrieve data in response to SQL-formatted commands. The database might be controlled and/or maintained by a database server, as described above, for example.
[0138] According to some embodiments, system 600 might further comprise a first vehicle 625a among a plurality of vehicles (similar to vehicles 105 and 105a- 105x of Figs. 1-3, or the like). System 600 may further comprise computing system 630 that may be disposed within or on vehicle 625 (similar to computing systems 110 of Figs. 1 and 2, or the like) or external to vehicle 625 yet associated therewith (not shown in Fig. 6; similar to computing systems 170a-170n of Fig. 1, or the like). Similarly, system 600 may further comprise user device(s) 635 that may be disposed within or on vehicle 625 (optional; similar to optional user device(s) 115 of Figs. 1 and 2, or the like) or external to vehicle 625 yet associated therewith and/or associated with a user (e.g., user devices 605, 605a, and/or 605b; similar to user devices 175 and 175a-175n of Figs. 1 and 2, or the like). Each vehicle 625 may further comprise one or more location sensors 640 (similar to location sensor(s) or location data signal source(s) 120 of Figs. 1 and 2, or the like) and one or more object detection sensors 645 (similar to object detection sensor(s) or object detection data signal source(s) 125 of Figs. 1 and 2, or the like). System 600 may further comprise remote computing system(s) 660 and corresponding database(s) 660a (similar to remote computing systems 140 and corresponding database(s) 140a of Figs. 1-3, or the like), location determination service 665 and corresponding database(s) 665 a (similar to location determination server 150 and corresponding database(s) 150a of Fig. 1, or the like), and image recognition server 670 and corresponding database(s) 670a (similar to image recognition server 155 and corresponding database(s) 155a of Fig. 1, or the like).
[0139] In operation, computing system 630 and/or user device(s) 605 or 635 (collectively, "computing system") may determine at least one of a first current vehicle location (e.g., vehicle location 650a for corresponding vehicle 625 a, or the like) or a first predicted path of a vehicle (e.g., vehicle path 655a (denoted in Fig. 6 by the broad arrow, or the like) for vehicle 625a, or the like), based at least in part on one or more first location-based data (e.g., current location data 685, or the like) from corresponding one or more first location data signals received from one or more different types of location data signal sources (e.g., location sensor(s) 640, or the like). The computing system may send a first communication regarding the determined at least one of the first current vehicle location or the first predicted path of the vehicle (e.g., a message containing current location data 685, or the like) to a remote computing system (e.g., remote computing system 660, or the like) over one or more networks (e.g., network(s) 610, or the like).
[0140] In response to existence of at least one first alert condition (e.g., alert condi tion(s) 675, or the like) for corresponding at least one first alert condition location that is in proximity to one or more of the determined at least one of the first current vehicle location or the first predicted path of the vehicle or that is within a first region encompassing the determined at least one of the first current vehicle location or the first predicted path of the vehicle, the computing system may receive a second communication (e.g., alert condition data 695, or the like) regarding the at least one first alert condition (e.g., alert condition 675, or the like) for the corresponding at least one first alert condition location (e.g., alert condition location 680, or the like) from the remote computing system over the one or more networks.
[0141] Merely by way of example, in some cases, the at least one first alert condition (e.g., alert condi tion(s) 675, or the like) may each include, without limitation, at least one of traffic congestion along the first predicted path of the vehicle potentially causing a slow-down, a traffic accident along the first predicted path of the vehicle potentially causing a slow-down, a construction site along the first predicted path of the vehicle potentially causing a slow-down, one or more people along the first predicted path of the vehicle who are occluded from a perspective of the vehicle, one or more animals along the first predicted path of the vehicle who are occluded from the perspective of the vehicle, one or more objects along the first predicted path of the vehicle who are occluded from the perspective of the vehicle, one or more people potentially blocking the first predicted path of the vehicle, one or more animals potentially blocking the first predicted path of the vehicle, one or more objects potentially blocking the first predicted path of the vehicle, a tracked weather event along or near the first predicted path of the vehicle, a natural hazard potentially blocking the first predicted path of the vehicle, a manmade hazard potentially blocking the first predicted path of the vehicle, one or more people potentially intercepting the vehicle along the first predicted path, one or more animals potentially intercepting the vehicle along the first predicted path, one or more objects potentially intercepting the vehicle along the first predicted path, or one or more other vehicles potentially intercepting the vehicle along the first predicted path, and/or the like.
[0142] Based on a determination that the vehicle is approaching the at least one first alert condition location (e.g., alert condition location 680, or the like) based at least in part on one or more second location-based data from corresponding one or more second location data signals received from the one or more different types of location data signal sources, the computing system may perform the following tasks: receive one or more first object detection data signals (e.g., signals containing object detection data 690, or the like) from one or more different types of object detection data signal sources (e.g., object detection sensor(s) 645, or the like); fuse the at least one first alert condition with one or more of the received one or more first object detection data signals, the at least one first alert condition location, a second current vehicle location of the vehicle, or a second predicted path of the vehicle to generate first fused data; and generate and present, via one or more user devices (e.g., user device(s) 605, 605a, 605b, and/or 635, or the like), a first alert message indicating that the vehicle is approaching the at least one first alert condition, based at least in part on the generated first fused data. In some cases, at least one of the second current vehicle location or the second predicted path of the vehicle may be determined based at least in part on the one or more second location-based data from the corresponding one or more second data signals received from the one or more different types of location data signal sources (similar to the first current vehicle location or the first predicted path, or the like).
[0143] In some instances, determining the at least one of the first current vehicle location or the first predicted path of a vehicle may comprise the computing system performing at least one of: (1) determining the first current vehicle location based at least in part on GPS data from the GPS device; (2) determining the first predicted path of the vehicle based at least in part on a series of GPS data from the GPS device over time; (3) determining the first current vehicle location based at least in part on GNSS data from the GNSS device; (4) determining the first predicted path of the vehicle based at least in part on a series of GNSS data from the GNSS device over time; (5) determining the first current vehicle location based at least in part on text recognition of one or more location-identifying signs, the one or more location-identifying signs comprising at least one of one or more street signs, one or more address signs, one or more business signs, one or more highway signs, one or more city limits signs, one or more county boundary signs, one or more state boundary signs, one or more province boundary signs, one or more territory boundary signs, one or more regional boundary signs, one or more national border signs, one or more landmark identification signs, one or more distance to destination signs, one or more route markers, one or more highway location markers, one or more driver location signs, or one or more mile markers, and/or the like; (6) determining the first predicted path of the vehicle based at least in part on text recognition of a plurality of location-identifying signs over time; (7) determining the first current vehicle location based at least in part on image recognition of one or more landmarks, the one or more landmarks comprising at least one of one or more cityscapes, one or more skylines, one or more unique buildings, one or more franchise buildings, one or more unique street views, one or more overlooks, one or more scenic locations, one or more mountain ranges, one or more individual mountains, one or more bodies of water, one or more manmade monuments, one or more landscape art pieces, or one or more unique structures; (8) determining the first predicted path of the vehicle based at least in part on image recognition of a plurality of landmarks over time, and/or the like; (9) determining the first current vehicle location based at least in part on analysis of signal strength of telecommunications signals from a plurality of stationary telecommunications transceivers; (10) determining the first predicted path of the vehicle based at least in part on analysis of changes in signal strength of telecommunications signals from the plurality of stationary telecommunications transceivers over time; (11) determining the first current vehicle location based at least in part on analysis of radar signal data; (12) determining the first predicted path of the vehicle based at least in part on analysis of changes in radar signal data over time; (13) determining the first current vehicle location based at least in part on analysis of sonar signal data; (14) determining the first predicted path of the vehicle based at least in part on analysis of changes in sonar signal data over time; (15) determining the first current vehicle location based at least in part on analysis of lidar signal data; or (16) determining the first predicted path of the vehicle based at least in part on analysis of changes in lidar signal data over time.
[0144] According to some embodiments, generating and presenting the first alert message may comprise at least one of: generating a graphical display depicting one or more of the at least one first alert condition or the generated first fused data, and presenting the generated graphical display on a display device (e.g., user devices 605, 605a, 605b, and/or 635, or the like); generating a text-based message depicting one or more of the at least one first alert condition or the generated first fused data, and presenting the text-based message on a display device (e.g., user devices 605, 605a, 605b, and/or 635, or the like); generating at least one message regarding one or more of the at least one first alert condition or the generated first fused data, and sending the at least one message to the user device (e.g., user devices 605, 605a, 605b, and/or 635, or the like); or generating at least one audio message regarding one or more of the at least one first alert condition or the generated first fused data, and sending the at least one audio message to the user device for playback on at least one audio speaker (e.g., user devices 605, 605a, 605b, and/or 635, or the like); and/or the like. In some cases, the at least one message may include, but is not limited to, at least one of an e- mail message, a short message service ("SMS") app, a multimedia messaging service ("MMS") app, or a text message app, and/or the like.
[0145] In some embodiments, at or near the at least one first alert condition location, the computing system may receive one or more second object detection data signals from the one or more different types of object detection data signal sources. The computing system may analyze the one or more second object detection data signals. Based on a determination that the at least one first alert condition is no longer present at the at least one first alert condition location, the computing system may send a third communication to the remote computing system over the one or more networks indicating that the at least one first alert condition is no longer present at the at least one first alert condition location.
[0146] According to some embodiments, the computing system may receive one or more third object detection data signals from the one or more different types of object detection data signal sources. The computing system may analyze the one or more third object detection data signals. Based on a determination that the one or more third object detection data signals correspond to at least one second alert condition, the computing system may determine at least one of a third current vehicle location or at least one second alert condition location corresponding to the at least one second alert condition, based at least in part on one or more third location-based data from corresponding one or more third data signals received from the one or more different types of location data signal sources. The computing system may send a fourth communication to the remote computing system over the one or more networks indicating that the at least one second alert condition has been detected at or near the at least one of the third current vehicle location or the at least one second alert condition location.
[0147] In some aspects, remote computing system 660 (herein, simply "remote computing system") may receive, from each of one or more first computing systems associated with corresponding one or more first vehicles among a plurality of vehicles and over one or more networks, one or more first communications indicating that a first alert condition has been detected at or near one or more of a first current vehicle location of each of the one or more first vehicles or a first alert condition location corresponding to the first alert condition. [0148] The remote computing system may receive, from each of one or more second computing systems associated with corresponding one or more second vehicles among the plurality of vehicles and over the one or more networks, one or more second communications regarding at least one of a second current vehicle location or a second predicted path for each of the one or more second vehicles. The remote computing system may analyze the at least one of the second current vehicle location or the second predicted path for each of the one or more second vehicles in relation to at least one of the first alert condition or the first alert condition location. Based on a determination that at least one second vehicle among the one or more second vehicles is in proximity to or approaching the first alert condition location or is within a first region encompassing the first alert condition, based at least in part on the analysis, the remote computing system may send one or more third communications to each of the second computing systems associated with each of the corresponding at least one second vehicle indicating that said second vehicle is in proximity to or approaching the first alert condition location.
[0149] In some embodiments, the remote computing system may receive, from each of one or more third computing systems associated with corresponding one or more third vehicles among the plurality of vehicles and over the one or more networks, one or more fourth communications indicating that the first alert condition is no longer present at the first alert condition location. The remote computing system may receive, from each of one or more fourth computing systems associated with corresponding one or more fourth vehicles among the plurality of vehicles and over the one or more networks, one or more fifth communications regarding at least one of a fourth current vehicle location or a fourth predicted path for each of the one or more fourth vehicles. The remote computing system may analyze the at least one of the fourth current vehicle location or the fourth predicted path for each of the one or more fourth vehicles in relation to at least one of the first alert condition or the first alert condition location. Based on a determination that at least one fourth vehicle among the one or more fourth vehicles is in proximity to or approaching the first alert condition location or is within the first region encompassing the first alert condition, based at least in part on the analysis, and based on a determination that the one or more fourth communications from a threshold number of third computing systems associated with the corresponding one or more third vehicles have been received (i.e., after receiving the one or more fourth communications from a threshold number of third computing systems associated with the corresponding one or more third vehicles), the remote computing system may send one or more fifth communications to each of the fourth computing systems associated with each of the corresponding at least one fourth vehicle indicating that the first alert condition is no longer present at the first alert condition location. If the one or more fourth communications from a threshold number of third computing systems associated with the corresponding one or more third vehicles have not been received, the remote computing system may continue sending the one or more third communications to each of the fourth computing systems associated with each of the corresponding at least one fourth vehicle indicating that said fourth vehicle is in proximity to or approaching the first alert condition location.
[0150] These and other functions of the system 600 (and its components) are described in greater detail above with respect to Figs. 1-4.
[0151] While particular features and aspects have been described with respect to some embodiments, one skilled in the art will recognize that numerous modifications are possible. For example, the methods and processes described herein may be implemented using hardware components, software components, and/or any combination thereof. Further, while various methods and processes described herein may be described with respect to particular structural and/or functional components for ease of description, methods provided by various embodiments are not limited to any particular structural and/or functional architecture but instead can be implemented on any suitable hardware, firmware and/or software configuration. Similarly, while particular functionality is ascribed to particular system components, unless the context dictates otherwise, this functionality need not be limited to such and can be distributed among various other system components in accordance with the several embodiments. [0152] Moreover, while the procedures of the methods and processes described herein are described in a particular order for ease of description, unless the context dictates otherwise, various procedures may be reordered, added, and/or omitted in accordance with various embodiments. Moreover, the procedures described with respect to one method or process may be incorporated within other described methods or processes; likewise, system components described according to a particular structural architecture and/or with respect to one system may be organized in alternative structural architectures and/or incorporated within other described systems. Hence, while various embodiments are described with — or without — particular features for ease of description and to illustrate some aspects of those embodiments, the various components and/or features described herein with respect to a particular embodiment can be substituted, added and/or subtracted from among other described embodiments, unless the context dictates otherwise. Consequently, although several embodiments are described above, it will be appreciated that the invention is intended to cover all modifications and equivalents within the scope of the following claims.

Claims

WHAT IS CLAIMED IS:
1. A method, comprising: determining, using a computing system, at least one of a first current vehicle location or a first predicted path of a vehicle, based at least in part on one or more first location-based data from corresponding one or more first location data signals received from one or more different types of location data signal sources; sending, using the computing system, a first communication regarding the determined at least one of the first current vehicle location or the first predicted path of the vehicle to a remote computing system over one or more networks; in response to existence of at least one first alert condition for corresponding at least one first alert condition location that is in proximity to one or more of the determined at least one of the first current vehicle location or the first predicted path of the vehicle or that is within a first region encompassing the determined at least one of the first current vehicle location or the first predicted path of the vehicle, receiving, using the computing system, a second communication regarding the at least one first alert condition for the corresponding at least one first alert condition location from the remote computing system over the one or more networks; based on a determination that the vehicle is approaching the at least one first alert condition location based at least in part on one or more second location-based data from corresponding one or more second location data signals received from the one or more different types of location data signal sources, performing the following tasks: receiving, using the computing system, one or more first object detection data signals from one or more different types of object detection data signal sources; fusing, using the computing system, the at least one first alert condition with one or more of the received one or more first object detection data signals, the at least one first alert condition location, a second
64 current vehicle location of the vehicle, or a second predicted path of the vehicle to generate first fused data; and generating and presenting, using the computing system and via one or more user devices, a first alert message indicating that the vehicle is approaching the at least one first alert condition, based at least in part on the generated first fused data.
2. The method of claim 1, wherein the computing system comprises at least one of a data signal fusing computing system, at least one processor on the user device, at least one processor on a mobile device associated with a user, a vehiclebased computing system, an object detection system, or a driver assistance system, wherein the remote computing system comprises at least one of a remote data signal fusing computing system, a remote object detection system, or a remote driver assistance system, a server computer over the one or more networks, an image processing server, a graphics processing unit ("GPU") -based server, a positioning and mapping server, a machine learning system, an artificial intelligence ("Al") system, a deep learning system, a neural network, a convolutional neural network ("CNN"), a fully convolutional network ("FCN"), a cloud computing system, or a distributed computing system.
3. The method of claim 1 or 2, wherein the one or more different types of location data signal sources each comprises one of a global positioning system ("GPS") device, a global navigation satellite system ("GNSS") device, a text recognition-based location identification system, an image recognition-based landmark identification system, a telecommunications signal triangulation-based location identification system, a radar-based location identification system, a sonarbased location identification system, or a lidar-based location identification system.
4. The method of claim 3, wherein determining the at least one of the first current vehicle location or the first predicted path of a vehicle comprises at least one of: determining, using the computing system, the first current vehicle location based at least in part on GPS data from the GPS device;
65 determining, using the computing system, the first predicted path of the vehicle based at least in part on a series of GPS data from the GPS device over time; determining, using the computing system, the first current vehicle location based at least in part on GNSS data from the GNSS device; determining, using the computing system, the first predicted path of the vehicle based at least in part on a series of GNSS data from the GNSS device over time; determining, using the computing system, the first current vehicle location based at least in part on text recognition of one or more locationidentifying signs, the one or more location-identifying signs comprising at least one of one or more street signs, one or more address signs, one or more business signs, one or more highway signs, one or more city limits signs, one or more county boundary signs, one or more state boundary signs, one or more province boundary signs, one or more territory boundary signs, one or more regional boundary signs, one or more national border signs, one or more landmark identification signs, one or more distance to destination signs, one or more route markers, one or more highway location markers, one or more driver location signs, or one or more mile markers; determining, using the computing system, the first predicted path of the vehicle based at least in part on text recognition of a plurality of locationidentifying signs over time; determining, using the computing system, the first current vehicle location based at least in part on image recognition of one or more landmarks, the one or more landmarks comprising at least one of one or more cityscapes, one or more skylines, one or more unique buildings, one or more franchise buildings, one or more unique street views, one or more overlooks, one or more scenic locations, one or more mountain ranges, one or more individual mountains, one or more bodies of water, one or more manmade monuments, one or more landscape art pieces, or one or more unique structures;
66 determining, using the computing system, the first predicted path of the vehicle based at least in part on image recognition of a plurality of landmarks over time; determining, using the computing system, the first current vehicle location based at least in part on analysis of signal strength of telecommunications signals from a plurality of stationary telecommunications transceivers; determining, using the computing system, the first predicted path of the vehicle based at least in part on analysis of changes in signal strength of telecommunications signals from the plurality of stationary telecommunications transceivers over time; determining, using the computing system, the first current vehicle location based at least in part on analysis of radar signal data; determining, using the computing system, the first predicted path of the vehicle based at least in part on analysis of changes in radar signal data over time; determining, using the computing system, the first current vehicle location based at least in part on analysis of sonar signal data; determining, using the computing system, the first predicted path of the vehicle based at least in part on analysis of changes in sonar signal data over time; determining, using the computing system, the first current vehicle location based at least in part on analysis of lidar signal data; or determining, using the computing system, the first predicted path of the vehicle based at least in part on analysis of changes in lidar signal data over time.
5. The method of any of claims 1-4, wherein the one or more different types of object detection data signal sources each comprises one of a vision-based object detection system, a radar-based object detection system, a sonar-based object detection system, or a lidar-based object detection system.
6. The method of any of claims 1-5, wherein the vehicle comprises one of a car, a minivan, a pickup truck, a motorcycle, an all-terrain vehicle, a scooter, a police vehicle, a fire engine, an ambulance, a recreational vehicle, a bus, a
67 commercial van, a commercial truck, a semi-tractor-trailer truck, a boat, a ship, a submersible, an amphibious vehicle, an aircraft, a space vehicle, a satellite, an autonomous vehicle, or a drone.
7. The method of any of claims 1-6, wherein the at least one first alert condition each comprises at least one of traffic congestion along the first predicted path of the vehicle potentially causing a slow-down, a traffic accident along the first predicted path of the vehicle potentially causing a slow-down, a construction site along the first predicted path of the vehicle potentially causing a slow-down, one or more people along the first predicted path of the vehicle who are occluded from a perspective of the vehicle, one or more animals along the first predicted path of the vehicle who are occluded from the perspective of the vehicle, one or more objects along the first predicted path of the vehicle who are occluded from the perspective of the vehicle, one or more people potentially blocking the first predicted path of the vehicle, one or more animals potentially blocking the first predicted path of the vehicle, one or more objects potentially blocking the first predicted path of the vehicle, a tracked weather event along or near the first predicted path of the vehicle, a natural hazard potentially blocking the first predicted path of the vehicle, a manmade hazard potentially blocking the first predicted path of the vehicle, one or more people potentially intercepting the vehicle along the first predicted path, one or more animals potentially intercepting the vehicle along the first predicted path, one or more objects potentially intercepting the vehicle along the first predicted path, or one or more other vehicles potentially intercepting the vehicle along the first predicted path.
8. The method of any of claims 1-7, wherein each user device comprises at least one of a smartphone, a tablet computer, a display device, an augmented reality ("AR") device, a virtual reality ("VR") device, a mixed reality ("MR") device, a vehicle console display, a vehicle heads-up display ("HUD"), a vehicle remote controller display, one or more audio speakers, or one or more haptic response devices.
9. The method of any of claims 1-8, wherein generating and presenting the first alert message comprises at least one of:
68 generating a graphical display depicting one or more of the at least one first alert condition or the generated first fused data, and presenting the generated graphical display on a display device; generating a text-based message depicting one or more of the at least one first alert condition or the generated first fused data, and presenting the textbased message on a display device; generating at least one message regarding one or more of the at least one first alert condition or the generated first fused data, and sending the at least one message to the user device, wherein the at least one message comprises at least one of an e-mail message, a short message service ("SMS") app, a multimedia messaging service ("MMS") app, or a text message app; or generating at least one audio message regarding one or more of the at least one first alert condition or the generated first fused data, and sending the at least one audio message to the user device for playback on at least one audio speaker.
10. The method of any of claims 1-9, wherein at least one of the second current vehicle location or the second predicted path of the vehicle is determined based at least in part on the one or more second location-based data from the corresponding one or more second data signals received from the one or more different types of location data signal sources.
11. The method of any of claims 1-10, further comprising: at or near the at least one first alert condition location, receiving, using the computing system, one or more second object detection data signals from the one or more different types of object detection data signal sources; analyzing, using the computing system, the one or more second object detection data signals; and based on a determination that the at least one first alert condition is no longer present at the at least one first alert condition location, sending, using the computing system, a third communication to the remote computing system over the one or more networks indicating that the at least one first alert condition is no longer present at the at least one first alert condition location.
12. The method of any of claims 1-11, further comprising: receiving, using the computing system, one or more third object detection data signals from the one or more different types of object detection data signal sources; analyzing, using the computing system, the one or more third object detection data signals; based on a determination that the one or more third object detection data signals correspond to at least one second alert condition, determining, using the computing system, at least one of a third current vehicle location or at least one second alert condition location corresponding to the at least one second alert condition, based at least in part on one or more third location-based data from corresponding one or more third data signals received from the one or more different types of location data signal sources; and sending, using the computing system, a fourth communication to the remote computing system over the one or more networks indicating that the at least one second alert condition has been detected at or near the at least one of the third current vehicle location or the at least one second alert condition location.
13. A system, comprising: a computing system, comprising: at least one first processor; and a first non-transitory computer readable medium communicatively coupled to the at least one first processor, the first non-transitory computer readable medium having stored thereon computer software comprising a first set of instructions that, when executed by the at least one first processor, causes the computing system to: determine at least one of a first current vehicle location or a first predicted path of a vehicle, based at least in part on one or more first location-based data from corresponding one or more first location data signals received from one or more different types of location data signal sources; send the determined at least one of the first current vehicle location or the first predicted path of the vehicle to a remote computing system over one or more networks; in response to existence of at least one first alert condition for corresponding at least one first alert condition location that is in proximity to one or more of the determined at least one of the first current vehicle location or the first predicted path of the vehicle or that is within a first region encompassing the determined at least one of the first current vehicle location or the first predicted path of the vehicle, receive the at least one first alert condition for the corresponding at least one first alert condition location from the remote computing system over the one or more networks; based on a determination that the vehicle is approaching the at least one first alert condition location based at least in part on one or more second location-based data from corresponding one or more second location data signals received from the one or more different types of location data signal sources, perform the following tasks: receive one or more first object detection data signals from one or more different types of object detection data signal sources; fuse the at least one first alert condition with one or more of the received one or more first object detection data signals, the at least one first alert condition location, a second current vehicle location of the vehicle, or a second predicted path of the vehicle to generate first fused data; and generate and present, using the computing system and via one or more user devices, a first alert message indicating that the vehicle is approaching the at least one first alert condition, based at least in part on the generated first fused data.
14. The system of claim 13, wherein the computing system comprises at least one of a data signal fusing computing system, at least one processor on the user device, at least one processor on a mobile device associated with a user, a vehiclebased computing system, an object detection system, or a driver assistance system, wherein the remote computing system comprises at least one of a remote data signal fusing computing system, a remote object detection system, or a remote driver assistance system, a server computer over the one or more networks, an image processing server, a graphics processing unit ("GPU") -based server, a positioning and mapping server, a machine learning system, an artificial intelligence ("Al") system, a deep learning system, a neural network, a convolutional neural network ("CNN"), a fully convolutional network ("FCN"), a cloud computing system, or a distributed computing system.
15. A method, comprising: receiving, using a remote computing system and from each of one or more first computing systems associated with corresponding one or more first vehicles among a plurality of vehicles and over one or more networks, one or more first communications indicating that a first alert condition has been detected at or near one or more of a first current vehicle location of each of the one or more first vehicles or a first alert condition location corresponding to the first alert condition; receiving, using the remote computing system and from each of one or more second computing systems associated with corresponding one or more second vehicles among the plurality of vehicles and over the one or more networks, one or more second communications regarding at least one of a second current vehicle location or a second predicted path for each of the one or more second vehicles; analyzing, using the remote computing system, the at least one of the second current vehicle location or the second predicted path for each of the one or
72 more second vehicles in relation to at least one of the first alert condition or the first alert condition location; and based on a determination that at least one second vehicle among the one or more second vehicles is in proximity to or approaching the first alert condition location or is within a first region encompassing the first alert condition, based at least in part on the analysis, sending, using the remote computing system, one or more third communications to each of the second computing systems associated with each of the corresponding at least one second vehicle indicating that said second vehicle is in proximity to or approaching the first alert condition location.
16. The method of claim 15, wherein the one or more first computing systems and the one or more second computing systems each comprises at least one of a data signal fusing computing system, at least one processor on the user device, at least one processor on a mobile device associated with a user, a vehicle-based computing system, an object detection system, or a driver assistance system, wherein the remote computing system comprises at least one of a remote data signal fusing computing system, a remote object detection system, or a remote driver assistance system, a server computer over the one or more networks, an image processing server, a graphics processing unit ("GPU") -based server, a positioning and mapping server, a machine learning system, an artificial intelligence ("Al") system, a deep learning system, a neural network, a convolutional neural network ("CNN"), a fully convolutional network ("FCN"), a cloud computing system, or a distributed computing system.
17. The method of claim 15 or 16, wherein the plurality of vehicles each comprises one of a car, a minivan, a pickup truck, a motorcycle, an all-terrain vehicle, a scooter, a police vehicle, a fire engine, an ambulance, a recreational vehicle, a bus, a commercial van, a commercial truck, a semi-tractor-trailer truck, a boat, a ship, a submersible, an amphibious vehicle, an aircraft, a space vehicle, a satellite, an autonomous vehicle, or a drone.
18. The method of any of claims 15-17, wherein the first alert condition comprises at least one of traffic congestion along the second predicted path of the at least one second vehicle potentially causing a slow-down, a traffic accident along the
73 second predicted path of the at least one second vehicle potentially causing a slowdown, a construction site along the second predicted path of the at least one second vehicle potentially causing a slow-down, one or more people along the second predicted path of the at least one second vehicle who are occluded from a perspective of the at least one second vehicle, one or more animals along the second predicted path of the at least one second vehicle who are occluded from the perspective of the at least one second vehicle, one or more objects along the second predicted path of the at least one second vehicle who are occluded from the perspective of the at least one second vehicle, one or more people potentially blocking the second predicted path of the at least one second vehicle, one or more animals potentially blocking the second predicted path of the at least one second vehicle, one or more objects potentially blocking the second predicted path of the at least one second vehicle, a tracked weather event along or near the second predicted path of the at least one second vehicle, a natural hazard potentially blocking the second predicted path of the at least one second vehicle, a manmade hazard potentially blocking the second predicted path of the at least one second vehicle, one or more people potentially intercepting the at least one second vehicle along the second predicted path, one or more animals potentially intercepting the at least one second vehicle along the second predicted path, one or more objects potentially intercepting the at least one second vehicle along the second predicted path, or one or more other vehicles potentially intercepting the at least one second vehicle along the second predicted path.
19. The method of any of claims 15-18, further comprising: receiving, using the remote computing system and from each of one or more third computing systems associated with corresponding one or more third vehicles among the plurality of vehicles and over the one or more networks, one or more fourth communications indicating that the first alert condition is no longer present at the first alert condition location; receiving, using the remote computing system and from each of one or more fourth computing systems associated with corresponding one or more fourth vehicles among the plurality of vehicles and over the one or more networks, one or more fifth communications regarding at least one of a fourth current vehicle location or a fourth predicted path for each of the one or more fourth vehicles;
74 analyzing, using the remote computing system, the at least one of the fourth current vehicle location or the fourth predicted path for each of the one or more fourth vehicles in relation to at least one of the first alert condition or the first alert condition location; and based on a determination that at least one fourth vehicle among the one or more fourth vehicles is in proximity to or approaching the first alert condition location or is within the first region encompassing the first alert condition, based at least in part on the analysis, and after receiving the one or more fourth communications from a threshold number of third computing systems associated with the corresponding one or more third vehicles, sending, using the remote computing system, one or more fifth communications to each of the fourth computing systems associated with each of the corresponding at least one fourth vehicle indicating that the first alert condition is no longer present at the first alert condition location.
20. A system, comprising: a remote computing system, comprising: at least one first processor; and a first non-transitory computer readable medium communicatively coupled to the at least one first processor, the first non-transitory computer readable medium having stored thereon computer software comprising a first set of instructions that, when executed by the at least one first processor, causes the remote computing system to: receive, from each of one or more first computing systems associated with corresponding one or more first vehicles among a plurality of vehicles and over one or more networks, one or more first communications indicating that a first alert condition has been detected at or near one or more of a first current vehicle location of each of the one or more first vehicles or a first alert condition location corresponding to the first alert condition; receive, from each of one or more second computing systems associated with corresponding one or more second vehicles 75 among the plurality of vehicles and over the one or more networks, one or more second communications regarding at least one of a second current vehicle location or a second predicted path for each of the one or more second vehicles; analyze the at least one of the second current vehicle location or the second predicted path for each of the one or more second vehicles in relation to at least one of the first alert condition or the first alert condition location; and based on a determination that at least one second vehicle among the one or more second vehicles is in proximity to or approaching the first alert condition location or is within a first region encompassing the first alert condition, based at least in part on the analysis, send one or more third communications to each of the second computing systems associated with each of the corresponding at least one second vehicle indicating that said second vehicle is in proximity to or approaching the first alert condition location.
76
PCT/US2021/065468 2021-12-29 2021-12-29 Object detection using fusion of vision, location, and/or other signals WO2022104295A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2021/065468 WO2022104295A1 (en) 2021-12-29 2021-12-29 Object detection using fusion of vision, location, and/or other signals

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2021/065468 WO2022104295A1 (en) 2021-12-29 2021-12-29 Object detection using fusion of vision, location, and/or other signals

Publications (1)

Publication Number Publication Date
WO2022104295A1 true WO2022104295A1 (en) 2022-05-19

Family

ID=81601824

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/065468 WO2022104295A1 (en) 2021-12-29 2021-12-29 Object detection using fusion of vision, location, and/or other signals

Country Status (1)

Country Link
WO (1) WO2022104295A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180362031A1 (en) * 2017-06-20 2018-12-20 nuTonomy Inc. Risk processing for vehicles having autonomous driving capabilities
US20200294385A1 (en) * 2019-03-15 2020-09-17 General Motors Llc Vehicle operation in response to an emergency event
US20210188304A1 (en) * 2019-12-23 2021-06-24 LinkeDrive, Inc. Apparatus and method for providing real time hotspot driver coaching messages

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180362031A1 (en) * 2017-06-20 2018-12-20 nuTonomy Inc. Risk processing for vehicles having autonomous driving capabilities
US20200294385A1 (en) * 2019-03-15 2020-09-17 General Motors Llc Vehicle operation in response to an emergency event
US20210188304A1 (en) * 2019-12-23 2021-06-24 LinkeDrive, Inc. Apparatus and method for providing real time hotspot driver coaching messages

Similar Documents

Publication Publication Date Title
US11216972B2 (en) Vehicle localization using cameras
US10262234B2 (en) Automatically collecting training data for object recognition with 3D lidar and localization
US10133947B2 (en) Object detection using location data and scale space representations of image data
US10943485B2 (en) Perception assistant for autonomous driving vehicles (ADVs)
US10169991B2 (en) Proximity awareness system for motor vehicles
US20220221295A1 (en) Generating navigation instructions
US10139818B2 (en) Visual communication system for autonomous driving vehicles (ADV)
US9728084B2 (en) Method and apparatus for providing vehicle classification based on automation level
US20180090005A1 (en) Method And Apparatus For Vulnerable Road User Incidence Avoidance
US11551373B2 (en) System and method for determining distance to object on road
CN106257556B (en) Detecting and communicating lane splitting maneuvers
US9709414B2 (en) Personalized suggestion of automated driving features
CN113748447A (en) Ghost traffic congestion detection and avoidance
JP6559086B2 (en) Information processing system
JP2022543936A (en) Automated crowdsourcing of road environment information
JP2019109795A (en) Driving support device and driving support system
JP2015210584A (en) Image processing apparatus
US9000950B2 (en) Managing vehicle detection
WO2022104295A1 (en) Object detection using fusion of vision, location, and/or other signals
US11455800B2 (en) Roadway alert system using video stream from a smart mirror
US11410432B2 (en) Methods and systems for displaying animal encounter warnings in vehicles
JP2015114931A (en) Vehicle warning device, server device and vehicle warning system
JP2018018245A (en) Mobile communication device and determination program
US20230111391A1 (en) Method, apparatus, and computer program product for identifying wrong-way driven vehicles
JP7093268B2 (en) Information processing equipment, information processing methods, and information processing programs

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21893045

Country of ref document: EP

Kind code of ref document: A1