WO2017176550A1 - Method and system for autonomous vehicle sensor assisted selection of route with respect to dynamic route conditions - Google Patents

Method and system for autonomous vehicle sensor assisted selection of route with respect to dynamic route conditions Download PDF

Info

Publication number
WO2017176550A1
WO2017176550A1 PCT/US2017/025007 US2017025007W WO2017176550A1 WO 2017176550 A1 WO2017176550 A1 WO 2017176550A1 US 2017025007 W US2017025007 W US 2017025007W WO 2017176550 A1 WO2017176550 A1 WO 2017176550A1
Authority
WO
WIPO (PCT)
Prior art keywords
route
user
location
vehicle
user device
Prior art date
Application number
PCT/US2017/025007
Other languages
French (fr)
Inventor
Jussi RONKAINEN
Marko Palviainen
Mikko Tarkiainen
Jani Mantyjarvi
Ari Virtanen
Pertti PEUSSA
Matti Kutila
Original Assignee
Pcms Holdings, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pcms Holdings, Inc. filed Critical Pcms Holdings, Inc.
Publication of WO2017176550A1 publication Critical patent/WO2017176550A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • G01C21/3461Preferred or disfavoured areas, e.g. dangerous zones, toll or emission zones, intersections, manoeuvre types, segments such as motorways, toll roads, ferries
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/3415Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents

Definitions

  • a user such as an exerciser, traveler, or tourist, plans an activity involving a route (such as shopping, walking, running, biking, miscellaneous exercise, or other activity) in an unfamiliar area
  • a route such as shopping, walking, running, biking, miscellaneous exercise, or other activity
  • the user might want to know the best/safest/most convenient route or may want to dynamically change the route while the user is on that route already.
  • Such a user might also wonder about the condition of the road, whether it is paved, whether there are (roofed) sidewalks, whether there is a headwind, how hard is it raining, whether there are puddles, and whether there are dark alleys, slippery roads, icy patches, steep hills, road blockages, pets, drunken crowds, or trash on the ground.
  • Planning a pedestrian or cycling route may be demanding, especially if a person (e.g., a traveler or tourist) planning the route lacks knowledge of the region or the condition of the roads, or if the planner has a desire to try out new routes, even in a fairly well-known region.
  • Things to consider when selecting route include, for example: safety (e.g., lighting or dark alleys), convenience (e.g., willingness to face a strong headwind or rain), conditions of the roads (e.g., surface of roads, sidewalks, or other lanes), type of neighborhood (e.g., cleanliness, suspicious people loitering), and suitability of the route for other personal preferences (e.g., exercise style, fitness level, and number of road crossings and stops).
  • Drivers of motor vehicles care about planning a route according to dynamic elements as well. For example, drivers seeking safe travel of a vehicle or motorcade through city streets or desiring road and traffic conditions during rush hour or following a natural disaster (e.g., an earthquake, hurricane, or flood) require up to date information of dynamic elements along the route. Such data cannot be determined from static sources. Crowdsourcing the data via services such as Waze may not be feasible, reliable, or fast enough in cases where the safety of the route for human drivers cannot be guaranteed (e.g., after an earthquake). Moreover, apps and services do not report small or unusual items (such as trash, objects, miscellaneous items, events, or people) that are important to drivers, cyclists, and walkers.
  • small or unusual items such as trash, objects, miscellaneous items, events, or people
  • Users typically plan routes for running or cycling based on ad-hoc routes selected after looking at a static map or according to crowdsourced information provided by services such as Strava, MapMyRun, and GPSies.
  • the services allow other users of the service to report their exercise routes and other optional parameters, such as elevation or workout profile (e.g., calorie burn rate, total time used).
  • elevation or workout profile e.g., calorie burn rate, total time used.
  • users typically select routes from existing, reported routes. The list of possible selections might be small or non-existent in areas with fewer active users. Also, reported routes might not suit a user's desired exercise profile or personal taste.
  • current solutions lack the capability to make route modifications, such as adding preferred points of interest or preferred road sections along the route.
  • Autonomous functions are generally designed to operate best in well-mapped and structured areas, and autonomous vehicles (AV) are designed to obey traffic rules.
  • AV autonomous vehicles
  • the same problem may exist for areas around a store, which may prevent an autonomous vehicle from reaching the front door to pick up elderly passengers.
  • An AV may be programmed to override the rules in special cases where overrides are allowed and there are no safety issues. This approach may not make sense for broader area events or venues. An AV may block other traffic inside an area, or an AV may be unable to enter an area because pedestrians are blocking the whole area. Rerouting may not solve the problem because alternative routes do not exist. If a sudden clearance of an area is required (such as due to a fire or other hazard), an AV may not detect such an event and may block traffic. Manually driven cars may be controlled by giving orders to a driver, but a driverless car may not have such an option. Therefore, unwanted access of AVs may be difficult to control if there is no method to communicate such messages between a restricted area and AVs.
  • Deployment of autonomous vehicles may start with an assumption that certain common rules and traffic instructions are available.
  • pedestrian zones may not be well-mapped and may include many types of vulnerable road users (e.g., bicyclists and pedestrians) who do not expect any cars driving in the area.
  • Such areas may have special requirements for autonomous vehicle control units.
  • Restricted areas may not be configured to send instructions to autonomous vehicles on how to enter an area or drive to a front door, for example. In many cases, getting an autonomous taxi to a front door may be a service where people may pay extra.
  • US patent application 2016/0231746 presents a system and method to operate an automated vehicle by using concurrence estimation with pedestrian cell-phones and bicycle and road infrastructure sensors.
  • US patent 8,509,982 offers a zone driving system where a roadgraph may include a network of information such as roads, lanes, intersections, and connections between these features, as well as geographic zones associated with particular rules.
  • the rules may require an autonomous vehicle to alert a driver that the vehicle is approaching a zone.
  • US patent 8,688,306 describes systems and methods to limit use of autonomous or semi-autonomous vehicles based on user permission data.
  • the vehicle may drop off a passenger at a predefined destination, and the permission data may be used to limit the ability of an occupant to change a vehicle's route completely or by some maximum deviation value.
  • US patent application 2016/0125736 presents a method to improve parking space identification in autonomous driving using identification information and parking space status information.
  • US patent application 2015/0353080 offers an automatic parking system where a vehicle is controlled automatically along a traveling path to a monitored parking space. If a contact determination unit determines that a vehicle makes contact with an obstacle, then the vehicle is stopped and parked in a position for removal of the obstacle.
  • US patent application 2015/0367845 describes a parking assist apparatus that parks a vehicle by generating a set of traveling routes and associated vehicle speeds, selecting a route based on object information, and controlling a vehicle to the selected parking spot.
  • US patent application 2014/00465506 presents a method of autonomous movement of a vehicle in a parking area by an external, stationary control device located in or near a parking area. Impending or actual collisions with other vehicles are detected by a vehicle sensor and in response the control device makes a behavioral decision, such as triggering an alert signal or maneuvering the vehicle.
  • Exemplary embodiments described herein relate to the field of personal security around a smart space (e,g., a campus area, mall, or office building(s), and their parking areas) and connected vehicles with various sensor and communication systems.
  • a smart space e.g., a campus area, mall, or office building(s), and their parking areas
  • some embodiments described herein address the problem of how to increase personal safety and security of a user walking between a smart space area and his or her vehicle.
  • Embodiments of systems and methods described herein use connected vehicles and smart space sensor systems to provide dynamic surveillance and guidance capabilities.
  • An exemplary embodiment operates to enhance personal safety and security for people that use the outdoor space (including blind spots) for entry, exit, or outdoor walking for any other purpose.
  • This specification describes new techniques for enhancing the personal security and safety of a user before and while the user walks in a smart space (such as a campus or office area and parking or other area) or between a smart space and a connected vehicle.
  • a smart space security service provides information on the current security situation, coverage of the security surveillance (fixed and vehicle based), and authorized persons and vehicles identified in the area.
  • the system may route connected vehicles entering and exiting the area to cover blind-spots. Autonomous vehicles may be driven to cover the blind spot (perform a security-scanning drive).
  • the system provides a user safe walking path suggestions with detailed information about current security conditions, surveillance and tracking coverage, route blind-spots, and potential risks. Safety tracking is provided for the user while walking in a smart space area.
  • the system communicates the user's position to the smart space security service and to connected vehicles (via vehicle-to-person (V2P) communications).
  • V2P vehicle-to-person
  • An exemplary embodiment involves interactions among a primary terminal (which may be a mobile device), a smart space security service (which may be implemented using software on a computer server, for example), and at least one connected vehicle.
  • the primary terminal may be configured to provide a user positioning service, a Personal Security and Safety Monitoring (PSSM) application, and a user interface.
  • PSSM Personal Security and Safety Monitoring
  • the user positioning service tracks the user' s position (e.g., via GPS location information). Indoor locations may be tracked by the smart space security service.
  • the PSSM application presents ID information, security tracking information, and route and alert information.
  • the application provides the user or vehicle ID when communicating with the smart space security service or the connected vehicle.
  • the security tracking information includes the current user security tracking status.
  • the route and alert information includes the walking route(s) along with additional alert information, such as security risks, safety risks, and safety metrics.
  • the user interface provides an application interface, an interaction management module, and a communication unit.
  • the application interface enables application and service modules to use the user interface to present the application's or service module's information and to control that information.
  • the interaction management module implements the application interface and controls the UI device' s input units and UI device' s output units.
  • the communication unit provides wide area wireless communication systems (e.g., cellular systems) and short-range communication systems (e.g., Wi-Fi V2P communication).
  • the smart space (e.g., a campus, office building, or mall), including entry/exit doors, entry/exit areas, and parking areas may be provided with a smart space security service that includes components such as a smart space security manager, a smart space sensing and communication system, and a smart space person ID registry.
  • the smart space security manager may include components such as a smart space security data manager, a security scanning service, a route planner, a security risk calculator, a person ID service, and a user location and tracking service.
  • the smart space security data manager handles the smart space security service's current safety and security situation data.
  • the security scanning service oversees the security and safety scanning of the area by the smart space's sensor systems.
  • the security scanning service uses the coverage limits of the smart space sensing system and the additional security scans performed by connected vehicles.
  • the service sends security scanning requests to connected vehicles and receives reports from connected vehicles.
  • the walking path planner calculates the safest walking route for the user based on current security conditions, security monitoring, and tracking coverage.
  • the security risk calculator gathers information from the smart space's resources and security scans and person identifications performed by vehicles and the smart space.
  • the security risk calculator uses this gathered information to discern potential risks and raise security alerts, which may occur if a safety metric is below a safety threshold.
  • safety metrics may be calculated based on sensor data measured by a smart space sensor or an AV sensor.
  • the person identification service identifies detected people by using the smart space person ID registry (or a database of information related to identities of people).
  • the user location and tracking service locates the user (e.g., obtains the user's location from connected vehicles or short-range communications with the user' s device) and tracks the user while he or she walks to the destination.
  • an updated safety metric is calculated for a portion of a route between a location of a user device and a user destination location. This updated safety metric may be communicated to the user device. For one embodiment, a safety alert message is sent to a user device if an updated safety metric is below a safety threshold.
  • the smart space sensing and communication system may use a variety of available monitoring sensors (e.g., camera feeds) to detect potential risks (e.g., objects, people, and conditions) in the smart space surroundings (or a vicinity of a pedestrian route).
  • the smart space also includes user tracking capabilities with the sensor and short-range communication systems.
  • the smart space ID registry (or database) provides identification of all personnel, people, and vehicles authorized by the smart space management. The registry may also include authorized guests and visitors in the smart space area.
  • An exemplary connected vehicle may include components such as a PSSM application, a vehicle sensor system, and a communication unit.
  • the PSSM application e.g., an application running in the vehicle terminal
  • the PSSM application may provide services such as a security scanning service, a user location and tracking service, and a person recognition service.
  • the security scanning service performs security and safety scanning of the area with the in-vehicle sensor system.
  • the user location and tracking service locates the user (e.g., obtains the user's location from the smart space sensors or calculates the user's location using vehicle sensors and short-range communications) and tracks the user while he or she walks to the destination.
  • the person recognition service detects people (e.g., using V2P communication and/or environment perception systems), ascertains their person identification numbers, and provides those numbers to the smart space security service.
  • the vehicle sensor system uses perception monitoring sensors (e.g., visual cameras, thermal cameras, and LIDARs) to detect potential risks (e.g., objects, people, and conditions).
  • the communication unit provides wide area wireless communication (e.g., cellular) systems and short-range communication (e.g., Wi-Fi and V2P) systems between the vehicle, the smart space security service, and the user's primary terminal device.
  • a Route Planning Service may collect data from one or more vehicles for use with planning an exercise route.
  • a user uses the service to determine a route.
  • the user has the primary terminal and other devices (such as a wrist watch, a bike, or a head-mounted device) that contain the route planning application and wide-area wireless communication systems (e.g., cellular systems).
  • the route planning application provides functionality and user interfaces for: (1) connecting the user profile, the cloud service, and the scouting vehicle to one another; (2) providing input for preferences and requirements for the route, such as the desire to stay within a given distance to the starting point; (3) showing the route map on the primary device; (4) requesting and controlling the vehicle usage along the route; and (5) providing input and output via the wrist watch, handheld devices, or other devices (such as route instructions and buttons for calling the car).
  • the RPS runs as a cloud service and includes a route planning application, a sensor data processing service, a sensor information sharing service, and information storage.
  • the route planning application plans the route and may include a route planning module, a map data maintenance module, a vehicle command module, and an external interface module.
  • the route planning module determines a suitable route based on the user's profile, user input, static route information (such as maps), and dynamic route information.
  • the map maintenance module augments and updates stored map data with information obtained from the vehicle and other sources.
  • the vehicle command module may perform tasks including: determining vehicle sensor capability; performing sensor fusion and detection, where multiple sensor inputs are fused together to compute something greater than just one sensor alone; determining vehicle availability for scouting routes; instructing routes for the vehicle scouting; parsing reports sent by the vehicle in order to provide up-to-date information on the current route candidate and to update map data; and relaying meeting requests from the user to the vehicle, both static (such as meet at a fixed location) and dynamic (such as the car meeting a moving user).
  • the external interfaces module accesses external services, such as maps, traffic information, and weather services.
  • the RPS's Sensor Data Processing Service is capable of extracting route events based on sensor data. Such events may include the presence of trash, cardboard boxes, loitering persons, pets or other animals, and roofed sidewalks.
  • the RPS's Sensor Information Sharing Service enables the sharing of vehicle sensor information between the user and nearby vehicles.
  • the RPS' s information storage module stores user preferences and profiles, augmented map data, and stored routes. User preferences and profiles contain at least preferred route parameters for a given movement type (such as inclination, paved vs. non-paved routes, and a preference for sidewalks). Optionally, user preferences and profiles might include other information, such as user fitness level and exercise history.
  • Augmented map data is formed from static map data and dynamic information obtained from vehicles, such as road conditions (potholes, surface material, surface roughness, and traction) and other road information (such as the presence and condition of sidewalks, the presence and type of roadside lighting, and the availability of locations for shelter from weather).
  • road conditions potential holes, surface material, surface roughness, and traction
  • other road information such as the presence and condition of sidewalks, the presence and type of roadside lighting, and the availability of locations for shelter from weather.
  • Stored routes serve as a basis for creating new routes.
  • the vehicle links to the primary terminal (such as a cell phone) and the RPS.
  • the vehicle may be provided with a route planning application, a map database, vehicle sensor data, and a sensor access module.
  • the route planning application may include a route reporting module, a driving route module, and a vehicle information provider.
  • the route reporting module reduces and reports sensor data into events along the route as requested by the RPS.
  • the driving route module establishes and executes a driving plan along the route (where the autonomous vehicle is able to drive the route) as requested by the RPS.
  • the vehicle information provider gives information about the vehicle, such as which sensor information the vehicle supports and the vehicle's ability to fulfill route scouting requests in light of fuel and battery levels.
  • the digital map navigation database augments the vehicle's own sensor information.
  • Vehicle sensor data includes access to data such as 3D imaging via cameras or LiDAR, thermal camera data, road roughness, ambient lighting level, rain sensor, traction control, electronic stability control (ESC), antilock braking system (ABS), and other derived events.
  • the sensor access module enables activation of a vehicle's sensors and the sharing of car sensor data for other users.
  • Embodiments disclosed herein may provide various benefits. Users may have confidence in a selected route because a vehicle has recently confirmed the route. A static route service lacks such up-to-date data. Another potential benefit is the ability to check of the route for unsafe or unknown conditions, such as following a natural disaster. Exemplary embodiments also use real-time weather and road conditions for selecting a route. A further benefit includes the ability to reposition the vehicle nearby the user and provide added security and convenience (such as if it suddenly starts raining).
  • Exemplary systems and methods use data received from numerous vehicle sensors (e.g., cameras, LiDAR, radar, vehicle suspension data, and gyros) along the route to provide dynamic data to the user before and during travel on the route (using, for example V2C, C2V, V2P, and P2V communication) that provides the user with a sense of trust, safety, and accuracy during the trip.
  • vehicle sensors e.g., cameras, LiDAR, radar, vehicle suspension data, and gyros
  • One embodiment of a method of determining conditions along a route may comprise registering vehicles for ad hoc sensor duty, receiving location information for the registered vehicles, receiving information regarding at least a portion of a potential route of a user, and in response to a determination, based upon the received location information for the registered vehicles, that at least one registered vehicle is parked or driving at a position with visibility of at least a portion of a planned route of a user sending, to at least one registered autonomous vehicle (parked or driving) at a position with visibility of the route, information for causing a vehicle to collect data from a sensor of a selected modality, receiving sensor data from at least one registered vehicle (parked or driving) at a position with visibility of the route, and sending information regarding conditions on at least a portion of a potential route of a user derived from information from sensor data of at least one registered vehicle (parked or driving) at a position with visibility of a route.
  • a route may be a pedestrian route along sidewalks, where vehicles have visibility of sidewalks from non-sidewalk locations.
  • a method may further comprise deploying an autonomous vehicle to fill a gap for a portion of a potential route not covered by sensors.
  • One embodiment of a method of determining route conditions may comprise registering vehicles for sensor data collection, receiving location information for registered vehicles, receiving information regarding at least a portion of a potential route of a user, and determining that at least one registered vehicle has visibility of a portion of a potential user route.
  • determining that at least one registered vehicle has visibility of a portion of a potential user route may comprise sending to at least one registered vehicle at a position with visibility of route information to cause a vehicle to collect data from a sensor of a selected modality, receiving sensor data from at least one registered vehicle at a position with visibility of the route, and sending information regarding conditions on at least a portion of a potential user route derived from information from sensor data of at least one registered vehicle at a position with visibility of a route.
  • a potential user route may include sidewalks, and vehicles may have visibility of sidewalks from non-sidewalk locations.
  • a method of determining route conditions may further comprise determining that a portion of at least a portion of a potential user route lacks sensor data and deploying a vehicle to collect data to overcome a lack of sensor data.
  • information regarding conditions on at least a portion of a potential user route may be sent to a user's terminal device and to a cloud service.
  • collecting of sensor data may be done for one or more potential user routes prior to a user starting to traverse a route.
  • collecting of sensor data may be done dynamically with a vehicle traveling a configurable distance ahead of the user while the user starts to traverse the route.
  • an apparatus may be configured to use computer-readable memory storing software instructions for a user route planning application, wherein such instructions are capable of calculating potential user routes, maintaining route map data in a database, instructing one or more vehicles to collect data for such potential user routes, interfacing with external modules to obtain external map, traffic information, and weather service data, and reporting route data to the user.
  • instructions for commanding of one or more vehicles may comprise instructions to perform operations including determining vehicle sensor capability, performing sensor fusion and detection, determining vehicle availability for scouting of routes, instructing vehicles to perform scouting for a particular route, parsing vehicle scouting reports to update map data and determine if alternate route candidates exist, and sending user- vehicle meeting requests to a vehicle and a user.
  • instructions for reporting of route data to a user may comprise instructions to perform operations including: displaying potential routes on a primary terminal device, displaying route conditions for each potential user route on a primary terminal device, allowing a user to pick a route from available choices or to name a new route, enabling a user to modify a route after starting to traverse a route, displaying updated route conditions on a primary terminal device, and interfacing with secondary user devices.
  • an apparatus may be configured to use computer-readable memory holding software instructions for a vehicle route planning application, wherein such instructions may be capable of planning vehicle routes, updating a map database, interfacing with a sensor access module to enable vehicle sensors, and reading vehicle sensor data.
  • instructions for planning of vehicle routes may comprise a route reporting module, a driving route module, and a vehicle information provider.
  • a method may comprise receiving, from a user, an indication of one or more criteria for a route, identifying at least one portion of a route satisfying the criterion, collecting data from a vehicle traversing the identified portion, and proposing a route to the user based at least in part on the collected data.
  • criterion for a route may include a starting point.
  • criterion for a route may include an end point.
  • criterion for a route may include a length.
  • criterion for a route may include an overlap percent.
  • a method may further comprise instructing an autonomous vehicle to traverse the identified portion.
  • a method may further comprise identifying a vehicle traversing an identified portion, wherein data is collected from the identified vehicle.
  • a method may further comprise identifying a plurality of vehicles traversing an identified portion, wherein data is collected from a plurality of vehicles.
  • the user may be a non-motorized road user.
  • the user may be a pedestrian.
  • the user may be a runner.
  • the user may be a cyclist.
  • collected data may include data on road hazards.
  • collected data may include data on litter.
  • collected data may include data on crowding.
  • collected data may include data on temperature.
  • collected data may include data on inclination.
  • collected data may include data on surface quality.
  • collected data may include data on the user.
  • collected data may include data regarding at least two routes satisfying the criteria and may further comprise displaying collected data to a user and receiving a user selection of one of the routes.
  • a system may comprise a processor and a computer-readable memory storing instructions operative to perform functions including receiving, from a user, an indication of one or more criteria for a route, identifying at least one portion of a route satisfying the criterion, collecting data from a vehicle traversing an identified portion, and proposing a route to a user based at least in part on collected data.
  • systems and methods described herein provide an ability to control autonomous vehicle (or car) behavior outside a road network by defining an area management process.
  • An area operator/manager may control an autonomous vehicle inside a specified area.
  • Many AVs execute a pre-defined plan or mission. The lack of capability to determine where and when an AV may drive, especially in dynamic environments, is insufficient because such information is not included in navigational maps. Autonomous vehicles have information about dynamic objects within an AV's sensor range. However, an ability to adapt to dynamic changes in the area is missing. If a destination location is set outside of a road network (for example, in a market zone, pedestrian zone, parking facility, or temporary parking area) without any fixed markings, an AV may not operate safely without external assistance.
  • Systems and methods for an area management process are described herein, where an area manager may send instructions directly to vehicles (or cars), if a vehicle is allowed to enter in an area, that indicate which route and parking place may be selected. Systems and methods described herein also enable control of vehicle departure times and routes out of an area. Thus, for example, congestion may be avoided after a public event, when all cars try to leave the area at the same time.
  • An area management process may be based on real-time monitoring of an area, detection of pedestrians, and other dynamic objects, using either fixed or mobile monitoring sensors and/or the available autonomous vehicles sensor data.
  • Systems and methods described herein enable an AV to enter zones outside road networks where traffic rules are not exact and the area is crowded with people, other VRUs, and other activities. Such areas are dynamically changing and difficult to cover with traditional navigation maps that are updated to include dynamic events and alternative routes.
  • Systems and methods described herein use micro traffic management. Using such systems and methods, AVs are automatically controlled inside such managed area, and safety and functionality of AVs is improved. AVs may be driven to pick up passengers and packages. Large parking facilities (such as airports, amusement parks, and shopping malls) benefit from better control of parking with less guiding personnel. Costs may be reduced by not hiring and training workers to run large temporary parking areas during large events that occur only for a short time period. With a vehicle control API, AVs may enter managed areas and be controlled by area management processes.
  • FIG. 1 is a schematic block diagram of the device and service levels of an exemplary communication system.
  • FIG. 2 is an example mobile application system map showing possible user routes.
  • FIG. 3 is a schematic plan view of a parking lot showing coverage areas and blind spots.
  • FIG. 4 is a schematic plan view of a parking lot showing coverage areas and blind spots.
  • FIG. 5 is a message flow diagram illustrating messages exchanged during an exemplary process of scouting and selecting a user route.
  • FIG. 6 is a message flow diagram illustrating messages exchanged while a user is traversing a selected route in an exemplary embodiment.
  • FIG. 7 is a schematic diagram illustrating an example interaction with a mobile device application and a smartwatch application.
  • FIG. 8 is a schematic plan view of a parking lot showing coverage areas and blind spots for safety metric calculations.
  • FIG. 9 is an example of a route map.
  • FIG. 10 is a schematic block diagram of the functional architecture of an exemplary embodiment.
  • FIG. 11 is a schematic perspective view of an example user interface for planning and accepting a route with the ability to dynamically control route planning.
  • FIG. 12 is a schematic perspective view of exemplary smart watch user interfaces.
  • FIG. 13 is a message sequence diagram illustrating a process for using sensor information of oncoming cars in selection of a user's route.
  • FIG. 14 is a message sequence diagram demonstrating a process for using car sensors in selecting a route that suits user-given preferences.
  • FIG. 15 is a message sequence diagram that shows the remainder of the process shown in FIG. 14 for using car sensors in selecting a route that suits user-given preferences.
  • FIG. 16 is a message sequence diagram showing examples of exceptional events that might occur when traveling a route, such as re-planning of the route due to a user deviation and a user-initiated request for meeting the car at a fixed location.
  • FIG. 17 is a message sequence diagram showing examples of exceptional events that might occur when traveling a route, such as requests for an accompanied route at a specific location.
  • FIG. 18 is a message sequence diagram showing the remainder of an example shown in FIG. 17 of an exceptional event that may occur when traveling a route: a request for an accompanied route at a specific location.
  • FIG. 19 is a schematic block diagram of system interfaces between an AV and an area manager and related devices.
  • FIG. 20 is a plan view schematic of an environment where an AV may be controlled by an area manager.
  • FIG. 21 is one embodiment of a message sequencing diagram for an area access process.
  • FIG. 22 is one embodiment of a message sequencing diagram for an area access process for operating without a fixed surveillance system.
  • FIG. 23 is a plan view schematic of a temporary parking area with virtual parking spots and virtual lanes.
  • FIG. 24 illustrates an exemplary wireless transmit/receive unit (WTRU) and, for example, may be used as a primary terminal (e.g., smartphone), a user device, or a connected vehicle computing system (which may be capable of running a route planning application) in some embodiments.
  • WTRU wireless transmit/receive unit
  • FIG. 25 illustrates an exemplary network entity which may be used as a route planning service or a smart apace security service in some embodiments.
  • FIG. 1 shows an overall view of an exemplary system 100.
  • the diagram differentiates between service-level 101 and device-level 102 processes within a user's mobile device 103, smart space 105, and connected vehicle 106.
  • a PSSM application 108 may handle the primary interface with the user via the user' s primary device 116.
  • the PSSM application 108 may run on top of a mobile device's 103 operating system device drivers and application handlers.
  • each system entity uses a communication network 104 to communicate with the other entities.
  • the device-level 102 processes may include low-level sensors, communication systems (e.g., Wi- Fi) 124, and the person ID database 125.
  • a smart space security manager 117 may contain system- level 101 modules, such as a security data manager 118, walking path planning (route calculation) 119, risk calculation 120, security scanning service 121, person identification service 122, and user location and tracking service 123.
  • a connected vehicle's 106 device-level 102 processes may include a vehicle sensor system 130 and a wireless communication unit (or system) 131.
  • Service- level 101 processes may include a PSSM application's 126 security scan service 127, person recognition service 128, and user location and tracking service 129.
  • FIG. 2 shows an exemplary embodiment of a mobile device user interface 200.
  • the system presents alternative routes with current and estimated security and surveillance coverage values (e.g., by visualizing them on a map).
  • the figure illustrates how the PSSM application may present multiple routes to the user 202.
  • An example PSSM application displays each route in white over the top of a satellite view of the area for two routes between the user's current location 202 and a car 208.
  • the PSSM application indicates the security level and percentage of surveillance 204, 206 for each route.
  • the system provides route selection with the possibility of increasing surveillance or security coverage (e.g., by using additional automated vehicles or connected cars).
  • assessed security unidentified people or vehicles
  • surveillance state coverage area
  • FIG. 3 is a schematic plan view of a parking lot outside of a smart space in an exemplary use case 300.
  • the user walks a route (or walking path) 307 from a smart space building 304 towards a destination (an AV 318).
  • the route 307 contains two blind spots 305, 306.
  • Trees 308, 309 block the building security cameras 301, 302, 303 at the parking area's edges.
  • the sensor systems of the vehicles 314, 317 parked in the middle of the lot fail to reach these blind spots 305, 306 as shown by the vehicle short-range communication regions 319, 320.
  • the destination vehicle's 318 short-range communications region 321 also fails to cover either blind spot 305, 306.
  • the smart space security service requests an autonomous vehicle 312, 313 to move along a route 311 to one end of the parking lot to provide coverage for a blind spot.
  • the requested location for the autonomous vehicle 312, 313 may be determined based on a calculation that maximizes sensor coverage of a route.
  • the smart space security service instructs a connected vehicle 315, 316 to move along a route 310 to the other end of the lot to provide coverage for the second blind spot.
  • the user's walking route 307 obtains full security monitoring and tracking coverage.
  • Some embodiments use autonomous vehicles (AV) interchangeably with connected vehicles.
  • AV autonomous vehicles
  • FIG. 4 is a schematic plan view of a parking lot outside of a smart space in another use case 400.
  • a user leaves a smart space 401 to walk to his or her parked car 2 (409).
  • the smart space contains two security monitoring cameras with views 404, 405 of the parking lot.
  • the triangles 404, 405 show the field of view 404, 405 of the two cameras centered on Door A 402 and Door B 403 exits.
  • the user's car 2 (409) contains perception sensors with a field of view indicated by a triangle 411 emanating from the back of the vehicle 409.
  • the circle 413 centered around car 2 (409) depicts the range of short-range communication (e.g., V2P) from the vehicle 409.
  • V2P short-range communication
  • the circle 406 emanating from the smart space 401 shows the range of short-range communication (e.g., Wi- Fi) from the smart space 401.
  • the dashed line 408 illustrates the planned walking route 408 from door B to the vehicle 409.
  • the path 408 contains one blind spot 407 unreachable by sensor systems.
  • the path 408 is fully covered with short-range communication tracking (short-range communication ranges).
  • the vehicle 410 scans the area with the in-vehicle sensors (and sensor FOV 412) while driving.
  • a smart space camera 405 tracks the user until the user is out of range. Tracking continues via connected vehicle systems until the user enters the vehicle.
  • FIGs. 5 and 6 illustrate an exemplary personal security support process 500, 600 for when a user walks from a smart space building to a destination.
  • Exemplary embodiments are performed in systems in which the user has the PSSM application installed and running on the mobile user's device 501, and in which connected vehicles have the PSSM application installed and running.
  • Connected vehicles 503 in an exemplary embodiment have environment perception sensor systems capable of detecting and classifying people, objects, and events.
  • Connected vehicles 503 in exemplary embodiments further have short-range communication capabilities (e.g., Wi-Fi or V2P) relaying such data to the smart space security service 502 and a user's device 501.
  • short-range communication capabilities e.g., Wi-Fi or V2P
  • the smart space security service 502 in this example retains a list of IDs for people and vehicles recognized within the smart space (e.g., office building personnel or other people identifiable by the smart space security service 503).
  • the smart space security service 503 may also identify (e.g., via personnel tags, mobile devices, and camera-and camera-based recognition system) and track movements of people in the smart space and parking areas.
  • the smart space security service 503 may have real-time location information about PSSM users and connected vehicles in the area.
  • the system is capable in real-time of tracking the location of security personnel and performing security scans from vehicles. Such vehicles may be parked in the area or entered or left the area recently (e.g., in the last hour).
  • the smart space security service 503 has access to these vehicles and is permitted to use these vehicles for risk calculations.
  • An exemplary personal security support process for when a user walks from a smart space building to a destination comprises several steps.
  • a user launches the PSSM application 504 on the user's (mobile) device 501 and requests a safe walking route 505.
  • the route starts with the smart space exit location and ends with the destination, which may be, for example, another smart space building, a vehicle, or the same smart space building.
  • the smart space security service 502 identifies possible walking routes from the smart space exit to the destination.
  • the planning process 506 takes into account security monitoring coverage of the area.
  • the system searches for connected vehicles 507 in the area.
  • the smart space security service 502 determines which connected vehicles are located along possible walking routes and may provide additional security scanning and tracking coverage 511.
  • the smart space security service 502 sends a Start Security Scan 508 message to one or more vehicles perform a security scan 509.
  • the smart space security service instructs the selected vehicle(s) to use their sensors to start scanning.
  • the vehicle(s) determine(s) if the vehicle(s) detected one or more people in the area and exchange communication messages with the smart space security service to identify the detected person(s) 510.
  • the Vehicle Security Scan Response 512 includes a description of the area scanned by the vehicle (via field- of-view of its sensors) and the coverage provided by its short-range communication, a list of detected person ID numbers, a list of alerts (e.g., unidentified persons and other potentially alarming events detected), and related data (e.g., location and images of unidentified people).
  • the smart space security service initiates a security scan.
  • the smart space security service performs the security scanning of the area to search for people and vehicles.
  • the smart space security service obtains identification of the detected people and vehicles. Scanning continues until the user reaches the destination.
  • the smart space security service checks IDs (people and vehicles) 513 detected by the connected vehicle(s) and by the smart space's security monitoring devices against the list of IDs (people and vehicles) authorized by the smart space security service.
  • the smart space security service compiles all the security scan data together 514.
  • the smart space security service collects data for all areas covered with security scans performed by the smart space security service, connected vehicles, and short-range communication devices.
  • the smart space security service also gathers the results from security scans, person identifications made by the smart space security service and connected vehicles, and other available information sources (e.g., the current location of the security personnel) in relation to possible walking routes.
  • the smart space security service calculates the risk 515 of each route.
  • risk assessment ranks the walking routes for safety, calculates the coverage for the security monitoring and user tracking, calculates the potential risks, and generates alerts to the user if warranted.
  • a safety metric is determined for each route.
  • the safety metric may be, for example, a percentage of the respective route that has sensor coverage or a length of a portion of the respective route that has sensor coverage.
  • the safety metric takes into consideration additional factors, such as the presence or absence of unidentified individuals along the route, the quality of lighting along the route, and/or sensor data received from vehicles along the route.
  • a safety metric may be communicated to a user's device 501.
  • the smart space security service Based on the risk calculations, the smart space security service generates and sends to the user's device a Security Report 516.
  • the Security Report 516 may include information about areas covered with security monitoring and tracking, areas not covered (blind spots), areas capable of additional coverage via connected vehicles, alerts (including data and media), and suggested walking routes. This information is presented to the user.
  • the smart space security service 502 and connected vehicles 503 may use short-range communication systems to relay identity information of detected persons in the surveillance area and to perform user security tracking.
  • smart space personnel may have the PSSM application installed on their mobile devices.
  • the mobile device sends a Smart Space Exit message, which includes current location, heading, and speed of the mobile device, and the user's person ID.
  • the smart space security service receives person IDs and maintains a database of known people and their IDs.
  • the smart space may also maintain a database of information related to people and vehicles not matching an entry in the smart space ID registry (or database). This database of unmatched people and vehicles may be used to determine whether to send an alert message to a user's device.
  • the mobile device may send this message, for example, via V2P communication (802. l ip), Wi-Fi, or other short-range communication types.
  • V2P communication 802. l ip
  • Wi-Fi Wireless Fidelity
  • the messaging range varies according to the environment and exemplary embodiment used but the range typically falls within 100 to 1600 feet.
  • the vehicle performing security scanning and identification of people receives a person identity message, the vehicle will note the location and person ID of the detected person.
  • the vehicle uses the same method for user tracking. If the vehicle detects a person in the area during a security scan but the vehicle fails to receive a person identity message from that person, then the vehicle will record the location (and may capture an image of the person) and raise an unidentified person alert.
  • the exemplary personal security support process continues with selection of a route.
  • the user selects a route 517, and the PSSM application on the user's device 502 sends the selected route 518 to the smart space security service 502.
  • the smart space security service 502 identifies blind spots on the route 519.
  • the smart space security service 502 may send Coverage Support Requests 520 to autonomous vehicles to drive through an area or park in a specified location and direction to perform surveillance and tracking. Also, an autonomous vehicle may be requested to drive to a specific location and direction to perform such surveillance and tracking.
  • a smart space security service 502 may send a Coverage Support Request 520 to a connected vehicle to get additional surveillance and tracking data.
  • a smart space 502 may send route information to a manually-driven vehicle to perform surveillance in areas only if the manually-driven vehicle is parked in the area or driven in the area (e.g., entering or leaving the area).
  • Manually-driven vehicles may also record sensor readings while driving in an area and, if requested, provide a recording of an area if recorded recently.
  • a smart space may send information to an autonomous vehicle (AV) to cause the AV to provide sensor coverage for a blind spot in a route.
  • AV autonomous vehicle
  • a smart space may determine which portions of a route lack sensor coverage and determine a ranked list from largest to smallest based on size of those sensor gaps in the route.
  • a smart space may send information to an AV to move to an optimal location to provide a maximum coverage area of a sensor gap. For a sensor that measures objects with a radially-transmitted signal, an optimal location for an AV may be in the center of a sensor gap.
  • the AVs may be directed to locations such that overlap of the two sensor coverage areas is minimized while providing sensor data that covers a maximal amount of the sensor gap.
  • An optimal location for an AV based on this maximal coverage area calculation is communicated to an affected AV.
  • the exemplary personal security support process continues further as the user traverses the route.
  • the process entails several steps as the user walks from the smart space to the destination.
  • the smart space security service 502 begins security tracking.
  • the security tracking may begin when the user exits the smart space and starts walking towards the destination.
  • the smart space security service 502 may use various methods to track the user (e.g., security monitoring (camera) sensors and location updates periodically sent from the user's mobile device).
  • security monitoring camera
  • the smart space security service 502 informs vehicles 503 on the walking route to start security tracking.
  • the vehicles 503 use perception sensors and communication systems to track the user.
  • the smart space security service 502 provides the user with the best potential walking route.
  • the connected vehicle 503 estimates the user's direction of approach.
  • the vehicle 503 starts sending to the smart space security service 502 the user identification and location information.
  • the system updates the security tracking status.
  • a system activates security tracking from the smart space security service or a connected vehicle, the system periodically sends a status message to the user's mobile device.
  • the user's mobile device receives tracking status information (e.g., where the user is in relation to the areas covered) and the ID of the vehicle supporting the security tracking.
  • tracking status information e.g., where the user is in relation to the areas covered
  • the smart space security service 502 instructs the vehicle 503 to stop security scanning and tracking.
  • the exemplary personal security support process continues as the user reaches the destination.
  • the PSSM application sends a message to the smart space security service when the user arrives at the destination.
  • the smart space security service may then stop the security scanning and tracking of the user.
  • FIG. 5 illustrates an exemplary set of messages that may be exchanged when a user indicates a desire to walk from the smart space to a destination.
  • the user requests information on a safe route from the smart space to a destination.
  • the user executes the PSSM application on the primary device (capable of short-range communication, such as V2P or Wi-Fi) and enters the destination.
  • the smart space security service 502 calculates one or more walking paths between the smart space and the destination and begins evaluating the possible routes.
  • the smart space security service 502 issues a security scan request to available local connected vehicles 503. These vehicles respond with data from multiple sources, such as vehicle cameras, LIDAR, and thermal sensors.
  • the returned report identifies security risks (e.g., obstacles, a dark path, and any alarming alerts), the person ID of any identified person on the route (which may be identified using, for example, P2V or RFID communications), and available data on unidentified people (e.g., location and image of the person).
  • security risks e.g., obstacles, a dark path, and any alarming alerts
  • the person ID of any identified person on the route which may be identified using, for example, P2V or RFID communications
  • available data on unidentified people e.g., location and image of the person.
  • the smart space security service 502 and one or more connected vehicles 503 repeat this process on a continual basis until the user reaches the destination or is out of range of the connected vehicles.
  • the smart space security service 502 looks up person IDs in a smart space database to match detected people with known identities and to flag unidentified people as an alert.
  • a database may be used to store information about unidentified people and vehicles.
  • a safety metric is calculated based on whether information about unidentified people and vehicles matches an entry in a database of known identities.
  • a safety metric is decreased if information about unidentified people or vehicles matches an entry in a database of unidentified people and vehicles.
  • the smart space security service 502 updates the security profile and sends the user a report about the route. This report may include, for example, alerts and risk assessments of the possible routes.
  • the user selects a route, and the PSSM communicates to the smart space security service 502 the particular choice, along with the user ID. If the smart space security service 502 detects a blind spot along or near the route, then the smart space security service 502 may instruct a nearby connected vehicle 503 (if available) to drive towards the blind spot to increase surveillance coverage or to perform a security scan.
  • FIG. 6 illustrates an exemplary embodiment 600 for a user to traverse a route.
  • the user starts walking the route from a smart space building to a destination 604.
  • the user's device 601 sends periodic location updates 606 to the smart space security service 602.
  • the smart space security service 602 also may track a user's progress 607 via sensors (e.g., beacons and cameras) and determine if a user starts traversing a different route.
  • the smart space security service 602 instructs 608 vehicles 603 along the route to begin security scans 609.
  • An autonomous vehicle may be requested to drive through an area or park at a specified location and direction to do surveillance and tracking, or may be requested to drive to a specified location and direction to do the surveillance and tracking 605.
  • a smart space security service 602 may send a request to a connected vehicle to get additional surveillance and tracking data.
  • a smart space 602 may send route information to a manually-driven vehicle to perform surveillance in areas only if the manually- driven vehicle is parked in the area or driven in the area (e.g., entering or leaving the area).
  • Manually-driven vehicles may also record sensor readings while driving in an area and, if requested, provide a recording of an area if recorded recently.
  • a plurality of routes that extend from a starting location to a destination location may be calculated and communicated to a user device 601.
  • the starting location may be a midpoint location along a route or a location not associated with a route.
  • Security information such as security metrics and alerts, may be communicated to a user device 601.
  • a user may select one of the plurality of routes and a user device 601 associated with the user may send a message to a smart space 602 to indicate the user's selected route.
  • the smart space security service's requests 608 may include the user ID, route identification, and route direction. These vehicles 603 respond to the smart space security service 602 with the user ID and location upon the vehicle's detection of the user 610.
  • the smart space security service may send a Security Tracking Status 611 to a user's device 601 to provide security status information to the user.
  • the PSSM notifies the user 615 about the vehicle's detection of the user. If the user passes by a vehicle, the smart space security service 602 may instruct 612 the vehicle 603 to stop scanning 613.
  • a connected vehicle 603 sends a User Passed Response 614 to the smart space security service 603 if scanning is stopped. If the user arrives at and enters the destination 618, the user may stop the security assistance by sending the user ID 616 to the smart space security service 603, which stops tracking 617. The user may receive a report detailing any detected people located at or near the destination.
  • a smart space 602 and/or a connected vehicle 603 may use RFID tags or keys, usual personnel movement patterns (e.g., from an exit to a parking location), mobile device tracking and identification, and camera-based identification of people (e.g., facial recognition) represent a few exemplary options for detecting and identifying people in an area.
  • the user may request extra safety or surveillance actions or a vehicle pickup (which may include sending to the smart space security service the user ID and exact location).
  • Autonomous vehicles may provide security scanning while driving.
  • a smart space security service may instruct an autonomous vehicle to scan a specific area, which may include locations smart space sensors fail to reach.
  • a smart space security service may also instruct a vehicle to travel to a particular location and direction to park for pick-up or drop-off to optimize sensor coverage.
  • a security service as described herein may be used by a driver of a vehicle arriving at a smart space.
  • the security service may recommend or select an available parking space to the arriving user based at least in part on the availability of a safe walking route from the parking space to the smart space.
  • the smart space security service may instruct the vehicle to navigate to the selected parking space.
  • the parking space may be selected at least in part based on weather conditions, e.g. favouring a parking space that is nearer to a covered route when it is raining.
  • the security service may select the safest available parking place by analyzing smart space security monitoring coverage and availability of connected vehicles (which provide additional security scanning capability). A walking route calculation may take into account routes with shelter for bad weather.
  • the smart space security service may use weather parameter sensors to record the temperature or the presence of rain, snow, hail, clouds, fog, humidity, wind, and the like.
  • the user' s device may request additional security support when the user enters or exits the parking lot or parking location of the vehicle.
  • FIG. 5 illustrates an exemplary messaging process for scouting one or more routes before the user begins to traverse a route.
  • FIG. 6 illustrates an exemplary messaging process when the user traverses a route.
  • the messages shown in FIGs. 5 and 6 may be used in a messaging protocol used to support exemplary embodiments described in this specification. Each message is described below. It should be understood that embodiments need not be limited to the use of the specific messages and fields of messages described below, and the messages described below may not be employed in all embodiments. It should further be understood that additional messages such as acknowledgements and the like may be employed in an exemplary protocol but are not described here in detail.
  • a Smart Space Exit Walking Route Request may be sent by a user's device to start the route planning process by requesting a walking route.
  • the User Information field may include at least the user ID.
  • the Destination field contains the destination address (or coordinates) for the walking route. If the user is walking to his or her car, then this field reconfirms to the smart space security service the location of the vehicle.
  • the smart space security service sends a Start Security Scan Request to a connected vehicle to begin a security scan.
  • the vehicle and the smart space security service may use short- range communication systems and a smart space database to identify detected people.
  • the V2P communication may send Collected Person ID messages that contain a detected person's location, speed, direction, person ID (if known), and the vehicle ID of the vehicle that detected the person.
  • a connected vehicle may send a Start Security Scan Response to a smart space security service in response to a Start Security Scan Request message.
  • This message includes fields for the area, the communication coverage, the person ID(s), and the alert ID.
  • the Area field provides a description of the area covered (e.g., via corner coordinates) by the vehicle security scan. If the vehicle is stationary, then the area is the field-of-view of the vehicle sensors. If the vehicle is moving, then the area is the entire area scanned while driving.
  • the Communication Coverage field provides an estimate of the area covered by short-range communication (e.g., V2P or Wi-Fi communication). The system uses such an estimate for tracking a pedestrian.
  • the Person ID field lists person ID(s) and locations of anyone identified by the vehicle.
  • the Alert field contains a list of potential security or safety concerns detected. For example, this field may contain unidentified persons, their locations, and associated images (which may be taken by the vehicle camera). The Alert field may also contain descriptions of obstacles, an indication of darkness, and a description of other potential risks (such as high speed vehicles (e.g., an electric bike) on the sidewalk or walking area).
  • high speed vehicles e.g., an electric bike
  • a Security Report message combines the content of the Start Security Scan Response message with other sources of information to report data from the smart space security service to the user's device.
  • the security report includes fields for security data, routes, alerts, and additional information.
  • the Security Data field lists at least four categories of areas: areas covered by security monitoring, areas covered by security tracking, areas not covered (blind spots), and blind spots which may be covered by vehicle scans. Each of these areas may be reported as a list of corner coordinates.
  • the Routes field provides a description of suggested walking route(s) and may be reported as a list of waypoint coordinates. This field also reports the percentage of each route covered with security monitoring and tracking devices.
  • the Alert field reports security alerts
  • the Additional Information field includes the number, identity, and location of identified persons (e.g., personnel) currently present on the route as well as the number, identity, location, and next task (e.g., handle a security alert) of security personnel on the route.
  • identified persons e.g., personnel
  • next task e.g., handle a security alert
  • a user' s device may send a Route Selection message to the smart space security service to communicate the user's selected route.
  • the message includes the user ID and the selected route's ID.
  • the smart space security service may send a Coverage Support Request to a connected vehicle to scan areas.
  • This message includes area, location, and direction fields.
  • the Area field lists the area through which the smart space security service requests the vehicle to drive and to perform a security scan. This field may be reported, for example, as corner coordinates.
  • the Location field lists the location of the desired parking space (e.g., as coordinates).
  • the Direction field contains the desired heading of the vehicle (e.g., as degrees from north).
  • a Location Update message may be sent by a user's device to the smart space security service to provide updates on the location of the user.
  • the message includes the user ID and his or her location, which may be reported as coordinates.
  • a smart space security service may send a User Walking message to a connected vehicle that indicates that the user has started traversing the route.
  • the message includes the user ID (the ID of the person to be tracked) and information identifying the route (the most likely route the user will traverse).
  • the route may be reported as a list of waypoint coordinates.
  • a User Walking Response may be sent by a connected vehicle to a smart space security service when the user comes within range of the vehicle.
  • the message includes a user ID and a current location of a user (which may be reported as coordinates).
  • the smart space security service may send a Security Tracking Status periodically to a user' s device to provide a security status of a route.
  • This message includes fields for tracking status and vehicle ID(s).
  • the Tracking Status field details the status of the security monitoring (e.g., the user's location within the security monitoring coverage map).
  • the field also provides the status of the security tracking, which includes the user's device short-range communication connection type.
  • the Vehicle ID field lists the vehicle identifications (e.g., license plate numbers) currently supporting security tracking.
  • the smart space security service may send a User Passed Connected Vehicle message to a connected vehicle when the user passes by the connected vehicle.
  • the message contains a field for the user ID and a field for the connected vehicle's ID.
  • a connected vehicle may send a User Passed Connected Vehicle Response to the smart space security service in response to a User Passed Connected Vehicle message. This response message indicates that the connected vehicle will stop scanning for security issues related to the present walking route.
  • a user's device may send a User Reached Destination message to a smart space security service when a user reaches a destination.
  • the message contains a field for the user ID.
  • FIG. 7 shows an exemplary system 700 for interfacing with a user via a user's device 706, 712.
  • a system may dynamically provide to the user through his or her personal device(s) the status of surveillance and security data. See an example status report on the device 706 shown on the left side of FIG. 7.
  • the example status report contains two routes, which may be selected by route buttons 701. For this example, extra surveillance was brought along the route, as selected by a button 702.
  • the user's device also displays a status 703 with percentages of the route that had surveillance 704 and security 705.
  • FIG. 7 shows a dynamic route security assessment 708 as seen on a smart watch 709, for example, along with tactile or auditory feedback 707.
  • a system After a user arrives at the destination, a system generates a detailed report 710. The user may submit the report for system improvements or other purposes by pressing a button 711. See the right side of FIG. 7.
  • FIG. 8 shows an exemplary embodiment 800 for calculating a safety metric for a route.
  • the system evaluates the amount of coverage for security monitoring and user tracking and calculates a safety metric based on this assessment.
  • the calculation of the safety metric may be influenced by several positive factors, the presence of which may increase the value of the safety metric.
  • Positive factors may include length of the route covered with security monitoring (e.g., cameras), length of the route covered with tracking functionality (e.g., communication devices, cameras, and sensors), availability of recent security scans of blind spots (e.g., images captured by connected vehicles while driving), availability of additional connected vehicles to cover blind spots, the number of known persons on the route, and the number of security personnel on the route.
  • the calculation of the safety metric may further be influenced by several negative factors, the presence of which may decrease the value of the safety metric. Negative factors may include the length of the route not covered (blind spots), the number of unidentified people on the route, the length of the route not illuminated by good lights when traveling at night, and the presence of possible alerts (e.g., potential security or safety concerns detected or other risks) on the route. Also, the walking distance may be treated as a neutral factor if the route is short, and the walking distance may be a negative factor if the route is long.
  • the safety metric for each potential walking route is ranked from 0 to 100, where 0 is completely unsafe and 100 is very safe.
  • a route completely covered with security monitoring and tracking with no blind spots, unidentified people, security alerts, or dark sections would receive a safety metric of 100.
  • the level of security monitoring and tracking provides the basis for the risk metric calculation.
  • FIG. 8 demonstrates an example scenario for a risk metric calculation.
  • the safety metric is calculated as an average of the percentage of security monitoring coverage and the percentage of tracking coverage. For example, if the security monitoring coverage is 87.5% and the tracking coverage is 92.5%, the value of the safety metric is 90.
  • the safety metric may be adjusted based on other available information. For example, if a route section contains unidentified people, no security personnel, and no identified people, then the safety metric is reduced. If one or more of the unidentified people are in a dark area, then the safety metric is reduced even more. For example, if a route is divided into ten sections, one embodiment reduces the safety metric by ten points for each section containing an unidentified person. If one or more of these unidentified people is in a dark area, then the safety metric is reduced by twenty points. Additionally, each alert (e.g., a potential security or safety concern or other detected risk) reduces the safety metric by twenty points.
  • each alert e.g., a potential security or safety concern or other detected risk
  • the initial safety metric minus the adjustments is calculated for each potential route and presented to the user as the final safety metric.
  • the safety metric may factor in risk assessments made for other users.
  • the calculation of the safety metric may take into consideration data obtained from external sources (e.g., crowdsourced data or big data methods).
  • FIG. 8 is a schematic plan view of a parking lot outside of a smart space in an exemplary embodiment.
  • the user walks a route towards the destination.
  • the route (or walking path 815) is broken into eight equal segments, delineated by points labeled with the letters A through I.
  • Point A is the front door of the smart space 801, and point I is next to the target 809.
  • Smart space security cameras 802, 803 cover the segments extending from point A through point C.
  • the security camera ranges 804, 805 and the directions of each camera's viewing angle are known values.
  • An autonomous vehicle 806, 807 is directed along a route 813 to a new space to provide additional coverage between points C and E based on the autonomous vehicle's 807 short-range communication region 810.
  • the autonomous vehicle's 807 cameras and RADAR and LIDAR systems may be used to provide additional surveillance, tracking, and monitoring coverage.
  • the range and direction of those systems are known values.
  • a parked autonomous vehicle 808 provides coverage between points E and G based on the parked vehicle's 808 short-range communication region 811.
  • the parked autonomous vehicle's 808 cameras and RADAR and LIDAR systems provide coverage.
  • the RADAR and LIDAR system components are not shown in FIG. 8 due to size.
  • transceivers for RADAR and LIDAR systems may be mounted on the roof of a vehicle, but other embodiments may mount them in different locations.
  • the range and direction of those systems are known values.
  • a blind spot 814 exists between points G and H.
  • the target vehicle 809 provides coverage between points H and I based on the target vehicle's 809 short-range communication region 812.
  • the vehicle's cameras and RADAR and LIDAR systems provide coverage.
  • the range and direction of those systems are known values.
  • sections C to E and E to G receive
  • section scores of - Section H to I is fully covered by the target vehicle, so it receives a score of -
  • the blind spot for section G to H receives a score of 0.
  • a user Christina, studies in a university and she frequently has evening badminton practices. After practice, she walks alone through the university campus to her apartment. She installs the PSSM application on her mobile phone. One evening, Christina leaves the university sports hall and plans a walking route back to her apartment. She launches the PSSM application and sets the destination. The application updates the latest security information from the smart space security service and provides three walking routes with security details from which to choose.
  • the shortest route contains a security alert (several unidentified persons in a dark alley), so she skips this route.
  • the second route lacks any security alerts but security monitoring covers only 60% of the route while security tracking (communication) covers 80% of the route.
  • the second route contains several blind spots which smart spaces or connected vehicles cannot cover currently. Therefore, Christina chooses the longest route, which provides security monitoring of 85% of the route and security tracking of 95% of the route.
  • the third route contains two security blind spots close to parking areas in the apartment.
  • a security report indicates several identified persons (e.g., students or personnel) walking in this area. The same report mentions an autonomous vehicle and a connected vehicle available to cover the blind spots and provide security monitoring and tracking.
  • the PSSM application on her mobile phone shows security tracking activity.
  • the PSSM also indicates that the short-range communication and tracking link is connected to the sports hall smart space and that a connected vehicle is parked in the parking area. While Christina walks, the application shows the security tracking status, and it beeps when she gets to a blind spot and loses security tracking. While in the blind spot, Christina speeds up her walking until a car comes around the corner. Her phone reports that security tracking connected again with this security surveillance- supporting car. Christina arrives at her apartment, and the PSSM application on her phone shuts down.
  • a user leaves her office late at night. She launches the PSSM application on her smartphone to see the security situation.
  • the PSSM application shows her a map of the walking route from her office building to her car in the personnel parking lot.
  • Her smartphone PSSM application displays one active, potential security alert in the area. She opens the alert message and sees that one of the cars, which just entered the parking area, detected an unidentified person between two parked vehicles.
  • the application also informs Maria that security personnel have been dispatched to identify the person. Maria decides to wait until security personnel assess the situation.
  • Maria After a while, Maria checks the security status via the PSSM application and discovers that the situation returned to normal. Smart space and vehicle camera monitoring cover 85% of the suggested walking path to the car, while communication tracking covers 100% of the route.
  • Maria walks to the parking garage door, and the PSSM application reports the location of her car and the recommended walking route to it. The application shows that security tracking is active and that the communication link connected to the car and to the smart space.
  • Maria arrives safely at the car, she starts to drive home. Maria notices that her car's PSSM application performs security scanning while she drives out of the parking lot. The car and cell phone PSSM applications shut down when Maria's car leaves the parking area.
  • a user leaves her office late at night. She starts the PSSM application on her smartphone. Liz instructs her autonomous car to pick her up at the office building's rear exit. Her smartphone application indicates that the car will be at the exit in 5 minutes. Together, the smartphone and the smart space security service perform personal security and safety monitoring as the autonomous vehicle approaches. After 3 minutes, the smartphone application beeps when it receives a security alert. Liz opens the alert message and sees that the car detected an unidentified person between parked vehicles. The application shows an image of the situation. The PSSM also shows an indoor map of the areas covered during the security scan. Liz decides to take another exit and summons the car to the front door of the office.
  • the PSSM application reports to Liz where the car is waiting.
  • the application reports that walking path surveillance covered the complete route.
  • the PSSM also indicates that personal security and safety tracking is active and that the communication link connected to the car and to the office smart space.
  • the PSSM application on her mobile device switches to security tracking mode.
  • Liz reaches the car she instructs the car to drive her home.
  • FIG. 9 shows an example screen shot 900 of an exercise route 902 displayed on a map 904 within an application running on a user's device.
  • An example exercise route 902 is shown as a set of thick, bold lines, while the other streets are shown as thin lines.
  • a user 906 is currently at a hotel on the corner of North Avenue and First Street.
  • an exercise route 902 is created that starts at the user's current location 906 of North and First.
  • the example route goes south down First Street to Tenth Avenue and continues to follow a T-shaped route through a portion of a city. The route returns to the starting location with a portion of the route along North Avenue.
  • FIG. 10 is a functional block diagram of an exemplary embodiment 1000 of a system for communicating route planning data to a user device 1002 (such as a bike computer 1012, wrist device 1014, or primary device 1016).
  • the device level 1040 includes physical devices, such as vehicle sensors 1068, user devices 1002, vehicles 1054, and other physical hardware. These items include both sensors that measure a route's conditions and devices that interface directly with the user.
  • the device level 1040 also includes other physical devices, such as base stations, building cameras, and other physical items used to survey potential routes.
  • Device level 1040 items may communicate by wide area communication networks 1042 and V2P and/or Bluetooth networks 1044.
  • the service level 1030 items may include user device 1002 components of a route planning application 1004, such as route preference selection and control 1006, vehicle interaction 1008, and user tracking services 1010.
  • the service level 1030 items may also include route planning service 1022 components of a route planning application 1004, such as a route planner 1026, map data and map interface 1028, and a vehicle command module 1032.
  • the service level 1030 items may also include route planning service 1022 components, such as sensor information sharing service 1034, sensor information processing service 1036, and external APIs 1038.
  • External services 1046 may include service level 1030 components such as a weather service 1048, a map data service 1050, and traffic information 1052.
  • the service level 1030 items may also include route planning application 1004 vehicle 1054 components, such as a route reporting module 1058, a driving route module 1060, a vehicle provider 1062, a sensor data interface 1064, and a sensor access module 1066.
  • a route planning service 1022 may include user profiles, augmented map data, and route history 1024.
  • the information level 1020 may also include a vehicle 1054 component of map vehicle data 1056. The components as shown in FIG.
  • the information level 1020 elements relate to profile-type data, such as user profiles, vehicle profiles, general map data, route history, and augmented map data, for example.
  • FIG. 11 shows a sample user interface (UI) 1100 within a vehicle for visualizing a planned (and accepted) route and for providing control for dynamic route planning.
  • the UI 1100 may be shared with other stakeholders (e.g., a coach or personal trainer).
  • the picture shows two optional routes for selection by user, shown with a glow (which may be, e.g., a red glow) and one solid line 1111 or two solid lines 1107 and an exclamation point in a route description 1120.
  • a route may be planned or set up using static data before exercising or traversing the route. Data may have already been received via autonomous vehicles that were on the scouting route.
  • the user interface (UI) 1100 displays the details of the route with specific notation of segment details and alternatives.
  • a route starts with the word "Start" 1101.
  • the first route segment 1102 is asphalt with a headwind.
  • the second route segment 1103 is gravel and hot but roofed.
  • the second segment ends with a pick-up/re-plan point 1104.
  • the user is picked up by an AV for the third route segment 1105 due to the segment being sunny, crowded, and with stops.
  • the segment 1105 ends at a point marked "Get off 1106.
  • the user has a choice of two routes.
  • the first route includes a fast tailwind route segment 1107.
  • the second route includes one segment 1111 with lane changes and a second segment 11 12 with potholes that is traveled by car from a pick-up/re-plan point 1113 to a get off point 1108.
  • the two routes join back together for a route segment 1109 through a cold tunnel that ends at a point marked "Stop" 1110.
  • a screen of the user interface displays the proposed route, potential routing issues, and relevant route data (such as weather and road conditions, as well as route type, elevation data, and other desired information).
  • the top of the screen 1117 shows the start point ("Hotel") and end point ("Office") of the route.
  • Different colors may be used along with a key to convey various information. For example, one segment of the route may be colored red, and a key may indicate that the red segment is paved with asphalt and is experiencing a headwind. Another segment of the route may be colored yellow, and the key may indicate that the yellow segment is roofed, is hot, and is covered with gravel.
  • Another segment of the route may be colored green, and the key may indicate that the green segment is experiencing a tailwind and requires lane changes or crossing the street from one sidewalk to another one on the opposite side.
  • Another segment of the route may be colored blue, and the key may indicate that the blue segment is experiencing a tailwind and is a very fast segment.
  • Dashed lines may be used to show sections of the route where the vehicle transports the user from one location to another location. Short dashed lines are portions traversed by vehicle due to potholes, while long dashed lines are portions traversed by vehicle due to crowds or other obstructions.
  • a series of solid black circles show asphalt route sections that also have a headwind.
  • An alternating series of black circles and dashes show gravel route sections that also have a roof over the route.
  • the map also labels pick-up and drop-off points along the route.
  • the user interface allows the user to turn on or off the options of vehicle pick-up 1115 and dynamic re-plan 1116. In the upper right corner, the user has the option to accept 1118 the route as shown or to re-plan 1119 any portion of it.
  • the system After a vehicle finishes the route check, the system provides a user interface for planning a route, visualizing a route's details, setting controls for a route, and accepting the route.
  • a system may perform the route check dynamically. Such a method skips the initial pre-check of the entire route but moves the vehicle in front of the user by a preset distance and reports conditions as encountered. For example, the vehicle travels a block or two in front of the user (depending on the user's speed) and updates the user on upcoming conditions.
  • the user interface may show the overall view of the planned route with the ability to update the start, stop, pick-up, or break locations. Changes to the route trigger re-planning or dynamic planning.
  • the user interface illustrates main sections of the route and labels road and weather conditions, road width, the presence of traffic and parked cars, and the presence of tunnels, roofed areas, and open areas.
  • the Route Planning Service may provide additional route sections if alternates meet the user's set criteria.
  • the user interface allows the user to modify the route to swap in an alternative route if one exists.
  • FIG. 12 shows a series of exemplary notifications that may be provided to a user.
  • the system provides notifications to the user via a personal device, such as a smartwatch 1201, 1202, 1203, 1204, 1205, 1206, 1207, 1208 or a cell phone.
  • Notifications may include, for example, "change to other side of the road” 1209, “increase (or decrease) your speed” 1213, “immediate request for car sent” 1215, "pick-up due to safety/bad route (accept/decline)" 1210, 1211, “drink/snack/break (accept/decline)” 1214 or warnings of loose gravel 1216 / potholes 1212 down the road.
  • the accept/decline feature allows the user to agree with the system's suggestion by accepting the message or to reject the suggestion by declining the message.
  • the user may call the car or make alterations to the route.
  • the user interface may utilize visual and tactile interaction modalities 1217, 1218, 1219, 1220, 1221, 1222, 1223, 1224 for notifying the user.
  • Button presses or voice response commands received from the user notify the system of the user's awareness of the notification.
  • the application for example, defaults to triggering the car to come to the user with a simple button press of the user's smartwatch unless the user's preferences state otherwise.
  • sensor information collected by oncoming cars is collected, processed, and used to guide the route of a user.
  • FIG. 13 One such embodiment is illustrated in FIG. 13.
  • the method of FIG. 13 may be implemented in a setting in which an appropriate version of the Route Planning Application is installed on the user's device, the nearby cars have a sensor access module installed, and sensors are available (e.g., rear-view, side-view, and front-view cameras).
  • a system starts by collecting route data. This process includes the steps of activating collection of route data, querying the RPS for sensor data collection parameters, and beginning collection of route data.
  • Activating collection of route data begins after the user 1302 starts 1320 the route planning application 1306 and activates vehicles nearby (e.g., in a range of 3 miles) that have a sensor access module to collect route data by using vehicle sensors.
  • the route planning application 1306 on the user's device 1304 sends a Road/Street Data Collection Request 1322 to the route planning application service 1310, which then delivers to nearby vehicles 1312 the request 1324 for the sensor access modules.
  • the route planning service 1310 is based in the cloud 1308.
  • the sensor access module 1314 sends a Sensor Data Collection Parameters Request 1326 to the route planning service 1310.
  • the route planning service 1310 sends a Sensor Data Collection Parameters Response 1328 that replies with the type sensor data sought.
  • the user 1302 may be interested in events that may be identified from video streams.
  • nearby cars that have a sensor access module 1314 in continuous sensor data delivery mode may continually deliver Sensor Information Notifications 1332 for a sensor information sharing service 1316.
  • the delivery of sensor information may depend on sensor data collection parameters queried in a Sensor Data Collection Parameters Request 1326.
  • the vehicle's sensor access module sends a vehicle data key notification 1334 to the user's route planning application 1306.
  • the user's route planning application 1306 may send the vehicle data key notification 1336 to the sensor data processing service 1318.
  • Utilizing the shared sensor data includes the steps of fetching sensor data, analyzing sensor data, enhancing route data, and notifying the user.
  • the sensor data processing service uses the sensor data key and sends a Sensor Information Request 1338 to the sensor information sharing service 1316.
  • the sensor information sharing service 1316 responds with the collected vehicle sensor data in a Sensor Information Response 1340.
  • the sensor data processing service 1318 analyzes the sensor information and extracts street/road events (e.g., road, bike lane, or sidewalk blocked, traffic jam, and traffic accident) that may cause changes to the user's route.
  • street/road events e.g., road, bike lane, or sidewalk blocked, traffic jam, and traffic accident
  • the sensor data processing service 1318 sends a Street/Road Event Notification 1344 to the route planning application 1306.
  • the application 1306 uses the identified street/road events to update the route plan. To notify the user of the new sensor data, the route planning application 1306 sends a Route Update Notification message 1346 to the user 1302 (which may occur by displaying the message on a user interface). The user's device 1304 notifies the user 1302 about the recognized street/road events and shows an update route for the user.
  • oncoming vehicle sensor data may be reported to a user using the following steps.
  • the user executes a route planning application on his or her device. Vehicles providing sensor data to this service are registered with the application (which occurs in the RPS cloud backend). The user requests sensor data regarding the road/street.
  • the application (or associated RPS cloud service(s)) distributes this request to vehicles. Vehicles query the service about the parameter set and what sort of sensor data (for instance, front or back camera data) the user needs. Vehicles now stay in a constant data collection mode. Vehicles provide the sensor data to the application.
  • the application aggregates all the data from all vehicles in the processing engine and filters it down to street/road events that may cause changes to the user's route.
  • Vehicles provide a key to enable the user to access the sensor data.
  • the user sends the key to the application to access the specific vehicle information.
  • the application then provides sensor information for the specific route to the user's device. Based on this data, the user's route may be updated, and the user's device is notified of the change.
  • FIG. 13 offers an example of how to retrieve vehicle sensor data from oncoming vehicles and report it to the user.
  • the protocol may include the following messages and data fields.
  • a Road Data Collection Request may include a user's location.
  • the sensor capabilities field may list desired sensor data.
  • a Sensor Data Collection Parameters Response may list the sensor configuration parameters (e.g., how many video frames should be recorded in second) and which sensors will provide sensor data.
  • a Sensor Information Notification a sensor information field may list information provided by a vehicle's sensors.
  • the vehicle data key field may enable access to a vehicle's sensor data.
  • a Sensor Information Request may use a vehicle data key field.
  • the sensor information may be the data provided by a vehicle' s sensor.
  • the road events field may list identified events present on the road.
  • the locations for road events field lists the locations for the identified road events.
  • a Route Update Notification provides details of a route update.
  • the updated route field lists the route that has been produced based on the latest sensor data.
  • the road items/events field lists road events that have been identified, such as: trash, people loitering, sidewalk condition, pets, cardboard boxes, roofed sidewalks, micro weather, and the width of sidewalks.
  • the locations for road events field lists locations for the identified road events.
  • the message also includes the number of street crossings on the route and optionally how many street crossings contain traffic lights.
  • the process for route selection may be based on static and dynamic information about the user's preferences and actual information on road conditions and other observations along the route, which may be obtained via vehicle sensors or other available sensors.
  • An exemplary route selection process, illustrated in FIGs. 14 and 15, is described below.
  • the appropriate version of the route planning application is installed on the user's device 1402.
  • the vehicle is coupled with the application on both the user' s personal device and the Route Planning Service (RPS) 1404 cloud service.
  • RPS Route Planning Service
  • the user should authorize the RPS 1404 to use the vehicle and its sensors to detect dynamic conditions along potential routes during route selection.
  • the user enters dynamic preferences for route selection on his primary device.
  • the preferences may include, e.g., maximum duration of the route on given method of transportation (such as cycling) or maximum allowed distance for duration of the route, maximum allowed distance to starting or other location along the route, or minimum lighting level along the route.
  • maximum duration of the route on given method of transportation such as cycling
  • maximum allowed distance for duration of the route maximum allowed distance to starting or other location along the route, or minimum lighting level along the route.
  • the user may set whether the car should check the entire route beforehand or whether the vehicle should make piecewise checks in front of the user. For piecewise checks, the user may set maximum distance allowed for the vehicle to be in front of the user.
  • FIG. 14 shows one embodiment 1400 of a route selection process.
  • the user activates the route search on his or her primary device and the user's device 1402 sends a Route Planning Request 1410 to a route planning service 1404.
  • the RPS 1404 uses a selection of inputs to generate potential routes for the user, including: (1) dynamic parameters 1408 given by the user, including user location; (2) previously recorded user data and preferences such as fitness level and workout history; (3) known routes near the user's location, including map data for the routes, such as road inclination and previously gathered data on the routes, such as road conditions, the presence and condition of sidewalks, and the presence of light posts; and (4) past and forthcoming weather information and any weather related historical information.
  • the service After finishing the initial route search, the service generates 1412 a list of candidate routes.
  • the RPS 1404 instructs a vehicle to scout the route candidates.
  • a Vehicle Use Permission Request 1414 is sent by the RPS 1404 to a user's device 1402 and a Vehicle Use Permission Response 1416 is sent back granting permission to use a vehicle.
  • the RPS 1404 sends a Vehicle Use Permission Request 1418 to a route planning application 1406, which checks the grant, determines availability, and checks vehicle fuel and battery status 1420, which sends back a Vehicle Use Permission Response 1422 based on a check of the grant, a determination of availability, and a check of vehicle fuel and battery status 1420.
  • the RPS 1404 also may send a Sensor Capability Request 1424 to a route planning application 1406 and receive a Sensor Capability Response 1426 in return.
  • the RPS 1404 determines reporting requirements 1428 and sends Section Scouting Requests 1430.
  • One exemplary process includes the methods of sending a Section Scouting Request 1430 for the first candidate route, scouting each remaining candidate route or requesting to scout a new route, sending the candidate routes to the user for selection, selecting a candidate route by the user, issuing a start notification to the user, and updating the route map.
  • RPS 1404 requests the vehicle to observe a section of a candidate route.
  • An exemplary request 1430 contains section coordinates, report content descriptions, and the scouting type. Section coordinates may be specified as GPS waypoints for the start and end points.
  • the report 1434 content description contains factors to include in the route section report, such as: sidewalk information (presence or absence, condition such as potholes, and sidewalk width); road conditions (e.g.
  • the scouting type field contains whether the vehicle should scout only the given/shortest route from section beginning to section end, or whether it should scout all possible routes shorter than a given threshold.
  • the vehicle may send a route section report 1434.
  • the report 1434 may contain information on one or more alternative routes between given endpoints.
  • the report may contain: section information, such as route coordinates; information specified in the report content description, as well as any hazards recognized by the vehicle, such as slippery spots, or puddles of water on sidewalks, or loose gravel on road after a steep road decline; and potential blockages, such as cars parked on sideways.
  • the vehicle may scout successive sections until done or request to scout another route if the first route is done. Based on the route section report(s), RPS 1404 may instruct the vehicle to scout the next section along the overall route, or if none of the reported routes for the section fulfill the criteria, request the vehicle to scout another route for the section.
  • the vehicle After finishing scouting or performing enough scouting to enable the user to select a route, the vehicle sends 1438 candidate route alternatives to the user for selection.
  • the user selects 1440 the preferred route section(s) from the given alternatives.
  • the RPS 1404 updates map and route information 1442 based on the user's route selection.
  • the RPS 1404 sends a Section Scouting Request 1444 to a route planning application 1406 and receives a Section Scouting Report 1446 in return.
  • the service issues a start permission 1450 to the user after it has determined that enough route has been covered 1448 so that the rest of the route may be updated 1452 as the user moves along the route.
  • the decision is based on, e.g., historical information on user's speed, route profile (road inclination / workout level), and weather conditions.
  • the service updates a map of the route on the user's device as more route sections are scouted 1454.
  • the map contains information about the planned and confirmed sections of the exercise course, including static information such as road inclination, calorie burning rate, as well as timestamped, dynamic information, such as roadside lighting levels and any observations along the route (e.g., "wet cardboard detected 6 minutes ago").
  • FIG. 15 continues one embodiment 1500 of the process shown in FIG. 14.
  • the user starts traversing the route 1508.
  • the user's device 1502 may send a Start Indication 1510 to the route planning service 1504.
  • the user may, for example, walk, run, cycle, skateboard, roller blade, or hoverboard along the given route.
  • the vehicle keeps scouting for new sections and alternatives, and the route planning service 1504 may receive a Section Scouting Report 1516 from a route planning application 1506. New or updated information may be received or determined by the route planning service 1504 that is sent to a user' s device 1502 in a Route Update 1512.
  • Location Updates 1514 are sent from a user's device 1502 to the route planning service 1504.
  • the route planning service 1504 may update a map and route information 1518 and send a Route Update 1520 to the user's device 1502.
  • exceptions may occur or be detected and cause changes to communication between the vehicle, user and RPS 1504. Exceptions may include the following conditions, among others: the user changes the route, the user wants to meet the car at some point along the route, or the user wants to know the fastest route to some location.
  • the service may inform the user of the route deviation and ask if he wants a new route scouted and/or if he wants the car to follow close by if the route is not entirely known. If the user wants to meet the car at some point along the route, the meeting point may be pre-planned or ad-hoc, and the meeting point may be moving (such as where the car meets a moving user). The service may also automatically command the car to meet at a point along a new route, such as if the user changes route. If the user wants to know the fastest route to some location, then the car may precede the user by a short distance, be available for fast pickup if needed, and report any observations directly to the user's device. When the vehicle precedes the user in close proximity, the vehicle may communicate directly with the user's device.
  • An exemplary method for how to update a route dynamically starts with the user providing his or her specific parameters (such as calorie burn) for a route to the route planning service (which may also use historic data, known routes, and weather forecast, among other things), and the user authorizing the use of the vehicle for scouting.
  • the route planning service queries the vehicle's sensor capabilities to see if it satisfies the route safety reporting requirements for the specific section (which is reported using GPS coordinates).
  • the vehicle sends a scouting report for the specific route segment (which includes the presence of sidewalks and potholes, the lighting levels, the inclination, and the presence of parked cars blocking the road).
  • the user receives several route choices for the desired starting and ending location.
  • the user picks a choice for the route segment and responds to the route planning service.
  • the service updates the route map instructions.
  • the application sends to the vehicle a section scouting request (which is communicated based on the user's coordinates).
  • the route planning service sends the user updated vehicle sensor and scouting data for the confirmed and planned route sections.
  • the application may access extra scouting reports based on the user's location, speed, and direction. The user may also want to meet the vehicle at particular location.
  • V2P Vehicle-to-Pedestrian
  • P2V Pedestrian-to- Vehicle
  • the user sends the meeting request to the RPS, and it locates the appropriate vehicle.
  • the vehicle responds with its location and estimated time of arrival (ETA).
  • ETA estimated time of arrival
  • the user asks for the fastest route to a destination (such as to home or a hotel).
  • RPS updates the route and calculates the best meeting spot to pick up the vehicle.
  • RPS passes to the vehicle the meeting location and the calculated route.
  • the application constantly updates the location of the vehicle and the user as they progress toward the meeting location.
  • the messaging protocol of FIGs. 14 and 15 shows an example of how to update a route dynamically.
  • the protocol comprises the following messages and data fields:
  • a user may send a Route Start Indication 1510 if beginning to traverse a route. It may include a location field that holds the user's current location.
  • a Route Planning Request may include information about the dynamic parameters for the desired route. This information may be used by the RPS in addition to static user profile parameters.
  • the dynamic parameters may include, for example, a time parameter a location parameter, a movement type parameter, a duration parameter, and a route parameter, among others.
  • the time parameter may indicate a specific time for a planned departure along a route, or it may indicate, for example, that departure should take place as soon as possible.
  • the location parameter may indicate the user' current location.
  • the movement type parameter may indicate the user's desired means of traversing the route. For example, the movement type parameter may indicate that the user wishes to run, jog, walk, cycle, skate, or travel by car, van, or truck, among other alternatives.
  • Route criteria include specific requirements for the route. Examples of route criteria comprise the lighting level (e.g., "entirely lit", or “single light-post outages allowed"), route overlap, usage activity, and road criteria.
  • Route overlap includes permission for a route to overlap on itself (e.g., expressed as a maximum allowed percentage of overlap with 0% for no overlap).
  • Usage activity includes preference for routes with or without other people, and their activities (such as, "50% of roads along the route must have cyclist activity", “prefer routes with high jogging activity”, or “prefer routes with low pedestrian activity”).
  • Road criteria includes both cycling and walking / running criteria for a road, such as a road's surface material and its condition and the presence of sidewalks (e.g., "100% of route must have sidewalks").
  • the RPS may send a Vehicle Use Permission Request to request permission to use the user-specified vehicle for route scouting.
  • the RPS may request the use of the vehicle to chart a larger area of the region than immediately needed for route selection.
  • the vehicle use permission request may include a vehicle ID as an identifier for the vehicle to use.
  • the request may include a permission type parameter.
  • the permission type (e.g., "for route scouting use only" or "allow extended range scouting for service improvement purposes") may also contain an indication of how much extended scouting is allowed (e.g., 10% of route length).
  • a Vehicle Use Permission Response is a response to a Vehicle Use Permission Request.
  • the response may include a vehicle identifier.
  • the response may further include a permission grant parameter.
  • the permission grant may contain a key parameter and a permission type parameter (e.g., "allow extended range scouting for service improvement purposes"). In case the car rejects the use, the permission type may also be "not allowed” with an optional explanation, such as "low battery.”
  • a Sensor Capability Request may be used as a request containing the required sensor capabilities field that defines a list of capabilities whose support is requested, including names (e.g., "sidewalk detection” or “road slipperiness detection”) for the required capabilities.
  • a Sensor Capability Response may be used as a response providing an indication of available support for requested capabilities.
  • the response may include the capability name and the support level (e.g., "not supported”, “partially supported”, or “fully supported”).
  • a Section Scouting Request may be used as a request to provide sensor reports for a given road section.
  • a coordinates parameter may be provided to identify a scoutable section (e.g., as start and endpoint definitions as street and intersection names or GPS coordinates or, alternatively, as a starting area and ending area where the start and endpoints must be, and between which section candidates are scouted).
  • a report contents field may be provided. The report contents field is a list of sensor events to report, such as "road slipperiness events", “road inclination”, “lighting level", “runner activity”.
  • the scouting type lists instructions for the car for route scanning (e.g., "shortest only” or “exhaustive”).
  • a Section Scouting Report message may be provided as a response to a Section Scouting Request, containing events for each sensor event type requested.
  • the section scouting report may include a section coordinates parameter.
  • the section coordinates may provide the start and endpoint and any waypoints along the scouted route, as street/intersection names or GPS coordinates.
  • the report may also contain event list parameters.
  • the event list contains event types (e.g., "lighting level").
  • the event report lists noteworthy conditions, such as "dead light post". Event coordinates identify a particular location, such as via GPS coordinates.
  • the route planning service may send a Select Section Request to communicate several options to a user, based on scouted routes.
  • the message contains a list of candidate sections.
  • the candidate ID field defines identifiers for the candidate section in the list.
  • the route is a representation of the route of the section as, e.g., GPS waypoints.
  • the distance is the length of the section.
  • the duration is an estimate of the duration of the section for the movement type.
  • the event list contains the same field information as it does in the Section Scouting Report message.
  • An additional information field may be provided to include information about the section added by the service based on, e.g., user profile, such as calorie burn.
  • a Select Section Response may be sent to communicate a user's selected section.
  • the message may contain the candidate ID, which may be blank if the user does not accept any of the candidates (in which case, the service may scout for further alternatives).
  • a Start Permission Indication may be sent to a user's device to communicate permission to start an exercise.
  • the exercise start permission may be given to the user when route planning is estimated to be finished before the user reaches the end of the route. No specific parameters are required.
  • the RPS may send a Route Update Indication to a route planning application to communicate changes to a route, e.g., when the user embarks on the route, as new sections of the route have been confirmed, or if the user requests a route update.
  • the route description field is a description of the route as waypoints, in two parts.
  • the confirmed portion is the section of the route that has already been scouted, while the planned section is the unconfirmed part of the route.
  • the route events field is a list of events and warnings for the route, including the event type (e.g., "dead light post", or "cardio workout section") and event coordinates (e.g., as GPS coordinates).
  • FIGs. 16, 17, and 18 display some examples of exceptional events that may occur when traversing a route, such as a re-planning of the route due to a user deviation from the plan 1608, a user-initiated request to meet a car at a fixed location, and a request for accompanied route at a specific location.
  • the protocol includes the following messages and data fields:
  • the user's device may send a Location Update 1812 to update a user's location (e.g., as GPS coordinates), speed, and heading.
  • a Location Update 1812 to update a user's location (e.g., as GPS coordinates), speed, and heading.
  • the user may send a Route Update Request 1612, 1710, 1732 to request a new route while traversing the route.
  • the message includes a location (the user's current location), a heading (the direction the user wishes to go), and optional waypoints (specify user waypoints along the route).
  • the user may send a Vehicle Meeting Request 1636, 1638, 1716, 1720, 1738, 1742 to request that the vehicle meet him or her along the route.
  • the meet type field is set to "parked" for meeting at a predefined location, "moving" for a request for the car to meet the moving user, or “precede” if the car is requested to move to and hold a position a short distance ahead of the user.
  • the meet location shows candidate coordinates for a parked meet point, or the user's current location and a planned route for a moving or preceding to or from the meet point.
  • a Vehicle Meeting Response 1642, 1644, 1724, 1726, 1746, 1748 may be sent in response to a Vehicle Meeting Request 1636, 1638, 1716, 1720, 1738, 1742.
  • the vehicle location is the current location of the vehicle.
  • the meet point, for a parked meet point contains updated coordinates determined by the vehicle and/or the RPS, depending on, e.g., available parking locations near the requested meet point.
  • the meet point contains the coordinates of the estimated point along the user's route where the car will meet the user.
  • a Vehicle Location Broadcast 1810 may be sent to broadcast a vehicle's location.
  • the vehicle ID field identifies the vehicle.
  • the message also broadcasts the vehicle's location, speed, and heading.
  • FIG. 16 shows an embodiment of messaging 1600 that may occur if a user deviates from a route 1608.
  • a user's device 1602 receives an indication from a user of an agreement to a new route planning 1610.
  • a Route Update Request 1612 is sent to a route planning service 1604, which updates a route 1614 and responds with aRoute Update 1616.
  • the RPS 1604 sends a Section Scouting Request 1618 to a route planning application 1606, which drives the vehicle (or car) along a section and generates a report of the requested information based on sensor data 1620.
  • the report may be sent to RPS in a Section Scouting Report 1622.
  • RPS 1604 selects section candidates 1624, and sends a Section Selection Request 1626 to the user's device 1602.
  • the user's device sends back a Section Selection Response 1628, and the RPS 1604 updates a route 1630.
  • the updated route may be sent 1632 to a user's device 1602.
  • Another scenario where a user may deviate from a route may occur if a user's device receives an indication from a user a request to meet the vehicle 1634, the user's device may send a Meeting Request 1636 to the RPS 1604.
  • the RPS 1604 sends a Meeting Request 1638 to a route planning application 1606, which determines the nearest location allowed and an estimated time of arrival (ETA) 1640 and sends back a Meeting Response 1642.
  • the RPS 1604 sends a Meeting Response 1644 to the user's device 1602, which displays to the user the meeting location and ETA 1646.
  • FIG. 17 shows one embodiment of messaging 1700 if a user requests an accompanied fast route to a location 1708.
  • a user's device 1702 sends a Route Update Request 1710 to an RPS 1704, which updates a route 1712 and sends back a Route Update Response 1714.
  • a user's device 1702 sends a Vehicle Meeting Request 1716 to an RPS 1704, which estimates a meeting point 1718.
  • the RPS 1704 may send a Vehicle Meeting Request 1720 to a route planning application
  • RPS 1704 sends a Vehicle Meeting Response 1726 to the user's device 1702, which displays location and ETA for the meeting location 1728.
  • FIG. 18 shows one embodiment 1800 of a vehicle near a meeting point 1808.
  • a route planning application 1806 sends a Vehicle Location Broadcast 1810 to a user's device 1802.
  • the user's device may send a Location Update 1812 to the route planning application 1806, which may be used to determine how far apart the vehicle is from the user.
  • a car stays a little ahead of the user, detecting route conditions 1814.
  • a user's device reports location periodically 1816.
  • the route planning application 1806 sends a route update 1818 to a user's device 1802, which alerts a user on route events 1820.
  • the route selection parameters may include, for example, the following parameters.
  • parameters may include: the lighting level, the presence of broken lights, the presence of dark alleys, weather (e.g. wind: strength, direction, and gusts; sunlight: cloudiness, effect on the user; rain; snow) including from both safety and convenience viewpoints, allowance of route overlapping, the presence of slippery/icy patches, puddles of water, or flooded areas, a road's profile (including its inclination and surface material).
  • the following route selection parameters may be more pertinent (but not limited to just pedestrians): street side structures, the number/type of buildings, the presence of roofed areas, the number of trees, which includes their effect on weather (such as shade and protection from wind and rain), sidewalk parameters (such as their presence, condition, and need to switch from side to side), the number and activity of pedestrians observed along the route (jogging, walking with bags in their hands, standing, or waiting at traffic lights), traffic light cycles (short vs.
  • the following route selection parameters may be more pertinent (but not limited to just cyclists): the number of cyclists and their activity level (cycling fast, slow, alone or with someone), road conditions (potholes, road surface roughness, presence of loose gravel, sloped/non-sloped curbs at intersections), roadside blocks (e.g., legally or illegally parked vehicles, vehicle size, and blocked view in road crossings), and the same exercise-related parameters pertinent to pedestrian routes.
  • the following route selection parameters may be more pertinent (but not limited to just motorists): presence and type of parked vehicles along the route (e.g., lone vans), road blocks (such as natural disaster conditions, parked vehicles, demonstrations, rioting, and traffic jams), road conditions (broken sections, such as due to earthquake), gatherings of people, stationary people at corners, and the length of visibility (long open roads vs. narrow winding roads with lots of parked vehicles or the presence of dark alleys).
  • parked vehicles along the route e.g., lone vans
  • road blocks such as natural disaster conditions, parked vehicles, demonstrations, rioting, and traffic jams
  • road conditions broken sections, such as due to earthquake
  • gatherings of people stationary people at corners
  • the length of visibility long open roads vs. narrow winding roads with lots of parked vehicles or the presence of dark alleys.
  • Vehicle sensors may report several types of vehicle data content, which may be used for route planning support (RPS) calculations.
  • static information include road type (such as asphalt, gravel, or dirt), road surface conditions (including its suitability for various exercise types, such as uneven gravel road or rough asphalt), road width, the presence of sidewalks and their width, the need to change side of the road often, places to change side (due to, for example, illumination, security, or a need to maintain a certain speed), the presence of buildings (including the type of buildings, their height, and their closeness to the road), other service data (crowdsourced and open data) which may be fetched from, for example, base stations, pico- and femto-cells, and Facebook with respect to car location to aid planning, and the presence of resting, snack, and drink areas may be used with or without a car, such as drink breaks in a car to avoid certain areas or environments.
  • the use of resting, snack, and/or drink areas might require part of the route to be changed to accommodate the use of such areas
  • Dynamic sensor information includes, for example: road conditions (such as the presence of pot holes, road cracks, or slippery conditions, which generates a recommendation for special shoes, bike tires, or types of exercise); parked vehicles and other road blocks, including the types of parked vehicles (e.g., car, van, or truck); illumination (dark/bright spots, overall illumination, and a recommendation for clothing color/lights); micro weather (e.g., presence and level of wind, heat, or sun along certain parts of the route), including head/tail wind, direct sunshine, or heavy rain; time combined with other information; traffic (vehicle) information: intensity (high, moderate, or low level), presence of traffic jams, and number of vehicles in front of buildings; traffic light operation (e.g., via Vehicle-to-Infrastructure (V2I)): cycle lengths, whether it is operating properly, current state; the effect of buildings on local weather conditions (e.g., the presence of a tunnel, including wind direction, wind intensity, and comparison with overall wind), including the ability
  • Pedestrian/cyclist information might necessitate route adjustments.
  • Music recommendations are based on the current environment features (not just speed and/or training obj ectives).
  • Weather data for example, is used if the user desires to avoid such conditions or if the user desires to use as many roofed areas as possible.
  • a cloud service such as RPS
  • RPS gives the car an area to scout for routes between a starting area and an ending area.
  • the vehicle reports all possible routes or just the best route from the starting to ending point.
  • a vehicle also does the scouting/route selection independently of external services by exchanging all the necessary information with the phone application. With the vehicle user's permission, the collected data is used to update larger area map data to maintain accurate information on routes. This larger area data is used by other users for route planning or other services.
  • the user might drive through the route in a vehicle during or after route scouting.
  • the vehicle's human-machine interface shows information considered in route selection and provide means for interacting with route selection, such as the ability to include or reject a particular route.
  • the system uses this approach when it lacks knowledge of road conditions and traffic flow (such as after an earthquake or a hurricane); vehicles map passable routes for road traffic and include an estimate of the treacherousness level.
  • the system uses this approach for scouting in front of a motorcade by looking for suspicious roadside objects and potential locations where the speed of the motorcade might require an adjustment.
  • the application uses Emma's workout history and additional preferences to define the length of the route for the trip at Emma's cycling speed, and the application makes a preliminary list of candidates for the route.
  • the application instructs the vehicle to scout the candidate routes and to present new routes in case it finds better ones.
  • the vehicle also records accurate and updated information of the routes, such as lighting levels, the presence and condition of sidewalks, pedestrian activity along the routes, road inclination, and potentially slippery spots along the route.
  • the vehicle looks for road sections that fit Emma's fitness profile for a cardio workout.
  • the vehicle reports its findings to the application.
  • the vehicle collects enough route data to allow Emma to start cycling.
  • the confirmed part of the route is shown in green on a map on her phone, and the yet unconfirmed portion is shown in gray.
  • Emma's wrist device helps her make the correct turns along the route.
  • Emma reaches a street section with a fairly steep incline her wrist device indicates this portion is the cardio workout section.
  • the watch instructs her to cycle the street up and down three times.
  • Emma decides to deviate from the planned route by going to a road by a canal. She instructs either her phone or her watch to have her vehicle meet her at the other end of the road.
  • Emma arrives at the car and takes a water break in the car.
  • Emma decides to return to the hotel using the shortest possible route. She hits the "take me home” button on her wrist device. The application indicates to her the shortest route, and Emma calls for the car to precede her along the route. Emma's wrist device instructs her to follow the car, which stays 50 yards in front of Emma. She cycles to her hotel, feeling safe that in case of any trouble, her car is one button press away from her.
  • Exemplary embodiments described herein enhance safety and personal security by integrating smart space and connected vehicles together with security tracking of the user before and while walking. Exemplary embodiments also enhance security monitoring coverage by leveraging the availability of connected vehicles. Such leveraging enables connected vehicles to cover surveillance of blind-spots. Exemplary embodiments may perform safety tracking without additional equipment (regarding the user's mobile device and connected vehicles). This system takes into account the current safety situation (e.g., location of security personnel or police in the area) and dynamic environmental variations (e.g., illumination of the walking path, such as broken street lamps). The system provides user interfaces before, during, and after a user traverses a route. As described herein, a mobile device or primary terminal is a device associated with a user.
  • FIG. 19 is a system diagram of the architecture 1900 of the area management and access code exchange process.
  • geographic locations may be managed by an area manager that controls autonomous and controlled vehicles.
  • a managed area may be an area with a large number of pedestrians, for example, that may use locally maintained rules for vehicle control and use.
  • FIG. 19 shows an example set of messaging interfaces for an AV 1901 about to enter a managed area.
  • an AV 1901 sends a destination route request 1908 to a routing application (which may be local to the AV 1901 or based in the cloud 1902).
  • a routing application which may be local to the AV 1901 or based in the cloud 1902.
  • Managed areas along a vehicle route may be identified by a routing application sending a route request 1909 to a map database 1903 and receiving managed areas route information 1910.
  • Access requests (or announcements) 1912 may be sent to area managers 1904 responsible for an identified managed area.
  • Area managers 1904 may be distributed over a network. Because area rules 1919 may be maintained locally via area surveillance 1906, an update process may be real time, and restrictions may also be set real time (immediately).
  • an area manager 1904 may respond to an access request 1912 by sending the requested access code 1913 to the AV 1901 via the cloud 1902.
  • the access code may be valid only between a certain timespan to prevent too early or too late arrival. Therefore, an access request 1912 may include an estimation of the arrival time, which may be calculated from a route plan for some embodiments. For example, traffic congestion may delay an arrival of an AV, and an access code may expire. In this case, an AV may send a renewal request for the access code.
  • an AV 1901 may send an entrance announcement message 1914 with an access code to an entrance 1907.
  • the AV 1901 may also send 1914, 1915 an authentication code to an area manager 1904 via an entrance 1907 and receive 1916, 1917 route instructions.
  • An area manager 1904 may perform a code check identification 1908 to verify an authentication code.
  • This code exchange secures communication between an area manager 1904 and an AV 1901.
  • an AV 1901 receives instructions and commands only from an area manager 1904. If access is delayed, an AV may receive waiting instructions 1917, 1918, for example to drive to a certain location for waiting. Otherwise, an AV may block streets or entrances from other traffic.
  • AV movement inside a restricted area may be controlled using a local map and route information that indicates allowed routes and parking spots inside he restricted area.
  • an area manager 1904 may control the order in which vehicles leave the area.
  • Area management may have an ability to clear a managed area. For example, this ability may be used by emergency vehicles in case of fire.
  • restricted areas may use a management system which controls AV traffic.
  • An AV may send a message to a management system before entering an area. Some areas may be managed by an area operator. This management task may be outsourced to dedicated service providers, for example.
  • An AV 1901 may communicate with other vehicles (or cars) in an area and may receive from an area manager 1904 which roads are blocked, where to enter, or where to park.
  • a management system may send mapping data to an AV to support route and driving trajectory planning. Some large public events (such as concerts or fairs) may set up temporary parking areas in fields or open areas without any markings.
  • An automated parking manager may send guidance messages to AVs to communicate parking spots and control AVs inside a parking area.
  • Area management may rely on sensors (such as surveillance cameras, LiDAR systems, and other types of perception sensors), which may detect the number of pedestrians or other uncontrolled traffic in an area.
  • Temporary areas may use supervising vehicles to supervise an area, for example with an Unmanned Aerial Vehicles (UAVs) or drone units.
  • UAVs Unmanned Aerial Vehicles
  • An area manager 1904 may store records 1905 of vehicles inside an area.
  • An AV may change a planning module's mode to enable vehicles (or cars) to drive in automated mode in pedestrian zones.
  • a behavior planning module may lower an AVs diving speed and reduce safety margins.
  • an AV 1901 may turn a corner at slower speeds with reduced distance margins to other vehicles.
  • an AV 501 may receive driving instructions (or behavior rules) and follow those instructions in accordance with messages received from an area manager 1904 to achieve goals set by an area manager 1904 and a behavior planning module.
  • FIG. 20 is a plan view schematic of an environment 2000 where an AV s vehicle routes
  • both the shortest route planned by an AV 2012 and a new route 2014 may contain waypoints 2016, 2018, which are intermediate points along a route.
  • FIG. 21 is one embodiment 2100 of a message sequencing diagram for an area access process.
  • An AV 2102 may send a route request 2112 to a routing application (local or cloud-based) with a destination and an AV's current location.
  • a route description is generated.
  • information on which areas are managed and which area manager 2106 is responsible for management may be included in a route database. Areas are identified with an identification number or other appropriate code. With an identification code, an area manager may be determined using a cloud-based database. There are several web- based technologies available for this purpose. Estimated arrival time to a managed area may be determined from a route description.
  • a route planner 2104 may send 2114 the route and associated area managers to an AV 2102.
  • a vehicle identification code together with an estimated arrival time may be sent 2116 to an area manager 2106 to obtain 2120 an access code.
  • An access code is not a permission but is used to identify an arriving vehicle 2102.
  • An Area Manager adds 2118 an arriving vehicle 2102 to a database. Information on arriving vehicles is used to estimate the number of vehicles in the area at certain time in the future and it is used to plan the management actions.
  • an AV When an AV 2102 arrives at an area, an AV sends an access request 2116 to an area manager 2106.
  • This access request 2116 may include a vehicle's own access code and a previously received area access code. Codes may be used to secure communication between an area manager 2106 and an AV 2102. Inside an area, an AV may receive commands and instructions only from authenticated area manager 2106.
  • An area manager 2106 monitors an area with surveillance sensors 2110, such as surveillance cameras, LiDAR systems, or other types of perception sensors. If such sensors 2110 are not available, an area manager 2106 may receive information from an AV's sensors when an AV is inside an area. For example, an AV may detect the presence of pedestrians from cell phone activity or V2P (vehicle-to-pedestrian) communication. Static area maps may be loaded 2122 with usable static routes for routing inside an area. Dynamic objects may be added to a map 2126 for final routing. Using sensors 2110, pedestrians and other traffic may be detected 2124 by area surveillance 2108 in real-time and added to a map 2126.
  • surveillance sensors 2110 such as surveillance cameras, LiDAR systems, or other types of perception sensors. If such sensors 2110 are not available, an area manager 2106 may receive information from an AV's sensors when an AV is inside an area. For example, an AV may detect the presence of pedestrians from cell phone activity or V2P (vehicle-to
  • area management may be performed automatically and update a map 2130.
  • the number of AVs may be monitored, and movements of AVs may be controlled to maintain safety.
  • AVs may be controlled to ensure activities inside an are not disturbed.
  • a virtual map may be created, and surveillance sensors 2110 may act as mobile units (like quadcopters or similar devices) to avoid fixed installations.
  • an area manager 2106 sends instructions 2138 to the AV 2102.
  • Instructions 2138 may include status of the entrance. If immediate access is not possible, an AV may receive an instruction to wait to be guided to a specific waiting area. In some cases, for example, access may be declined if an area is already closed or full.
  • an area manager 2106 creates 2136 and sends 2138 drive instructions to the AV 2102. Examples may be seen in FIGs. 21 (market case) and 23 (parking). Instruction set may be planned using dynamic map data collected from area surveillance 2108 and database data. The route may be described as segments which contain waypoints, speed limit, driving priority and actions.
  • Actions may be stopping points, turns, and other information not described by waypoints.
  • a destination may be either a location where AV want to go (an AV's goal point) or a spot in which an area manager 2106 reserves for an AV, such as a preferred parking spot.
  • AV 2102 may send location (or position) updates 2140 and status information (such as moving or parked) to an area manager 2106.
  • An area manager 2106 may update a database 2142 accordingly.
  • An area manager 2106 may also update a map 2144 and update instruction 2146 based on a location and action update 2140.
  • GNSS may be inaccurate and have availability limitations, a area surveillance system may send position information to an AV 2102.
  • Location data may be sent within an instruction update 2148.
  • Database entry contains the Identification number, priority, current location, status and access time.
  • An area manager 2106 may monitor an AV 2102 to determine if given instructions are being followed. Because an area changes dynamically, an area manager 2106 may send update instructions 2148.
  • An AV 2102 may also send a list of detected objects and measured clearances to an area manager 2106 to improve dynamic map content.
  • an AV may send a departure request 2150 to an area manager 2106.
  • An area manager 2106 may send an acknowledgement to a request and remove 2156 an AV (or vehicle) 2102 from a database. If departure is managed or planned 2152 (which may occur to avoid congestion), an area manager 2106 may send departure instructions (or an instruction update) 2154 to an AV 2102 (which may be an instruction to wait).
  • FIG. 22 is one embodiment 2200 of a message sequencing diagram for an area access process for operating without a fixed surveillance system.
  • An AV 2202 may send a location and destination route request 2212 to a route planner 2204.
  • a route planner 2104 may send a route and associated area managers response 2214 to an AV 2202.
  • An access request 2216 may be sent to an area manager 2206.
  • An area manager 2206 may add 2218 the AV 2202 to a database and respond with an access code message 2220.
  • FIG. 22 is different from FIG. 21 in that an area manager 2206 may receive 2222 detected objects from AV sensors 2208. An area manager 2206 also may receive 2224 detected objects from pedestrian sensors 2210. An area manager 2206 may update a map 2226 based on detected objects.
  • An AV 2202 may send to an area manager 2206 an entrance request with an access code and vehicle access code 2228. An area manager 2206 may verify an access code 2230, create instructions 2232, and send instructions 2234 to an AV 2202. An AV 2202 may send a location and action update 2236 to an area manager 2206. An area manager 2206 may update a database 2238, update a map 2240, update instructions 2242, and send back updated instructions 2244.
  • An AV 2202 may send a departure request 2246 to an area manger 2206 upon preparing to leave an area.
  • An area manager 2206 may process a departure request (departure planning) 2248 and send departure instructions (or an instruction update) 2250 to an AV 2202.
  • An area manger 2206 may also remove an AV 2202 from a database 2252.
  • One embodiment of a route request 2112, 2212 may include fields for a current location (latitude and longitude coordinates) and a destination (address, latitude coordinate, and longitude coordinate).
  • One embodiment of a route response 2114, 2214 may include fields for a dedicated route description based on vehicle control system preferences and a list of managed areas, which may include an area identification code and an estimated arrival time.
  • One embodiment of an access request 2116, 2216 may include fields for a vehicle ID, an estimated arrival time, a destination, and vehicle dimensions.
  • One embodiment of an access code response 2120, 2220 may include fields for access status (with values for allowed, delayed, and denied), an access code, an optional estimated delay, and an optional wait area.
  • an entrance request 2132, 2228 may include fields for an access code, a vehicle ID, and a vehicle access code.
  • vehicle instructions 2138, 2234 may include fields for access status (with values for allowed, delayed, and denied), an access code with optional estimated delay and optional wait area sub-fields, an entrance location (latitude and longitude coordinates), a list of road segments (with sub-fields for segment ID, waypoints (latitude and longitude coordinates), allowed speed, action points (stop points and pedestrian crossings), and priority (compared to other route segments)), and a parking spot (latitude coordinate, longitude coordinate, a heading, and a parking spot ID).
  • a location and action update 2140, 2236 may include fields for a vehicle ID, a location (latitude coordinate, a longitude coordinate, a heading, and a speed), vehicle status (with values for moving, avoiding, still, or parked), a list of detected objects, and detection of a clearance ahead.
  • One embodiment of an instruction update 2148, 2244 may include fields for a detected location (latitude coordinate, longitude coordinate, and heading), a list of road segments (with sub-fields for segment ID, waypoints (latitude and longitude coordinates), allowed speed, action points (stop points and pedestrian crossings), and priority (compared to other route segments)), and a parking spot (latitude coordinate, longitude coordinate, a heading, and a parking spot ID).
  • a detected location may be a relative location correction to prevent or to correct for a GNSS error.
  • a departure request 2150, 2246 may include fields for a vehicle ID and a departure request time.
  • One embodiment of a departure instruction 2154, 2250 may include fields for departure status (with values for allowed, delayed, and denied), an optional estimated departure time, a departure point location (latitude and longitude coordinates), and list of route segments (with sub-fields for segment ID, waypoints (latitude and longitude coordinates), allowed speed, action points (stop points and pedestrian crossings), and priority (compared to other route segments)).
  • FIG. 23 is one embodiment 2300 of a plan view schematic of a temporary parking area with virtual parking spots and virtual lanes.
  • An AV receives instructions to avoid detected pedestrians 2301 and follow a virtual route 2305 described in instructions (which may be sent as segments 2308 with route waypoints 2313) towards a destination 2318.
  • Virtual routes 2305 may have priorities, such as routes with a priority 0 (2306) or a priority 1 (2307).
  • Virtual routes 2305 may have action points 2309, such as a turn left action point 2310, a stop and turn right action point 2311, or a turn left action point 2312.
  • FIG. 23 shows an example of an uneven column created by a manually-parked vehicle.
  • An original route 2314 is deviated with waypoints that avoid a misaligned vehicle.
  • FIG. 23 also shows an example of when a pedestrian may be blocking a route.
  • An AV detects a narrow clearance 2315 and avoids such a route if a pedestrian remains.
  • Virtual parking spots have IDs 2317 that may be specified by latitude and longitude headings 2316.
  • Parking areas may have a passenger drop zone 2302 and a passenger pickup zone 2303. These zones 2302, 2303 may be near an entry point 2304. By using such zones, AVs in parking areas may be handled without humans. This methodology also may make drivers leaving or picking up cars more comfortable.
  • Discharging of large areas may provide priority to certain vehicles, for example buses, taxis, emergency vehicles, and maintenance vehicles.
  • automated vehicles may be moved or stopped to yield right of way to top priority vehicles.
  • Manually-driven vehicles (or cars) (M) may generate uneven rows.
  • AVs may be used to show manual drivers how to park (see the middle columns of FIG. 23). AVs may be parked in rows leaving parking space between them (upper 2x5 parking area) or used as corner vehicles (lower 2x3 parking area).
  • One exemplary embodiment of a method described herein comprises: receiving a request for an automated vehicle to travel to a location within a controlled access area; determining the automated vehicle is authorized to enter the controlled access area; determining a location of the automated vehicle; determining a plurality of vehicle movements of the automated vehicle to travel to the location within the controlled access area; communicating to the automated vehicle the plurality of vehicle movements to travel to the location within the controlled access area; and determining that the automated vehicle has traveled to the location within the controlled access area.
  • Another embodiment further comprises updating a map database based on objects detected by area surveillance devices and sensors. Another embodiment comprises determining a location of an automated vehicle by receiving GPS coordinates from the automated vehicle. Another embodiment further comprises controlling the departure of the automated vehicle (which may be controlled based on vehicle congestion information).
  • One exemplary embodiment of a method described herein comprises: receiving a request for an automated vehicle to park in a controlled access parking area; determining a starting location of the automated vehicle; determining a location within the controlled access parking area to park the automated vehicle; determining a plurality of vehicle movements to move the automated vehicle from the starting location to the determined parking location; communicating to the automated vehicle the plurality of vehicle movements to move the automated vehicle to the determined parking location; and determining that the automated vehicle has moved to the determined parking location.
  • Exemplary embodiments disclosed herein are implemented using one or more wired and/or wireless network nodes, such as a wireless transmit/receive unit (WTRU) or other network entity.
  • WTRU wireless transmit/receive unit
  • FIG. 24 is a system diagram of an exemplary WTRU 2402, which may be employed as for example, a user device on which a route planning application is installed, or a vehicle computing system on which a route planning application is installed.
  • the WTRU 2402 may include a processor 2418, a communication interface 2419 including a transceiver 2420, a transmit/receive element 2422, a speaker/microphone 2424, a keypad 2426, a display/touchpad 2428, a non-removable memory 2430, a removable memory 2432, a power source 2434, a global positioning system (GPS) chipset 2436, and sensors 2438.
  • the WTRU 2402 may include any sub-combination of the foregoing elements while remaining consistent with an embodiment.
  • the processor 2418 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like.
  • the processor 2418 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 2402 to operate in a wireless environment.
  • the processor 2418 may be coupled to the transceiver 2420, which may be coupled to the transmit/receive element 2422. While FIG. 24 depicts the processor 2418 and the transceiver 2420 as separate components, the processor 2418 and the transceiver 2420 may be integrated together in an electronic package or chip.
  • the transmit/receive element 2422 may be configured to transmit signals to, or receive signals from, a base station over the air interface 2416.
  • the transmit/receive element 2422 may be an antenna configured to transmit and/or receive RF signals.
  • the transmit/receive element 2422 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, as examples.
  • the transmit/receive element 2422 may be configured to transmit and receive both RF and light signals.
  • the transmit/receive element 2422 may be configured to transmit and/or receive any combination of wireless signals.
  • the WTRU 2402 may include any number of transmit/receive elements 2422. More specifically, the WTRU 2402 may employ MIMO technology.
  • the WTRU 2402 may employ MIMO technology.
  • WTRU 2402 may include two or more transmit/receive elements 2422 (e.g., multiple antennas) for transmitting and receiving wireless signals over the air interface 2416.
  • transmit/receive elements 2422 e.g., multiple antennas
  • the transceiver 2420 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 2422 and to demodulate the signals that are received by the transmit/receive element 2422.
  • the WTRU 2402 may have multi-mode capabilities.
  • the transceiver 2420 may include multiple transceivers for enabling the WTRU 2402 to communicate via multiple RATs, such as UTRA and IEEE 802.11, as examples.
  • the processor 2418 of the WTRU 2402 may be coupled to, and may receive user input data from, the speaker/microphone 2424, the keypad 2426, and/or the display/touchpad 2428 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit).
  • the processor 2418 may also output user data to the speaker/microphone 2424, the keypad 2426, and/or the display/touchpad 2428.
  • the processor 2418 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 2430 and/or the removable memory 2432.
  • the non-removable memory 2430 may include random-access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device.
  • the removable memory 2432 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and the like.
  • SIM subscriber identity module
  • SD secure digital
  • the processor 2418 may access information from, and store data in, memory that is not physically located on the WTRU 2402, such as on a server or a home computer (not shown).
  • the processor 2418 may receive power from the power source 2434, and may be configured to distribute and/or control the power to the other components in the WTRU 2402.
  • the power source 2434 may be any suitable device for powering the WTRU 2402.
  • the power source 2434 may include one or more dry cell batteries (e.g., nickel -cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion), and the like), solar cells, fuel cells, and the like.
  • the processor 2418 may also be coupled to the GPS chipset 2436, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the WTRU 2402.
  • location information e.g., longitude and latitude
  • the WTRU 2402 may receive location information over the air interface 2416 from a base station and/or determine its location based on the timing of the signals being received from two or more nearby base stations.
  • the WTRU 2402 may acquire location information by way of any suitable location-determination method while remaining consistent with an embodiment.
  • the processor 2418 may further be coupled to other peripherals 2438, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity.
  • the peripherals 2438 may include sensors such as an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, and the like.
  • sensors such as an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module
  • FIG. 25 depicts an exemplary network entity 2590 that may be used within a communication system, for example as a route planning service in some embodiments.
  • network entity 2590 includes a communication interface 2592, a processor 2594, and non-transitory data storage 2596, all of which are communicatively linked by a bus, network, or other communication path 2598.
  • Communication interface 2592 may include one or more wired communication interfaces and/or one or more wireless-communication interfaces. With respect to wired communication, communication interface 2592 may include one or more interfaces such as Ethernet interfaces, as an example. With respect to wireless communication, communication interface 2592 may include components such as one or more antennae, one or more transceivers/chipsets designed and configured for one or more types of wireless (e.g., LTE) communication, and/or any other components deemed suitable by those of skill in the relevant art. And further with respect to wireless communication, communication interface 2592 may be equipped at a scale and with a configuration appropriate for acting on the network side— as opposed to the client side— of wireless communications (e.g., LTE communications, Wi-Fi communications, and the like). Thus, communication interface 2592 may include the appropriate equipment and circuitry (perhaps including multiple transceivers) for serving multiple mobile stations, UEs, or other access terminals in a coverage area.
  • wireless communication interface 2592 may include the appropriate equipment and circuitry (perhaps including multiple transceivers)
  • Processor 2594 may include one or more processors of any type deemed suitable by those of skill in the relevant art, some examples including a general-purpose microprocessor and a dedicated DSP.
  • Data storage 2596 may take the form of any non-transitory computer-readable medium or combination of such media, some examples including flash memory, read-only memory (ROM), and random-access memory (RAM) to name but a few, as any one or more types of non- transitory data storage deemed suitable by those of skill in the relevant art may be used.
  • data storage 2596 contains program instructions 2597 executable by processor 2594 for carrying out various combinations of the various network-entity functions described herein.
  • ROM read-only memory
  • RAM random-access memory
  • register cache memory
  • semiconductor memory devices magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD- ROM disks, and digital versatile disks (DVDs).
  • a processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, or any host computer.
  • modules include hardware (e.g., one or more processors, one or more microprocessors, one or more microcontrollers, one or more microchips, one or more application-specific integrated circuits (ASICs), one or more field programmable gate arrays (FPGAs), one or more memory devices) deemed suitable by those of skill in the relevant art for a given implementation.
  • hardware e.g., one or more processors, one or more microprocessors, one or more microcontrollers, one or more microchips, one or more application-specific integrated circuits (ASICs), one or more field programmable gate arrays (FPGAs), one or more memory devices
  • Each described module may also include instructions executable for carrying out the one or more functions described as being carried out by the respective module, and those instructions may take the form of or include hardware (hardwired) instructions, firmware instructions, software instructions, and/or the like, and may be stored in any suitable non-transitory computer-readable medium or media, such as the media commonly referred to as RAM or ROM.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

Systems and methods are provided for enhancing pedestrian safety, particularly in or near sensor-equipped smart spaces. A pedestrian indicates to a security service of a smart space that he wishes to walk from a starting location to a destination. The service identifies a plurality of pedestrian routes to the destination. The service also receives, from a nearby connected vehicles, information indicating what areas are coved by sensors of those vehicles. For each route, the system calculates a security metric based at least in part on the proportion of the route covered by the vehicle sensors. The system informs the pedestrian of the different routes and their associated security metrics. In some embodiments, the system may direct an autonomous vehicle to reposition itself to improve sensor coverage along a selected route.

Description

METHOD AND SYSTEM FOR AUTONOMOUS VEHICLE SENSOR ASSISTED SELECTION OF ROUTE WITH RESPECT TO DYNAMIC ROUTE CONDITIONS
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] The present application is a non-provisional filing of, and claims benefit under 35 U.S.C. §119(e) from, U.S. Provisional Patent Application Serial No. 62/318,585, entitled "Method and System for Autonomous Vehicle Sensor Assisted Selection of Route with Respect to Dynamic Route Conditions," filed April 5, 2016 and U.S. Provisional Patent Application Serial No. 62/329,572, entitled "Methods and Systems to Enhance Personal Security by Using Smart Space and Connected Vehicles," filed April 29, 2016, the entirety of which are incorporated herein by reference.
BACKGROUND
[0002] In large campuses, buildings, and complexes, such as libraries, malls, offices, apartments, colleges, and stadiums, security is a concern in outdoor spaces and parking lots. Existing solutions such as full-time, physical guards or comprehensive video surveillance are expensive. Moreover, in most cases, such methods provide only limited coverage.
[0003] If a user, such as an exerciser, traveler, or tourist, plans an activity involving a route (such as shopping, walking, running, biking, miscellaneous exercise, or other activity) in an unfamiliar area, the user might want to know the best/safest/most convenient route or may want to dynamically change the route while the user is on that route already. Such a user might also wonder about the condition of the road, whether it is paved, whether there are (roofed) sidewalks, whether there is a headwind, how hard is it raining, whether there are puddles, and whether there are dark alleys, slippery roads, icy patches, steep hills, road blockages, pets, drunken crowds, or trash on the ground.
[0004] Planning a pedestrian or cycling route may be demanding, especially if a person (e.g., a traveler or tourist) planning the route lacks knowledge of the region or the condition of the roads, or if the planner has a desire to try out new routes, even in a fairly well-known region. Things to consider when selecting route include, for example: safety (e.g., lighting or dark alleys), convenience (e.g., willingness to face a strong headwind or rain), conditions of the roads (e.g., surface of roads, sidewalks, or other lanes), type of neighborhood (e.g., cleanliness, suspicious people loitering), and suitability of the route for other personal preferences (e.g., exercise style, fitness level, and number of road crossings and stops). Even if the purpose is not exercise (such as a walk to a shopping district in an unfamiliar city), pedestrians and cyclists care about safety and convenience concerns. The user might also want to know what route to take. The user might also wonder if there is anything along the route that should be avoided or otherwise taken into account.
[0005] Drivers of motor vehicles care about planning a route according to dynamic elements as well. For example, drivers seeking safe travel of a vehicle or motorcade through city streets or desiring road and traffic conditions during rush hour or following a natural disaster (e.g., an earthquake, hurricane, or flood) require up to date information of dynamic elements along the route. Such data cannot be determined from static sources. Crowdsourcing the data via services such as Waze may not be feasible, reliable, or fast enough in cases where the safety of the route for human drivers cannot be guaranteed (e.g., after an earthquake). Moreover, apps and services do not report small or unusual items (such as trash, objects, miscellaneous items, events, or people) that are important to drivers, cyclists, and walkers.
[0006] Users typically plan routes for running or cycling based on ad-hoc routes selected after looking at a static map or according to crowdsourced information provided by services such as Strava, MapMyRun, and GPSies. The services allow other users of the service to report their exercise routes and other optional parameters, such as elevation or workout profile (e.g., calorie burn rate, total time used). As a consequence, users typically select routes from existing, reported routes. The list of possible selections might be small or non-existent in areas with fewer active users. Also, reported routes might not suit a user's desired exercise profile or personal taste. Furthermore, current solutions lack the capability to make route modifications, such as adding preferred points of interest or preferred road sections along the route.
[0007] Areas exist in some urban environments where driving a car is normally prohibited. Sometimes these restrictions are overridden in certain circumstances. For example, driving a vehicle in such areas may be convenient to transport people or cargo.
[0008] Autonomous functions are generally designed to operate best in well-mapped and structured areas, and autonomous vehicles (AV) are designed to obey traffic rules. However, some vehicle uses exist where accurate maps are not available. Examples include areas with changing pedestrian traffic, a temporary parking lot in a field, or a market square with many retail vendors.
[0009] The same problem may exist for areas around a store, which may prevent an autonomous vehicle from reaching the front door to pick up elderly passengers. An AV may be programmed to override the rules in special cases where overrides are allowed and there are no safety issues. This approach may not make sense for broader area events or venues. An AV may block other traffic inside an area, or an AV may be unable to enter an area because pedestrians are blocking the whole area. Rerouting may not solve the problem because alternative routes do not exist. If a sudden clearance of an area is required (such as due to a fire or other hazard), an AV may not detect such an event and may block traffic. Manually driven cars may be controlled by giving orders to a driver, but a driverless car may not have such an option. Therefore, unwanted access of AVs may be difficult to control if there is no method to communicate such messages between a restricted area and AVs.
[0010] Areas set aside for pedestrians present unpredictable driving environments. Also, areas where small children are playing soccer or areas reserved for bicyclists may present problems from people not noticing unexpected vehicles. Some streets or areas may be restricted to only emergency vehicles (such as a fire lane) where clear access needs to be maintained.
[0011] In temporary parking areas, vehicles may be guided manually without strict lanes to follow. Such an environment may be difficult for an autonomous vehicle that needs a digital IT infrastructure to receive commands from guidance personnel. However, standardizing the format of instructions for different vehicle manufacturers remains a problem.
[0012] Deployment of autonomous vehicles may start with an assumption that certain common rules and traffic instructions are available. However, pedestrian zones may not be well-mapped and may include many types of vulnerable road users (e.g., bicyclists and pedestrians) who do not expect any cars driving in the area. Such areas may have special requirements for autonomous vehicle control units. Restricted areas may not be configured to send instructions to autonomous vehicles on how to enter an area or drive to a front door, for example. In many cases, getting an autonomous taxi to a front door may be a service where people may pay extra.
[0013] US patent application 2016/0231746 presents a system and method to operate an automated vehicle by using concurrence estimation with pedestrian cell-phones and bicycle and road infrastructure sensors.
[0014] US patent 8,509,982 offers a zone driving system where a roadgraph may include a network of information such as roads, lanes, intersections, and connections between these features, as well as geographic zones associated with particular rules. In one example, the rules may require an autonomous vehicle to alert a driver that the vehicle is approaching a zone.
[0015] US patent 8,688,306 describes systems and methods to limit use of autonomous or semi-autonomous vehicles based on user permission data. The vehicle may drop off a passenger at a predefined destination, and the permission data may be used to limit the ability of an occupant to change a vehicle's route completely or by some maximum deviation value. [0016] US patent application 2016/0125736 presents a method to improve parking space identification in autonomous driving using identification information and parking space status information.
[0017] US patent application 2015/0353080 offers an automatic parking system where a vehicle is controlled automatically along a traveling path to a monitored parking space. If a contact determination unit determines that a vehicle makes contact with an obstacle, then the vehicle is stopped and parked in a position for removal of the obstacle.
[0018] US patent application 2015/0367845 describes a parking assist apparatus that parks a vehicle by generating a set of traveling routes and associated vehicle speeds, selecting a route based on object information, and controlling a vehicle to the selected parking spot.
[0019] US patent application 2014/00465506 presents a method of autonomous movement of a vehicle in a parking area by an external, stationary control device located in or near a parking area. Impending or actual collisions with other vehicles are detected by a vehicle sensor and in response the control device makes a behavioral decision, such as triggering an alert signal or maneuvering the vehicle.
SUMMARY
Enhancing Personal Security Using Smart Spaces and Connected Vehicles
[0020] Exemplary embodiments described herein relate to the field of personal security around a smart space (e,g., a campus area, mall, or office building(s), and their parking areas) and connected vehicles with various sensor and communication systems. For example, some embodiments described herein address the problem of how to increase personal safety and security of a user walking between a smart space area and his or her vehicle.
[0021] Embodiments of systems and methods described herein use connected vehicles and smart space sensor systems to provide dynamic surveillance and guidance capabilities. An exemplary embodiment operates to enhance personal safety and security for people that use the outdoor space (including blind spots) for entry, exit, or outdoor walking for any other purpose. This specification describes new techniques for enhancing the personal security and safety of a user before and while the user walks in a smart space (such as a campus or office area and parking or other area) or between a smart space and a connected vehicle.
[0022] In exemplary embodiments, a smart space security service provides information on the current security situation, coverage of the security surveillance (fixed and vehicle based), and authorized persons and vehicles identified in the area. The system may route connected vehicles entering and exiting the area to cover blind-spots. Autonomous vehicles may be driven to cover the blind spot (perform a security-scanning drive). The system provides a user safe walking path suggestions with detailed information about current security conditions, surveillance and tracking coverage, route blind-spots, and potential risks. Safety tracking is provided for the user while walking in a smart space area. The system communicates the user's position to the smart space security service and to connected vehicles (via vehicle-to-person (V2P) communications).
[0023] An exemplary embodiment involves interactions among a primary terminal (which may be a mobile device), a smart space security service (which may be implemented using software on a computer server, for example), and at least one connected vehicle. The primary terminal may be configured to provide a user positioning service, a Personal Security and Safety Monitoring (PSSM) application, and a user interface. The user positioning service tracks the user' s position (e.g., via GPS location information). Indoor locations may be tracked by the smart space security service.
[0024] The PSSM application presents ID information, security tracking information, and route and alert information. The application provides the user or vehicle ID when communicating with the smart space security service or the connected vehicle. The security tracking information includes the current user security tracking status. The route and alert information includes the walking route(s) along with additional alert information, such as security risks, safety risks, and safety metrics.
[0025] The user interface provides an application interface, an interaction management module, and a communication unit. The application interface enables application and service modules to use the user interface to present the application's or service module's information and to control that information. The interaction management module implements the application interface and controls the UI device' s input units and UI device' s output units. The communication unit provides wide area wireless communication systems (e.g., cellular systems) and short-range communication systems (e.g., Wi-Fi V2P communication).
[0026] The smart space (e.g., a campus, office building, or mall), including entry/exit doors, entry/exit areas, and parking areas may be provided with a smart space security service that includes components such as a smart space security manager, a smart space sensing and communication system, and a smart space person ID registry. The smart space security manager may include components such as a smart space security data manager, a security scanning service, a route planner, a security risk calculator, a person ID service, and a user location and tracking service. [0027] The smart space security data manager handles the smart space security service's current safety and security situation data. The security scanning service oversees the security and safety scanning of the area by the smart space's sensor systems. The security scanning service uses the coverage limits of the smart space sensing system and the additional security scans performed by connected vehicles. The service sends security scanning requests to connected vehicles and receives reports from connected vehicles. The walking path planner calculates the safest walking route for the user based on current security conditions, security monitoring, and tracking coverage. The security risk calculator gathers information from the smart space's resources and security scans and person identifications performed by vehicles and the smart space. The security risk calculator uses this gathered information to discern potential risks and raise security alerts, which may occur if a safety metric is below a safety threshold. For some embodiments, safety metrics may be calculated based on sensor data measured by a smart space sensor or an AV sensor. The person identification service identifies detected people by using the smart space person ID registry (or a database of information related to identities of people). The user location and tracking service locates the user (e.g., obtains the user's location from connected vehicles or short-range communications with the user' s device) and tracks the user while he or she walks to the destination.
[0028] For one embodiment, an updated safety metric is calculated for a portion of a route between a location of a user device and a user destination location. This updated safety metric may be communicated to the user device. For one embodiment, a safety alert message is sent to a user device if an updated safety metric is below a safety threshold.
[0029] The smart space sensing and communication system may use a variety of available monitoring sensors (e.g., camera feeds) to detect potential risks (e.g., objects, people, and conditions) in the smart space surroundings (or a vicinity of a pedestrian route). The smart space also includes user tracking capabilities with the sensor and short-range communication systems. The smart space ID registry (or database) provides identification of all personnel, people, and vehicles authorized by the smart space management. The registry may also include authorized guests and visitors in the smart space area.
[0030] An exemplary connected vehicle may include components such as a PSSM application, a vehicle sensor system, and a communication unit. The PSSM application (e.g., an application running in the vehicle terminal) may provide services such as a security scanning service, a user location and tracking service, and a person recognition service. The security scanning service performs security and safety scanning of the area with the in-vehicle sensor system. The user location and tracking service locates the user (e.g., obtains the user's location from the smart space sensors or calculates the user's location using vehicle sensors and short-range communications) and tracks the user while he or she walks to the destination. The person recognition service detects people (e.g., using V2P communication and/or environment perception systems), ascertains their person identification numbers, and provides those numbers to the smart space security service.
[0031] In an exemplary embodiment, the vehicle sensor system uses perception monitoring sensors (e.g., visual cameras, thermal cameras, and LIDARs) to detect potential risks (e.g., objects, people, and conditions). The communication unit provides wide area wireless communication (e.g., cellular) systems and short-range communication (e.g., Wi-Fi and V2P) systems between the vehicle, the smart space security service, and the user's primary terminal device.
Planning an Exercise Route Using Dynamic Route Conditions
[0032] For some exemplary embodiments, a Route Planning Service (RPS) may collect data from one or more vehicles for use with planning an exercise route. A user uses the service to determine a route. The user has the primary terminal and other devices (such as a wrist watch, a bike, or a head-mounted device) that contain the route planning application and wide-area wireless communication systems (e.g., cellular systems). The route planning application provides functionality and user interfaces for: (1) connecting the user profile, the cloud service, and the scouting vehicle to one another; (2) providing input for preferences and requirements for the route, such as the desire to stay within a given distance to the starting point; (3) showing the route map on the primary device; (4) requesting and controlling the vehicle usage along the route; and (5) providing input and output via the wrist watch, handheld devices, or other devices (such as route instructions and buttons for calling the car).
[0033] In an exemplary embodiment, the RPS runs as a cloud service and includes a route planning application, a sensor data processing service, a sensor information sharing service, and information storage.
[0034] The route planning application plans the route and may include a route planning module, a map data maintenance module, a vehicle command module, and an external interface module. The route planning module determines a suitable route based on the user's profile, user input, static route information (such as maps), and dynamic route information. The map maintenance module augments and updates stored map data with information obtained from the vehicle and other sources. The vehicle command module may perform tasks including: determining vehicle sensor capability; performing sensor fusion and detection, where multiple sensor inputs are fused together to compute something greater than just one sensor alone; determining vehicle availability for scouting routes; instructing routes for the vehicle scouting; parsing reports sent by the vehicle in order to provide up-to-date information on the current route candidate and to update map data; and relaying meeting requests from the user to the vehicle, both static (such as meet at a fixed location) and dynamic (such as the car meeting a moving user). The external interfaces module accesses external services, such as maps, traffic information, and weather services.
[0035] The RPS's Sensor Data Processing Service is capable of extracting route events based on sensor data. Such events may include the presence of trash, cardboard boxes, loitering persons, pets or other animals, and roofed sidewalks. The RPS's Sensor Information Sharing Service enables the sharing of vehicle sensor information between the user and nearby vehicles. The RPS' s information storage module stores user preferences and profiles, augmented map data, and stored routes. User preferences and profiles contain at least preferred route parameters for a given movement type (such as inclination, paved vs. non-paved routes, and a preference for sidewalks). Optionally, user preferences and profiles might include other information, such as user fitness level and exercise history. Augmented map data is formed from static map data and dynamic information obtained from vehicles, such as road conditions (potholes, surface material, surface roughness, and traction) and other road information (such as the presence and condition of sidewalks, the presence and type of roadside lighting, and the availability of locations for shelter from weather). Stored routes serve as a basis for creating new routes.
[0036] The vehicle links to the primary terminal (such as a cell phone) and the RPS. The vehicle may be provided with a route planning application, a map database, vehicle sensor data, and a sensor access module. The route planning application may include a route reporting module, a driving route module, and a vehicle information provider. The route reporting module reduces and reports sensor data into events along the route as requested by the RPS. The driving route module establishes and executes a driving plan along the route (where the autonomous vehicle is able to drive the route) as requested by the RPS. The vehicle information provider gives information about the vehicle, such as which sensor information the vehicle supports and the vehicle's ability to fulfill route scouting requests in light of fuel and battery levels. The digital map navigation database augments the vehicle's own sensor information. Vehicle sensor data includes access to data such as 3D imaging via cameras or LiDAR, thermal camera data, road roughness, ambient lighting level, rain sensor, traction control, electronic stability control (ESC), antilock braking system (ABS), and other derived events. The sensor access module enables activation of a vehicle's sensors and the sharing of car sensor data for other users.
[0037] Embodiments disclosed herein may provide various benefits. Users may have confidence in a selected route because a vehicle has recently confirmed the route. A static route service lacks such up-to-date data. Another potential benefit is the ability to check of the route for unsafe or unknown conditions, such as following a natural disaster. Exemplary embodiments also use real-time weather and road conditions for selecting a route. A further benefit includes the ability to reposition the vehicle nearby the user and provide added security and convenience (such as if it suddenly starts raining).
[0038] Exemplary systems and methods use data received from numerous vehicle sensors (e.g., cameras, LiDAR, radar, vehicle suspension data, and gyros) along the route to provide dynamic data to the user before and during travel on the route (using, for example V2C, C2V, V2P, and P2V communication) that provides the user with a sense of trust, safety, and accuracy during the trip.
[0039] One embodiment of a method of determining conditions along a route may comprise registering vehicles for ad hoc sensor duty, receiving location information for the registered vehicles, receiving information regarding at least a portion of a potential route of a user, and in response to a determination, based upon the received location information for the registered vehicles, that at least one registered vehicle is parked or driving at a position with visibility of at least a portion of a planned route of a user sending, to at least one registered autonomous vehicle (parked or driving) at a position with visibility of the route, information for causing a vehicle to collect data from a sensor of a selected modality, receiving sensor data from at least one registered vehicle (parked or driving) at a position with visibility of the route, and sending information regarding conditions on at least a portion of a potential route of a user derived from information from sensor data of at least one registered vehicle (parked or driving) at a position with visibility of a route. For one embodiment, a route may be a pedestrian route along sidewalks, where vehicles have visibility of sidewalks from non-sidewalk locations. For one embodiment, a method may further comprise deploying an autonomous vehicle to fill a gap for a portion of a potential route not covered by sensors.
[0040] One embodiment of a method of determining route conditions may comprise registering vehicles for sensor data collection, receiving location information for registered vehicles, receiving information regarding at least a portion of a potential route of a user, and determining that at least one registered vehicle has visibility of a portion of a potential user route. For one embodiment, determining that at least one registered vehicle has visibility of a portion of a potential user route may comprise sending to at least one registered vehicle at a position with visibility of route information to cause a vehicle to collect data from a sensor of a selected modality, receiving sensor data from at least one registered vehicle at a position with visibility of the route, and sending information regarding conditions on at least a portion of a potential user route derived from information from sensor data of at least one registered vehicle at a position with visibility of a route. For one embodiment, a potential user route may include sidewalks, and vehicles may have visibility of sidewalks from non-sidewalk locations. For one embodiment, a method of determining route conditions may further comprise determining that a portion of at least a portion of a potential user route lacks sensor data and deploying a vehicle to collect data to overcome a lack of sensor data. For one embodiment, information regarding conditions on at least a portion of a potential user route may be sent to a user's terminal device and to a cloud service. For one embodiment, collecting of sensor data may be done for one or more potential user routes prior to a user starting to traverse a route. For one embodiment, collecting of sensor data may be done dynamically with a vehicle traveling a configurable distance ahead of the user while the user starts to traverse the route.
[0041] For one embodiment, an apparatus may be configured to use computer-readable memory storing software instructions for a user route planning application, wherein such instructions are capable of calculating potential user routes, maintaining route map data in a database, instructing one or more vehicles to collect data for such potential user routes, interfacing with external modules to obtain external map, traffic information, and weather service data, and reporting route data to the user. For one embodiment, instructions for commanding of one or more vehicles may comprise instructions to perform operations including determining vehicle sensor capability, performing sensor fusion and detection, determining vehicle availability for scouting of routes, instructing vehicles to perform scouting for a particular route, parsing vehicle scouting reports to update map data and determine if alternate route candidates exist, and sending user- vehicle meeting requests to a vehicle and a user. For one embodiment, instructions for reporting of route data to a user may comprise instructions to perform operations including: displaying potential routes on a primary terminal device, displaying route conditions for each potential user route on a primary terminal device, allowing a user to pick a route from available choices or to name a new route, enabling a user to modify a route after starting to traverse a route, displaying updated route conditions on a primary terminal device, and interfacing with secondary user devices.
[0042] For one embodiment, an apparatus may be configured to use computer-readable memory holding software instructions for a vehicle route planning application, wherein such instructions may be capable of planning vehicle routes, updating a map database, interfacing with a sensor access module to enable vehicle sensors, and reading vehicle sensor data. For one embodiment, instructions for planning of vehicle routes may comprise a route reporting module, a driving route module, and a vehicle information provider. [0043] For one embodiment of methods described herein, a method may comprise receiving, from a user, an indication of one or more criteria for a route, identifying at least one portion of a route satisfying the criterion, collecting data from a vehicle traversing the identified portion, and proposing a route to the user based at least in part on the collected data. For one embodiment, criterion for a route may include a starting point. For one embodiment, criterion for a route may include an end point. For one embodiment, criterion for a route may include a length. For one embodiment, criterion for a route may include an overlap percent. For one embodiment, where a vehicle is an autonomous vehicle, a method may further comprise instructing an autonomous vehicle to traverse the identified portion. For one embodiment, a method may further comprise identifying a vehicle traversing an identified portion, wherein data is collected from the identified vehicle. For one embodiment, a method may further comprise identifying a plurality of vehicles traversing an identified portion, wherein data is collected from a plurality of vehicles. For one embodiment, the user may be a non-motorized road user. For one embodiment, the user may be a pedestrian. For one embodiment, the user may be a runner. For one embodiment, the user may be a cyclist. For one embodiment, collected data may include data on road hazards. For one embodiment, collected data may include data on litter. For one embodiment, collected data may include data on crowding. For one embodiment, collected data may include data on temperature. For one embodiment, collected data may include data on inclination. For one embodiment, collected data may include data on surface quality. For one embodiment, collected data may include data on the user. For one embodiment, collected data may include data regarding at least two routes satisfying the criteria and may further comprise displaying collected data to a user and receiving a user selection of one of the routes.
[0044] For one embodiment, a system may comprise a processor and a computer-readable memory storing instructions operative to perform functions including receiving, from a user, an indication of one or more criteria for a route, identifying at least one portion of a route satisfying the criterion, collecting data from a vehicle traversing an identified portion, and proposing a route to a user based at least in part on collected data.
Planning an Autonomous Vehicle Route Outside a Road Network
[0045] For some embodiments, systems and methods described herein provide an ability to control autonomous vehicle (or car) behavior outside a road network by defining an area management process. An area operator/manager may control an autonomous vehicle inside a specified area. [0046] Many AVs execute a pre-defined plan or mission. The lack of capability to determine where and when an AV may drive, especially in dynamic environments, is insufficient because such information is not included in navigational maps. Autonomous vehicles have information about dynamic objects within an AV's sensor range. However, an ability to adapt to dynamic changes in the area is missing. If a destination location is set outside of a road network (for example, in a market zone, pedestrian zone, parking facility, or temporary parking area) without any fixed markings, an AV may not operate safely without external assistance.
[0047] Systems and methods for an area management process are described herein, where an area manager may send instructions directly to vehicles (or cars), if a vehicle is allowed to enter in an area, that indicate which route and parking place may be selected. Systems and methods described herein also enable control of vehicle departure times and routes out of an area. Thus, for example, congestion may be avoided after a public event, when all cars try to leave the area at the same time. An area management process may be based on real-time monitoring of an area, detection of pedestrians, and other dynamic objects, using either fixed or mobile monitoring sensors and/or the available autonomous vehicles sensor data.
[0048] Systems and methods described herein enable an AV to enter zones outside road networks where traffic rules are not exact and the area is crowded with people, other VRUs, and other activities. Such areas are dynamically changing and difficult to cover with traditional navigation maps that are updated to include dynamic events and alternative routes. Systems and methods described herein use micro traffic management. Using such systems and methods, AVs are automatically controlled inside such managed area, and safety and functionality of AVs is improved. AVs may be driven to pick up passengers and packages. Large parking facilities (such as airports, amusement parks, and shopping malls) benefit from better control of parking with less guiding personnel. Costs may be reduced by not hiring and training workers to run large temporary parking areas during large events that occur only for a short time period. With a vehicle control API, AVs may enter managed areas and be controlled by area management processes.
BRIEF DESCRIPTION OF THE DRAWINGS
[0049] FIG. 1 is a schematic block diagram of the device and service levels of an exemplary communication system.
[0050] FIG. 2 is an example mobile application system map showing possible user routes.
[0051] FIG. 3 is a schematic plan view of a parking lot showing coverage areas and blind spots. [0052] FIG. 4 is a schematic plan view of a parking lot showing coverage areas and blind spots.
[0053] FIG. 5 is a message flow diagram illustrating messages exchanged during an exemplary process of scouting and selecting a user route.
[0054] FIG. 6 is a message flow diagram illustrating messages exchanged while a user is traversing a selected route in an exemplary embodiment.
[0055] FIG. 7 is a schematic diagram illustrating an example interaction with a mobile device application and a smartwatch application.
[0056] FIG. 8 is a schematic plan view of a parking lot showing coverage areas and blind spots for safety metric calculations.
[0057] FIG. 9 is an example of a route map.
[0058] FIG. 10 is a schematic block diagram of the functional architecture of an exemplary embodiment.
[0059] FIG. 11 is a schematic perspective view of an example user interface for planning and accepting a route with the ability to dynamically control route planning.
[0060] FIG. 12 is a schematic perspective view of exemplary smart watch user interfaces.
[0061] FIG. 13 is a message sequence diagram illustrating a process for using sensor information of oncoming cars in selection of a user's route.
[0062] FIG. 14 is a message sequence diagram demonstrating a process for using car sensors in selecting a route that suits user-given preferences.
[0063] FIG. 15 is a message sequence diagram that shows the remainder of the process shown in FIG. 14 for using car sensors in selecting a route that suits user-given preferences.
[0064] FIG. 16 is a message sequence diagram showing examples of exceptional events that might occur when traveling a route, such as re-planning of the route due to a user deviation and a user-initiated request for meeting the car at a fixed location.
[0065] FIG. 17 is a message sequence diagram showing examples of exceptional events that might occur when traveling a route, such as requests for an accompanied route at a specific location.
[0066] FIG. 18 is a message sequence diagram showing the remainder of an example shown in FIG. 17 of an exceptional event that may occur when traveling a route: a request for an accompanied route at a specific location. [0067] FIG. 19 is a schematic block diagram of system interfaces between an AV and an area manager and related devices.
[0068] FIG. 20 is a plan view schematic of an environment where an AV may be controlled by an area manager.
[0069] FIG. 21 is one embodiment of a message sequencing diagram for an area access process.
[0070] FIG. 22 is one embodiment of a message sequencing diagram for an area access process for operating without a fixed surveillance system.
[0071] FIG. 23 is a plan view schematic of a temporary parking area with virtual parking spots and virtual lanes.
[0072] FIG. 24 illustrates an exemplary wireless transmit/receive unit (WTRU) and, for example, may be used as a primary terminal (e.g., smartphone), a user device, or a connected vehicle computing system (which may be capable of running a route planning application) in some embodiments.
[0073] FIG. 25 illustrates an exemplary network entity which may be used as a route planning service or a smart apace security service in some embodiments.
DETAILED DESCRIPTION
Enhancing Personal Security Using Smart Spaces and Connected Vehicles
[0074] Some embodiments relate to enhancing personal security using smart spaces and connected vehicles, which may be used, for example, when a user walks between a smart space building and a user's vehicle. FIG. 1 shows an overall view of an exemplary system 100. The diagram differentiates between service-level 101 and device-level 102 processes within a user's mobile device 103, smart space 105, and connected vehicle 106. Within a mobile device 103, a PSSM application 108 may handle the primary interface with the user via the user' s primary device 116. The PSSM application 108 may run on top of a mobile device's 103 operating system device drivers and application handlers. In the exemplary embodiment, each system entity uses a communication network 104 to communicate with the other entities. Within the smart space 105, the device-level 102 processes may include low-level sensors, communication systems (e.g., Wi- Fi) 124, and the person ID database 125. A smart space security manager 117 may contain system- level 101 modules, such as a security data manager 118, walking path planning (route calculation) 119, risk calculation 120, security scanning service 121, person identification service 122, and user location and tracking service 123. A connected vehicle's 106 device-level 102 processes may include a vehicle sensor system 130 and a wireless communication unit (or system) 131. Service- level 101 processes may include a PSSM application's 126 security scan service 127, person recognition service 128, and user location and tracking service 129.
[0075] FIG. 2 shows an exemplary embodiment of a mobile device user interface 200. In the planning phase, the system presents alternative routes with current and estimated security and surveillance coverage values (e.g., by visualizing them on a map). The figure illustrates how the PSSM application may present multiple routes to the user 202. An example PSSM application displays each route in white over the top of a satellite view of the area for two routes between the user's current location 202 and a car 208. Next to each route, the PSSM application indicates the security level and percentage of surveillance 204, 206 for each route. The system provides route selection with the possibility of increasing surveillance or security coverage (e.g., by using additional automated vehicles or connected cars).
[0076] For some embodiments, assessed security (unidentified people or vehicles) and surveillance state (coverage area) for a pedestrian's route choices to his or her destination are based on a risk calculation on smart space and connected vehicle scans.
[0077] FIG. 3 is a schematic plan view of a parking lot outside of a smart space in an exemplary use case 300. The user walks a route (or walking path) 307 from a smart space building 304 towards a destination (an AV 318). The route 307 contains two blind spots 305, 306. Trees 308, 309 block the building security cameras 301, 302, 303 at the parking area's edges. The sensor systems of the vehicles 314, 317 parked in the middle of the lot fail to reach these blind spots 305, 306 as shown by the vehicle short-range communication regions 319, 320. The destination vehicle's 318 short-range communications region 321 also fails to cover either blind spot 305, 306. The smart space security service requests an autonomous vehicle 312, 313 to move along a route 311 to one end of the parking lot to provide coverage for a blind spot. The requested location for the autonomous vehicle 312, 313 may be determined based on a calculation that maximizes sensor coverage of a route. The smart space security service instructs a connected vehicle 315, 316 to move along a route 310 to the other end of the lot to provide coverage for the second blind spot. As a result, the user's walking route 307 obtains full security monitoring and tracking coverage. Some embodiments use autonomous vehicles (AV) interchangeably with connected vehicles.
[0078] FIG. 4 is a schematic plan view of a parking lot outside of a smart space in another use case 400. A user leaves a smart space 401 to walk to his or her parked car 2 (409). The smart space contains two security monitoring cameras with views 404, 405 of the parking lot. The triangles 404, 405 show the field of view 404, 405 of the two cameras centered on Door A 402 and Door B 403 exits. The user's car 2 (409) contains perception sensors with a field of view indicated by a triangle 411 emanating from the back of the vehicle 409. The circle 413 centered around car 2 (409) depicts the range of short-range communication (e.g., V2P) from the vehicle 409. The circle 406 emanating from the smart space 401 shows the range of short-range communication (e.g., Wi- Fi) from the smart space 401. The dashed line 408 illustrates the planned walking route 408 from door B to the vehicle 409. The path 408 contains one blind spot 407 unreachable by sensor systems. The path 408 is fully covered with short-range communication tracking (short-range communication ranges). As another vehicle 410 leaves the parking area via the parking exit 414, the vehicle 410 scans the area with the in-vehicle sensors (and sensor FOV 412) while driving. As the user starts walking to the destination car 409, a smart space camera 405 tracks the user until the user is out of range. Tracking continues via connected vehicle systems until the user enters the vehicle.
[0079] FIGs. 5 and 6 illustrate an exemplary personal security support process 500, 600 for when a user walks from a smart space building to a destination. Exemplary embodiments are performed in systems in which the user has the PSSM application installed and running on the mobile user's device 501, and in which connected vehicles have the PSSM application installed and running. Connected vehicles 503 in an exemplary embodiment have environment perception sensor systems capable of detecting and classifying people, objects, and events. Connected vehicles 503 in exemplary embodiments further have short-range communication capabilities (e.g., Wi-Fi or V2P) relaying such data to the smart space security service 502 and a user's device 501. The smart space security service 502 in this example retains a list of IDs for people and vehicles recognized within the smart space (e.g., office building personnel or other people identifiable by the smart space security service 503). The smart space security service 503 may also identify (e.g., via personnel tags, mobile devices, and camera-and camera-based recognition system) and track movements of people in the smart space and parking areas. The smart space security service 503 may have real-time location information about PSSM users and connected vehicles in the area. In exemplary embodiments, the system is capable in real-time of tracking the location of security personnel and performing security scans from vehicles. Such vehicles may be parked in the area or entered or left the area recently (e.g., in the last hour). In exemplary embodiments, the smart space security service 503 has access to these vehicles and is permitted to use these vehicles for risk calculations.
[0080] An exemplary personal security support process for when a user walks from a smart space building to a destination comprises several steps. A user launches the PSSM application 504 on the user's (mobile) device 501 and requests a safe walking route 505. The route starts with the smart space exit location and ends with the destination, which may be, for example, another smart space building, a vehicle, or the same smart space building. The smart space security service 502 identifies possible walking routes from the smart space exit to the destination. The planning process 506 takes into account security monitoring coverage of the area. The system searches for connected vehicles 507 in the area. The smart space security service 502 determines which connected vehicles are located along possible walking routes and may provide additional security scanning and tracking coverage 511. The smart space security service 502 sends a Start Security Scan 508 message to one or more vehicles perform a security scan 509. The smart space security service instructs the selected vehicle(s) to use their sensors to start scanning. The vehicle(s) determine(s) if the vehicle(s) detected one or more people in the area and exchange communication messages with the smart space security service to identify the detected person(s) 510. The Vehicle Security Scan Response 512 includes a description of the area scanned by the vehicle (via field- of-view of its sensors) and the coverage provided by its short-range communication, a list of detected person ID numbers, a list of alerts (e.g., unidentified persons and other potentially alarming events detected), and related data (e.g., location and images of unidentified people). Vehicles continue scanning the area and sending messages to the smart space security service. The smart space security service initiates a security scan. The smart space security service performs the security scanning of the area to search for people and vehicles. The smart space security service obtains identification of the detected people and vehicles. Scanning continues until the user reaches the destination. The smart space security service checks IDs (people and vehicles) 513 detected by the connected vehicle(s) and by the smart space's security monitoring devices against the list of IDs (people and vehicles) authorized by the smart space security service. The smart space security service compiles all the security scan data together 514. The smart space security service collects data for all areas covered with security scans performed by the smart space security service, connected vehicles, and short-range communication devices. The smart space security service also gathers the results from security scans, person identifications made by the smart space security service and connected vehicles, and other available information sources (e.g., the current location of the security personnel) in relation to possible walking routes.
[0081] The smart space security service calculates the risk 515 of each route. For one embodiment, risk assessment ranks the walking routes for safety, calculates the coverage for the security monitoring and user tracking, calculates the potential risks, and generates alerts to the user if warranted. In some embodiments, a safety metric is determined for each route. The safety metric may be, for example, a percentage of the respective route that has sensor coverage or a length of a portion of the respective route that has sensor coverage. In some embodiments, the safety metric takes into consideration additional factors, such as the presence or absence of unidentified individuals along the route, the quality of lighting along the route, and/or sensor data received from vehicles along the route. For some embodiments, a safety metric may be communicated to a user's device 501.
[0082] Based on the risk calculations, the smart space security service generates and sends to the user's device a Security Report 516. The Security Report 516 may include information about areas covered with security monitoring and tracking, areas not covered (blind spots), areas capable of additional coverage via connected vehicles, alerts (including data and media), and suggested walking routes. This information is presented to the user.
[0083] The smart space security service 502 and connected vehicles 503 may use short-range communication systems to relay identity information of detected persons in the surveillance area and to perform user security tracking. For example, smart space personnel may have the PSSM application installed on their mobile devices. When a user exits the smart space, the mobile device sends a Smart Space Exit message, which includes current location, heading, and speed of the mobile device, and the user's person ID. The smart space security service receives person IDs and maintains a database of known people and their IDs. For some embodiments, the smart space may also maintain a database of information related to people and vehicles not matching an entry in the smart space ID registry (or database). This database of unmatched people and vehicles may be used to determine whether to send an alert message to a user's device. The mobile device may send this message, for example, via V2P communication (802. l ip), Wi-Fi, or other short-range communication types. The messaging range varies according to the environment and exemplary embodiment used but the range typically falls within 100 to 1600 feet. When the vehicle performing security scanning and identification of people receives a person identity message, the vehicle will note the location and person ID of the detected person. The vehicle uses the same method for user tracking. If the vehicle detects a person in the area during a security scan but the vehicle fails to receive a person identity message from that person, then the vehicle will record the location (and may capture an image of the person) and raise an unidentified person alert.
[0084] The exemplary personal security support process continues with selection of a route. The user selects a route 517, and the PSSM application on the user's device 502 sends the selected route 518 to the smart space security service 502. The smart space security service 502 identifies blind spots on the route 519. The smart space security service 502 may send Coverage Support Requests 520 to autonomous vehicles to drive through an area or park in a specified location and direction to perform surveillance and tracking. Also, an autonomous vehicle may be requested to drive to a specific location and direction to perform such surveillance and tracking. For a connected vehicle (which may be a manually-driven vehicle), a smart space security service 502 may send a Coverage Support Request 520 to a connected vehicle to get additional surveillance and tracking data. For one embodiment, a smart space 502 may send route information to a manually-driven vehicle to perform surveillance in areas only if the manually-driven vehicle is parked in the area or driven in the area (e.g., entering or leaving the area). Manually-driven vehicles may also record sensor readings while driving in an area and, if requested, provide a recording of an area if recorded recently.
[0085] For one embodiment, a smart space may send information to an autonomous vehicle (AV) to cause the AV to provide sensor coverage for a blind spot in a route. For one embodiment, a smart space may determine which portions of a route lack sensor coverage and determine a ranked list from largest to smallest based on size of those sensor gaps in the route. A smart space may send information to an AV to move to an optimal location to provide a maximum coverage area of a sensor gap. For a sensor that measures objects with a radially-transmitted signal, an optimal location for an AV may be in the center of a sensor gap. If, for example, two AVs with sensors that have equal-sized coverage areas are directed to cover a sensor gap, the AVs may be directed to locations such that overlap of the two sensor coverage areas is minimized while providing sensor data that covers a maximal amount of the sensor gap. An optimal location for an AV based on this maximal coverage area calculation is communicated to an affected AV.
[0086] In some embodiments, the exemplary personal security support process continues further as the user traverses the route. The process entails several steps as the user walks from the smart space to the destination. The smart space security service 502 begins security tracking. The security tracking may begin when the user exits the smart space and starts walking towards the destination. The smart space security service 502 may use various methods to track the user (e.g., security monitoring (camera) sensors and location updates periodically sent from the user's mobile device). As the user walks to the destination, the smart space security service 502 informs vehicles 503 on the walking route to start security tracking. The vehicles 503 use perception sensors and communication systems to track the user. The smart space security service 502 provides the user with the best potential walking route. The connected vehicle 503 estimates the user's direction of approach. When the user comes within range of a connected vehicle 503, the vehicle 503 starts sending to the smart space security service 502 the user identification and location information. The system updates the security tracking status. When a system activates security tracking from the smart space security service or a connected vehicle, the system periodically sends a status message to the user's mobile device. The user's mobile device receives tracking status information (e.g., where the user is in relation to the areas covered) and the ID of the vehicle supporting the security tracking. When a user passes a connected vehicle 503 used to support user tracking, the smart space security service 502 instructs the vehicle 503 to stop security scanning and tracking.
[0087] The exemplary personal security support process continues as the user reaches the destination. The PSSM application sends a message to the smart space security service when the user arrives at the destination. The smart space security service may then stop the security scanning and tracking of the user.
[0088] FIG. 5 illustrates an exemplary set of messages that may be exchanged when a user indicates a desire to walk from the smart space to a destination. The user requests information on a safe route from the smart space to a destination. The user executes the PSSM application on the primary device (capable of short-range communication, such as V2P or Wi-Fi) and enters the destination. The smart space security service 502 calculates one or more walking paths between the smart space and the destination and begins evaluating the possible routes. The smart space security service 502 issues a security scan request to available local connected vehicles 503. These vehicles respond with data from multiple sources, such as vehicle cameras, LIDAR, and thermal sensors. The returned report identifies security risks (e.g., obstacles, a dark path, and any alarming alerts), the person ID of any identified person on the route (which may be identified using, for example, P2V or RFID communications), and available data on unidentified people (e.g., location and image of the person). The smart space security service 502 and one or more connected vehicles 503 repeat this process on a continual basis until the user reaches the destination or is out of range of the connected vehicles. The smart space security service 502 looks up person IDs in a smart space database to match detected people with known identities and to flag unidentified people as an alert.
[0089] For some embodiments, a database may be used to store information about unidentified people and vehicles. For some embodiments, a safety metric is calculated based on whether information about unidentified people and vehicles matches an entry in a database of known identities. For some embodiments, a safety metric is decreased if information about unidentified people or vehicles matches an entry in a database of unidentified people and vehicles.
[0090] The smart space security service 502 updates the security profile and sends the user a report about the route. This report may include, for example, alerts and risk assessments of the possible routes. The user selects a route, and the PSSM communicates to the smart space security service 502 the particular choice, along with the user ID. If the smart space security service 502 detects a blind spot along or near the route, then the smart space security service 502 may instruct a nearby connected vehicle 503 (if available) to drive towards the blind spot to increase surveillance coverage or to perform a security scan. [0091] Similarly, FIG. 6 illustrates an exemplary embodiment 600 for a user to traverse a route.
The user starts walking the route from a smart space building to a destination 604. The user's device 601 sends periodic location updates 606 to the smart space security service 602. The smart space security service 602 also may track a user's progress 607 via sensors (e.g., beacons and cameras) and determine if a user starts traversing a different route. The smart space security service 602 instructs 608 vehicles 603 along the route to begin security scans 609. An autonomous vehicle may be requested to drive through an area or park at a specified location and direction to do surveillance and tracking, or may be requested to drive to a specified location and direction to do the surveillance and tracking 605. For a connected vehicle (which may be a manually-driven vehicle), a smart space security service 602 may send a request to a connected vehicle to get additional surveillance and tracking data. For one embodiment, a smart space 602 may send route information to a manually-driven vehicle to perform surveillance in areas only if the manually- driven vehicle is parked in the area or driven in the area (e.g., entering or leaving the area). Manually-driven vehicles may also record sensor readings while driving in an area and, if requested, provide a recording of an area if recorded recently. For one embodiment, a plurality of routes that extend from a starting location to a destination location may be calculated and communicated to a user device 601. For some embodiments, the starting location may be a midpoint location along a route or a location not associated with a route. Security information, such as security metrics and alerts, may be communicated to a user device 601. A user may select one of the plurality of routes and a user device 601 associated with the user may send a message to a smart space 602 to indicate the user's selected route.
[0092] The smart space security service's requests 608 may include the user ID, route identification, and route direction. These vehicles 603 respond to the smart space security service 602 with the user ID and location upon the vehicle's detection of the user 610. The smart space security service may send a Security Tracking Status 611 to a user's device 601 to provide security status information to the user. The PSSM notifies the user 615 about the vehicle's detection of the user. If the user passes by a vehicle, the smart space security service 602 may instruct 612 the vehicle 603 to stop scanning 613. A connected vehicle 603 sends a User Passed Response 614 to the smart space security service 603 if scanning is stopped. If the user arrives at and enters the destination 618, the user may stop the security assistance by sending the user ID 616 to the smart space security service 603, which stops tracking 617. The user may receive a report detailing any detected people located at or near the destination.
[0093] While the user is walking the route, several optional steps may be performed. During the walk, if a security risk changes, the user receives (e.g., via a beep or haptic alert) an alert (such as a message warning the user of the presence of an unidentified person). The user may change routes as a result. For one embodiment, a smart space 602 and/or a connected vehicle 603 may use RFID tags or keys, usual personnel movement patterns (e.g., from an exit to a parking location), mobile device tracking and identification, and camera-based identification of people (e.g., facial recognition) represent a few exemplary options for detecting and identifying people in an area.
[0094] At any point along the route, the user may request extra safety or surveillance actions or a vehicle pickup (which may include sending to the smart space security service the user ID and exact location). Autonomous vehicles may provide security scanning while driving. A smart space security service may instruct an autonomous vehicle to scan a specific area, which may include locations smart space sensors fail to reach. A smart space security service may also instruct a vehicle to travel to a particular location and direction to park for pick-up or drop-off to optimize sensor coverage.
[0095] In some embodiments, a security service as described herein may be used by a driver of a vehicle arriving at a smart space. The security service may recommend or select an available parking space to the arriving user based at least in part on the availability of a safe walking route from the parking space to the smart space. In the case of an autonomous vehicle, the smart space security service may instruct the vehicle to navigate to the selected parking space. In some embodiments, the parking space may be selected at least in part based on weather conditions, e.g. favouring a parking space that is nearer to a covered route when it is raining. The security service may select the safest available parking place by analyzing smart space security monitoring coverage and availability of connected vehicles (which provide additional security scanning capability). A walking route calculation may take into account routes with shelter for bad weather. The smart space security service may use weather parameter sensors to record the temperature or the presence of rain, snow, hail, clouds, fog, humidity, wind, and the like.
[0096] Also, the user' s device may request additional security support when the user enters or exits the parking lot or parking location of the vehicle.
[0097] FIG. 5 illustrates an exemplary messaging process for scouting one or more routes before the user begins to traverse a route. FIG. 6 illustrates an exemplary messaging process when the user traverses a route. The messages shown in FIGs. 5 and 6 may be used in a messaging protocol used to support exemplary embodiments described in this specification. Each message is described below. It should be understood that embodiments need not be limited to the use of the specific messages and fields of messages described below, and the messages described below may not be employed in all embodiments. It should further be understood that additional messages such as acknowledgements and the like may be employed in an exemplary protocol but are not described here in detail.
[0098] A Smart Space Exit Walking Route Request may be sent by a user's device to start the route planning process by requesting a walking route. The User Information field may include at least the user ID. The Destination field contains the destination address (or coordinates) for the walking route. If the user is walking to his or her car, then this field reconfirms to the smart space security service the location of the vehicle.
[0099] The smart space security service sends a Start Security Scan Request to a connected vehicle to begin a security scan. The vehicle and the smart space security service may use short- range communication systems and a smart space database to identify detected people. The V2P communication may send Collected Person ID messages that contain a detected person's location, speed, direction, person ID (if known), and the vehicle ID of the vehicle that detected the person.
[0100] A connected vehicle may send a Start Security Scan Response to a smart space security service in response to a Start Security Scan Request message. This message includes fields for the area, the communication coverage, the person ID(s), and the alert ID. The Area field provides a description of the area covered (e.g., via corner coordinates) by the vehicle security scan. If the vehicle is stationary, then the area is the field-of-view of the vehicle sensors. If the vehicle is moving, then the area is the entire area scanned while driving. The Communication Coverage field provides an estimate of the area covered by short-range communication (e.g., V2P or Wi-Fi communication). The system uses such an estimate for tracking a pedestrian. The Person ID field lists person ID(s) and locations of anyone identified by the vehicle. The Alert field contains a list of potential security or safety concerns detected. For example, this field may contain unidentified persons, their locations, and associated images (which may be taken by the vehicle camera). The Alert field may also contain descriptions of obstacles, an indication of darkness, and a description of other potential risks (such as high speed vehicles (e.g., an electric bike) on the sidewalk or walking area).
[0101] A Security Report message combines the content of the Start Security Scan Response message with other sources of information to report data from the smart space security service to the user's device. The security report includes fields for security data, routes, alerts, and additional information. The Security Data field lists at least four categories of areas: areas covered by security monitoring, areas covered by security tracking, areas not covered (blind spots), and blind spots which may be covered by vehicle scans. Each of these areas may be reported as a list of corner coordinates. The Routes field provides a description of suggested walking route(s) and may be reported as a list of waypoint coordinates. This field also reports the percentage of each route covered with security monitoring and tracking devices. The Alert field reports security alerts
(including data and media) and provides coordinates to pinpoint an accurate location. The Additional Information field includes the number, identity, and location of identified persons (e.g., personnel) currently present on the route as well as the number, identity, location, and next task (e.g., handle a security alert) of security personnel on the route.
[0102] A user' s device may send a Route Selection message to the smart space security service to communicate the user's selected route. The message includes the user ID and the selected route's ID.
[0103] The smart space security service may send a Coverage Support Request to a connected vehicle to scan areas. This message includes area, location, and direction fields. The Area field lists the area through which the smart space security service requests the vehicle to drive and to perform a security scan. This field may be reported, for example, as corner coordinates. The Location field lists the location of the desired parking space (e.g., as coordinates). The Direction field contains the desired heading of the vehicle (e.g., as degrees from north).
[0104] A Location Update message may be sent by a user's device to the smart space security service to provide updates on the location of the user. The message includes the user ID and his or her location, which may be reported as coordinates.
[0105] A smart space security service may send a User Walking message to a connected vehicle that indicates that the user has started traversing the route. The message includes the user ID (the ID of the person to be tracked) and information identifying the route (the most likely route the user will traverse). The route may be reported as a list of waypoint coordinates.
[0106] A User Walking Response may be sent by a connected vehicle to a smart space security service when the user comes within range of the vehicle. The message includes a user ID and a current location of a user (which may be reported as coordinates).
[0107] The smart space security service may send a Security Tracking Status periodically to a user' s device to provide a security status of a route. This message includes fields for tracking status and vehicle ID(s). The Tracking Status field details the status of the security monitoring (e.g., the user's location within the security monitoring coverage map). The field also provides the status of the security tracking, which includes the user's device short-range communication connection type. The Vehicle ID field lists the vehicle identifications (e.g., license plate numbers) currently supporting security tracking. [0108] The smart space security service may send a User Passed Connected Vehicle message to a connected vehicle when the user passes by the connected vehicle. The message contains a field for the user ID and a field for the connected vehicle's ID.
[0109] A connected vehicle may send a User Passed Connected Vehicle Response to the smart space security service in response to a User Passed Connected Vehicle message. This response message indicates that the connected vehicle will stop scanning for security issues related to the present walking route.
[0110] A user's device may send a User Reached Destination message to a smart space security service when a user reaches a destination. The message contains a field for the user ID.
[0111] FIG. 7 shows an exemplary system 700 for interfacing with a user via a user's device 706, 712. When the user is walking a route, a system may dynamically provide to the user through his or her personal device(s) the status of surveillance and security data. See an example status report on the device 706 shown on the left side of FIG. 7. The example status report contains two routes, which may be selected by route buttons 701. For this example, extra surveillance was brought along the route, as selected by a button 702. The user's device also displays a status 703 with percentages of the route that had surveillance 704 and security 705.
[0112] The middle of FIG. 7 shows a dynamic route security assessment 708 as seen on a smart watch 709, for example, along with tactile or auditory feedback 707. After a user arrives at the destination, a system generates a detailed report 710. The user may submit the report for system improvements or other purposes by pressing a button 711. See the right side of FIG. 7.
[0113] FIG. 8 shows an exemplary embodiment 800 for calculating a safety metric for a route. In this embodiment, the system evaluates the amount of coverage for security monitoring and user tracking and calculates a safety metric based on this assessment. The calculation of the safety metric may be influenced by several positive factors, the presence of which may increase the value of the safety metric. Positive factors may include length of the route covered with security monitoring (e.g., cameras), length of the route covered with tracking functionality (e.g., communication devices, cameras, and sensors), availability of recent security scans of blind spots (e.g., images captured by connected vehicles while driving), availability of additional connected vehicles to cover blind spots, the number of known persons on the route, and the number of security personnel on the route. The calculation of the safety metric may further be influenced by several negative factors, the presence of which may decrease the value of the safety metric. Negative factors may include the length of the route not covered (blind spots), the number of unidentified people on the route, the length of the route not illuminated by good lights when traveling at night, and the presence of possible alerts (e.g., potential security or safety concerns detected or other risks) on the route. Also, the walking distance may be treated as a neutral factor if the route is short, and the walking distance may be a negative factor if the route is long.
[0114] For one embodiment, the safety metric for each potential walking route is ranked from 0 to 100, where 0 is completely unsafe and 100 is very safe. A route completely covered with security monitoring and tracking with no blind spots, unidentified people, security alerts, or dark sections would receive a safety metric of 100. The level of security monitoring and tracking provides the basis for the risk metric calculation. FIG. 8 demonstrates an example scenario for a risk metric calculation. In one embodiment, the safety metric is calculated as an average of the percentage of security monitoring coverage and the percentage of tracking coverage. For example, if the security monitoring coverage is 87.5% and the tracking coverage is 92.5%, the value of the safety metric is 90.
[0115] The safety metric may be adjusted based on other available information. For example, if a route section contains unidentified people, no security personnel, and no identified people, then the safety metric is reduced. If one or more of the unidentified people are in a dark area, then the safety metric is reduced even more. For example, if a route is divided into ten sections, one embodiment reduces the safety metric by ten points for each section containing an unidentified person. If one or more of these unidentified people is in a dark area, then the safety metric is reduced by twenty points. Additionally, each alert (e.g., a potential security or safety concern or other detected risk) reduces the safety metric by twenty points. The initial safety metric minus the adjustments is calculated for each potential route and presented to the user as the final safety metric. The safety metric may factor in risk assessments made for other users. The calculation of the safety metric may take into consideration data obtained from external sources (e.g., crowdsourced data or big data methods).
[0116] FIG. 8 is a schematic plan view of a parking lot outside of a smart space in an exemplary embodiment. The user walks a route towards the destination. The route (or walking path 815) is broken into eight equal segments, delineated by points labeled with the letters A through I. Point A is the front door of the smart space 801, and point I is next to the target 809. Smart space security cameras 802, 803 cover the segments extending from point A through point C. The security camera ranges 804, 805 and the directions of each camera's viewing angle are known values. An autonomous vehicle 806, 807 is directed along a route 813 to a new space to provide additional coverage between points C and E based on the autonomous vehicle's 807 short-range communication region 810. The autonomous vehicle's 807 cameras and RADAR and LIDAR systems may be used to provide additional surveillance, tracking, and monitoring coverage. The range and direction of those systems are known values. A parked autonomous vehicle 808 provides coverage between points E and G based on the parked vehicle's 808 short-range communication region 811. The parked autonomous vehicle's 808 cameras and RADAR and LIDAR systems provide coverage. The RADAR and LIDAR system components are not shown in FIG. 8 due to size. For one embodiment, transceivers for RADAR and LIDAR systems may be mounted on the roof of a vehicle, but other embodiments may mount them in different locations. The range and direction of those systems are known values. A blind spot 814 exists between points G and H. The target vehicle 809 provides coverage between points H and I based on the target vehicle's 809 short-range communication region 812. The vehicle's cameras and RADAR and LIDAR systems provide coverage. The range and direction of those systems are known values.
[0117] Under FIG. 8's example, coverage estimates for the route are calculated based on known camera and system ranges. Between points A and C, the area is fully covered by smart
2
space cameras, so that section receives a score of - Likewise, sections C to E and E to G receive
2 1 section scores of - Section H to I is fully covered by the target vehicle, so it receives a score of - The blind spot for section G to H receives a score of 0. Adding all the sections together produces a safety metric of - (= - + - + - + 0 + -) or 87.5%.
J 8 8 8 8 8J
[0118] In an exemplary use case in which some embodiments may be implemented, a user, Christina, studies in a university and she frequently has evening badminton practices. After practice, she walks alone through the university campus to her apartment. She installs the PSSM application on her mobile phone. One evening, Christina leaves the university sports hall and plans a walking route back to her apartment. She launches the PSSM application and sets the destination. The application updates the latest security information from the smart space security service and provides three walking routes with security details from which to choose. The shortest route contains a security alert (several unidentified persons in a dark alley), so she skips this route. The second route lacks any security alerts but security monitoring covers only 60% of the route while security tracking (communication) covers 80% of the route. The second route contains several blind spots which smart spaces or connected vehicles cannot cover currently. Therefore, Christina chooses the longest route, which provides security monitoring of 85% of the route and security tracking of 95% of the route. The third route contains two security blind spots close to parking areas in the apartment. A security report indicates several identified persons (e.g., students or personnel) walking in this area. The same report mentions an autonomous vehicle and a connected vehicle available to cover the blind spots and provide security monitoring and tracking. Christina accepts this third route due to its additional vehicle-based security. [0119] Christina leaves the sports hall and starts walking to her apartment via the longer route. The PSSM application on her mobile phone shows security tracking activity. The PSSM also indicates that the short-range communication and tracking link is connected to the sports hall smart space and that a connected vehicle is parked in the parking area. While Christina walks, the application shows the security tracking status, and it beeps when she gets to a blind spot and loses security tracking. While in the blind spot, Christina speeds up her walking until a car comes around the corner. Her phone reports that security tracking connected again with this security surveillance- supporting car. Christina arrives at her apartment, and the PSSM application on her phone shuts down.
[0120] Consider a second use case. A user, Maria, leaves her office late at night. She launches the PSSM application on her smartphone to see the security situation. The PSSM application shows her a map of the walking route from her office building to her car in the personnel parking lot. Her smartphone PSSM application displays one active, potential security alert in the area. She opens the alert message and sees that one of the cars, which just entered the parking area, detected an unidentified person between two parked vehicles. The application also informs Maria that security personnel have been dispatched to identify the person. Maria decides to wait until security personnel assess the situation.
[0121] After a while, Maria checks the security status via the PSSM application and discovers that the situation returned to normal. Smart space and vehicle camera monitoring cover 85% of the suggested walking path to the car, while communication tracking covers 100% of the route. Maria walks to the parking garage door, and the PSSM application reports the location of her car and the recommended walking route to it. The application shows that security tracking is active and that the communication link connected to the car and to the smart space. When Maria arrives safely at the car, she starts to drive home. Maria notices that her car's PSSM application performs security scanning while she drives out of the parking lot. The car and cell phone PSSM applications shut down when Maria's car leaves the parking area.
[0122] In a third use case. A user, Liz, leaves her office late at night. She starts the PSSM application on her smartphone. Liz instructs her autonomous car to pick her up at the office building's rear exit. Her smartphone application indicates that the car will be at the exit in 5 minutes. Together, the smartphone and the smart space security service perform personal security and safety monitoring as the autonomous vehicle approaches. After 3 minutes, the smartphone application beeps when it receives a security alert. Liz opens the alert message and sees that the car detected an unidentified person between parked vehicles. The application shows an image of the situation. The PSSM also shows an indoor map of the areas covered during the security scan. Liz decides to take another exit and summons the car to the front door of the office. Liz walks to the front door, and the PSSM application reports to Liz where the car is waiting. The application reports that walking path surveillance covered the complete route. The PSSM also indicates that personal security and safety tracking is active and that the communication link connected to the car and to the office smart space. When Liz steps out of the building, the PSSM application on her mobile device switches to security tracking mode. When Liz reaches the car, she instructs the car to drive her home.
Planning an Exercise Route Using Dynamic Route Conditions
[0123] Some embodiments provide systems and methods for a user to plan an exercise route with dynamic route conditions. FIG. 9 shows an example screen shot 900 of an exercise route 902 displayed on a map 904 within an application running on a user's device. An example exercise route 902 is shown as a set of thick, bold lines, while the other streets are shown as thin lines. For this example, a user 906 is currently at a hotel on the corner of North Avenue and First Street. Using systems and methods described in more detail below, an exercise route 902 is created that starts at the user's current location 906 of North and First. The example route goes south down First Street to Tenth Avenue and continues to follow a T-shaped route through a portion of a city. The route returns to the starting location with a portion of the route along North Avenue.
[0124] FIG. 10 is a functional block diagram of an exemplary embodiment 1000 of a system for communicating route planning data to a user device 1002 (such as a bike computer 1012, wrist device 1014, or primary device 1016). The device level 1040 includes physical devices, such as vehicle sensors 1068, user devices 1002, vehicles 1054, and other physical hardware. These items include both sensors that measure a route's conditions and devices that interface directly with the user. The device level 1040 also includes other physical devices, such as base stations, building cameras, and other physical items used to survey potential routes. Device level 1040 items may communicate by wide area communication networks 1042 and V2P and/or Bluetooth networks 1044. The service level 1030 items may include user device 1002 components of a route planning application 1004, such as route preference selection and control 1006, vehicle interaction 1008, and user tracking services 1010. The service level 1030 items may also include route planning service 1022 components of a route planning application 1004, such as a route planner 1026, map data and map interface 1028, and a vehicle command module 1032. The service level 1030 items may also include route planning service 1022 components, such as sensor information sharing service 1034, sensor information processing service 1036, and external APIs 1038. External services 1046 may include service level 1030 components such as a weather service 1048, a map data service 1050, and traffic information 1052. The service level 1030 items may also include route planning application 1004 vehicle 1054 components, such as a route reporting module 1058, a driving route module 1060, a vehicle provider 1062, a sensor data interface 1064, and a sensor access module 1066. For the information level 1020, a route planning service 1022 may include user profiles, augmented map data, and route history 1024. The information level 1020 may also include a vehicle 1054 component of map vehicle data 1056. The components as shown in FIG.
10 offer one example configuration. The information level 1020 elements relate to profile-type data, such as user profiles, vehicle profiles, general map data, route history, and augmented map data, for example.
[0125] FIG. 11 shows a sample user interface (UI) 1100 within a vehicle for visualizing a planned (and accepted) route and for providing control for dynamic route planning. The UI 1100 may be shared with other stakeholders (e.g., a coach or personal trainer). The picture shows two optional routes for selection by user, shown with a glow (which may be, e.g., a red glow) and one solid line 1111 or two solid lines 1107 and an exclamation point in a route description 1120. A route may be planned or set up using static data before exercising or traversing the route. Data may have already been received via autonomous vehicles that were on the scouting route. The user interface (UI) 1100 displays the details of the route with specific notation of segment details and alternatives.
[0126] For the example of FIG. 11, a route starts with the word "Start" 1101. The first route segment 1102 is asphalt with a headwind. The second route segment 1103 is gravel and hot but roofed. The second segment ends with a pick-up/re-plan point 1104. The user is picked up by an AV for the third route segment 1105 due to the segment being sunny, crowded, and with stops. The segment 1105 ends at a point marked "Get off 1106. The user has a choice of two routes. The first route includes a fast tailwind route segment 1107. The second route includes one segment 1111 with lane changes and a second segment 11 12 with potholes that is traveled by car from a pick-up/re-plan point 1113 to a get off point 1108. The two routes join back together for a route segment 1109 through a cold tunnel that ends at a point marked "Stop" 1110.
[0127] A screen of the user interface displays the proposed route, potential routing issues, and relevant route data (such as weather and road conditions, as well as route type, elevation data, and other desired information). The top of the screen 1117 shows the start point ("Hotel") and end point ("Office") of the route. Different colors may be used along with a key to convey various information. For example, one segment of the route may be colored red, and a key may indicate that the red segment is paved with asphalt and is experiencing a headwind. Another segment of the route may be colored yellow, and the key may indicate that the yellow segment is roofed, is hot, and is covered with gravel. Another segment of the route may be colored green, and the key may indicate that the green segment is experiencing a tailwind and requires lane changes or crossing the street from one sidewalk to another one on the opposite side. Another segment of the route may be colored blue, and the key may indicate that the blue segment is experiencing a tailwind and is a very fast segment. Dashed lines may be used to show sections of the route where the vehicle transports the user from one location to another location. Short dashed lines are portions traversed by vehicle due to potholes, while long dashed lines are portions traversed by vehicle due to crowds or other obstructions. A series of solid black circles show asphalt route sections that also have a headwind. An alternating series of black circles and dashes show gravel route sections that also have a roof over the route. The map also labels pick-up and drop-off points along the route. The user interface allows the user to turn on or off the options of vehicle pick-up 1115 and dynamic re-plan 1116. In the upper right corner, the user has the option to accept 1118 the route as shown or to re-plan 1119 any portion of it.
[0128] After a vehicle finishes the route check, the system provides a user interface for planning a route, visualizing a route's details, setting controls for a route, and accepting the route. Alternatively, a system may perform the route check dynamically. Such a method skips the initial pre-check of the entire route but moves the vehicle in front of the user by a preset distance and reports conditions as encountered. For example, the vehicle travels a block or two in front of the user (depending on the user's speed) and updates the user on upcoming conditions.
[0129] The user interface may show the overall view of the planned route with the ability to update the start, stop, pick-up, or break locations. Changes to the route trigger re-planning or dynamic planning. The user interface illustrates main sections of the route and labels road and weather conditions, road width, the presence of traffic and parked cars, and the presence of tunnels, roofed areas, and open areas. The Route Planning Service may provide additional route sections if alternates meet the user's set criteria. The user interface allows the user to modify the route to swap in an alternative route if one exists.
[0130] FIG. 12 shows a series of exemplary notifications that may be provided to a user. The system provides notifications to the user via a personal device, such as a smartwatch 1201, 1202, 1203, 1204, 1205, 1206, 1207, 1208 or a cell phone. Notifications may include, for example, "change to other side of the road" 1209, "increase (or decrease) your speed" 1213, "immediate request for car sent" 1215, "pick-up due to safety/bad route (accept/decline)" 1210, 1211, "drink/snack/break (accept/decline)" 1214 or warnings of loose gravel 1216 / potholes 1212 down the road. The accept/decline feature allows the user to agree with the system's suggestion by accepting the message or to reject the suggestion by declining the message. The user may call the car or make alterations to the route. The user interface may utilize visual and tactile interaction modalities 1217, 1218, 1219, 1220, 1221, 1222, 1223, 1224 for notifying the user. Button presses or voice response commands received from the user notify the system of the user's awareness of the notification. The application, for example, defaults to triggering the car to come to the user with a simple button press of the user's smartwatch unless the user's preferences state otherwise.
[0131] In an exemplary embodiment, sensor information collected by oncoming cars is collected, processed, and used to guide the route of a user. One such embodiment is illustrated in FIG. 13. The method of FIG. 13 may be implemented in a setting in which an appropriate version of the Route Planning Application is installed on the user's device, the nearby cars have a sensor access module installed, and sensors are available (e.g., rear-view, side-view, and front-view cameras).
[0132] In this exemplary embodiment 1300, a system starts by collecting route data. This process includes the steps of activating collection of route data, querying the RPS for sensor data collection parameters, and beginning collection of route data.
[0133] Activating collection of route data begins after the user 1302 starts 1320 the route planning application 1306 and activates vehicles nearby (e.g., in a range of 3 miles) that have a sensor access module to collect route data by using vehicle sensors. The route planning application 1306 on the user's device 1304 sends a Road/Street Data Collection Request 1322 to the route planning application service 1310, which then delivers to nearby vehicles 1312 the request 1324 for the sensor access modules. For some embodiments, the route planning service 1310 is based in the cloud 1308.
[0134] To start querying the route planning service 1310 for sensor data collection parameters, the sensor access module 1314 sends a Sensor Data Collection Parameters Request 1326 to the route planning service 1310. The route planning service 1310 sends a Sensor Data Collection Parameters Response 1328 that replies with the type sensor data sought. For example, the user 1302 may be interested in events that may be identified from video streams.
[0135] To begin continuous collection of route data 1330, nearby cars that have a sensor access module 1314 in continuous sensor data delivery mode may continually deliver Sensor Information Notifications 1332 for a sensor information sharing service 1316. The delivery of sensor information may depend on sensor data collection parameters queried in a Sensor Data Collection Parameters Request 1326.
[0136] To deliver to the user an oncoming vehicle's data key, the vehicle's sensor access module sends a vehicle data key notification 1334 to the user's route planning application 1306. The user's route planning application 1306 may send the vehicle data key notification 1336 to the sensor data processing service 1318.
[0137] Utilizing the shared sensor data includes the steps of fetching sensor data, analyzing sensor data, enhancing route data, and notifying the user. When fetching sensor data, the sensor data processing service uses the sensor data key and sends a Sensor Information Request 1338 to the sensor information sharing service 1316. The sensor information sharing service 1316 responds with the collected vehicle sensor data in a Sensor Information Response 1340. In analyzing sensor data 1342, the sensor data processing service 1318 analyzes the sensor information and extracts street/road events (e.g., road, bike lane, or sidewalk blocked, traffic jam, and traffic accident) that may cause changes to the user's route. To enhance the route plan, the sensor data processing service 1318 sends a Street/Road Event Notification 1344 to the route planning application 1306. The application 1306 uses the identified street/road events to update the route plan. To notify the user of the new sensor data, the route planning application 1306 sends a Route Update Notification message 1346 to the user 1302 (which may occur by displaying the message on a user interface). The user's device 1304 notifies the user 1302 about the recognized street/road events and shows an update route for the user.
[0138] In some embodiments, oncoming vehicle sensor data may be reported to a user using the following steps. The user executes a route planning application on his or her device. Vehicles providing sensor data to this service are registered with the application (which occurs in the RPS cloud backend). The user requests sensor data regarding the road/street. The application (or associated RPS cloud service(s)) distributes this request to vehicles. Vehicles query the service about the parameter set and what sort of sensor data (for instance, front or back camera data) the user needs. Vehicles now stay in a constant data collection mode. Vehicles provide the sensor data to the application. The application aggregates all the data from all vehicles in the processing engine and filters it down to street/road events that may cause changes to the user's route. Vehicles provide a key to enable the user to access the sensor data. The user sends the key to the application to access the specific vehicle information. The application then provides sensor information for the specific route to the user's device. Based on this data, the user's route may be updated, and the user's device is notified of the change.
[0139] FIG. 13 's messaging protocol offers an example of how to retrieve vehicle sensor data from oncoming vehicles and report it to the user. The protocol may include the following messages and data fields. A Road Data Collection Request may include a user's location. For a Sensor Data Collection Parameters Request, the sensor capabilities field may list desired sensor data. A Sensor Data Collection Parameters Response may list the sensor configuration parameters (e.g., how many video frames should be recorded in second) and which sensors will provide sensor data. For a Sensor Information Notification, a sensor information field may list information provided by a vehicle's sensors. For a Vehicle Data Key Notification, the vehicle data key field may enable access to a vehicle's sensor data. A Sensor Information Request may use a vehicle data key field. For a Sensor Information Response, the sensor information may be the data provided by a vehicle' s sensor. For a Street/Road Event Notification, the road events field may list identified events present on the road. The locations for road events field lists the locations for the identified road events. A Route Update Notification provides details of a route update. The updated route field lists the route that has been produced based on the latest sensor data. The road items/events field lists road events that have been identified, such as: trash, people loitering, sidewalk condition, pets, cardboard boxes, roofed sidewalks, micro weather, and the width of sidewalks. The locations for road events field lists locations for the identified road events. The message also includes the number of street crossings on the route and optionally how many street crossings contain traffic lights.
[0140] The process for route selection may be based on static and dynamic information about the user's preferences and actual information on road conditions and other observations along the route, which may be obtained via vehicle sensors or other available sensors. An exemplary route selection process, illustrated in FIGs. 14 and 15, is described below.
[0141] Prior to starting the process, the appropriate version of the route planning application is installed on the user's device 1402. The vehicle is coupled with the application on both the user' s personal device and the Route Planning Service (RPS) 1404 cloud service. The user should authorize the RPS 1404 to use the vehicle and its sensors to detect dynamic conditions along potential routes during route selection.
[0142] To select preferences for the route search, the user enters dynamic preferences for route selection on his primary device. The preferences may include, e.g., maximum duration of the route on given method of transportation (such as cycling) or maximum allowed distance for duration of the route, maximum allowed distance to starting or other location along the route, or minimum lighting level along the route. A more comprehensive list of user-provided parameters is provided below. Also, the user may set whether the car should check the entire route beforehand or whether the vehicle should make piecewise checks in front of the user. For piecewise checks, the user may set maximum distance allowed for the vehicle to be in front of the user.
[0143] FIG. 14 shows one embodiment 1400 of a route selection process. To search for a route, the user activates the route search on his or her primary device and the user's device 1402 sends a Route Planning Request 1410 to a route planning service 1404. The RPS 1404 uses a selection of inputs to generate potential routes for the user, including: (1) dynamic parameters 1408 given by the user, including user location; (2) previously recorded user data and preferences such as fitness level and workout history; (3) known routes near the user's location, including map data for the routes, such as road inclination and previously gathered data on the routes, such as road conditions, the presence and condition of sidewalks, and the presence of light posts; and (4) past and forthcoming weather information and any weather related historical information. After finishing the initial route search, the service generates 1412 a list of candidate routes.
[0144] To perform dynamic route scouting, the RPS 1404 instructs a vehicle to scout the route candidates. For one embodiment, A Vehicle Use Permission Request 1414 is sent by the RPS 1404 to a user's device 1402 and a Vehicle Use Permission Response 1416 is sent back granting permission to use a vehicle. The RPS 1404 sends a Vehicle Use Permission Request 1418 to a route planning application 1406, which checks the grant, determines availability, and checks vehicle fuel and battery status 1420, which sends back a Vehicle Use Permission Response 1422 based on a check of the grant, a determination of availability, and a check of vehicle fuel and battery status 1420. The RPS 1404 also may send a Sensor Capability Request 1424 to a route planning application 1406 and receive a Sensor Capability Response 1426 in return. The RPS 1404 determines reporting requirements 1428 and sends Section Scouting Requests 1430.
[0145] One exemplary process includes the methods of sending a Section Scouting Request 1430 for the first candidate route, scouting each remaining candidate route or requesting to scout a new route, sending the candidate routes to the user for selection, selecting a candidate route by the user, issuing a start notification to the user, and updating the route map.
[0146] To send a Section Scouting Request 1430 for a route, RPS 1404 requests the vehicle to observe a section of a candidate route. An exemplary request 1430 contains section coordinates, report content descriptions, and the scouting type. Section coordinates may be specified as GPS waypoints for the start and end points. The report 1434 content description contains factors to include in the route section report, such as: sidewalk information (presence or absence, condition such as potholes, and sidewalk width); road conditions (e.g. potholes, surface material and roughness); lighting levels along the route, such as missing or broken lights or no lights at all; pedestrian information such as number of pedestrians, cyclists, or joggers detected along the route; road inclination, which may affect the safety of cycling and the level of exercise or calorie burning; and significant roadside elements, such as parked cars blocking the sidewalk or bike lane, or areas that may be used for protection from rain. The scouting type field contains whether the vehicle should scout only the given/shortest route from section beginning to section end, or whether it should scout all possible routes shorter than a given threshold. [0147] To scout a candidate route, the vehicle scouts 1432 the route section by driving along the section and observing the route according to report content description. The vehicle may send a route section report 1434. The report 1434 may contain information on one or more alternative routes between given endpoints. The report may contain: section information, such as route coordinates; information specified in the report content description, as well as any hazards recognized by the vehicle, such as slippery spots, or puddles of water on sidewalks, or loose gravel on road after a steep road decline; and potential blockages, such as cars parked on sideways.
[0148] After scouting the first section, the vehicle may scout successive sections until done or request to scout another route if the first route is done. Based on the route section report(s), RPS 1404 may instruct the vehicle to scout the next section along the overall route, or if none of the reported routes for the section fulfill the criteria, request the vehicle to scout another route for the section.
[0149] After finishing scouting or performing enough scouting to enable the user to select a route, the vehicle sends 1438 candidate route alternatives to the user for selection. The user selects 1440 the preferred route section(s) from the given alternatives. The RPS 1404 updates map and route information 1442 based on the user's route selection. The RPS 1404 sends a Section Scouting Request 1444 to a route planning application 1406 and receives a Section Scouting Report 1446 in return. The service issues a start permission 1450 to the user after it has determined that enough route has been covered 1448 so that the rest of the route may be updated 1452 as the user moves along the route. The decision is based on, e.g., historical information on user's speed, route profile (road inclination / workout level), and weather conditions.
[0150] The service updates a map of the route on the user's device as more route sections are scouted 1454. The map contains information about the planned and confirmed sections of the exercise course, including static information such as road inclination, calorie burning rate, as well as timestamped, dynamic information, such as roadside lighting levels and any observations along the route (e.g., "wet cardboard detected 6 minutes ago").
[0151] FIG. 15 continues one embodiment 1500 of the process shown in FIG. 14. The user starts traversing the route 1508. The user's device 1502 may send a Start Indication 1510 to the route planning service 1504. The user may, for example, walk, run, cycle, skateboard, roller blade, or hoverboard along the given route. The vehicle keeps scouting for new sections and alternatives, and the route planning service 1504 may receive a Section Scouting Report 1516 from a route planning application 1506. New or updated information may be received or determined by the route planning service 1504 that is sent to a user' s device 1502 in a Route Update 1512. As a user' s location changes, Location Updates 1514 are sent from a user's device 1502 to the route planning service 1504. The route planning service 1504 may update a map and route information 1518 and send a Route Update 1520 to the user's device 1502.
[0152] While traversing the route, exceptions may occur or be detected and cause changes to communication between the vehicle, user and RPS 1504. Exceptions may include the following conditions, among others: the user changes the route, the user wants to meet the car at some point along the route, or the user wants to know the fastest route to some location.
[0153] If the user changes route, then the service may inform the user of the route deviation and ask if he wants a new route scouted and/or if he wants the car to follow close by if the route is not entirely known. If the user wants to meet the car at some point along the route, the meeting point may be pre-planned or ad-hoc, and the meeting point may be moving (such as where the car meets a moving user). The service may also automatically command the car to meet at a point along a new route, such as if the user changes route. If the user wants to know the fastest route to some location, then the car may precede the user by a short distance, be available for fast pickup if needed, and report any observations directly to the user's device. When the vehicle precedes the user in close proximity, the vehicle may communicate directly with the user's device.
[0154] An exemplary method for how to update a route dynamically starts with the user providing his or her specific parameters (such as calorie burn) for a route to the route planning service (which may also use historic data, known routes, and weather forecast, among other things), and the user authorizing the use of the vehicle for scouting. The route planning service then queries the vehicle's sensor capabilities to see if it satisfies the route safety reporting requirements for the specific section (which is reported using GPS coordinates). The vehicle sends a scouting report for the specific route segment (which includes the presence of sidewalks and potholes, the lighting levels, the inclination, and the presence of parked cars blocking the road). The user then receives several route choices for the desired starting and ending location. The user picks a choice for the route segment and responds to the route planning service. The service updates the route map instructions. Next, the application sends to the vehicle a section scouting request (which is communicated based on the user's coordinates). The vehicle reports back with the scouting report (which includes items such as hazards, blockages, and cyclist-jogger activity) based on observations for that section after driving along the section. Then, the route planning service sends the user updated vehicle sensor and scouting data for the confirmed and planned route sections. After the user starts traveling the route, the application may access extra scouting reports based on the user's location, speed, and direction. The user may also want to meet the vehicle at particular location. This communication is done directly via Vehicle-to-Pedestrian (V2P) or Pedestrian-to- Vehicle (P2V) if the vehicle is in range or via a cellular network otherwise. The user sends the meeting request to the RPS, and it locates the appropriate vehicle. The vehicle responds with its location and estimated time of arrival (ETA). The user asks for the fastest route to a destination (such as to home or a hotel). RPS updates the route and calculates the best meeting spot to pick up the vehicle. RPS passes to the vehicle the meeting location and the calculated route. The application constantly updates the location of the vehicle and the user as they progress toward the meeting location.
[0155] The messaging protocol of FIGs. 14 and 15 shows an example of how to update a route dynamically. The protocol comprises the following messages and data fields:
[0156] A user may send a Route Start Indication 1510 if beginning to traverse a route. It may include a location field that holds the user's current location.
[0157] A Route Planning Request may include information about the dynamic parameters for the desired route. This information may be used by the RPS in addition to static user profile parameters. The dynamic parameters may include, for example, a time parameter a location parameter, a movement type parameter, a duration parameter, and a route parameter, among others. The time parameter may indicate a specific time for a planned departure along a route, or it may indicate, for example, that departure should take place as soon as possible. The location parameter may indicate the user' current location. The movement type parameter may indicate the user's desired means of traversing the route. For example, the movement type parameter may indicate that the user wishes to run, jog, walk, cycle, skate, or travel by car, van, or truck, among other alternatives. The duration is the desired (maximum) length of route. Route criteria include specific requirements for the route. Examples of route criteria comprise the lighting level (e.g., "entirely lit", or "single light-post outages allowed"), route overlap, usage activity, and road criteria. Route overlap includes permission for a route to overlap on itself (e.g., expressed as a maximum allowed percentage of overlap with 0% for no overlap). Usage activity includes preference for routes with or without other people, and their activities (such as, "50% of roads along the route must have cyclist activity", "prefer routes with high jogging activity", or "prefer routes with low pedestrian activity"). Road criteria includes both cycling and walking / running criteria for a road, such as a road's surface material and its condition and the presence of sidewalks (e.g., "100% of route must have sidewalks").
[0158] The RPS may send a Vehicle Use Permission Request to request permission to use the user-specified vehicle for route scouting. The RPS may request the use of the vehicle to chart a larger area of the region than immediately needed for route selection. The vehicle use permission request may include a vehicle ID as an identifier for the vehicle to use. The request may include a permission type parameter. The permission type (e.g., "for route scouting use only" or "allow extended range scouting for service improvement purposes") may also contain an indication of how much extended scouting is allowed (e.g., 10% of route length).
[0159] A Vehicle Use Permission Response is a response to a Vehicle Use Permission Request. The response may include a vehicle identifier. The response may further include a permission grant parameter. The permission grant may contain a key parameter and a permission type parameter (e.g., "allow extended range scouting for service improvement purposes"). In case the car rejects the use, the permission type may also be "not allowed" with an optional explanation, such as "low battery."
[0160] A Sensor Capability Request may be used as a request containing the required sensor capabilities field that defines a list of capabilities whose support is requested, including names (e.g., "sidewalk detection" or "road slipperiness detection") for the required capabilities.
[0161] A Sensor Capability Response may be used as a response providing an indication of available support for requested capabilities. The response may include the capability name and the support level (e.g., "not supported", "partially supported", or "fully supported").
[0162] A Section Scouting Request may be used as a request to provide sensor reports for a given road section. A coordinates parameter may be provided to identify a scoutable section (e.g., as start and endpoint definitions as street and intersection names or GPS coordinates or, alternatively, as a starting area and ending area where the start and endpoints must be, and between which section candidates are scouted). A report contents field may be provided. The report contents field is a list of sensor events to report, such as "road slipperiness events", "road inclination", "lighting level", "runner activity". The scouting type lists instructions for the car for route scanning (e.g., "shortest only" or "exhaustive").
[0163] A Section Scouting Report message may be provided as a response to a Section Scouting Request, containing events for each sensor event type requested. The section scouting report may include a section coordinates parameter. The section coordinates may provide the start and endpoint and any waypoints along the scouted route, as street/intersection names or GPS coordinates. The report may also contain event list parameters. The event list contains event types (e.g., "lighting level"). The event report lists noteworthy conditions, such as "dead light post". Event coordinates identify a particular location, such as via GPS coordinates.
[0164] The route planning service may send a Select Section Request to communicate several options to a user, based on scouted routes. The message contains a list of candidate sections. The candidate ID field defines identifiers for the candidate section in the list. The route is a representation of the route of the section as, e.g., GPS waypoints. The distance is the length of the section. The duration is an estimate of the duration of the section for the movement type. The event list contains the same field information as it does in the Section Scouting Report message. An additional information field may be provided to include information about the section added by the service based on, e.g., user profile, such as calorie burn.
[0165] A Select Section Response may be sent to communicate a user's selected section. The message may contain the candidate ID, which may be blank if the user does not accept any of the candidates (in which case, the service may scout for further alternatives).
[0166] A Start Permission Indication may be sent to a user's device to communicate permission to start an exercise. The exercise start permission may be given to the user when route planning is estimated to be finished before the user reaches the end of the route. No specific parameters are required.
[0167] The RPS may send a Route Update Indication to a route planning application to communicate changes to a route, e.g., when the user embarks on the route, as new sections of the route have been confirmed, or if the user requests a route update. The route description field is a description of the route as waypoints, in two parts. The confirmed portion is the section of the route that has already been scouted, while the planned section is the unconfirmed part of the route. The route events field is a list of events and warnings for the route, including the event type (e.g., "dead light post", or "cardio workout section") and event coordinates (e.g., as GPS coordinates).
[0168] FIGs. 16, 17, and 18 display some examples of exceptional events that may occur when traversing a route, such as a re-planning of the route due to a user deviation from the plan 1608, a user-initiated request to meet a car at a fixed location, and a request for accompanied route at a specific location. The protocol includes the following messages and data fields:
[0169] The user's device may send a Location Update 1812 to update a user's location (e.g., as GPS coordinates), speed, and heading.
[0170] The user may send a Route Update Request 1612, 1710, 1732 to request a new route while traversing the route. The message includes a location (the user's current location), a heading (the direction the user wishes to go), and optional waypoints (specify user waypoints along the route).
[0171] The user may send a Vehicle Meeting Request 1636, 1638, 1716, 1720, 1738, 1742 to request that the vehicle meet him or her along the route. The meet type field is set to "parked" for meeting at a predefined location, "moving" for a request for the car to meet the moving user, or "precede" if the car is requested to move to and hold a position a short distance ahead of the user. The meet location shows candidate coordinates for a parked meet point, or the user's current location and a planned route for a moving or preceding to or from the meet point.
[0172] A Vehicle Meeting Response 1642, 1644, 1724, 1726, 1746, 1748 may be sent in response to a Vehicle Meeting Request 1636, 1638, 1716, 1720, 1738, 1742. The vehicle location is the current location of the vehicle. The meet point, for a parked meet point, contains updated coordinates determined by the vehicle and/or the RPS, depending on, e.g., available parking locations near the requested meet point. For a moving or preceding meet point, the meet point contains the coordinates of the estimated point along the user's route where the car will meet the user.
[0173] A Vehicle Location Broadcast 1810 may be sent to broadcast a vehicle's location. The vehicle ID field identifies the vehicle. The message also broadcasts the vehicle's location, speed, and heading.
[0174] FIG. 16 shows an embodiment of messaging 1600 that may occur if a user deviates from a route 1608. A user's device 1602 receives an indication from a user of an agreement to a new route planning 1610. A Route Update Request 1612 is sent to a route planning service 1604, which updates a route 1614 and responds with aRoute Update 1616. The RPS 1604 sends a Section Scouting Request 1618 to a route planning application 1606, which drives the vehicle (or car) along a section and generates a report of the requested information based on sensor data 1620. The report may be sent to RPS in a Section Scouting Report 1622. RPS 1604 selects section candidates 1624, and sends a Section Selection Request 1626 to the user's device 1602. The user's device sends back a Section Selection Response 1628, and the RPS 1604 updates a route 1630. The updated route may be sent 1632 to a user's device 1602.
[0175] Another scenario where a user may deviate from a route may occur if a user's device receives an indication from a user a request to meet the vehicle 1634, the user's device may send a Meeting Request 1636 to the RPS 1604. The RPS 1604 sends a Meeting Request 1638 to a route planning application 1606, which determines the nearest location allowed and an estimated time of arrival (ETA) 1640 and sends back a Meeting Response 1642. The RPS 1604 sends a Meeting Response 1644 to the user's device 1602, which displays to the user the meeting location and ETA 1646.
[0176] FIG. 17 shows one embodiment of messaging 1700 if a user requests an accompanied fast route to a location 1708. A user's device 1702 sends a Route Update Request 1710 to an RPS 1704, which updates a route 1712 and sends back a Route Update Response 1714. A user's device 1702 sends a Vehicle Meeting Request 1716 to an RPS 1704, which estimates a meeting point 1718. The RPS 1704 may send a Vehicle Meeting Request 1720 to a route planning application
1706, which determines an ETA 1722 and responds with a Vehicle Meeting Response 1724. The
RPS 1704 sends a Vehicle Meeting Response 1726 to the user's device 1702, which displays location and ETA for the meeting location 1728.
[0177] FIG. 18 shows one embodiment 1800 of a vehicle near a meeting point 1808. A route planning application 1806 sends a Vehicle Location Broadcast 1810 to a user's device 1802. The user's device may send a Location Update 1812 to the route planning application 1806, which may be used to determine how far apart the vehicle is from the user. For some embodiments, a car stays a little ahead of the user, detecting route conditions 1814. For some embodiments, a user's device reports location periodically 1816. The route planning application 1806 sends a route update 1818 to a user's device 1802, which alerts a user on route events 1820.
[0178] Depending on the objective of the route selection and the transportation method used, the user may enter various parameters to take into account in route selection. The route selection parameters may include, for example, the following parameters. For all routes, parameters may include: the lighting level, the presence of broken lights, the presence of dark alleys, weather (e.g. wind: strength, direction, and gusts; sunlight: cloudiness, effect on the user; rain; snow) including from both safety and convenience viewpoints, allowance of route overlapping, the presence of slippery/icy patches, puddles of water, or flooded areas, a road's profile (including its inclination and surface material).
[0179] For pedestrian routes (including, for example, rollerblading, skateboarding, and new transportation methods, such as hoverboarding), the following route selection parameters may be more pertinent (but not limited to just pedestrians): street side structures, the number/type of buildings, the presence of roofed areas, the number of trees, which includes their effect on weather (such as shade and protection from wind and rain), sidewalk parameters (such as their presence, condition, and need to switch from side to side), the number and activity of pedestrians observed along the route (jogging, walking with bags in their hands, standing, or waiting at traffic lights), traffic light cycles (short vs. long and cycle length), blocked sidewalks (parked vehicles (legally and illegally) and other obstructions), the duration of the route, exercise-related parameters (such as the desired number of calories to be burned on the route, the workout type and level (for example, "low", "medium", or "high" intensity; aerobic vs. anaerobic sections), warmup and cooldown sections at the beginning and end), and the presence of roadside areas, such as long flights of stairs that may be used for cardio workouts.
[0180] For cyclist routes, the following route selection parameters may be more pertinent (but not limited to just cyclists): the number of cyclists and their activity level (cycling fast, slow, alone or with someone), road conditions (potholes, road surface roughness, presence of loose gravel, sloped/non-sloped curbs at intersections), roadside blocks (e.g., legally or illegally parked vehicles, vehicle size, and blocked view in road crossings), and the same exercise-related parameters pertinent to pedestrian routes.
[0181] For motorist routes, the following route selection parameters may be more pertinent (but not limited to just motorists): presence and type of parked vehicles along the route (e.g., lone vans), road blocks (such as natural disaster conditions, parked vehicles, demonstrations, rioting, and traffic jams), road conditions (broken sections, such as due to earthquake), gatherings of people, stationary people at corners, and the length of visibility (long open roads vs. narrow winding roads with lots of parked vehicles or the presence of dark alleys).
[0182] Vehicle sensors may report several types of vehicle data content, which may be used for route planning support (RPS) calculations. Examples of static information include road type (such as asphalt, gravel, or dirt), road surface conditions (including its suitability for various exercise types, such as uneven gravel road or rough asphalt), road width, the presence of sidewalks and their width, the need to change side of the road often, places to change side (due to, for example, illumination, security, or a need to maintain a certain speed), the presence of buildings (including the type of buildings, their height, and their closeness to the road), other service data (crowdsourced and open data) which may be fetched from, for example, base stations, pico- and femto-cells, and Facebook with respect to car location to aid planning, and the presence of resting, snack, and drink areas may be used with or without a car, such as drink breaks in a car to avoid certain areas or environments. The use of resting, snack, and/or drink areas might require part of the route to be changed to accommodate the use of such areas.
[0183] Dynamic sensor information includes, for example: road conditions (such as the presence of pot holes, road cracks, or slippery conditions, which generates a recommendation for special shoes, bike tires, or types of exercise); parked vehicles and other road blocks, including the types of parked vehicles (e.g., car, van, or truck); illumination (dark/bright spots, overall illumination, and a recommendation for clothing color/lights); micro weather (e.g., presence and level of wind, heat, or sun along certain parts of the route), including head/tail wind, direct sunshine, or heavy rain; time combined with other information; traffic (vehicle) information: intensity (high, moderate, or low level), presence of traffic jams, and number of vehicles in front of buildings; traffic light operation (e.g., via Vehicle-to-Infrastructure (V2I)): cycle lengths, whether it is operating properly, current state; the effect of buildings on local weather conditions (e.g., the presence of a tunnel, including wind direction, wind intensity, and comparison with overall wind), including the ability to block rain and the effects on illumination (direct sunlight and/or shade); and information on the presence of pedestrians/cyclists. Pedestrian/cyclist information might necessitate route adjustments. Music recommendations are based on the current environment features (not just speed and/or training obj ectives). Weather data, for example, is used if the user desires to avoid such conditions or if the user desires to use as many roofed areas as possible.
[0184] Exemplary embodiments of the present disclosure include some of the following examples. A cloud service, such as RPS, gives the car an area to scout for routes between a starting area and an ending area. The vehicle reports all possible routes or just the best route from the starting to ending point. A vehicle also does the scouting/route selection independently of external services by exchanging all the necessary information with the phone application. With the vehicle user's permission, the collected data is used to update larger area map data to maintain accurate information on routes. This larger area data is used by other users for route planning or other services.
[0185] The user might drive through the route in a vehicle during or after route scouting. The vehicle's human-machine interface (HMI) shows information considered in route selection and provide means for interacting with route selection, such as the ability to include or reject a particular route. The system uses this approach when it lacks knowledge of road conditions and traffic flow (such as after an earthquake or a hurricane); vehicles map passable routes for road traffic and include an estimate of the treacherousness level. Similarly, the system uses this approach for scouting in front of a motorcade by looking for suspicious roadside objects and potential locations where the speed of the motorcade might require an adjustment.
[0186] Consider the following scenario for use of an exemplary embodiment disclosed herein. Emma took a business trip to an unfamiliar city. She rented an autonomous car for the trip. In the evening, she wants to go cycling on a hotel bike, but she does not know which route would best suit her desired exercise level. Also, she lacks knowledge of possible routes, such as the presence and condition of the roads, the amount of vehicle and pedestrian traffic, the level of roadside lighting, and the presence of possible slippery conditions or other hazards along the route. Emma also wishes to stay fairly close to her hotel during the trip.
[0187] She activates the exercise route planning application on her phone, and inputs her preferences for today's exercise. She wants to cycle for 45 minutes with one section for a cardio workout. The route must be well lit, be in a good condition, and include cyclists along the route to provide information of routes that locals use. Emma also wishes to stay within 1.5 miles from her hotel and is thus willing to let the route overlap on itself. [0188] She authorizes the application to use her car to scout a cycling route. The service confirms that the car has enough sensor capability to provide missing information along the route. Emma hits the "start" button on the application. A timer starts to count down from 10 minutes to indicate when enough route has been covered so that the rest of the route may be determined during the 45-minute exercise.
[0189] In addition to the preferences Emma just gave, the application uses Emma's workout history and additional preferences to define the length of the route for the trip at Emma's cycling speed, and the application makes a preliminary list of candidates for the route. The application instructs the vehicle to scout the candidate routes and to present new routes in case it finds better ones. The vehicle also records accurate and updated information of the routes, such as lighting levels, the presence and condition of sidewalks, pedestrian activity along the routes, road inclination, and potentially slippery spots along the route. The vehicle looks for road sections that fit Emma's fitness profile for a cardio workout. The vehicle reports its findings to the application.
[0190] After 7 minutes, the vehicle collects enough route data to allow Emma to start cycling. The confirmed part of the route is shown in green on a map on her phone, and the yet unconfirmed portion is shown in gray. Emma's wrist device helps her make the correct turns along the route. When Emma reaches a street section with a fairly steep incline, her wrist device indicates this portion is the cardio workout section. The watch instructs her to cycle the street up and down three times. Emma decides to deviate from the planned route by going to a road by a canal. She instructs either her phone or her watch to have her vehicle meet her at the other end of the road. Emma arrives at the car and takes a water break in the car.
[0191] Emma decides to return to the hotel using the shortest possible route. She hits the "take me home" button on her wrist device. The application indicates to her the shortest route, and Emma calls for the car to precede her along the route. Emma's wrist device instructs her to follow the car, which stays 50 yards in front of Emma. She cycles to her hotel, feeling safe that in case of any trouble, her car is one button press away from her.
[0192] Multiple embodiments to implement security surveillance for vehicles and buildings are described herein. Some mobile applications provide some means to enhance personal security of the user, e.g. by using a smartphone video call while walking, but these systems do not take advantage of connected vehicles or smart spaces for advance security scanning.
[0193] Exemplary embodiments described herein enhance safety and personal security by integrating smart space and connected vehicles together with security tracking of the user before and while walking. Exemplary embodiments also enhance security monitoring coverage by leveraging the availability of connected vehicles. Such leveraging enables connected vehicles to cover surveillance of blind-spots. Exemplary embodiments may perform safety tracking without additional equipment (regarding the user's mobile device and connected vehicles). This system takes into account the current safety situation (e.g., location of security personnel or police in the area) and dynamic environmental variations (e.g., illumination of the walking path, such as broken street lamps). The system provides user interfaces before, during, and after a user traverses a route. As described herein, a mobile device or primary terminal is a device associated with a user.
Planning an Autonomous Vehicle Route Outside a Road Network
[0194] Some embodiments comprise using an autonomous vehicle in an area outside a road network, such as a downtown pedestrian market or a parking garage. FIG. 19 is a system diagram of the architecture 1900 of the area management and access code exchange process. For some embodiments, geographic locations may be managed by an area manager that controls autonomous and controlled vehicles. A managed area may be an area with a large number of pedestrians, for example, that may use locally maintained rules for vehicle control and use.
[0195] FIG. 19 shows an example set of messaging interfaces for an AV 1901 about to enter a managed area. For one embodiment, an AV 1901 sends a destination route request 1908 to a routing application (which may be local to the AV 1901 or based in the cloud 1902). Managed areas along a vehicle route may be identified by a routing application sending a route request 1909 to a map database 1903 and receiving managed areas route information 1910. Access requests (or announcements) 1912 may be sent to area managers 1904 responsible for an identified managed area. Area managers 1904 may be distributed over a network. Because area rules 1919 may be maintained locally via area surveillance 1906, an update process may be real time, and restrictions may also be set real time (immediately).
[0196] If access is not available, rerouting may be performed and passengers informed that the destination may not be reached. Otherwise, an area manager 1904 may respond to an access request 1912 by sending the requested access code 1913 to the AV 1901 via the cloud 1902. The access code may be valid only between a certain timespan to prevent too early or too late arrival. Therefore, an access request 1912 may include an estimation of the arrival time, which may be calculated from a route plan for some embodiments. For example, traffic congestion may delay an arrival of an AV, and an access code may expire. In this case, an AV may send a renewal request for the access code.
[0197] Upon arrival, an AV 1901 may send an entrance announcement message 1914 with an access code to an entrance 1907. The AV 1901 may also send 1914, 1915 an authentication code to an area manager 1904 via an entrance 1907 and receive 1916, 1917 route instructions. An area manager 1904 may perform a code check identification 1908 to verify an authentication code. This code exchange secures communication between an area manager 1904 and an AV 1901. For some embodiments, an AV 1901 receives instructions and commands only from an area manager 1904. If access is delayed, an AV may receive waiting instructions 1917, 1918, for example to drive to a certain location for waiting. Otherwise, an AV may block streets or entrances from other traffic.
[0198] For some embodiments, AV movement inside a restricted area may be controlled using a local map and route information that indicates allowed routes and parking spots inside he restricted area. In temporary parking areas, an area manager 1904 may control the order in which vehicles leave the area. Area management may have an ability to clear a managed area. For example, this ability may be used by emergency vehicles in case of fire.
[0199] For some embodiments, restricted areas may use a management system which controls AV traffic. An AV may send a message to a management system before entering an area. Some areas may be managed by an area operator. This management task may be outsourced to dedicated service providers, for example. An AV 1901 may communicate with other vehicles (or cars) in an area and may receive from an area manager 1904 which roads are blocked, where to enter, or where to park. A management system may send mapping data to an AV to support route and driving trajectory planning. Some large public events (such as concerts or fairs) may set up temporary parking areas in fields or open areas without any markings. An automated parking manager may send guidance messages to AVs to communicate parking spots and control AVs inside a parking area.
[0200] Area management may rely on sensors (such as surveillance cameras, LiDAR systems, and other types of perception sensors), which may detect the number of pedestrians or other uncontrolled traffic in an area. Temporary areas may use supervising vehicles to supervise an area, for example with an Unmanned Aerial Vehicles (UAVs) or drone units. An area manager 1904 may store records 1905 of vehicles inside an area.
[0201] An AV may change a planning module's mode to enable vehicles (or cars) to drive in automated mode in pedestrian zones. A behavior planning module may lower an AVs diving speed and reduce safety margins. For example, an AV 1901 may turn a corner at slower speeds with reduced distance margins to other vehicles. For some embodiments, an AV 501 may receive driving instructions (or behavior rules) and follow those instructions in accordance with messages received from an area manager 1904 to achieve goals set by an area manager 1904 and a behavior planning module. [0202] FIG. 20 is a plan view schematic of an environment 2000 where an AV s vehicle routes
2010 may be controlled by an area manager. An AV approaches a destination (or goal) 2022, and the shortest route 2012 is blocked by an event 2006 and the area is full of people 2008. Area management system detects a crowd 2008. Without systems and methods as described herein, an AV may attempt to enter through the event area 2006. An AV may block the street because an AV obstacle avoidance system prevents the AV from driving through the crowd 2008. To avoid this situation, an area management system sends instructions to an AV to use a new route 2014 for another entrance 2020 of the market area 2004. If a safe access is not possible at time of arrival, an AV may receive instructions to drive to a waiting area 2002, where an AV does not block other traffic. For some embodiments, both the shortest route planned by an AV 2012 and a new route 2014 (which may be planned by an area manager) may contain waypoints 2016, 2018, which are intermediate points along a route.
[0203] FIG. 21 is one embodiment 2100 of a message sequencing diagram for an area access process. An AV 2102 may send a route request 2112 to a routing application (local or cloud-based) with a destination and an AV's current location. A route description is generated. To determine if an AV 2102 has to operate in any managed areas, information on which areas are managed and which area manager 2106 is responsible for management may be included in a route database. Areas are identified with an identification number or other appropriate code. With an identification code, an area manager may be determined using a cloud-based database. There are several web- based technologies available for this purpose. Estimated arrival time to a managed area may be determined from a route description. A route planner 2104 may send 2114 the route and associated area managers to an AV 2102. A vehicle identification code together with an estimated arrival time may be sent 2116 to an area manager 2106 to obtain 2120 an access code. An access code is not a permission but is used to identify an arriving vehicle 2102. An Area Manager adds 2118 an arriving vehicle 2102 to a database. Information on arriving vehicles is used to estimate the number of vehicles in the area at certain time in the future and it is used to plan the management actions.
[0204] When an AV 2102 arrives at an area, an AV sends an access request 2116 to an area manager 2106. This access request 2116 may include a vehicle's own access code and a previously received area access code. Codes may be used to secure communication between an area manager 2106 and an AV 2102. Inside an area, an AV may receive commands and instructions only from authenticated area manager 2106.
[0205] An area manager 2106 monitors an area with surveillance sensors 2110, such as surveillance cameras, LiDAR systems, or other types of perception sensors. If such sensors 2110 are not available, an area manager 2106 may receive information from an AV's sensors when an AV is inside an area. For example, an AV may detect the presence of pedestrians from cell phone activity or V2P (vehicle-to-pedestrian) communication. Static area maps may be loaded 2122 with usable static routes for routing inside an area. Dynamic objects may be added to a map 2126 for final routing. Using sensors 2110, pedestrians and other traffic may be detected 2124 by area surveillance 2108 in real-time and added to a map 2126. Based on real-time data (such as map update information 2128), area management may be performed automatically and update a map 2130. Thus, the number of AVs may be monitored, and movements of AVs may be controlled to maintain safety. Also, AVs may be controlled to ensure activities inside an are not disturbed. In the case of the temporary facilities, like parking, a virtual map may be created, and surveillance sensors 2110 may act as mobile units (like quadcopters or similar devices) to avoid fixed installations.
[0206] As a response to an entrance request 2132, an area manager 2106 sends instructions 2138 to the AV 2102. Instructions 2138 may include status of the entrance. If immediate access is not possible, an AV may receive an instruction to wait to be guided to a specific waiting area. In some cases, for example, access may be declined if an area is already closed or full. When access is allowed following verifying 2134 an access code, an area manager 2106 creates 2136 and sends 2138 drive instructions to the AV 2102. Examples may be seen in FIGs. 21 (market case) and 23 (parking). Instruction set may be planned using dynamic map data collected from area surveillance 2108 and database data. The route may be described as segments which contain waypoints, speed limit, driving priority and actions. Actions may be stopping points, turns, and other information not described by waypoints. A destination may be either a location where AV want to go (an AV's goal point) or a spot in which an area manager 2106 reserves for an AV, such as a preferred parking spot.
[0207] During the visit time, AV 2102 may send location (or position) updates 2140 and status information (such as moving or parked) to an area manager 2106. An area manager 2106 may update a database 2142 accordingly. An area manager 2106 may also update a map 2144 and update instruction 2146 based on a location and action update 2140. Because GNSS may be inaccurate and have availability limitations, a area surveillance system may send position information to an AV 2102. Because objects inside an area may be tracked, an AV's current location may be detected by a system. Location data may be sent within an instruction update 2148. Database entry contains the Identification number, priority, current location, status and access time. An area manager 2106 may monitor an AV 2102 to determine if given instructions are being followed. Because an area changes dynamically, an area manager 2106 may send update instructions 2148. An AV 2102 may also send a list of detected objects and measured clearances to an area manager 2106 to improve dynamic map content.
[0208] When AV prepares to leave from an area, an AV may send a departure request 2150 to an area manager 2106. An area manager 2106 may send an acknowledgement to a request and remove 2156 an AV (or vehicle) 2102 from a database. If departure is managed or planned 2152 (which may occur to avoid congestion), an area manager 2106 may send departure instructions (or an instruction update) 2154 to an AV 2102 (which may be an instruction to wait).
[0209] FIG. 22 is one embodiment 2200 of a message sequencing diagram for an area access process for operating without a fixed surveillance system. An AV 2202 may send a location and destination route request 2212 to a route planner 2204. A route planner 2104 may send a route and associated area managers response 2214 to an AV 2202. An access request 2216 may be sent to an area manager 2206. An area manager 2206 may add 2218 the AV 2202 to a database and respond with an access code message 2220.
[0210] FIG. 22 is different from FIG. 21 in that an area manager 2206 may receive 2222 detected objects from AV sensors 2208. An area manager 2206 also may receive 2224 detected objects from pedestrian sensors 2210. An area manager 2206 may update a map 2226 based on detected objects. An AV 2202 may send to an area manager 2206 an entrance request with an access code and vehicle access code 2228. An area manager 2206 may verify an access code 2230, create instructions 2232, and send instructions 2234 to an AV 2202. An AV 2202 may send a location and action update 2236 to an area manager 2206. An area manager 2206 may update a database 2238, update a map 2240, update instructions 2242, and send back updated instructions 2244. An AV 2202 may send a departure request 2246 to an area manger 2206 upon preparing to leave an area. An area manager 2206 may process a departure request (departure planning) 2248 and send departure instructions (or an instruction update) 2250 to an AV 2202. An area manger 2206 may also remove an AV 2202 from a database 2252.
[0211] One embodiment of a route request 2112, 2212 may include fields for a current location (latitude and longitude coordinates) and a destination (address, latitude coordinate, and longitude coordinate). One embodiment of a route response 2114, 2214 may include fields for a dedicated route description based on vehicle control system preferences and a list of managed areas, which may include an area identification code and an estimated arrival time. One embodiment of an access request 2116, 2216 may include fields for a vehicle ID, an estimated arrival time, a destination, and vehicle dimensions. One embodiment of an access code response 2120, 2220 may include fields for access status (with values for allowed, delayed, and denied), an access code, an optional estimated delay, and an optional wait area. One embodiment of an entrance request 2132, 2228 may include fields for an access code, a vehicle ID, and a vehicle access code. One embodiment of vehicle instructions 2138, 2234 may include fields for access status (with values for allowed, delayed, and denied), an access code with optional estimated delay and optional wait area sub-fields, an entrance location (latitude and longitude coordinates), a list of road segments (with sub-fields for segment ID, waypoints (latitude and longitude coordinates), allowed speed, action points (stop points and pedestrian crossings), and priority (compared to other route segments)), and a parking spot (latitude coordinate, longitude coordinate, a heading, and a parking spot ID). One embodiment of a location and action update 2140, 2236 may include fields for a vehicle ID, a location (latitude coordinate, a longitude coordinate, a heading, and a speed), vehicle status (with values for moving, avoiding, still, or parked), a list of detected objects, and detection of a clearance ahead. One embodiment of an instruction update 2148, 2244 may include fields for a detected location (latitude coordinate, longitude coordinate, and heading), a list of road segments (with sub-fields for segment ID, waypoints (latitude and longitude coordinates), allowed speed, action points (stop points and pedestrian crossings), and priority (compared to other route segments)), and a parking spot (latitude coordinate, longitude coordinate, a heading, and a parking spot ID). A detected location may be a relative location correction to prevent or to correct for a GNSS error. One embodiment of a departure request 2150, 2246 may include fields for a vehicle ID and a departure request time. One embodiment of a departure instruction 2154, 2250 may include fields for departure status (with values for allowed, delayed, and denied), an optional estimated departure time, a departure point location (latitude and longitude coordinates), and list of route segments (with sub-fields for segment ID, waypoints (latitude and longitude coordinates), allowed speed, action points (stop points and pedestrian crossings), and priority (compared to other route segments)).
[0212] FIG. 23 is one embodiment 2300 of a plan view schematic of a temporary parking area with virtual parking spots and virtual lanes. An AV receives instructions to avoid detected pedestrians 2301 and follow a virtual route 2305 described in instructions (which may be sent as segments 2308 with route waypoints 2313) towards a destination 2318. Virtual routes 2305 may have priorities, such as routes with a priority 0 (2306) or a priority 1 (2307). Virtual routes 2305 may have action points 2309, such as a turn left action point 2310, a stop and turn right action point 2311, or a turn left action point 2312.
[0213] FIG. 23 shows an example of an uneven column created by a manually-parked vehicle. An original route 2314 is deviated with waypoints that avoid a misaligned vehicle. FIG. 23 also shows an example of when a pedestrian may be blocking a route. An AV detects a narrow clearance 2315 and avoids such a route if a pedestrian remains. Virtual parking spots have IDs 2317 that may be specified by latitude and longitude headings 2316.
[0214] Parking areas may have a passenger drop zone 2302 and a passenger pickup zone 2303. These zones 2302, 2303 may be near an entry point 2304. By using such zones, AVs in parking areas may be handled without humans. This methodology also may make drivers leaving or picking up cars more comfortable.
[0215] Discharging of large areas may provide priority to certain vehicles, for example buses, taxis, emergency vehicles, and maintenance vehicles. In these cases, automated vehicles may be moved or stopped to yield right of way to top priority vehicles. Manually-driven vehicles (or cars) (M) may generate uneven rows. AVs may be used to show manual drivers how to park (see the middle columns of FIG. 23). AVs may be parked in rows leaving parking space between them (upper 2x5 parking area) or used as corner vehicles (lower 2x3 parking area).
[0216] One exemplary embodiment of a method described herein comprises: receiving a request for an automated vehicle to travel to a location within a controlled access area; determining the automated vehicle is authorized to enter the controlled access area; determining a location of the automated vehicle; determining a plurality of vehicle movements of the automated vehicle to travel to the location within the controlled access area; communicating to the automated vehicle the plurality of vehicle movements to travel to the location within the controlled access area; and determining that the automated vehicle has traveled to the location within the controlled access area.
[0217] Another embodiment further comprises updating a map database based on objects detected by area surveillance devices and sensors. Another embodiment comprises determining a location of an automated vehicle by receiving GPS coordinates from the automated vehicle. Another embodiment further comprises controlling the departure of the automated vehicle (which may be controlled based on vehicle congestion information).
[0218] One exemplary embodiment of a method described herein comprises: receiving a request for an automated vehicle to park in a controlled access parking area; determining a starting location of the automated vehicle; determining a location within the controlled access parking area to park the automated vehicle; determining a plurality of vehicle movements to move the automated vehicle from the starting location to the determined parking location; communicating to the automated vehicle the plurality of vehicle movements to move the automated vehicle to the determined parking location; and determining that the automated vehicle has moved to the determined parking location. Network Architecture
[0219] Exemplary embodiments disclosed herein are implemented using one or more wired and/or wireless network nodes, such as a wireless transmit/receive unit (WTRU) or other network entity.
[0220] FIG. 24 is a system diagram of an exemplary WTRU 2402, which may be employed as for example, a user device on which a route planning application is installed, or a vehicle computing system on which a route planning application is installed. As shown in FIG. 24, the WTRU 2402 may include a processor 2418, a communication interface 2419 including a transceiver 2420, a transmit/receive element 2422, a speaker/microphone 2424, a keypad 2426, a display/touchpad 2428, a non-removable memory 2430, a removable memory 2432, a power source 2434, a global positioning system (GPS) chipset 2436, and sensors 2438. The WTRU 2402 may include any sub-combination of the foregoing elements while remaining consistent with an embodiment.
[0221] The processor 2418 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like. The processor 2418 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 2402 to operate in a wireless environment. The processor 2418 may be coupled to the transceiver 2420, which may be coupled to the transmit/receive element 2422. While FIG. 24 depicts the processor 2418 and the transceiver 2420 as separate components, the processor 2418 and the transceiver 2420 may be integrated together in an electronic package or chip.
[0222] The transmit/receive element 2422 may be configured to transmit signals to, or receive signals from, a base station over the air interface 2416. For example, in one embodiment, the transmit/receive element 2422 may be an antenna configured to transmit and/or receive RF signals. In another embodiment, the transmit/receive element 2422 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, as examples. In yet another embodiment, the transmit/receive element 2422 may be configured to transmit and receive both RF and light signals. The transmit/receive element 2422 may be configured to transmit and/or receive any combination of wireless signals. [0223] In addition, although the transmit/receive element 2422 is depicted in FIG. 24 as a single element, the WTRU 2402 may include any number of transmit/receive elements 2422. More specifically, the WTRU 2402 may employ MIMO technology. Thus, in one embodiment, the
WTRU 2402 may include two or more transmit/receive elements 2422 (e.g., multiple antennas) for transmitting and receiving wireless signals over the air interface 2416.
[0224] The transceiver 2420 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 2422 and to demodulate the signals that are received by the transmit/receive element 2422. As noted above, the WTRU 2402 may have multi-mode capabilities. Thus, the transceiver 2420 may include multiple transceivers for enabling the WTRU 2402 to communicate via multiple RATs, such as UTRA and IEEE 802.11, as examples.
[0225] The processor 2418 of the WTRU 2402 may be coupled to, and may receive user input data from, the speaker/microphone 2424, the keypad 2426, and/or the display/touchpad 2428 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit). The processor 2418 may also output user data to the speaker/microphone 2424, the keypad 2426, and/or the display/touchpad 2428. In addition, the processor 2418 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 2430 and/or the removable memory 2432. The non-removable memory 2430 may include random-access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device. The removable memory 2432 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and the like. In other embodiments, the processor 2418 may access information from, and store data in, memory that is not physically located on the WTRU 2402, such as on a server or a home computer (not shown).
[0226] The processor 2418 may receive power from the power source 2434, and may be configured to distribute and/or control the power to the other components in the WTRU 2402. The power source 2434 may be any suitable device for powering the WTRU 2402. As examples, the power source 2434 may include one or more dry cell batteries (e.g., nickel -cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion), and the like), solar cells, fuel cells, and the like.
[0227] The processor 2418 may also be coupled to the GPS chipset 2436, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the WTRU 2402. In addition to, or in lieu of, the information from the GPS chipset 2436, the WTRU 2402 may receive location information over the air interface 2416 from a base station and/or determine its location based on the timing of the signals being received from two or more nearby base stations. The WTRU 2402 may acquire location information by way of any suitable location-determination method while remaining consistent with an embodiment.
[0228] The processor 2418 may further be coupled to other peripherals 2438, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity. For example, the peripherals 2438 may include sensors such as an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, and the like.
[0229] FIG. 25 depicts an exemplary network entity 2590 that may be used within a communication system, for example as a route planning service in some embodiments. As depicted in FIG. 25, network entity 2590 includes a communication interface 2592, a processor 2594, and non-transitory data storage 2596, all of which are communicatively linked by a bus, network, or other communication path 2598.
[0230] Communication interface 2592 may include one or more wired communication interfaces and/or one or more wireless-communication interfaces. With respect to wired communication, communication interface 2592 may include one or more interfaces such as Ethernet interfaces, as an example. With respect to wireless communication, communication interface 2592 may include components such as one or more antennae, one or more transceivers/chipsets designed and configured for one or more types of wireless (e.g., LTE) communication, and/or any other components deemed suitable by those of skill in the relevant art. And further with respect to wireless communication, communication interface 2592 may be equipped at a scale and with a configuration appropriate for acting on the network side— as opposed to the client side— of wireless communications (e.g., LTE communications, Wi-Fi communications, and the like). Thus, communication interface 2592 may include the appropriate equipment and circuitry (perhaps including multiple transceivers) for serving multiple mobile stations, UEs, or other access terminals in a coverage area.
[0231] Processor 2594 may include one or more processors of any type deemed suitable by those of skill in the relevant art, some examples including a general-purpose microprocessor and a dedicated DSP.
[0232] Data storage 2596 may take the form of any non-transitory computer-readable medium or combination of such media, some examples including flash memory, read-only memory (ROM), and random-access memory (RAM) to name but a few, as any one or more types of non- transitory data storage deemed suitable by those of skill in the relevant art may be used. As depicted in FIG. 25, data storage 2596 contains program instructions 2597 executable by processor 2594 for carrying out various combinations of the various network-entity functions described herein.
[0233] Although features and elements are described above in particular combinations, one of ordinary skill in the art will appreciate that each feature or element may be used alone or in any combination with the other features and elements. In addition, the methods described herein may be implemented in a computer program, software, or firmware incorporated in a computer- readable medium for execution by a computer or processor. Examples of computer-readable storage media include, but are not limited to, a read-only memory (ROM), a random-access memory (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD- ROM disks, and digital versatile disks (DVDs). A processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, or any host computer.
[0234] Note that various hardware elements of one or more of the described embodiments are referred to as "modules" that carry out (perform or execute) various functions that are described herein in connection with the respective modules. As used herein, a module includes hardware (e.g., one or more processors, one or more microprocessors, one or more microcontrollers, one or more microchips, one or more application-specific integrated circuits (ASICs), one or more field programmable gate arrays (FPGAs), one or more memory devices) deemed suitable by those of skill in the relevant art for a given implementation. Each described module may also include instructions executable for carrying out the one or more functions described as being carried out by the respective module, and those instructions may take the form of or include hardware (hardwired) instructions, firmware instructions, software instructions, and/or the like, and may be stored in any suitable non-transitory computer-readable medium or media, such as the media commonly referred to as RAM or ROM.

Claims

1. A method comprising: receiving from a plurality of vehicles, information regarding sensor capabilities and coverage areas; receiving from a user device, an indication of a starting location and a destination location; determining information regarding a first route from the received first user starting location to the first user destination location; determining information regarding areas of the first route without sensor coverage from the plurality of vehicles; sending to the user device the information regarding the first route; responsive to a determination that a user associated with the user device intends to traverse the first route from the received first user starting location to the first user destination location, sending, to at least one autonomous vehicle, information for causing the at least one autonomous vehicle to relocate to a location where sensor coverage will be provided for at least a portion of the areas of the first route without sensor coverage from the plurality of vehicles.
2. The method of claim 1, further comprising: receiving sensor data from a plurality of vehicles in a vicinity of the first route; calculating a safety metric for the first route based at least in part on the received sensor data; and communicating to the user device the safety metric for the first route.
3. The method of claim 2, further comprising sending a safety alert message to the user device if the safety metric for the first route is below a safety threshold.
4. The method of claim 1, wherein the starting location is within a smart space building that contains at least one sensor measuring data used for calculating a safety metric of the first route.
5. The method of claim 1, further comprising: receiving information related to a person detected in a vicinity of the first route; determining if the received information related to the person detected in the vicinity of the first route matches an entry in a database of information related to identities of people; communicating to the user device the presence of the detected person; and communicating to the user device whether the information related to the detected person matches an entry in the database of information related to identities of people.
6. The method of claim 1, wherein sending, to at least one autonomous vehicle, information for causing the at least one autonomous vehicle to relocate to a location where sensor coverage will be provided: determining a largest segment of the first route without sensor coverage; determining a coverage area for each of a plurality of sensors of a first at least one autonomous vehicle; selecting from the plurality of sensors of the first at least one autonomous vehicle the sensor with the largest coverage area; determining an optimal location of the first at least one autonomous vehicle to maximize coverage area for the selected sensor; and communicating to the first at least one autonomous vehicle to go to the determined optimal location that will maximize coverage area for the selected sensor.
7. The method of claim 1, further comprising: determining a plurality of routes that extend from the starting location to the destination location; calculating a safety metric for each of the plurality of routes that extend from the starting location to the destination location; communicating to the user device location information for each of the plurality of routes that extend from the starting location to the destination location; communicating to the user device the safety metric for each of the plurality of routes that extend from the starting location to the destination location; receiving from the user device an indication of a selection of the first route from the plurality of routes that extend from the starting location to the destination location.
8. The method of claim 1, further comprising: receiving from the user device an indication of the location of the user device; determining a plurality of routes that extend from the location of the user device to the destination location; calculating a safety metric for each of the plurality of routes that extend from the location of the user device to the destination location; communicating to the user device location information for each of the plurality of routes that extend from the location of the user device to the destination location; communicating to the user device the safety metric for each of the plurality of routes that extend from the location of the user device to the destination location; receiving from the user device an indication of a selection of a second route from the plurality of routes that extend from the location of the user device to the destination location; determining information regarding the second route from the location of the user device to the destination location; sending to the user device the information regarding the second route; and responsive to a determination that the user associated with the user device intends to traverse the second route from the location of the user device to the destination location, sending, to at least one autonomous vehicle, information for causing the at least one autonomous vehicle to relocate to a location where sensor coverage will be provided for at least a portion of the areas of the second route without sensor coverage from the plurality of vehicles.
9. The method of claim 1, further comprising: receiving sensor data from a plurality of vehicles in a vicinity of the first route; receiving from the user device an indication of the location of the user device; determining if sensor data received from each of the plurality of vehicles in the vicinity of the first route relates to a portion of the first route between the location of the user device and the destination location; communicating, to each of the plurality of vehicles in the vicinity of the first route that are not transmitting sensor data related to the portion of the first route between the location of the user device and the destination location, a message to stop sending sensor data related to the first route.
10. A method comprising: determining that a user associated with a user device intends to traverse a route from a received first user starting location to a first user destination location; receiving a location update from the user device; and sending, to a first autonomous vehicle, information for causing the first autonomous vehicle to transmit sensor data related to at least a portion of the route.
11. The method of claim 10, further comprising: determining that the user has traversed to a portion of the route such that the sensor data measured by the first autonomous vehicle relates only to the portion of the route already traversed by the user; sending, to the first autonomous vehicle, information for causing the first autonomous vehicle to stop transmitting sensor data related to the route.
12. The method of claim 10, further comprising: instructing a plurality of sensor-equipped vehicles to monitor a route; and in response to identification of a segment of the route that is unmonitored, instructing a sensor-equipped autonomous vehicle to navigate to a position such that a sensor of the sensor- equipped autonomous vehicle monitors at least a portion of the identified segment.
13. The method of claim 10, further comprising: receiving a request from the user device to assign additional surveillance support; and sending to at least one autonomous vehicle, information for causing the at least one autonomous vehicle to relocate to a location where sensor coverage will be provided for at least a portion of the route without sensor coverage from the plurality of vehicles.
14. The method of claim 10, further comprising sending to the user device security tracking status information regarding the route.
15. The method of claim 10, further comprising: determining that the user associated with the user device has reached the destination location; and sending information to one or more autonomous vehicles transmitting sensor data related to at least a portion of the route to stop transmitting sensor data related to at least a portion of the route.
16. The method of claim 10, further comprising: receiving sensor data from one or more vehicles transmitting sensor data related to at least a portion of the route; determining if received sensor data relates to a previously undetected vehicle; and sending an update to a user device information related to the previously undetected vehicle.
17. The method of claim 10, further comprising: calculating an updated safety metric for a portion of the route between the location of the user device and the first user destination location; and sending the updated safety metric to the user device.
18. The method of claim 17, further comprising sending a safety alert message to the user device if the updated safety metric for the route is below a safety threshold.
19. The method of claim 10, further comprising: receiving a location update from an autonomous vehicle regarding a user associated with a user device; determining what portion of the route remains between the user and the destination location and what portion of the route has already been traversed by the user; and sending information to one or more autonomous vehicles transmitting sensor data related to the portion of the route that has already been traversed by the user to stop transmitting sensor data related to that portion of the route.
20. A device, comprising: one or more location sensors; a processor; and a non-transitory computer-readable medium storing instructions that are operative, when executed on the processor, to perform the functions of: receiving from a plurality of vehicles, information identifying coverage areas covered by sensors of the respective vehicles; receiving, from a user, information identifying a starting location and a destination location; identifying at least one pedestrian route that extends from the starting location to the destination location, where the pedestrian route includes at least one gap in sensor coverage; sending to the user information identifying the pedestrian route; and in response to an indication that the user intends to traverse the pedestrian route, instructing at least one autonomous vehicle (AV) to relocate to a position such that a coverage area of a sensor of the AV will cover at least a portion of the gap.
PCT/US2017/025007 2016-04-05 2017-03-30 Method and system for autonomous vehicle sensor assisted selection of route with respect to dynamic route conditions WO2017176550A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201662318585P 2016-04-05 2016-04-05
US62/318,585 2016-04-05
US201662329572P 2016-04-29 2016-04-29
US62/329,572 2016-04-29

Publications (1)

Publication Number Publication Date
WO2017176550A1 true WO2017176550A1 (en) 2017-10-12

Family

ID=58547846

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/025007 WO2017176550A1 (en) 2016-04-05 2017-03-30 Method and system for autonomous vehicle sensor assisted selection of route with respect to dynamic route conditions

Country Status (1)

Country Link
WO (1) WO2017176550A1 (en)

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109358351A (en) * 2018-09-30 2019-02-19 南京联创北斗技术应用研究院有限公司 Curved path Dynamic Vehicle based on Beidou positioning is away from measuring method
CN109936622A (en) * 2019-01-29 2019-06-25 华南理工大学 A kind of unmanned plane cluster control method and system based on distributed resource sharing
WO2019152662A1 (en) * 2018-01-31 2019-08-08 Walmart Apollo, Llc System and method for identifying vehicle delivery locations utilizing scout autonomous vehicles
WO2019152249A1 (en) * 2018-02-02 2019-08-08 Walmart Apollo, Llc Systems and methods for managing last mile deliveries
CN110276463A (en) * 2018-03-14 2019-09-24 丰田自动车株式会社 Information processing system and server
EP3547228A1 (en) * 2018-03-28 2019-10-02 The Boeing Company Vehicle anomalous behavior detection
WO2020052465A1 (en) * 2018-09-14 2020-03-19 阿里巴巴集团控股有限公司 Road condition information processing system, method, and apparatus
WO2020064080A1 (en) * 2018-09-24 2020-04-02 Telefonaktiebolaget Lm Ericsson (Publ) Method and apparatus for capturing road conditions
WO2020137806A1 (en) * 2018-12-28 2020-07-02 Clarion Co., Ltd. Method for selecting route, terminal, system for selecting route, and program
WO2020145438A1 (en) * 2019-01-10 2020-07-16 엘지전자 주식회사 Device and method for v2x communication
CN111559383A (en) * 2019-02-13 2020-08-21 通用汽车环球科技运作有限责任公司 Method and system for determining Autonomous Vehicle (AV) motion based on vehicle and edge sensor data
CN111619551A (en) * 2019-02-28 2020-09-04 本田技研工业株式会社 Vehicle control system, vehicle control method, and storage medium
CN111861007A (en) * 2020-07-23 2020-10-30 上海中通吉网络技术有限公司 Express processing method, device and equipment integrating takeout platform and express platform
US10885785B2 (en) 2018-12-04 2021-01-05 At&T Intellectual Property I, L.P. Network-controllable physical resources for vehicular transport system safety
US10957196B2 (en) 2019-04-03 2021-03-23 International Business Machines Corporation Traffic redirection for autonomous vehicles
WO2021076099A1 (en) * 2019-10-15 2021-04-22 Google Llc Weather and road surface type-based navigation directions
WO2021102031A1 (en) * 2019-11-18 2021-05-27 Sidewalk Labs LLC Methods, systems, and media for generating and evaluating street grids
CN112912694A (en) * 2018-10-18 2021-06-04 传鼎有限公司 Automatically pairing GPS data with a planned travel route for a mobile object
WO2021114987A1 (en) * 2019-12-13 2021-06-17 苏州宝时得电动工具有限公司 Autonomous robot, wireless charging and docking method and apparatus therefor, and storage medium
US11174022B2 (en) * 2018-09-17 2021-11-16 International Business Machines Corporation Smart device for personalized temperature control
CN113706737A (en) * 2021-10-27 2021-11-26 北京主线科技有限公司 Road surface inspection system and method based on automatic driving vehicle
CN113728210A (en) * 2019-02-11 2021-11-30 特斯拉公司 Autonomous and user-controlled vehicle summons to targets
US20210380096A1 (en) * 2018-11-08 2021-12-09 Hitachi Astemo, Ltd. Vehicle control device, route distribution device, vehicle guidance system
US11215462B2 (en) 2018-10-26 2022-01-04 Here Global B.V. Method, apparatus, and system for location correction based on feature point correspondence
EP3945394A1 (en) * 2020-07-29 2022-02-02 HERE Global B.V. System/method for indoor vehicle collision prevention
DE102020130406A1 (en) 2020-11-18 2022-05-19 Valeo Schalter Und Sensoren Gmbh METHOD AND CONTROL DEVICE FOR CONTROLLING AN AUTONOMOUS VEHICLE
US11346685B2 (en) 2018-11-09 2022-05-31 Toyota Motor North America, Inc. Parking exit coordination systems and methods
CN114577229A (en) * 2022-01-28 2022-06-03 广州小鹏自动驾驶科技有限公司 Parking route filtering method and device, electronic equipment and storage medium
US11458993B2 (en) * 2020-09-15 2022-10-04 Tusimple, Inc. Detecting a road closure by a lead autonomous vehicle (AV) and updating routing plans for following AVs
US20220319308A1 (en) * 2021-03-31 2022-10-06 Honda Motor Co., Ltd. Smart traffic assistant systems and methods
WO2023039896A1 (en) * 2021-09-18 2023-03-23 北京小米移动软件有限公司 Wireless sensing method and apparatus, communication device, and storage medium
US20230184561A1 (en) * 2021-12-10 2023-06-15 Ford Global Technologies, Llc Systems and methods for providing a monitoring service for a pedestrian
CN116321628A (en) * 2023-05-09 2023-06-23 永林电子股份有限公司 Intelligent LED lamp regulation and control system used in market
WO2024019867A1 (en) * 2022-07-20 2024-01-25 Qualcomm Incorporated Handling privileges while changing geographical regions
US11885639B2 (en) * 2020-08-10 2024-01-30 Waymo Llc Generating scouting objectives

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080312819A1 (en) * 2007-06-12 2008-12-18 Arup Banerjee Pedestrian mapping system
US20090122142A1 (en) * 2007-11-09 2009-05-14 Bruce Douglas Shapley Distributed mobile surveillance system and method
WO2011116476A1 (en) * 2010-03-26 2011-09-29 Feeling Software Inc. Effortless navigation across cameras and cooperative control of cameras
US8509982B2 (en) 2010-10-05 2013-08-13 Google Inc. Zone driving
US20150353080A1 (en) 2014-06-06 2015-12-10 Toyota Jidosha Kabushiki Kaisha Automatic parking system
US20150367845A1 (en) 2014-06-19 2015-12-24 Toyota Jidosha Kabushiki Kaisha Parking assist apparatus
US20160125736A1 (en) 2014-10-31 2016-05-05 Toyota Motor Engineering & Manufacturing North America, Inc. Method to improve parking space identification in autonomous driving
US20160231746A1 (en) 2015-02-06 2016-08-11 Delphi Technologies, Inc. System And Method To Operate An Automated Vehicle

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080312819A1 (en) * 2007-06-12 2008-12-18 Arup Banerjee Pedestrian mapping system
US20090122142A1 (en) * 2007-11-09 2009-05-14 Bruce Douglas Shapley Distributed mobile surveillance system and method
WO2011116476A1 (en) * 2010-03-26 2011-09-29 Feeling Software Inc. Effortless navigation across cameras and cooperative control of cameras
US8509982B2 (en) 2010-10-05 2013-08-13 Google Inc. Zone driving
US8688306B1 (en) 2010-10-05 2014-04-01 Google Inc. Systems and methods for vehicles with limited destination ability
US20150353080A1 (en) 2014-06-06 2015-12-10 Toyota Jidosha Kabushiki Kaisha Automatic parking system
US20150367845A1 (en) 2014-06-19 2015-12-24 Toyota Jidosha Kabushiki Kaisha Parking assist apparatus
US20160125736A1 (en) 2014-10-31 2016-05-05 Toyota Motor Engineering & Manufacturing North America, Inc. Method to improve parking space identification in autonomous driving
US20160231746A1 (en) 2015-02-06 2016-08-11 Delphi Technologies, Inc. System And Method To Operate An Automated Vehicle

Cited By (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019152662A1 (en) * 2018-01-31 2019-08-08 Walmart Apollo, Llc System and method for identifying vehicle delivery locations utilizing scout autonomous vehicles
WO2019152249A1 (en) * 2018-02-02 2019-08-08 Walmart Apollo, Llc Systems and methods for managing last mile deliveries
CN110276463A (en) * 2018-03-14 2019-09-24 丰田自动车株式会社 Information processing system and server
US10896553B2 (en) 2018-03-28 2021-01-19 The Boeing Company Vehicle anomalous behavior detection
EP3547228A1 (en) * 2018-03-28 2019-10-02 The Boeing Company Vehicle anomalous behavior detection
CN110334367A (en) * 2018-03-28 2019-10-15 波音公司 Vehicles unusual checking
WO2020052465A1 (en) * 2018-09-14 2020-03-19 阿里巴巴集团控股有限公司 Road condition information processing system, method, and apparatus
US11174022B2 (en) * 2018-09-17 2021-11-16 International Business Machines Corporation Smart device for personalized temperature control
WO2020064080A1 (en) * 2018-09-24 2020-04-02 Telefonaktiebolaget Lm Ericsson (Publ) Method and apparatus for capturing road conditions
CN109358351A (en) * 2018-09-30 2019-02-19 南京联创北斗技术应用研究院有限公司 Curved path Dynamic Vehicle based on Beidou positioning is away from measuring method
CN112912694A (en) * 2018-10-18 2021-06-04 传鼎有限公司 Automatically pairing GPS data with a planned travel route for a mobile object
US11215462B2 (en) 2018-10-26 2022-01-04 Here Global B.V. Method, apparatus, and system for location correction based on feature point correspondence
US20210380096A1 (en) * 2018-11-08 2021-12-09 Hitachi Astemo, Ltd. Vehicle control device, route distribution device, vehicle guidance system
US11346685B2 (en) 2018-11-09 2022-05-31 Toyota Motor North America, Inc. Parking exit coordination systems and methods
US10885785B2 (en) 2018-12-04 2021-01-05 At&T Intellectual Property I, L.P. Network-controllable physical resources for vehicular transport system safety
US11821736B2 (en) 2018-12-28 2023-11-21 Faurecia Clarion Electronics Co., Ltd. Method for selecting route, terminal, system for selecting route, and program
WO2020137806A1 (en) * 2018-12-28 2020-07-02 Clarion Co., Ltd. Method for selecting route, terminal, system for selecting route, and program
US11968603B2 (en) 2019-01-10 2024-04-23 Lg Electronics Inc. Device and method for V2X communication
WO2020145438A1 (en) * 2019-01-10 2020-07-16 엘지전자 주식회사 Device and method for v2x communication
CN109936622A (en) * 2019-01-29 2019-06-25 华南理工大学 A kind of unmanned plane cluster control method and system based on distributed resource sharing
CN109936622B (en) * 2019-01-29 2021-08-06 华南理工大学 Unmanned aerial vehicle cluster control method and system based on distributed resource sharing
CN113728210A (en) * 2019-02-11 2021-11-30 特斯拉公司 Autonomous and user-controlled vehicle summons to targets
CN111559383B (en) * 2019-02-13 2023-12-05 通用汽车环球科技运作有限责任公司 Method and system for determining Autonomous Vehicle (AV) action based on vehicle and edge sensor data
CN111559383A (en) * 2019-02-13 2020-08-21 通用汽车环球科技运作有限责任公司 Method and system for determining Autonomous Vehicle (AV) motion based on vehicle and edge sensor data
CN111619551A (en) * 2019-02-28 2020-09-04 本田技研工业株式会社 Vehicle control system, vehicle control method, and storage medium
CN111619551B (en) * 2019-02-28 2023-07-07 本田技研工业株式会社 Vehicle control system, vehicle control method, and storage medium
US10957196B2 (en) 2019-04-03 2021-03-23 International Business Machines Corporation Traffic redirection for autonomous vehicles
CN113015887A (en) * 2019-10-15 2021-06-22 谷歌有限责任公司 Navigation directions based on weather and road surface type
US11867519B2 (en) 2019-10-15 2024-01-09 Google Llc Weather and road surface type-based navigation directions
WO2021076099A1 (en) * 2019-10-15 2021-04-22 Google Llc Weather and road surface type-based navigation directions
WO2021102031A1 (en) * 2019-11-18 2021-05-27 Sidewalk Labs LLC Methods, systems, and media for generating and evaluating street grids
US11955000B2 (en) 2019-11-18 2024-04-09 Google Llc Methods, systems, and media for generating and evaluating street grids
CN112987712B (en) * 2019-12-13 2022-05-17 苏州宝时得电动工具有限公司 Autonomous robot, wireless charging docking method and device thereof, and storage medium
WO2021114987A1 (en) * 2019-12-13 2021-06-17 苏州宝时得电动工具有限公司 Autonomous robot, wireless charging and docking method and apparatus therefor, and storage medium
CN112987712A (en) * 2019-12-13 2021-06-18 苏州宝时得电动工具有限公司 Autonomous robot, wireless charging docking method and device thereof, and storage medium
CN111861007A (en) * 2020-07-23 2020-10-30 上海中通吉网络技术有限公司 Express processing method, device and equipment integrating takeout platform and express platform
EP3945394A1 (en) * 2020-07-29 2022-02-02 HERE Global B.V. System/method for indoor vehicle collision prevention
US11885639B2 (en) * 2020-08-10 2024-01-30 Waymo Llc Generating scouting objectives
US11458993B2 (en) * 2020-09-15 2022-10-04 Tusimple, Inc. Detecting a road closure by a lead autonomous vehicle (AV) and updating routing plans for following AVs
DE102020130406A1 (en) 2020-11-18 2022-05-19 Valeo Schalter Und Sensoren Gmbh METHOD AND CONTROL DEVICE FOR CONTROLLING AN AUTONOMOUS VEHICLE
US20220319308A1 (en) * 2021-03-31 2022-10-06 Honda Motor Co., Ltd. Smart traffic assistant systems and methods
WO2023039896A1 (en) * 2021-09-18 2023-03-23 北京小米移动软件有限公司 Wireless sensing method and apparatus, communication device, and storage medium
CN113706737A (en) * 2021-10-27 2021-11-26 北京主线科技有限公司 Road surface inspection system and method based on automatic driving vehicle
CN113706737B (en) * 2021-10-27 2022-01-07 北京主线科技有限公司 Road surface inspection system and method based on automatic driving vehicle
US20230184561A1 (en) * 2021-12-10 2023-06-15 Ford Global Technologies, Llc Systems and methods for providing a monitoring service for a pedestrian
CN114577229B (en) * 2022-01-28 2024-03-12 广州小鹏自动驾驶科技有限公司 Parking route filtering method and device, electronic equipment and storage medium
CN114577229A (en) * 2022-01-28 2022-06-03 广州小鹏自动驾驶科技有限公司 Parking route filtering method and device, electronic equipment and storage medium
WO2024019867A1 (en) * 2022-07-20 2024-01-25 Qualcomm Incorporated Handling privileges while changing geographical regions
CN116321628B (en) * 2023-05-09 2023-08-08 永林电子股份有限公司 Intelligent LED lamp regulation and control system used in market
CN116321628A (en) * 2023-05-09 2023-06-23 永林电子股份有限公司 Intelligent LED lamp regulation and control system used in market

Similar Documents

Publication Publication Date Title
WO2017176550A1 (en) Method and system for autonomous vehicle sensor assisted selection of route with respect to dynamic route conditions
US20230060762A1 (en) Methods for executing autonomous rideshare requests
US11599123B2 (en) Systems and methods for controlling autonomous vehicles that provide a vehicle service to users
US10395285B2 (en) Selecting vehicle type for providing transport
US9672734B1 (en) Traffic aware lane determination for human driver and autonomous vehicle driving system
JP6857728B2 (en) How to drive autonomously at uncontrolled and controlled intersections
RU2761270C2 (en) System and method for providing transportation
US9494439B1 (en) Autonomous vehicle operated with guide assistance of human driven vehicles
US9983020B2 (en) Vehicle operation device and method
US10962372B1 (en) Navigational routes for autonomous vehicles
US11543824B2 (en) Queueing into pickup and drop-off locations
CN111052198A (en) Parked object detection system
US11884264B2 (en) Driveway maneuvers for autonomous vehicles
US11782439B2 (en) Determining routes for autonomous vehicles
US11651693B2 (en) Passenger walking points in pick-up/drop-off zones
US20210095978A1 (en) Autonomous Navigation for Light Electric Vehicle Repositioning
US20230324192A1 (en) Determining pickup and drop off locations for large venue points of interests
WO2019198449A1 (en) Information provision system, mobile terminal, information provision device, information provision method, and computer program
JP7350501B2 (en) Walking route information presentation system, server, terminal, and walking route information presentation method
Chou et al. CaNPAs: a campus navigation and parking assistant system
WO2020028206A1 (en) Self-driving vehicle systems and methods

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17717593

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17717593

Country of ref document: EP

Kind code of ref document: A1