US20180260635A1 - Intention recognition - Google Patents
Intention recognition Download PDFInfo
- Publication number
- US20180260635A1 US20180260635A1 US15/758,330 US201615758330A US2018260635A1 US 20180260635 A1 US20180260635 A1 US 20180260635A1 US 201615758330 A US201615758330 A US 201615758330A US 2018260635 A1 US2018260635 A1 US 2018260635A1
- Authority
- US
- United States
- Prior art keywords
- dynamic element
- environment
- dynamic
- vehicle
- contextual cues
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000033001 locomotion Effects 0.000 claims abstract description 79
- 230000003068 static effect Effects 0.000 claims description 37
- 238000012544 monitoring process Methods 0.000 claims description 27
- 238000000034 method Methods 0.000 claims description 21
- 230000007717 exclusion Effects 0.000 claims 3
- 230000009471 action Effects 0.000 abstract description 7
- 238000012545 processing Methods 0.000 description 32
- 230000005236 sound signal Effects 0.000 description 21
- 230000000007 visual effect Effects 0.000 description 19
- 230000001133 acceleration Effects 0.000 description 13
- 230000000712 assembly Effects 0.000 description 9
- 238000000429 assembly Methods 0.000 description 9
- 238000004891 communication Methods 0.000 description 8
- 230000004044 response Effects 0.000 description 8
- 230000005540 biological transmission Effects 0.000 description 5
- 230000003993 interaction Effects 0.000 description 5
- 230000002093 peripheral effect Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 230000037361 pathway Effects 0.000 description 4
- 230000008054 signal transmission Effects 0.000 description 4
- 238000007792 addition Methods 0.000 description 3
- 230000003190 augmentative effect Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000008447 perception Effects 0.000 description 2
- 229920001690 polydopamine Polymers 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000000875 corresponding effect Effects 0.000 description 1
- 238000013501 data transformation Methods 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 230000003362 replicative effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 239000010454 slate Substances 0.000 description 1
- 229920000638 styrene acrylonitrile Polymers 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0956—Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
-
- G06K9/00805—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- G06K9/00342—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/005—Traffic control systems for road vehicles including pedestrian guidance indicator
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/09626—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages where the origin of the information is within the own vehicle, e.g. a local storage device, digital map
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096716—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information does not generate an automatic action on the vehicle control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096733—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
- G08G1/096741—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where the source of the transmitted information selects which information to transmit to each vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096791—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is another vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
Definitions
- FIG. 2 illustrates an overhead view of an environment in which multiple dynamic elements are located, including a vehicle which is autonomously navigated through the environment and includes a set of sensors which can generate sensor data associated with the dynamic elements and an autonomous navigation system which can determine a particular predicted trajectory of a dynamic element based on contextual cues identified in the sensor data, according to some embodiments.
- FIG. 7 illustrates an overhead view of an environment in which multiple dynamic elements are located, including a vehicle which is autonomously navigated through the environment and includes a set of indicators which can generate a set of targeted signals which are transmitted to specific dynamic elements in the external environment, according to some embodiments.
- FIG. 8 illustrates generating targeted signals which are directed to particular target dynamic elements, according to some embodiments.
- a unit/circuit/component is “configured to” perform one or more tasks is expressly intended not to invoke 35 U.S.C. ⁇ 112, sixth paragraph, for that unit/circuit/component.
- “configured to” can include generic structure (e.g., generic circuitry) that is manipulated by software and/or firmware (e.g., an FPGA or a general-purpose processor executing software) to operate in manner that is capable of performing the task(s) at issue.
- “Configure to” may also include adapting a manufacturing process (e.g., a semiconductor fabrication facility) to fabricate devices (e.g., integrated circuits) that are adapted to implement or perform one or more tasks.
- the ANS can identify the one or more dynamic element intentions included in the association and can associate the identified intention with the one or more dynamic elements with which the correlated cues are associated.
- An intention specifies one or more predicted future motions of an associated dynamic element.
- the ANS can generate a particular predicted trajectory of the dynamic element based at least in part upon the associated intention.
- the term “if” may be construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context.
- the phrase “if it is determined” or “if [a stated condition or event] is detected” may be construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.
- FIG. 1 illustrates a schematic block diagram of a vehicle 100 which comprises an autonomous navigation system (ANS), according to some embodiments.
- ANS autonomous navigation system
- Some or all of the ANS 110 illustrated in FIG. 1 including one or more of the modules 111 , 112 , 113 , etc., can be included in any of the embodiments of ANSs included in any of the embodiments herein.
- the one or more user interfaces 130 can include one or more driving control interfaces with which an occupant of the vehicle 100 can interact, such that the driving control interfaces generate control commands which cause one or more control elements 130 to adjustably navigate the vehicle 100 , based on one or more occupant interactions with one or more interfaces 140 .
- one or more input interfaces 140 included in the vehicle 100 provide one or more instances of information to occupants of the vehicle, including indications of whether the vehicle is being navigated via autonomous driving control of the vehicle 100 by ANS 110 , whether the vehicle is being navigated to a stop based on implementation of a failure recovery plan at the ANS 110 , whether at least one failure in the ANS 110 is determined to have occurred, etc.
- a targeted signal generated by one or more sets of signal devices 117 includes a visual signal and one or more sets of signal devices 117 are configured to generate one or more targeted visual signals.
- one or more signal devices 117 can include one or more sets of light generating devices, light indicators, light-emitting diodes (LEDs), headlight assemblies, etc.
- One or more sets of signal devices 117 which are configured to generate one or more targeted visual signals can be adjustably positioned, so that one or more sets of signal devices 117 can, individually, collectively, etc., be positioned to direct one or more light beams towards one or more target dynamic elements in the external environment.
- Entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal data should comply with established privacy policies and/or practices. Such entities should safeguard and secure access to such personal data and ensure that others with access to the personal data also comply. Such entities should implement privacy policies and practices that meet or exceed industry or governmental requirements for maintaining the privacy and security of personal data. For example, an entity should collect users' personal data for legitimate and reasonable uses, and not share or sell the data outside of those legitimate uses. Such collection should occur only after receiving the users' informed consent. Furthermore, third parties can evaluate these entities to certify their adherence to established privacy policies and practices.
- Driving control module 112 can determine a driving route based at least in part upon at least some sensor data generated by one or more sensor devices 116 , including position data indicating a geographic position of the vehicle 100 and a world model, stored in one or more memory storage devices included in the vehicle 100 , one or more remotely-located systems external to the vehicle 100 , etc.
- processing sensor data generated by sensor devices included in the vehicle can include identifying the pedestrian as a particular dynamic element in the environment, where the identification can include determining a particular position of the pedestrian in the environment, a particular orientation of the pedestrian, determining a particular present velocity, acceleration, etc. of the pedestrian, etc.
- Identifying 506 contextual cues 507 B associated with the pedestrian can include identifying situational contextual cues which include a cue indicating that the pedestrian is moving along a sidewalk with a general direction of travel which extends along in the sidewalk, a cue indicating that the pedestrian is moving along the sidewalk within a certain threshold velocity associated with jogging activities, a cue indicating that a field of view of the pedestrian includes the sidewalk and general direction of pedestrian travel, etc.
- a dynamic element intention is generated, where the intention specifies occurrence of the motion of the dynamic element, monitored at 508 , subsequent to identification, at 506 , of the contextual cues associated with the dynamic element. For example, where a pedestrian is monitored, at 508 , to continue moving along a sidewalk subsequent to identification of cues at 506 which indicate that the pedestrian is moving within a velocity window associated with jogging activities, is wearing jogging attire, has a field of view which includes the sidewalk and the direction of travel, etc., an intention generated at 510 can specify that the pedestrian is predicted to continue moving along the sidewalk subsequent to identifying the cues identified at 506 .
- Targeted signal 750 A is generated by a particular set of signal generators 713 A included in vehicle 710 and is directed towards vehicle 720 along a particular axis 752 A and angle 751 A of transmission, such that signal 750 A passes through a limited portion of environment 700 in which vehicle 720 passes, so that the signal 750 A is received by a limited portion of the elements 720 , 730 included in the environment 700 .
- ANS 712 can determine a particular axis 752 A and angle 751 A of a targeted signal to direct to element 720 and can select a particular configuration of signal generators 713 A which can generate and transmit the signal 750 A along the particular axis 752 A and angle 751 A.
- the ANS 712 can determine the axis 752 A and angle 751 A of the signal 750 A based on identification of a size, position, velocity, acceleration, etc. of the dynamic element 720 through the environment, a predicted trajectory of the element 720 through the environment, some combination thereof, etc.
- the signal 750 A can include information which includes a message which, when communicates one or more signals, alerts, messages, etc. to element 720 .
- signal 750 A can include a message which communicates, to element 720 , information regarding traffic conditions, particular dynamic elements, events, etc. associated with one or more portions of environment 700 .
- Signal 750 A can include information indicating an occurrence of stopped vehicles in a portion of the environment 700 which through which vehicle 710 has previously navigated.
- Signal 750 A can include information which communicates, to vehicle 720 , a driving route along which vehicle 710 is being navigated by ANS 712 .
- Signal 750 A can include a warning signal.
- one or more of the targeted signals 750 A-B includes a visual signal.
- the targeted signal 750 A can be generated by one or more visual signal generators 713 A, including one or more lights, included in the vehicle 710 , where a particular set of visual indicators are activated to generate a particular visual signal 750 A which includes one or more instances of information.
- the visual indicators 713 A can include at least a portion of one or more headlight assemblies included in the vehicle 710 , where a portion of the headlight assemblies can be adjusted to direct a light beam having a particular axis 752 A and angle 751 A to a particular dynamic element 720 in the environment 700 .
- the headlight assemblies can be adjusted to provide a visual signal 750 A which includes a variable-intensity beam of light, including a series of light beam pulses, etc.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Atmospheric Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Electromagnetism (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Social Psychology (AREA)
- Psychiatry (AREA)
- General Health & Medical Sciences (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
Abstract
Some embodiments provide an autonomous navigation system which autonomously navigates a vehicle through an environment based on predicted trajectories of one or more separate dynamic elements through the environment. The system identifies contextual cues associated with a monitored dynamic element, based on features of the dynamic element and actions of the dynamic element relative to various elements of the environment, including motions relative to other dynamic elements. A monitored dynamic element can be associated with a particular intention, which specifies a prediction of dynamic element movement through the environment, based on a correlation between identified contextual cues associated with the monitored dynamic element and a set of contextual cues which are associated with the particular intention. A predicted trajectory of the dynamic element is generated based on an associated intention. A targeted signal can be directed to a target dynamic element based on a predicted trajectory of the dynamic element.
Description
- This application is a 371 of PCT Application No. PCT/US2016/050621, filed Sep. 8, 2016, which claims benefit of priority to U.S. Provisional Patent Application No. 62/215,672, filed Sep. 8, 2015. The above applications are incorporated herein by reference. To the extent that any material in the incorporated application conflicts with material expressly set forth herein, the material expressly set forth herein controls.
- This disclosure relates generally to autonomous navigation of a vehicle, and in particular to an autonomous navigation system which can be included in a vehicle and which navigates the vehicle in an environment, which includes various dynamic elements, based on predicting trajectories of the dynamic elements based on recognizing contextual cues associated with the elements and the environment.
- The rise of interest in autonomous navigation of vehicles, including automobiles, has resulted in a desire to develop autonomous navigation systems which can autonomously navigate (i.e., autonomously “drive”) a vehicle through various routes, including one or more roads in a road network, such as contemporary roads, streets, highways, etc.
- In some cases, autonomous navigation is enabled via an autonomous navigation system (ANS) which can process and respond to detection of various elements in an external environment, including static features (e.g., roadway lanes, road signs, etc.) and dynamic features (present locations of other vehicles in a roadway on which the route extends, present locations of pedestrians, present environmental conditions, roadway obstructions, etc.) along a route in real-time as they are encountered, thereby replicating the real-time processing and driving capabilities of a human being.
- In some cases, autonomous navigation includes navigating a vehicle in response to detection of one or more traffic participants in the environment through which the vehicle is being navigated. For example, where another vehicle is detected ahead of the navigated vehicle and is determined to be moving slower than the navigated vehicle, such that the navigated vehicle is approaching the other vehicle, the navigated vehicle can be slowed or stopped to prevent the vehicle paths intersecting. In another example, where a pedestrian is identified near an edge of the roadway along which the vehicle is being navigated, the vehicle can be slowed or stopped in response to detection of the pedestrian.
- Some embodiments provide an autonomous navigation system which autonomously navigates a vehicle through an environment based on predicted trajectories of one or more separate dynamic elements through the environment. The system identifies contextual cues associated with a monitored dynamic element, based on features of the dynamic element and actions of the dynamic element relative to various elements of the environment, including motions relative to other dynamic elements. A monitored dynamic element can be associated with a particular intention, which specifies a prediction of dynamic element movement through the environment, based on a correlation between identified contextual cues associated with the monitored dynamic element and a set of contextual cues which are associated with the particular intention. A predicted trajectory of the dynamic element is generated based on an associated intention. A targeted signal can be directed to a target dynamic element based on a predicted trajectory of the dynamic element.
- Some embodiments provide an apparatus which includes an autonomous navigation system which can be installed in a vehicle and can autonomously navigate the vehicle through an environment in which the vehicle is located. The autonomous navigation system is configured to identify a set of contextual cues associated with a dynamic element located in the environment, wherein each contextual cue indicates one or more particular features associated with the dynamic element, based on monitoring at least a portion of the environment; associate the dynamic element with a particular set of predicted motions, based on a determination of a correlation between the identified set of contextual cues and a predetermined set of contextual cues which are associated with the particular set of predicted motions; generate a predicted trajectory of the dynamic element through the environment based on the particular set of predicted motions associated with the dynamic element; and generate a set of control commands which, when executed by one or more control elements installed in the vehicle, cause the vehicle to be navigated along a driving route which avoids intersection with the predicted trajectory of the dynamic element.
- Some embodiments provide a method which includes identifying a set of contextual cues associated with a dynamic element located in the environment, wherein each contextual cue indicates one or more particular features associated with the dynamic element, based on monitoring at least a portion of the environment; associating the dynamic element with a particular set of predicted motions, based on a determination of a correlation between the identified set of contextual cues and a predetermined set of contextual cues which are associated with the particular set of predicted motions; generating a predicted trajectory of the dynamic element through the environment based on the particular set of predicted motions associated with the dynamic element; and generating a set of control commands which, when executed by one or more control elements installed in the vehicle, cause the vehicle to be navigated along a driving route which avoids intersection with the predicted trajectory of the dynamic element.
-
FIG. 1 illustrates a schematic block diagram of a vehicle which comprises an autonomous navigation system (ANS), according to some embodiments. -
FIG. 2 illustrates an overhead view of an environment in which multiple dynamic elements are located, including a vehicle which is autonomously navigated through the environment and includes a set of sensors which can generate sensor data associated with the dynamic elements and an autonomous navigation system which can determine a particular predicted trajectory of a dynamic element based on contextual cues identified in the sensor data, according to some embodiments. -
FIG. 3 illustrates an overhead view of an environment in which multiple dynamic elements are located, including a vehicle which is autonomously navigated through the environment and includes a set of sensors which can generate sensor data associated with the dynamic elements and an autonomous navigation system which can determine a particular predicted trajectory of a dynamic element based on contextual cues identified in the sensor data, according to some embodiments. -
FIG. 4 illustrates an overhead view of an environment in which multiple dynamic elements are located, including a vehicle which is autonomously navigated through the environment and includes a set of sensors which can generate sensor data associated with the dynamic elements and an autonomous navigation system which can determine a particular predicted trajectory of a dynamic element based on contextual cues identified in the sensor data, according to some embodiments. -
FIG. 5 illustrates generating an intention association between identified contextual cues and subsequent predicted dynamic element movements, according to some embodiments. -
FIG. 6 illustrates autonomously navigating a vehicle through an external environment based on a predicted trajectory of a dynamic element in the external environment, according to some embodiments. -
FIG. 7 illustrates an overhead view of an environment in which multiple dynamic elements are located, including a vehicle which is autonomously navigated through the environment and includes a set of indicators which can generate a set of targeted signals which are transmitted to specific dynamic elements in the external environment, according to some embodiments. -
FIG. 8 illustrates generating targeted signals which are directed to particular target dynamic elements, according to some embodiments. -
FIG. 9 illustrates an example computer system configured to implement aspects of a system and method for autonomous navigation, according to some embodiments. - This specification includes references to “one embodiment” or “an embodiment.” The appearances of the phrases “in one embodiment” or “in an embodiment” do not necessarily refer to the same embodiment. Particular features, structures, or characteristics may be combined in any suitable manner consistent with this disclosure.
- “Comprising.” This term is open-ended. As used in the appended claims, this term does not foreclose additional structure or steps. Consider a claim that recites: “An apparatus comprising one or more processor units . . . .” Such a claim does not foreclose the apparatus from including additional components (e.g., a network interface unit, graphics circuitry, etc.).
- “Configured To.” Various units, circuits, or other components may be described or claimed as “configured to” perform a task or tasks. In such contexts, “configured to” is used to connote structure by indicating that the units/circuits/components include structure (e.g., circuitry) that performs those task or tasks during operation. As such, the unit/circuit/component can be said to be configured to perform the task even when the specified unit/circuit/component is not currently operational (e.g., is not on). The units/circuits/components used with the “configured to” language include hardware—for example, circuits, memory storing program instructions executable to implement the operation, etc. Reciting that a unit/circuit/component is “configured to” perform one or more tasks is expressly intended not to invoke 35 U.S.C. § 112, sixth paragraph, for that unit/circuit/component. Additionally, “configured to” can include generic structure (e.g., generic circuitry) that is manipulated by software and/or firmware (e.g., an FPGA or a general-purpose processor executing software) to operate in manner that is capable of performing the task(s) at issue. “Configure to” may also include adapting a manufacturing process (e.g., a semiconductor fabrication facility) to fabricate devices (e.g., integrated circuits) that are adapted to implement or perform one or more tasks.
- “First,” “Second,” etc. As used herein, these terms are used as labels for nouns that they precede, and do not imply any type of ordering (e.g., spatial, temporal, logical, etc.). For example, a buffer circuit may be described herein as performing write operations for “first” and “second” values. The terms “first” and “second” do not necessarily imply that the first value must be written before the second value.
- “Based On.” As used herein, this term is used to describe one or more factors that affect a determination. This term does not foreclose additional factors that may affect a determination. That is, a determination may be solely based on those factors or based, at least in part, on those factors. Consider the phrase “determine A based on B.” While in this case, B is a factor that affects the determination of A, such a phrase does not foreclose the determination of A from also being based on C. In other instances, A may be determined based solely on B.
- Some embodiments include one or more vehicles in which an autonomous navigation system (“ANS”) is included, where the ANS identifies dynamic elements in a common external environment in which the one or more vehicles are located, predicts a trajectory of certain dynamic elements based on contextual cues associated with one or more of the dynamic elements, and autonomously navigates the vehicle based on the predicted trajectories of the dynamic elements along a driving route which avoids intersections with the trajectories of the dynamic elements. Autonomously navigating a vehicle along a driving route which avoids intersections with the trajectories of one or more of the dynamic elements includes navigating the vehicle along a driving route which avoids paths intersecting between the one or more vehicles and the one or more dynamic elements.
- In some embodiments, the ANS autonomously navigates the vehicle along a driving route via generation of control commands associated with various control elements of the vehicle, where the control commands, when received at the associated control elements, cause the control elements to navigate the vehicle along the driving route.
- In some embodiments, the ANS generates a driving route through an external environment based at least in part upon various static elements and dynamic elements included in the external environment. Static elements can include roadway features, including roadway lanes, curbs, etc., traffic signs and traffic signals, flora, artificial structures, inanimate objects, etc. Dynamic elements can include a time of day, local weather conditions, fauna, traffic participants, etc. in the external environment. Traffic participants can include vehicles, pedestrians, some combination thereof, etc. located in the external environment, including traffic participants located proximate to or in the roadway along which the vehicle is located.
- The ANS, in some embodiments, generates a driving route based on various detected static elements and dynamic elements in the external environment, where the driving route includes a route, via which the vehicle can be navigated through the external environment, which avoids intersection of the vehicle with one or more static elements, dynamic elements, etc. located in the external environment. For example, a driving route through an external environment can include a route which avoids intersection with a static obstacle in the roadway along which the vehicle is being navigated, a traffic participant which includes another vehicle navigating along the roadway in an opposite direction of travel relative to the vehicle, etc.
- In some embodiments, the ANS, to generate a driving route which avoids intersection of the vehicle with various dynamic elements in an external environment, generates predicted trajectories of one or more dynamic elements, including traffic participants, through at least a portion of the external environment. The ANS can generate, for a given dynamic element, including a pedestrian, a predicted trajectory of the dynamic element through the external environment for a particular future amount of elapsed time. A predicted trajectory can include a predicted position, velocity, acceleration, etc. of the dynamic element through the external environment at one or more future points in time. As a result, based on the predicted trajectory of the dynamic element, the ANS can predict a future position, velocity, acceleration, etc. of the dynamic element in the external environment at various future points in time. The ANS can generate a driving route along which the ANS can navigate a vehicle, where the driving route avoids intersection with the dynamic element at any of the future points in time, based on the predicted trajectory of the dynamic element.
- In some embodiments, the ANS generates a predicted trajectory of a dynamic element through an external environment based on identifying various contextual cues associated with the dynamic element and further identifying an association between the identified contextual cues and a particular dynamic element intention which can be associated with the dynamic element. The various contextual cues can be identified based on processing sensor data, generated by various sensor devices, which includes information associated with various portions of the external environment, dynamic elements, static elements, etc. The association can include a predetermined association between a set of contextual cues and one or more particular dynamic element intentions, and identifying the association can be based on determining a correlation between the identified contextual cues and at least some of the contextual cues included in the association. Based on the correlation, the ANS can identify the one or more dynamic element intentions included in the association and can associate the identified intention with the one or more dynamic elements with which the correlated cues are associated. An intention specifies one or more predicted future motions of an associated dynamic element. The ANS can generate a particular predicted trajectory of the dynamic element based at least in part upon the associated intention.
- In some embodiments, an ANS generates an association between a dynamic element intention and a set of contextual cues based on monitoring motions of a dynamic element in an external environment, which can include identifying various contextual cues associated with the dynamic element, tracking movement of the dynamic element through the environment, generating an intention which specifies the tracked movement of the dynamic element, and generating an association between the generated intention and a set of identified contextual cues associated with the dynamic element.
- The ANS can revise, refine, change, etc. an association over time based on subsequent monitoring of one or more dynamic elements in one or more external environments. In some embodiments, the ANS associates a confidence value with a particular association and selectively enables the association to be used, in predicting dynamic element trajectories and generating driving routes, based on a determination that the confidence value associated with the association exceeds a predetermined threshold value. The confidence value can be adjusted based on repeated verifications that a dynamic element's actual movements through an environment at least partially correlates to the movements predicted by an intention which is associated with identified contextual cues associated with the dynamic element.
- Autonomously navigating a vehicle through an environment based on dynamic element trajectories which are predicted based on contextual cues and associations between the contextual cues and the predicted trajectories can provide augmented navigation relative to autonomous navigation systems which predict future trajectories of dynamic elements based on tracked prior and present movement of the dynamic elements through the environment. For example, where a dynamic element which includes a pedestrian is observed, via sensor data generated by a vehicle sensor device, to approach a crosswalk across a roadway ahead of the vehicle at a particular rate of speed, the ANS can determine, based on various contextual cues identified from at least the sensor data, that the predicted trajectory of the pedestrian which is associated with the cues includes a trajectory which includes the pedestrian stopping at the edge of the crosswalk rather than continuing through the crosswalk at the observed particular rate of speed. As a result, the ANS can generate a driving route which causes the vehicle to be navigated through the crosswalk without decelerating, thereby providing augmented navigation control relative to a system which predicts the future trajectory of the pedestrian based on the pedestrian's movement towards the crosswalk, as such a system may predict that the pedestrian would continue into the roadway, through the crosswalk, without stopping and may thus command the vehicle to stop. Because the ANS predicts dynamic element trajectories based on contextual cues and associated intentions, rather than extrapolating present motion into future motion, the ANS can provide improved prediction of complex dynamic element trajectories and improved safety and navigation of the vehicle, based on navigating the vehicle based on the improved prediction of dynamic element motion through the external environment.
- In some embodiments, the ANS generates one or more targeted signals which are transmitted through the external environment to one or more targeted elements in the external environment. The one or more targeted signals can include information, also referred to herein as “content”, “signal content”, etc., which is included in the targeted signal based on the element to which the targeted signal is transmitted. A targeted signal can include a signal which is directed to a particular “target” dynamic element in the environment and comprises a signal axis and angle which is focused on the target dynamic element so that the recipients of the targeted signal are at least partially restricted to the target dynamic element. In addition, in some embodiments, the targeted signal includes content which is particularly associated with the target dynamic element, relative to other dynamic elements in the environment. In some embodiments, the content is associated with a present state of the dynamic element, a predicted trajectory of the dynamic element, some combination thereof, etc. In some embodiments, the content comprises a message.
- In some embodiments, the ANS generates the targeted signal directed at a particular target dynamic element based on a determination that the target dynamic element lacks perception of the vehicle in which the ANS is included, where the targeted signal includes content which provides an indication, to the dynamic element, of the presence of the vehicle in the environment. For example, where the dynamic element includes a pedestrian which is determinate to be oriented in a direction which results in the vehicle being outside of the field of view of the pedestrian, the ANS can generate a targeted signal to the pedestrian which provides the pedestrian with an indication that the vehicle is proximate to the pedestrian. As a result, the targeted signal augments the perception of the pedestrian, which can augment the safety of the vehicle and the pedestrian by mitigating a risk that the pedestrian will follow a trajectory which intersects with the driving route along which the vehicle is being navigated. In addition, the ANS can predict, based on the transmission of the targeted signal, that the pedestrian will follow a trajectory based at least in part upon the content of the signal, which can include following a trajectory which avoids the vehicle, and can further generate a driving route based on the prediction.
- In some embodiments, the targeted signal includes a visual signal. For example, the targeted signal can be generated by one or more visual indicators, including one or more lights, included in the vehicle, where a particular set of visual indicators are activated to generate a particular visual signal. The visual indicators can include at least a portion of one or more headlight assemblies included in the vehicle, where a portion of the headlight assemblies can be adjusted to direct a light beam to a particular dynamic element in the environment. The headlight assemblies can be adjusted to provide a visual signal which includes a variable-intensity beam of light, including a series of light beam pulses, etc.
- In some embodiments, a targeted signal includes an audio signal which is directed at a particular dynamic element, including a traffic participant, instance of fauna, etc. located in the external environment. The audio signal can be a directional signal which is focused in angle and axis via various known systems and methods of generating targeted audio signals, including one or more of beamforming, ultrasonic modulation, etc., so that the recipient of the audio signal is at least partially restricted to the target dynamic element to which the targeted signal is directed. As a result, the amplitude of the signal can be reduced, relative to a non-targeted audio signal, which can result in reduced disturbances to other dynamic elements in the external environment as a result of the signal transmission. In addition, as a result of the amplitude of the targeted signal being reduced relative to a non-targeted signal, information communicated to the dynamic element via content in the targeted signal can be at least partially precluded from being received and interpreted by other dynamic elements in the environment, thereby providing at least some level information security to the communication.
- As referred to herein, a “driving route” includes a pathway along which a vehicle is navigated. A driving route can extend from a starting location to another separate destination location, extend back to a destination location which is the same as the starting location, etc. A route can extend along one or more various portions of one or more various roadways. For example, a route between a home location and a work location can extend from a home driveway, through one or more residential streets, along one or more portions of one or more avenues, highways, toll ways, etc., and to one or more parking spaces in one or more parking areas. Such routes can be routes which a user repeatedly navigates over time, including multiple times in a given day (e.g., routes between home and work locations may be travelled at least once in a given day).
- Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. However, it will be apparent to one of ordinary skill in the art that some embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
- It will also be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first contact could be termed a second contact, and, similarly, a second contact could be termed a first contact, without departing from the intended scope. The first contact and the second contact are both contacts, but they are not the same contact.
- The terminology used in the description herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description and the appended claims, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “includes,” “including,” “comprises,” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
- As used herein, the term “if” may be construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context. Similarly, the phrase “if it is determined” or “if [a stated condition or event] is detected” may be construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.
-
FIG. 1 illustrates a schematic block diagram of avehicle 100 which comprises an autonomous navigation system (ANS), according to some embodiments. Some or all of theANS 110 illustrated inFIG. 1 , including one or more of themodules -
Vehicle 100 includes an autonomous navigation system (“ANS”) 110, a set of one ormore sensor devices 116, a set of one ormore control elements 120, a set of one ormore signal devices 117, and a set of one ormore user interfaces 130.Sensor devices 116 include devices which monitor one or more aspects of an external environment in which the vehicle is located. Monitoring an aspect of an external environment can include generating, at the sensor device, sensor data which includes information regarding the aspect of the external environment. For example, asensor device 116 can include one or more of a camera device which generates images of one or more portions of the external environment, a light beam scanning device which generates one or more point clouds of one or more portions of the external environments, a radar device which generates radar data associated with one or more portions of the external environment, etc. Aspects of an external environment which can be monitored include one or more static elements, dynamic elements, etc. included in the environment. For example, asensor device 116 which includes a camera device can capture images of an external environment which includes images of static elements, including roadway lane boundary markers, roadway curbs, inanimate obstacles in the roadway, etc., images of dynamic elements including traffic participants, fauna, ambient environment conditions, weather, etc. - The
control elements 120 included invehicle 100 include various control elements, including actuators, motors, etc. which each control one or more components of the vehicle which cause the vehicle to be navigated through an external environment. For example, acontrol element 120 can include one or more of a braking assembly (also referred to herein interchangeably as a braking system) which applies braking pressure to one or more wheel assemblies of the vehicle to cause the vehicle to be decelerated, a throttle assembly which adjusts the acceleration of thevehicle 100 through an external environment, a steering assembly which adjusts one or more configurations of one or more wheel assemblies which causes the vehicle to be adjustably navigated in one or more various directions through the external environment, etc. A control element can execute one or more various adjustments to navigation of the vehicle based on receipt and execution of one or more various control commands at the control elements from one or more of auser interface 130, theANS 110, etc. - The one or
more user interfaces 130, also referred to herein interchangeably as input interfaces, can include one or more driving control interfaces with which an occupant of thevehicle 100 can interact, such that the driving control interfaces generate control commands which cause one ormore control elements 130 to adjustably navigate thevehicle 100, based on one or more occupant interactions with one or more interfaces 140. In some embodiments, one or more input interfaces 140 included in thevehicle 100 provide one or more instances of information to occupants of the vehicle, including indications of whether the vehicle is being navigated via autonomous driving control of thevehicle 100 byANS 110, whether the vehicle is being navigated to a stop based on implementation of a failure recovery plan at theANS 110, whether at least one failure in theANS 110 is determined to have occurred, etc. -
Vehicle 100 includes at least one set ofsignal devices 117 which are coupled to various portions of an exterior of thevehicle 100 and are configured, individually, collectively, in limited part, etc. to generate one or more various targeted signals which are directed to one or more particular dynamic elements in the external environment. In some embodiments, one or more signal devices are configured to generate at least a portion of a targeted signal which communicates one or more instances of targeted information to the particular dynamic elements, where the one or more instances of targeted information are selected based at least in part upon one or more particular aspects of one or more of thevehicle 100, the one or more dynamic elements in the external environment, etc. For example, a targeted signal can communicate, to a particular target dynamic element, information which indicates a presence of the vehicle in the external environment. In another example, a targeted signal can communicate, to a particular target dynamic element, information which includes a message to the target dynamic element. - In some embodiments, a set of
signal devices 117 included in a vehicle can generate and direct a targeted signal to a particular target dynamic element included in the external environment, so that the signal is directed through a limited portion of the external environment in which the particular target dynamic element is located and the dynamic elements which can perceive the targeted signal can be at least partially restricted to the dynamic elements, including the particular target dynamic element, which are located in the limited portion of the external environment through which the targeted signal is directed. - In some embodiments, a targeted signal generated by one or more sets of
signal devices 117 includes an audio signal and one or more sets ofsignal devices 117 are configured to generate one or more targeted audio signals. For example, one ormore signal devices 117 can include one or more sets of speaker devices. In some embodiments, one or more speaker devices can be adjustably positioned. The audio signal can include one or more of an audio indication, an audio message, a particular sound effect, some combination thereof, etc. The signal can be directed through a particular limited portion of the environment in which the target dynamic element is located via one or more of beamforming, ultrasonic modulation, etc. In some embodiments,vehicle 100 includes multipleaudio signal devices 117 located at various portions of the exterior of thevehicle 100, and, to generate a targeted audio signal which is directed at a target dynamic element, a limited selection of the audio signals which are located on a portion of thevehicle 100 exterior and are at least partially oriented in a direction towards the target dynamic element can be commanded to collectively generate the target audio signal. One ormore signal devices 117 can be adjustably positioned to cause thesignal devices 117 to be directed towards the target dynamic element, which results in a signal generated by the signal devices to be directed towards the target dynamic element. - In some embodiments, a targeted signal generated by one or more sets of
signal devices 117 includes a visual signal and one or more sets ofsignal devices 117 are configured to generate one or more targeted visual signals. For example, one ormore signal devices 117 can include one or more sets of light generating devices, light indicators, light-emitting diodes (LEDs), headlight assemblies, etc. One or more sets ofsignal devices 117 which are configured to generate one or more targeted visual signals can be adjustably positioned, so that one or more sets ofsignal devices 117 can, individually, collectively, etc., be positioned to direct one or more light beams towards one or more target dynamic elements in the external environment. - A message included in a targeted signal can be a warning associated with one or more of the driving route along which the vehicle is being navigated, one or more trajectories via which the dynamic element can navigate through the environment, etc. For example, where a dynamic element includes a pedestrian that is walking along a sidewalk proximate to a
vehicle 100 and in a particular orientation which precludes thevehicle 100 from being within a field of view of the pedestrian, a targeted signal generated by one ormore signal devices 117 which is directed to the pedestrian can include an audio warning, to the pedestrian, to avoid turning into the roadway along which the vehicle is being navigated. The message can include a verbal, spoken message. - In some embodiments, one or more instances of personal data can be accessed by
ANS 110. For example, in some embodiments,ANS 110 can process sensor data, generated by one ormore sensor devices 116, and, based on personal data including facial recognition data, associated user device detection, etc., identify a dynamic element in the environment as an element associated with a particular individual, user account, etc. In some embodiments, the content included in a targeted signal generated at one ormore signal devices 117 includes content generated based on one or more instances of personal data, including personal schedule data, identity data, etc. For example, where a dynamic element is identified atANS 110 as being a particular individual, the ANS can generate a targeted signal which includes audio content which addresses the dynamic element by name. - Users can benefit from use of personal data by the ANS. For example, the personal data can be used to communicate relevant content to a particular individual identified in the external environment by the ANS. Accordingly, use of such personal data enables users to influence and control delivered content.
- Users can selectively block use of, or access to, personal data. A system incorporating some or all of the technologies described herein can include hardware and/or software that prevents or blocks access to such personal data. For example, the system can allow users to “opt in” or “opt out” of participation in the collection of personal data or portions of portions thereof. Also, users can select not to provide location information, or permit provision of general location information (e.g., a geographic region or zone), but not precise location information.
- Entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal data should comply with established privacy policies and/or practices. Such entities should safeguard and secure access to such personal data and ensure that others with access to the personal data also comply. Such entities should implement privacy policies and practices that meet or exceed industry or governmental requirements for maintaining the privacy and security of personal data. For example, an entity should collect users' personal data for legitimate and reasonable uses, and not share or sell the data outside of those legitimate uses. Such collection should occur only after receiving the users' informed consent. Furthermore, third parties can evaluate these entities to certify their adherence to established privacy policies and practices.
-
ANS 110 includesvarious modules ANS 110 autonomously navigatesvehicle 100 along one or more driving routes, based at least in part upon sensor data generated by one ormore sensor devices 116. - Driving
control module 112 can determine a driving route based at least in part upon at least some sensor data generated by one ormore sensor devices 116, including position data indicating a geographic position of thevehicle 100 and a world model, stored in one or more memory storage devices included in thevehicle 100, one or more remotely-located systems external to thevehicle 100, etc. - In some embodiments, the
ANS 110 determines a driving route based at least in part upon occupant interaction with one ormore interfaces 130 included in thevehicle 100, including one or more interactions which result in theANS 110 receiving an occupant-initiated command to navigate thevehicle 100 from a particular location in the external environment, which can include a present location of thevehicle 100 in the external environment, to a particular destination location. In some embodiments, the occupant-initiated command includes a command to navigate thevehicle 100 along a particular occupant-selected driving route. In some embodiments, theANS 110 receives a driving route from a remotely-located system via one or more communication networks. - In some embodiments, the
module 112 generates one or more sets of control elements commands which are communicated to one ormore control elements 120 in thevehicle 100 and cause thecontrol elements 120 to navigate thevehicle 100 along a driving route. Themodule 112 can generate control commands based on a driving route, where the control commands, when executed by thecontrol elements 120, cause thevehicle 100 to be navigated along the driving route. -
ANS 110 includes anintention recognition module 111 which generates a driving route through a portion of an external environment based at least in part upon predicted trajectories of one or more dynamic elements, including one or more traffic participants, instances of fauna, pedestrians, etc., through at least a portion of the external environment. Themodule 111 can process sensor data, generated by one ormore sensor devices 116, and, based on the processing, identify various static elements and dynamic elements in the external environment. - The
module 111 can identify various features of the various static elements and dynamic elements. The various features are referred to herein as contextual cues and include object cues which are associated with observed aspects of a dynamic element and situational cues which are associated both the dynamic element and various aspects of the environment, including interactions between dynamic element and various elements in the environment, motions of the dynamic element with regard to various elements in the environment, etc. An object cue associated with a dynamic element can include one or more particular features associated with an appearance of one or more portions of a dynamic element detected via processing of sensor data, one or more particular features associated with various movements and actions of the dynamic element in the environment detected via processing of sensor data, etc. For example, where a pedestrian is detected, via sensor data processing, to be jogging along a sidewalk which extends adjacent to a roadway, object cues which can be identified via sensor data processing can include a type of clothing worn by the pedestrian, a rate of motion by the pedestrian along the sidewalk, an orientation of the pedestrian and direction of travel, a position, velocity, acceleration, etc. of the pedestrian with regard to one or more particular static elements, dynamic elements, etc. in the environment, an interaction by the pedestrian with one or more static elements, dynamic elements, etc. in the environment, etc. In another example, situational cues can include a present set of weather conditions in the environment in which the dynamic element is moving, a position, motion, etc. of the dynamic element relative to one or more particular static elements in the environment, including one or more particular structures, in the environment, etc. -
Module 111 can predict a trajectory of a dynamic element, including a traffic participant, through the environment based on identifying various contextual cues associated with the dynamic element and correlating the identified cues with one or more sets of cues included in an association between the sets of cues and one or more particular dynamic element intentions. Correlating the identified cues with cues included in an association can include determining an at least partial match between the identified contextual cues and a set of cues included in the association, where an at least partial match can include a match between the sets of cues which exceeds a certain threshold level. Based on the correlation,module 111 can associate the identified cues with an intention included in the association. - Associating identified cues, associated with a dynamic element, with a particular intention can include associating the particular intention with the dynamic element. A dynamic element intention can include a set of specified actions, motions, etc. which the dynamic element is predicted to take, based on at least partial identification of a particular set of contextual cues associated with the dynamic element. Associating a set of identified contextual cues with a particular intention can result in a prediction, at
module 111, that a dynamic element identified in the environment and associated with the intention will take the set of actions specified in the particular intention. - Based on associating a dynamic element with a particular intention,
module 111 can generate a predicted trajectory of the dynamic element through the external environment, where the trajectory indicates a predicted position, velocity, acceleration, etc. of the dynamic element at various future times in the external environment. Based on the predicted trajectory of the dynamic element through the environment,module 111 can generate a driving route along whichvehicle 100 can be navigated which results in thevehicle 100 avoiding intersection with the dynamic element navigation according to the predicted trajectory. -
ANS 110 includessignaling module 113 which generates commands tovarious signal devices 117 which cause thevarious devices 117 to generate one or more targeted signals which are directed at target dynamic elements in the external environment. Generating commands atmodule 113 can include selecting particular target dynamic elements, determining a direction, the angle and axis of the signal transmission through the external environment, the information included in the signal, etc.Module 113 can select a targeted dynamic element determine to generate a targeted signal to the target dynamic element, determine the particular content to be communicated to the dynamic element via the targeted signal, etc. based on determinations at one or more ofmodules ANS 110. For example, wheremodule 111 identifies a dynamic element in the external environment which includes a pedestrian approaching the roadway, where thevehicle 100 is determined to be outside a field of view of the pedestrian,module 113 can, in response to the identification and determination atmodule 111 by determining to generate a targeted signal to the pedestrian, determining a particular axis and angle of the signal transmission which at least partially restricts the signal from being received by dynamic elements other than the pedestrian, selecting a set ofdevices 117 which can collectively generate a signal with the determined axis and angle, determining the content of the signal, and commanding the selected set ofdevices 117 to generate the targeted signal which includes the determined content and is transmitted along the determined axis and angle. In the above example,module 113 can determine that the content to include in the targeted signal includes an audio warning message, so that themodule 113, in commanding a set ofdevices 117 to generate the targeted signal to the pedestrian, commands thedevices 117 to generate a targeted audio signal which includes a particular warning message to the pedestrian regarding a risk of paths intersecting with the vehicle. -
FIG. 2 illustrates an overhead view of an environment in which multiple dynamic elements are located, including a vehicle which is autonomously navigated through the environment and includes a set of sensors which can generate sensor data associated with the dynamic elements and an autonomous navigation system which can determine a particular predicted trajectory of a dynamic element based on contextual cues identified in the sensor data, according to some embodiments.Vehicle 210 can include any of the embodiments of vehicles included herein, andANS 212 can include any of the embodiments of ANSs included herein. - In some embodiments, an ANS included in a vehicle processes sensor data, generated by various sensor devices included in the vehicle, and identifies at least one particular dynamic element included in the environment. For example, in the illustrated embodiment, where
vehicle 210 is located in anexternal environment 200 which includes aroadway 250 withlanes sidewalks 260A-B extending alongside theroadway 250,sensor devices 213 included in thevehicle 210 can generate sensor data which can be processed by theANS 212 so that the ANS, as a result, determines that thevehicle 210 is located inlane 252 of theroadway 250 and that adynamic element 230, which can include a pedestrian, is located at a particular position on thesidewalk 260B and is moving at aparticular velocity 232 along thesidewalk 260B. - In some embodiments, an ANS included in a vehicle determines various features of a dynamic element and the external environments in which a dynamic element is located. The ANS can associate these features, referred to as dynamic element contextual cues, with the dynamic element. For example, in the illustrated embodiment shown in
FIG. 2 ,ANS 212, based on processing data generated bysensors 213, can identify contextual cues including a cue indicating that adynamic element 230 is located inenvironment 200 at a certain time of day, a cue indicating that adynamic element 230 is located in a particular type of environment (e.g., rural, urban, residential area, industrial area, etc.), a proximity of various specific structures (e.g., schools, landmarks, etc.) to theelement 230, a geographic position of theelement 230 relative to various static elements, dynamic elements, etc. located in theenvironment 200, etc. In one example, the contextual cues which can be identified by theANS 212 and associated withelement 230 can include an identification thatelement 230 is remotely located from any crosswalks, bridges, etc. via which thedynamic element 230 might travel across theroadway 250. In another example,ANS 212 can identify a cue that theelement 230 is moving through anenvironment 200 which is a rural environment, etc. -
ANS 212 can determine, based on processing sensor data associated with thedynamic element 230, contextual cues associated withelement 230 which indicate one or more of a visual appearance of thedynamic element 230, a type of traffic participant of thedynamic element 230, etc. For example,ANS 212 can identify cues associated withelement 230, based on processing sensor data generated by one ormore sensors 213, that indicate thatelement 230 is a pedestrian and that thepedestrian 230 is wearing a particular type of clothing, including a set of clothing associated with exercise. Such determination can include utilizing captured images of thedynamic element 230 and comparing an appearance of theelement 230 with representations of various particular articles of clothing. In another example,ANS 212 can identify a cue which indicates thatelement 230 is a pedestrian who is utilizing a wheelchair for mobility. - In some embodiments, where the
dynamic element 230 is a traffic participant, including a pedestrian, vehicle, etc. theANS 212 can process sensor data associated with the traffic participant and determine, based on the processing, a cue which indicates a field of view of the traffic participant. As shown,ANS 212 can determine that a present field of view of thedynamic element 230 is as shown at 234.ANS 212 can identify, as a contextual cue, that the field ofview 234 is approximately centered on thesidewalk 260B along which thedynamic element 230 is moving 232. The ANS can identify, as a contextual cue, that thedynamic element 230 is moving along aparticular velocity 232 and acceleration. - The ANS can associate identified contextual cues with the
dynamic element 230, based on an association of the cues with monitoring the dynamic element. For example, a contextual cue indicating an appearance of adynamic element 230 can be associated with thatelement 230 based on the cue being identified based on monitoring the appearance of the dynamic element, and a contextual cue indicating that adynamic element 230 is moving through a particular type of environment can be associated with theelement 230 based on the cue being identified based at least in part upon monitoring the motion of theelement 230 through one or more external environments. - Based on the identified contextual cues associated with a dynamic element,
ANS 212 can determine a correlation between at least some of the identified contextual cues with a set of contextual cues which are associated with a particular dynamic element intention. Based on determining the correlation, theANS 212 can associate thedynamic element 230 with the particular dynamic element intention and can generate a predicted trajectory of thedynamic element 230 based on the associated particular dynamic element intention. - In the illustrated embodiment, ANS can determine that the identified contextual cues associated with the
dynamic element 230 correlate with a set of contextual cues associated with a particular dynamic element intention, where the particular intention specifies that the dynamic element is predicted to continue moving along the sidewalk on which it is presently moving.ANS 212 can associate the particular intention withdynamic element 230 and generate a predictedtrajectory 236 of thedynamic element 230 based on the particular intention associated with thedynamic element 230. In some embodiments,ANS 212 can generate various different predictedtrajectories 237A-C of thedynamic element 230 based on different contextual cues identified in association with both thedynamic element 230 and theenvironment 200, different intentions which correlate to the identified cues, etc. - As shown,
ANS 212 generates a drivingroute 240 along whichvehicle 210 is navigated to atleast position 241 based at least in part upon the predictedtrajectory 236 of thedynamic element 230. Because the predictedtrajectory 236 continues alongsidewalk 260B, theANS 212 generates aroute 240 which continues alonglane 252 without substantial deviation in direction, velocity, etc. In another example, where the predicted trajectory ofelement 230 is determined to be trajectory 237C which veers fromsidewalk 260B into theroadway lane 252 along whichvehicle 210 is presently moving, theANS 212 can generate a different driving route which navigates thevehicle 210 to avoid intersecting with trajectory 237C, including a route via whichvehicle 210 is decelerated, navigates at least partially out oflane 252, some combination thereof, etc. -
FIG. 3 illustrates an overhead view of an environment in which multiple dynamic elements are located, including a vehicle which is autonomously navigated through the environment and includes a set of sensors which can generate sensor data associated with the dynamic elements and an autonomous navigation system which can determine a particular predicted trajectory of a dynamic element based on contextual cues identified in the sensor data, according to some embodiments.Vehicle 310 can include any of the embodiments of vehicles included herein, andANS 312 can include any of the embodiments of ANSs included herein. - In the illustrated embodiment, where
vehicle 310 is located in anexternal environment 300 which includes aroadway 350 withlanes sidewalks 360A-B extending alongside theroadway 350,sensor devices 313 included in thevehicle 310 can generate sensor data which can be processed by theANS 312 so that the ANS, as a result, determines that thevehicle 310 is located inlane 352 of theroadway 350 and that adynamic element 330, which can include a pedestrian, is located at a particular position on thesidewalk 360B. - The
ANS 312 can identify, as environmental contextual cues associated withenvironment 300 based on processing data generated bysensors 313, thatenvironment 300 includessidewalks 360A-B on opposite adjacent sides ofroadway 350 and that acrosswalk 356 extends across theroadway 350 between thesidewalks 360A-B. - The
ANS 312 can further identify dynamic element contextual cues associated with thedynamic element 330 which indicate that the dynamic element is positioned at a stop, that the dynamic element is positioned proximate to thesidewalk 356 across theroadway 350, and that thedynamic element 330 has a field ofview 334 which includes thecrosswalk 356 and does not include sidewalk 360 on whichdynamic element 330 is located. -
ANS 312 can determine that the set of identified dynamic element cues associated withelement 330 correlate with a set of contextual cues included in a particular intention association. Determining a correlation between the contextual cues can include determining that the set of identified contextual cues at least partially matches the set of contextual cues included in the association. TheANS 312 can determine that the association includes an association of the set of contextual cues with a particular dynamic element intention which specifies that, although the dynamic element is presently stationary, the dynamic element is predicted to move onto the crosswalk and across the roadway via the crosswalk. - Based on associating the
dynamic element 330 with a particular dynamic element intention which specifies that, although the dynamic element is presently stationary, the dynamic element is predicted to move onto the crosswalk and across the roadway via the crosswalk,ANS 312 can generate a predictedtrajectory 336 of thedynamic element 330 through theenvironment 300, where the predictedtrajectory 336 of thedynamic element 330 passes alongcrosswalk 356 and acrossroadway 350, according to the particular dynamic element intention associated with thedynamic element 330. - In some embodiments,
ANS 312, in generating a predicted trajectory of a dynamic element through an environment, selects one of a set of predicted trajectories based on a particular dynamic element intention associated with the dynamic element. A set of trajectories can be associated with a dynamic element based on a dynamic element type associated with the dynamic element. For example, in the illustrated embodiment, a set oftrajectories dynamic element 330 based ondynamic element 330 being a pedestrian, andANS 312 can selecttrajectory 336 as the predicted trajectory ofdynamic element 330, rather than one oftrajectories 337A-B, based on determining a correlation between thetrajectory 336 and the particular dynamic element intention which is associated with thedynamic element 330. -
ANS 312 can generate adriving route 341 along whichvehicle 310 is navigated based at least in part upon the predictedtrajectory 336 of thedynamic element 330 through theenvironment 300. The driving route can be configured to navigate thevehicle 310 in avoidance of an intersection with the predicted trajectory of thedynamic element 330. In the illustrated embodiment,ANS 312 generates a drivingroute 341 which, whenvehicle 310 is navigated along theroute 341, results in the vehicle being decelerated to aposition 342 concurrently with the predicted position ofdynamic element 330 being at one or more positions alongtrajectory 336 which is located on a portion ofcrosswalk 356 which is withinlane 352. - In some embodiments,
route 341 is configured to navigate thevehicle 310 along a pathway which intersects a portion of thetrajectory 336 subsequently to the dynamic element travelling through the portion of thetrajectory 336, so that thevehicle 310 navigates “behind” thedynamic element 330 as it navigates through theenvironment 300. For example,ANS 312 can generate aroute 341, based on the predictedtrajectory 336, which includes deceleratingvehicle 310 to aposition 342 for a period of time and subsequently accelerating the vehicle, alonglane 352, across the pathway oftrajectory 336, based on a determination thatdynamic element 330 has moved alongtrajectory 330 out oflane 352 and into at leastlane 354. - In some embodiments, ANS is configured to continue monitoring various aspects of the
environment 300, including movement of thedynamic element 330 through theenvironment 300, subsequent to generating the predictedtrajectory 336 of theelement 330 and generating a drivingroute 341 of thevehicle 310 based thereupon. TheANS 312 can determine whether thedynamic element 330 is moving along the predictedtrajectory 336. In some embodiments,ANS 312 responds to a determination that thedynamic element 330 is following a different trajectory than the predicted trajectory by revising the drivingroute 341 via whichvehicle 310 is navigated through the environment.ANS 312 can revise the association between the identified contextual cues associated with thedynamic element 330 and theenvironment 300 with a particular intention based on determining that the dynamic element moves along a trajectory which is separate from a predicted trajectory generated based on the particular intention. Such a revision of the association can include adjusting a confidence level associated with the association, adjusting the particular intention to specify a different set of predicted dynamic element movements, some combination thereof, etc. -
FIG. 4 illustrates an overhead view of an environment in which multiple dynamic elements are located, including a vehicle which is autonomously navigated through the environment and includes a set of sensors which can generate sensor data associated with the dynamic elements and an autonomous navigation system which can determine a particular predicted trajectory of a dynamic element based on contextual cues identified in the sensor data, according to some embodiments.Vehicle 410 can include any of the embodiments of vehicles included herein, andANS 412 can include any of the embodiments of ANSs included herein. - In the illustrated embodiment, where
vehicle 410 is located in anexternal environment 400 which includes aroadway 450 withlanes roadway 450,sensor devices 413 included in thevehicle 410 can generate sensor data which can be processed by theANS 412 so that the ANS, as a result, determines that thevehicle 410 is located inlane 452 of theroadway 450. TheANS 412 further determines thatstatic element 420 anddynamic elements environment 400.Dynamic elements roadway 450.Dynamic elements 434 can include pedestrians.Static element 420 can include a particular structure. - The
ANS 412 can identify, as contextual cues associated with one or more ofelements sensors 413, that the dynamic elements are located proximate to astatic element 420 which is a particular structure. For example, theANS 412 can identify thatstructure 420 is a school building and thatelements ANS 412 can identify a contextual cue that thevehicle 410 andelements sensor device 413, correlates with a determined geographic location associated with a school zone located in theenvironment 400, a particular school zone, some combination thereof, etc. The geographic location associated with a school zone, a particular school zone, some combination thereof, etc. can be determined based on information received from a service, system, etc. which is located remotely from thevehicle 410, where the information is received atvehicle 410 via one or more communication networks. TheANS 412 can identify a contextual cue that indicates that one or more of thevehicle 410,elements static element 420 is a school structure associated with a school zone and that one or more ofvehicle 410,dynamic elements static element 420. - In some embodiments, the
ANS 412 can identify, as a contextual cue associated with one or more of thedynamic elements sensors 413, that the one or more dynamic elements are located inenvironment 400, within a certain proximity of theschool 420, at a time which is included within a particular time period during which classes at theschool 420 are dismissed. The identification can be based on determining a present time, determining a set of events associated with the identifiedschool 420, where at least one event includes class dismissal, comparing the present time with a time period associated with the class dismissal event, and determining that the present time is located within the time period associated with the class dismissal event. -
ANS 412, based on processing sensor data generated bysensors 413, can identify contextual cues associated with at least elements 432 which indicate thatdynamic elements 432A-C are vehicles which are stopped proximate to respective sides of theroadway 450, also referred to herein as being “pulled-over” to the sides of theroadway 450, and can further identify contextual cues associated with at least elements 424 which indicate thatdynamic elements 434, which include pedestrians, are moving 435 in a general direction away from thestatic element 420 which is identified as a school building and towards theroadway 450.ANS 412 can identify contextual cues associated with each of thedynamic elements 434 which include identifying thedynamic elements 434 as children. - In some embodiments, dynamic element cues associated with one or more particular dynamic elements in an environment are associated with one or more other separate dynamic elements in the environment. For example,
ANS 412 can associate the dynamic element cues identified based on monitoring the vehicledynamic elements dynamic elements 434 and can further associate the dynamic elements cues identified based on monitoring the childrendynamic elements 434 with thedynamic elements - Based on identifying the various contextual cues associated with the various dynamic elements in the
environment 400,ANS 412 can correlate the cues associated with one or more dynamic elements with one or more sets of contextual cues included in a particular intention association which associates the one or more sets of contextual cues with one or more particular dynamic element intentions. In the illustrated embodiment,ANS 412 can associate each of thevehicles 432A-C with a particular dynamic element intention which specifies that the dynamic elements are predicted to remain stopped at the sides of theroadway 450, based on matching at least a portion of the dynamic element cues associated with theelements 432A-C with a set of contextual cues which are themselves associated with the particular dynamic element intention. Based on associating each of thevehicles 432A-C with a particular dynamic element intention which specifies that the dynamic elements are predicted to remain stopped at the sides of theroadway 450,ANS 412 generates a predicted trajectory for each of thevehicles 432A-C which includes thevehicles 432A-C remaining stationary for at least a period of time, as shown inFIG. 4 . - In addition,
ANS 412 can associatevehicle 430 with a particular dynamic element intention which specifies that the dynamic element is predicted to pull over to the side of the roadway and stop, based on matching at least a portion of the dynamic element cues associated with theelements 430 with a set of contextual cues which are themselves associated with the particular dynamic element intention.ANS 412 can, based on the intention associated with thevehicle 430, generate a predictedtrajectory 431 for thevehicle 430 which passes to the side of the roadway and stops, as shown inFIG. 4 . - In some embodiments, a contextual cue associated with a dynamic element indicates movements of the dynamic element relative to movements of other dynamic elements in the environment. As a result of identifying cues which incorporate movements of various other dynamic elements in the environment,
ANS 412 can associate a given dynamic element with an intention which more accurately predicts future movements of the dynamic element through the environment than via generating cues which incorporate movement of the given dynamic element alone. For example, in the illustrated embodiment,ANS 412, based on identifying that thevehicle 430 is moving slowly alonglane 452 in a school zone during a time of day during which classes are dismissed fromschool 420 proximate tovehicle 430 without incorporating motions and positions of ofchildren 434 andvehicles 432A-C in theenvironment 400,ANS 412 may associatevehicle 430 with a different intention which specifies that thevehicle 430 is predicted to continue moving alonglane 452, rather than pull over to the side ofroadway 450. - In addition,
ANS 412 can associatedynamic elements 434 with a particular dynamic element intention which specifies that thedynamic elements 434 are predicted to continue moving into theroadway 450 towards proximate stopped vehicles 432, based on matching at least a portion of the dynamic element cues associated withelements 434 with a set of contextual cues which are themselves associated with the particular dynamic element intention. As a result,ANS 412 can generate predictedtrajectories 436 of theelements 434 which pass towards thevehicles sidewalk 460B and into theroadway 450. In some embodiments,ANS 412 identifies contextual cues associated withdynamic elements 434 which indicate movement of theelements 434 relative to particular movements of one or more additional dynamic elements through theenvironment 400. - In some embodiments, an ANS generates a driving route configured to navigate the vehicle along a pathway which avoids intersection of the vehicle with predicted trajectories of multiple dynamic elements in an environment. As shown,
ANS 412 generates a drivingroute 441, based on predictedtrajectories dynamic elements vehicle 410 so that, as a result, thevehicle 410 avoids intersections withtrajectories ANS 412 can further control navigation of thevehicle 412 along the route to avoid paths intersecting with the variousdynamic elements -
FIG. 5 illustrates generating an intention association between identified contextual cues and subsequent predicted dynamic element movements, according to some embodiments. The generating can be implemented by one or more portions of any embodiment of ANS included in any embodiments herein. An ANS can be implemented by one or more computer systems. - At 502, one or more instances of
sensor data 501 are received from one or more sensor devices included in a vehicle. Sensor data can include various instances of information regarding one or more portions of an external environment in which the vehicle is located. For example, sensor data generated by a sensor device can include images, of at least a portion of an external environment, generated by a camera device, a point cloud, of a portion of an external environment, generated by a light beam scanning device, a radar image of an environment generated by a radar device, position information generated by a geographic positioning sensor device, weather data generated by one or more weather sensor devices, ambient light data generated by one or more ambient light sensors, time data generated by one or more chronometers, etc. - At 504, received sensor data is processed. Processing sensor data can include identifying various static elements, dynamic elements, some combination thereof, etc. included in the external environment. For example, where an external environment in which a vehicle is located includes a roadway on which the vehicle is located, a building proximate to the roadway processing sensor data generated by sensor devices included in the vehicle can include identifying the building as a particular static element in the environment, where the identification can include determining a particular position of the building in the environment, a particular distance between the building and the vehicle, a particular type of static element associated with the building, etc. In another example, where the external environment in which the vehicle is located a pedestrian standing ahead of the vehicle adjacent to the roadway, processing sensor data generated by sensor devices included in the vehicle can include identifying the pedestrian as a particular dynamic element in the environment, where the identification can include determining a particular position of the pedestrian in the environment, a particular orientation of the pedestrian, determining a particular present velocity, acceleration, etc. of the pedestrian, etc.
- At 506, one or more contextual cues associated with one or more of the dynamic elements are identified based on processing sensor data at 504. Identifying a contextual cue from processing sensor data of a particular dynamic element, static element, etc. can include associating the identified contextual cue with one or more particular dynamic elements. As a result, the identifying at 506 can result in identifying various dynamic element object
contextual cues 507A and situationalcontextual cues 507B associated with various dynamic elements in the environment. - Contextual cues identified and associated with an element in an environment can include “object contextual cues” 507A which are identified based on processing sensor data generated based on monitoring one or more features of the element and “situational contextual cues” which are identified based on processing sensor data generated based on monitoring the element relative to one or more elements of the environment, including motions, positions, orientations, etc. of the element relative to one or more aspects of one or more dynamic elements located in the environment, one or more aspects of one or more static elements located in the environment, one or more aspects of the environment in general, some combination thereof, etc.
- For, example, where a vehicle is moving along a roadway, and where a dynamic element which includes a pedestrian is moving along a sidewalk which extends adjacent to the roadway, identifying 506 object
contextual cues 507A associated with the pedestrian can include identifying object contextual cues which can include a cue indicating that the pedestrian is clothed in running exercise-related attire, a cue indicating that the pedestrian is wearing audio headset gear, etc. Identifying 506contextual cues 507B associated with the pedestrian can include identifying situational contextual cues which include a cue indicating that the pedestrian is moving along a sidewalk with a general direction of travel which extends along in the sidewalk, a cue indicating that the pedestrian is moving along the sidewalk within a certain threshold velocity associated with jogging activities, a cue indicating that a field of view of the pedestrian includes the sidewalk and general direction of pedestrian travel, etc. - In another example, where a vehicle is moving along a roadway which extends proximate to a static element which includes a school building, and dynamic elements in the environment include a vehicle parked at the side of the road proximate to the school building and a child which is moving from the school building towards the roadway, identifying 506
contextual cues 507B associated with the child can include identifying situational contextual cues which can include a cue indicating that the child is proximate to a static element which is a portion of a particular school, a cue indicating that the child is moving away from the static element which is part of a particular school towards the roadway, and a cue indicating that the child is located proximate to the particular school at a time is within a time period associated with class dismissal at the particular school. Identifying 506contextual cues 507A associated with the dynamic element which includes the child can include an object contextual cue indicating that the dynamic element is a child, an object contextual cue indicating that the child is wearing a backpack. - At 508, subsequent dynamic element motion is monitored via
processing sensor data 501 generated by one or more sensor devices. For example, where the vehicle is moving along the roadway where a pedestrian is moving along an adjacent sidewalk, the monitoring at 508 includes monitoring subsequent motions of the pedestrian subsequent to identifying the various contextual cues at 506. - At 510, based on monitoring the subsequent dynamic element motion at 508, a dynamic element intention is generated, where the intention specifies occurrence of the motion of the dynamic element, monitored at 508, subsequent to identification, at 506, of the contextual cues associated with the dynamic element. For example, where a pedestrian is monitored, at 508, to continue moving along a sidewalk subsequent to identification of cues at 506 which indicate that the pedestrian is moving within a velocity window associated with jogging activities, is wearing jogging attire, has a field of view which includes the sidewalk and the direction of travel, etc., an intention generated at 510 can specify that the pedestrian is predicted to continue moving along the sidewalk subsequent to identifying the cues identified at 506.
- In another example, where a child is monitored, at 508, to continue moving away from a school structure and out into a roadway, subsequent to identification of cues indicating that the dynamic element is a child, that the child is wearing a backpack, that the child is moving away from a school, that the child is moving away from the school at a time which is within a time period associated with class dismissal at a school, that the child is moving towards a roadway, some combination thereof, etc., an intention can be generated which specifies that the child is predicted to move into the roadway subsequent to identification of the cues at 506.
- At 512, the intention generated at 512 is associated with at least some of the particular cues, identified at 506, which are associated with the dynamic element and upon which the intention is based. As a result, identifying some or all of the particular cues associated with another dynamic element at a future point in time can result in a correlation of the other dynamic element with the associated dynamic element intention.
-
FIG. 6 illustrates autonomously navigating a vehicle through an external environment based on a predicted trajectory of a dynamic element in the external environment, according to some embodiments. The generating can be implemented by one or more portions of any embodiment of ANS included in any embodiments herein. An ANS can be implemented by one or more computer systems. - At 602, one or more instances of
sensor data 601 are received from one or more sensor devices included in a vehicle. Sensor data can include various instances of information regarding one or more portions of an external environment in which the vehicle is located. For example, sensor data generated by a sensor device can include images, of at least a portion of an external environment, generated by a camera device, a point cloud, of a portion of an external environment, generated by a light beam scanning device, a radar image of an environment generated by a radar device, position information generated by a geographic positioning sensor device, weather data generated by one or more weather sensor devices, ambient light data generated by one or more ambient light sensors, time data generated by one or more chronometers, etc. - At 604, received sensor data is processed. Processing sensor data can include identifying various static elements, dynamic elements, some combination thereof, etc. included in the external environment. For example, where an external environment in which a vehicle is located includes a roadway on which the vehicle is located, a building proximate to the roadway processing sensor data generated by sensor devices included in the vehicle can include identifying the building as a particular static element in the environment, where the identification can include determining a particular position of the building in the environment, a particular distance between the building and the vehicle, a particular type of static element associated with the building, etc. In another example, where the external environment in which the vehicle is located a pedestrian standing ahead of the vehicle adjacent to the roadway, processing sensor data generated by sensor devices included in the vehicle can include identifying the pedestrian as a particular dynamic element in the environment, where the identification can include determining a particular position of the pedestrian in the environment, a particular orientation of the pedestrian, determining a particular present velocity, acceleration, etc. of the pedestrian, etc.
- At 606, one or more contextual cues associated with one or more of the dynamic elements located in the environment are identified based on processing sensor data at 604. Identifying a contextual cue from processing sensor data of a particular dynamic element, static element, etc. can include associating the identified contextual cue with the particular dynamic element. As a result, the identifying at 606 can result in identifying various dynamic element contextual cues which include object
contextual cues 607A and situationalcontextual cues 607B associated with one or more various dynamic elements in the environment. - At 610, at least some of the identified
contextual cues contextual intention associations 608, which can result in determining a correlation between at least some of thecues 607A-B with a set of contextual cues included in at least oneparticular association 608. Determining a correlation between at least somecues 607A-B identified at 606 with a set of cues included in an association, also referred to herein as determining that at least somecues 607A-B identified at 606 correlate with a set of cues included in an association, can include at least partially matching thecues 607A-B with the set of cues included in the association above a certain threshold level. For example, matching five of the identifiedcues 607A-B associated with a dynamic element with five cues out of a set of six cues included in aparticular association 608 can result in a determination of a correlation of the five identifiedcues 607A-B with theparticular association 608. Anassociation 608 includes a set of contextual cues associated with one or more dynamic elements and a corresponding dynamic element intention, associated with the set of contextual cues, which specifies a prediction of the motion of the one or more dynamic elements through the environment based on the set of contextual cues. - At 612, based on correlating at least some identified
cues 607A-B associated with a dynamic element in the environment with aparticular association 608, the dynamic element is associated with a dynamic element intention included in theparticular association 608. As a result, the intention included in theparticular association 608 is determined to be a prediction of the motion of the dynamic element through the environment, based on the at least some identifiedcues 607A-B. - At 614, a trajectory of the dynamic element through the environment is determined, based at least in part upon the dynamic element intention associated with the dynamic element at 612. The trajectory can specify a particular variation of one or more of the position, velocity, acceleration, etc. of the dynamic element through the environment based on time. As a result, the generated trajectory can illustrate a prediction of the route along which the dynamic element is predicted to move through the environment and the various points in the environment at which the dynamic element is predicted to be located, along the route, at various points in time.
- At 616, a driving route of the vehicle through the environment is generated based at least in part upon the predicted trajectory of the dynamic element through the environment. The driving route can be a trajectory of the vehicle which avoids intersection with one or more predicted trajectories of one or more dynamic elements through the environment where the vehicle and the one or more dynamic elements are located within a particular threshold proximity distance at one or more given points in time. A driving route can be a route through the environment which avoids navigating the vehicle within a certain distance of any dynamic elements navigating through the environment along predicted trajectories of the various dynamic elements.
-
FIG. 7 illustrates an overhead view of an environment in which multiple dynamic elements are located, including a vehicle which is autonomously navigated through the environment and includes a set of indicators which can generate a set of targeted signals which are transmitted to specific dynamic elements in the external environment, according to some embodiments.Vehicle 710 can include any of the embodiments of vehicles included herein, andANS 712 can include any of the embodiments of ANSs included herein. - In some embodiments, a vehicle includes one or more sets of signal generators which can be commanded by an ANS included in the vehicle to generate one or more targeted signals which are directed to one or more particular target elements included in the external environment in which the vehicle is located. A targeted signal can include a signal which is transmitted along a particular selected angle and axis which results in the signal passing through a restricted portion of the external environment in which the target element is located, so that the signal is at least partially restricted from being received at one or more other elements included in the environment. A targeted signal which is directed at a particular target element in the environment can include one or more particular instances of information, including one or more particular messages, which is at least partially associated with the target element. In some embodiments, a target signal includes information which is selected, at the ANS included in a vehicle, based at least in part upon a predicted trajectory of the target element through the external environment.
- In some embodiments, separate signal generators, also referred to herein interchangeably as signal devices, included in a vehicle can generate separate targeted signals which are directed to separate target elements in the environment, where the separate targeted signals include separate instances of information which are included in the separate signals based on the separate target elements to which the separate targeted signals are directed.
- In the illustrated embodiment shown in
FIG. 7 , avehicle 710 navigating throughenvironment 700 includes anANS 712 andseparate sets 713A-B of signal generators. Thevehicle 710 is being navigated alonglane 752 ofroadway 750. Theenvironment 700 further includesdynamic elements element 720 includes a vehicle navigating alonglane 754 ofroadway 750 andelement 730 includes a pedestrian navigating alongsidewalk 760 which extends along an edge of theroadway 750 which is proximate tolane 752. - As shown in the illustrated embodiment,
vehicle 710 generates separate targetedsignals 750A-B which are separately directed to a separate one ofelements environment 700. -
Targeted signal 750A is generated by a particular set ofsignal generators 713A included invehicle 710 and is directed towardsvehicle 720 along aparticular axis 752A andangle 751A of transmission, such thatsignal 750A passes through a limited portion ofenvironment 700 in whichvehicle 720 passes, so that thesignal 750A is received by a limited portion of theelements environment 700.ANS 712 can determine aparticular axis 752A andangle 751A of a targeted signal to direct toelement 720 and can select a particular configuration ofsignal generators 713A which can generate and transmit thesignal 750A along theparticular axis 752A andangle 751A.ANS 712 can further command the selectedsignal generators 713A to generate the targetedsignal 750A including a particular set of information, also referred to herein as a particular set of content, and directed, alongaxis 752A andangle 751A, towardselement 720. -
ANS 712 can determine theaxis 752A andangle 751A of thesignal 750A based on identification of a size, position, velocity, acceleration, etc. of thedynamic element 720 through the environment, a predicted trajectory of theelement 720 through the environment, some combination thereof, etc. Thesignal 750A can include information which includes a message which, when communicates one or more signals, alerts, messages, etc. toelement 720. For example, in some embodiments, signal 750A can include a message which communicates, toelement 720, information regarding traffic conditions, particular dynamic elements, events, etc. associated with one or more portions ofenvironment 700.Signal 750A can include information indicating an occurrence of stopped vehicles in a portion of theenvironment 700 which through whichvehicle 710 has previously navigated.Signal 750A can include information which communicates, tovehicle 720, a driving route along whichvehicle 710 is being navigated byANS 712.Signal 750A can include a warning signal. -
ANS 712 can determine to generatesignal 750A, determine aparticular angle 751A andaxis 752A ofsignal 750A, select a particular set ofsignal generators 713A to generate thesignal 750A in a particular generator configuration, determine one or more particular instances of information to include in the signal, command the set ofgenerators 713A to generate the targeted signal, some combination thereof, etc. based at least in part upon one or more of detecting thedynamic element 720 in the environment, generating a predicted trajectory of thedynamic element 720 through theenvironment 700, some combination thereof, etc. -
Targeted signal 750B is generated by a particular set ofsignal generators 713B included invehicle 710 and is directed towardspedestrian 730 along aparticular axis 752B andangle 751B of transmission, such thatsignal 750B passes through a limited portion ofenvironment 700 in whichpedestrian 730 passes, so that thesignal 750B is received by a limited portion of theelements environment 700.ANS 712 can determine aparticular axis 752B andangle 751B of a targetedsignal 750B to direct toelement 730 and can select a particular configuration ofsignal generators 713B which can generate and transmit thesignal 750B along theparticular axis 752B andangle 751B.ANS 712 can further command the selectedsignal generators 713B to generate the targetedsignal 750B including a particular set of information and directed, alongaxis 752B andangle 751B, towardselement 730.ANS 712 can determine theaxis 752B andangle 751B of thesignal 750B based on identification of a size, position, velocity, acceleration, etc. of thedynamic element 730 through the environment, a predicted trajectory of theelement 730 through the environment, some combination thereof, etc. Thesignal 750B can include information which includes a message which, when received atdynamic element 730, communicates one or more signals, alerts, messages, etc. toelement 730. For example, in some embodiments, signal 750B can include a message which communicates, toelement 730, information regarding traffic conditions, particular dynamic elements, events, etc. associated with one or more portions ofenvironment 700.Signal 750B can include information which communicates, topedestrian 730, a driving route along whichvehicle 710 is being navigated byANS 712.Signal 750B can include a warning signal. -
ANS 712 can determine to generatesignal 750B, determine aparticular angle 751B andaxis 752B ofsignal 750B, select a particular set ofsignal generators 713B to generate thesignal 750B in a particular generator configuration, determine one or more particular instances of information to include in the signal, command the set ofgenerators 713B to generate the targeted signal, some combination thereof, etc. based at least in part upon one or more of detecting thedynamic element 730 in the environment, generating a predicted trajectory of thedynamic element 730 through theenvironment 700, some combination thereof, etc. For example, as shown,ANS 712 can determine that, wheredynamic element 730 is a pedestrian, a field ofview 734 of thepedestrian 730 and one or more predicted trajectories 732-B through theenvironment 700. Based at least in part upon one or more of determining that thevehicle 710 is not within the field ofview 734 of thepedestrian 730, determining that one or morepredicted trajectories 732A-B of the pedestrian may intersect a driving route along which thevehicle 710 is being navigated, some combination thereof, etc. -
ANS 712 can determine a targeted signal to transmit topedestrian 730 which includes information which communicates one or more of a warning to thepedestrian 730 of the presence of thevehicle 710 in theenvironment 700, a message to thepedestrian 730 to refrain from navigating along one or moreparticular trajectories 732A-B through the environment, etc. For example, thesignal 750B can include an audio message which warns thepedestrian 730 to avoid moving into thelane 752 of theroadway 750. As a result, thevehicle 710 can respond to a prediction, atANS 712, that thepedestrian 730 may move along a trajectory which intersects a driving route of thevehicle 710 by, rather than adjusting the driving route of thevehicle 710 to avoid intersection with the predicted trajectory of the pedestrian, generate a targeted signal which prompts the pedestrian to avoid moving along a trajectory which intersects the present driving route. As a result, the risk of paths intersecting between thevehicle 710 and dynamic elements in the external environment through which thevehicle 710 is being navigated can be reduced, thereby augmenting safety for occupants ofvehicle 710 and various dynamic elements in theenvironment 700. - In some embodiments, one or more of the targeted signals 750A-B includes a visual signal. For example, the targeted
signal 750A can be generated by one or morevisual signal generators 713A, including one or more lights, included in thevehicle 710, where a particular set of visual indicators are activated to generate a particularvisual signal 750A which includes one or more instances of information. Thevisual indicators 713A can include at least a portion of one or more headlight assemblies included in thevehicle 710, where a portion of the headlight assemblies can be adjusted to direct a light beam having aparticular axis 752A andangle 751A to a particulardynamic element 720 in theenvironment 700. The headlight assemblies can be adjusted to provide avisual signal 750A which includes a variable-intensity beam of light, including a series of light beam pulses, etc. - In some embodiments, one or more of the targeted signals 750A-B includes an audio signal which is directed at a particular dynamic element, including a traffic participant, instance of fauna, etc. located in the external environment. For example, the targeted
signal 750B can include an audio signal which can be a directional signal which is focused in angle and axis via various known systems and methods of focused audio signals, including one or more of beamforming, ultrasonic modulation, etc., so that the recipient of theaudio signal 750B is at least partially restricted to the particulardynamic element 730 to which the targetedsignal 750B is directed. As a result, the amplitude of the signal can be reduced, relative to a non-focused audio signal, which can result in reduced disturbance in the external environment as a result of the signal transmission. In addition, as a result of the amplitude of the signal being reduced, information communicated to the dynamic element via the targeted signal can be at least partially precluded from being received and interpreted by other dynamic elements in the environment, thereby providing at least some information security. - In some embodiments,
ANS 712 selects a type of targeted signal to generate and direct to a particular dynamic element based on one or more various identified contextual cues associated with the particular dynamic element, a position, velocity, acceleration, etc. of one or more of the particular dynamic element and thevehicle 710 through theenvironment 700, some combination thereof, etc. For example,ANS 712 can determine to generatesignal 750A as a visual signal based at least in part upon a determination thatdynamic element 720 is a vehicle, that thevehicle 710 is included within a field of view of thevehicle 720, that one or more ofvehicles environment 700 within a certain range of velocities, that a distance betweenvehicles ANS 712 can determine to generatesignal 750B as an audio signal based at least in part upon a determination thatdynamic element 730 is a pedestrian, that thevehicle 710 is not included within the field ofview 734 of thepedestrian 730, that, that a distance betweenvehicle 710 andpedestrian 730 is less than a certain threshold distance, some combination thereof, etc. - In some embodiments,
ANS 712 determines to refrain from generating a targeted signal directed to a particular target dynamic element, based at least in part upon one or more of relative proximity, relative velocity, etc. of the target dynamic element and the vehicle in which theANS 712 is included. For example, where the relative velocity ofvehicle 710 andvehicle 720 exceeds a threshold, the value provided bysignal 750A may be restricted, as amount of elapsed time between thesignal 750A being received atvehicle 720 andvehicles signal 750A to be processed atvehicle 720. -
FIG. 8 illustrates generating targeted signals which are directed to particular target dynamic elements, according to some embodiments. The generating can be implemented by one or more portions of any embodiment of ANS included in any embodiments herein. An ANS can be implemented by one or more computer systems. - At 802, a dynamic element is identified in an external environment. The identification can be based at least in part upon processing one or more instances of sensor data generated by one or more sensor devices included in one or more vehicles in the external environment. Identifying the dynamic element can include identifying one or more contextual cues associated with the dynamic element, generating one or more predicted trajectories of the dynamic element through the environment, etc.
- At 804, a determination is made regarding whether to generate a targeted signal which is directed to the identified dynamic element. The determination can be based at least in part upon one or more of a determination of whether the vehicle in which the
ANS implementing determination 804 is included is located within a field of view of the dynamic element, a determination of whether a predicted trajectory of the dynamic element intersects a driving route along which the vehicle in which theANS implementing determination 804 is included is located is being navigated, etc., a determination regarding whether the dynamic element is moving along a trajectory which approaches one or more elements in the external environment previously detected based on processing sensor data, etc. - In some embodiments, a determination regarding whether to generate a targeted signal is based at least in part upon a determination regarding whether a present velocity of the vehicle in which the
ANS implementing determination 804 is included is greater than a threshold value. For example, if the vehicle is moving greater than a threshold velocity, a determination can be made to not generate a targeted signal, as the high velocity of the vehicle can at least partially preclude generated signals from being properly received and interpreted by a target dynamic element prior to the vehicle passing a closest proximity to the target dynamic element. - At 806, if a determination is made to generate a targeted signal which is directed to the dynamic element, a signal type of the signal, and one or more instances of content to be included in the signal, are determined. Various signal types can include audio signals, visual signals, some combination thereof, etc. A signal type of the targeted signal can be based at least in part upon one or more various contextual cues associated with the identified dynamic element, etc. For example, where a contextual cue associated with a dynamic element indicates that the dynamic element is a pedestrian, the signal type determined at 806 can include an audio signal type. In another example, where a contextual cue associated with the dynamic element indicates that the dynamic element is a vehicle, the signal type determined at 806 can include a visual signal type.
- The content of a signal, also referred to herein interchangeably as one or more instances of information included in the signal, can be determined based on one or more of the field of view of the dynamic element, a predicted trajectory of the dynamic element, one or more elements of the external environment, etc. For example, where the field of view of the dynamic element excludes the vehicle in which the
ANS implementing determination 806 is included, the signal content determined at 806 can include a signal, message, alert, etc. which is configured to communicate a presence of the vehicle. In another example, where a predicted trajectory of the dynamic element intersects the driving route along which the vehicle in which theANS implementing determination 806 is included is being navigated, the signal content determined at 806 can include a warning message which is configured to warn the dynamic element to avoid navigating along the predicted trajectory. In another example, where the vehicle in which theANS implementing determination 806 is included has previously navigated proximate to one or more particular elements in the environment, including one or more traffic accidence, construction zones, stopped vehicles proximate to the roadway, traffic hazards, etc., the signal content determined at 806 can include a message which communicates one or more instances of information regarding the one or more particular elements in the environment. - At 808, one or more of an axis and angle of the targeted signal is determined, which results in a determination of a configuration of the targeted signal which results in the targeted signal, when transmitted along the one or more determined axis and angle, passes through a limited portion of the environment in which the dynamic element is located, so that the targeted signal is restricted from being directed to one or more other elements in the environment. At 810, an amplitude of one or more portions of the signal is determined, which can be based in part upon one or more of relative distance, velocity, acceleration between the vehicle in which the ANS implementing 810 is located and the dynamic element, a content of the signal, a dynamic element type associated with the dynamic element, etc.
- At 812, one or more sets of signal devices, included in the vehicle, which are configurable to generate the targeted signal which includes the determined content, amplitude, and the determined one or more of the angle and axis are selected. Selecting one or more sets of signal devices can include determining one or more particular configurations of the signal devices, including orientation, adjustable position, etc. which results in the signal devices being at least partially configured to generate the targeted signal which includes the determined content, amplitude, and the determined one or more of the angle and axis. A set of signal devices can be selected based at least in part upon a determination that the set of signal devices comprises a minimum quantity of signal devices which can generate the targeted signal which includes the determined content, amplitude, and the determined one or more of the angle and axis, etc. At 814, the selected signal devices are commanded to generate targeted signal, where the commanding can include commanding the one or more selected signal devices to be adjustably positioned, oriented, etc. to generate a signal which include the determined one or more of the determined angle and axis of the targeted signal.
-
FIG. 9 illustrates anexample computer system 900 that may be configured to include or execute any or all of the embodiments described above. In different embodiments,computer system 900 may be any of various types of devices, including, but not limited to, a personal computer system, desktop computer, laptop, notebook, tablet, slate, pad, or netbook computer, cell phone, smartphone, PDA, portable media device, mainframe computer system, handheld computer, workstation, network computer, a camera or video camera, a set top box, a mobile device, a consumer device, video game console, handheld video game device, application server, storage device, a television, a video recording device, a peripheral device such as a switch, modem, router, or in general any type of computing or electronic device. - Various embodiments of an autonomous navigation system (ANS), as described herein, may be executed in one or
more computer systems 900, which may interact with various other devices. Note that any component, action, or functionality described above with respect toFIGS. 1 through 8 may be implemented on one or more computers configured ascomputer system 900 ofFIG. 9 , according to various embodiments. In the illustrated embodiment,computer system 900 includes one ormore processors 910 coupled to asystem memory 920 via an input/output (I/O)interface 930.Computer system 900 further includes anetwork interface 940 coupled to I/O interface 930, and one or more input/output devices, which can include one or more user interface (also referred to as “input interface”) devices. In some cases, it is contemplated that embodiments may be implemented using a single instance ofcomputer system 900, while in other embodiments multiple such systems, or multiple nodes making upcomputer system 900, may be configured to host different portions or instances of embodiments. For example, in one embodiment some elements may be implemented via one or more nodes ofcomputer system 900 that are distinct from those nodes implementing other elements. - In various embodiments,
computer system 900 may be a uniprocessor system including oneprocessor 910, or a multiprocessor system including several processors 910 (e.g., two, four, eight, or another suitable number).Processors 910 may be any suitable processor capable of executing instructions. For example, invarious embodiments processors 910 may be general-purpose or embedded processors implementing any of a variety of instruction set architectures (ISAs), such as the x86, PowerPC, SPARC, or MIPS ISAs, or any other suitable ISA. In multiprocessor systems, each ofprocessors 910 may commonly, but not necessarily, implement the same ISA. -
System memory 920 may be configured to store program instructions, data, etc. accessible byprocessor 910. In various embodiments,system memory 920 may be implemented using any suitable memory technology, such as static random access memory (SRAM), synchronous dynamic RAM (SDRAM), nonvolatile/Flash-type memory, or any other type of memory. In the illustrated embodiment, program instructions included inmemory 920 may be configured to implement some or all of an ANS, incorporating any of the functionality described above. Additionally, existing automotive component control data ofmemory 920 may include any of the information or data structures described above. In some embodiments, program instructions and/or data may be received, sent or stored upon different types of computer-accessible media or on similar media separate fromsystem memory 920 orcomputer system 900. Whilecomputer system 900 is described as implementing the functionality of functional blocks of previous Figures, any of the functionality described herein may be implemented via such a computer system. - In one embodiment, I/
O interface 930 may be configured to coordinate I/O traffic betweenprocessor 910,system memory 920, and any peripheral devices in the device, includingnetwork interface 940 or other peripheral interfaces, such as input/output devices 950. In some embodiments, I/O interface 930 may perform any necessary protocol, timing or other data transformations to convert data signals from one component (e.g., system memory 920) into a format suitable for use by another component (e.g., processor 910). In some embodiments, I/O interface 930 may include support for devices attached through various types of peripheral buses, such as a variant of the Peripheral Component Interconnect (PCI) bus standard or the Universal Serial Bus (USB) standard, for example. In some embodiments, the function of I/O interface 930 may be split into two or more separate components, such as a north bridge and a south bridge, for example. Also, in some embodiments some or all of the functionality of I/O interface 930, such as an interface tosystem memory 920, may be incorporated directly intoprocessor 910. -
Network interface 940 may be configured to allow data to be exchanged betweencomputer system 900 and other devices attached to a network 985 (e.g., carrier or agent devices) or between nodes ofcomputer system 900.Network 985 may in various embodiments include one or more networks including but not limited to Local Area Networks (LANs) (e.g., an Ethernet or corporate network), Wide Area Networks (WANs) (e.g., the Internet), wireless data networks, some other electronic data network, or some combination thereof. In various embodiments,network interface 940 may support communication via wired or wireless general data networks, such as any suitable type of Ethernet network, for example; via telecommunications/telephony networks such as analog voice networks or digital fiber communications networks; via storage area networks such as Fibre Channel SANs, or via any other suitable type of network and/or protocol. - Input/output devices may, in some embodiments, include one or more display terminals, keyboards, keypads, touchpads, scanning devices, voice or optical recognition devices, or any other devices suitable for entering or accessing data by one or
more computer systems 900. Multiple input/output devices may be present incomputer system 900 or may be distributed on various nodes ofcomputer system 900. In some embodiments, similar input/output devices may be separate fromcomputer system 900 and may interact with one or more nodes ofcomputer system 900 through a wired or wireless connection, such as overnetwork interface 940. -
Memory 920 may include program instructions, which may be processor-executable to implement any element or action described above. In one embodiment, the program instructions may implement the methods described above. In other embodiments, different elements and data may be included. Note that data may include any data or information described above. - Those skilled in the art will appreciate that
computer system 900 is merely illustrative and is not intended to limit the scope of embodiments. In particular, the computer system and devices may include any combination of hardware or software that can perform the indicated functions, including computers, network devices, Internet appliances, PDAs, wireless phones, pagers, etc.Computer system 900 may also be connected to other devices that are not illustrated, or instead may operate as a stand-alone system. In addition, the functionality provided by the illustrated components may in some embodiments be combined in fewer components or distributed in additional components. Similarly, in some embodiments, the functionality of some of the illustrated components may not be provided and/or other additional functionality may be available. - Those skilled in the art will also appreciate that, while various items are illustrated as being stored in memory or on storage while being used, these items or portions of them may be transferred between memory and other storage devices for purposes of memory management and data integrity. Alternatively, in other embodiments some or all of the software components may execute in memory on another device and communicate with the illustrated computer system via inter-computer communication. Some or all of the system components or data structures may also be stored (e.g., as instructions or structured data) on a computer-accessible medium or a portable article to be read by an appropriate drive, various examples of which are described above. In some embodiments, instructions stored on a computer-accessible medium separate from
computer system 900 may be transmitted tocomputer system 900 via transmission media or signals such as electrical, electromagnetic, or digital signals, conveyed via a communication medium such as a network and/or a wireless link. Various embodiments may further include receiving, sending or storing instructions and/or data implemented in accordance with the foregoing description upon a computer-accessible medium. Generally speaking, a computer-accessible medium may include a non-transitory, computer-readable storage medium or memory medium such as magnetic or optical media, e.g., disk or DVD/CD-ROM, volatile or non-volatile media such as RAM (e.g. SDRAM, DDR, RDRAM, SRAM, etc.), ROM, etc. In some embodiments, a computer-accessible medium may include transmission media or signals such as electrical, electromagnetic, or digital signals, conveyed via a communication medium such as network and/or a wireless link. - The methods described herein may be implemented in software, hardware, or a combination thereof, in different embodiments. In addition, the order of the blocks of the methods may be changed, and various elements may be added, reordered, combined, omitted, modified, etc. Various modifications and changes may be made as would be obvious to a person skilled in the art having the benefit of this disclosure. The various embodiments described herein are meant to be illustrative and not limiting. Many variations, modifications, additions, and improvements are possible. Accordingly, plural instances may be provided for components described herein as a single instance. Boundaries between various components, operations and data stores are somewhat arbitrary, and particular operations are illustrated in the context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within the scope of claims that follow. Finally, structures and functionality presented as discrete components in the example configurations may be implemented as a combined structure or component. These and other variations, modifications, additions, and improvements may fall within the scope of embodiments as defined in the claims that follow.
Claims (20)
1. An apparatus, comprising:
an autonomous navigation system configured to be installed in a vehicle and autonomously navigate the vehicle through an environment in which the vehicle is located, wherein the autonomous navigation system is configured to:
identify a set of contextual cues associated with a dynamic element located in the environment, wherein each contextual cue indicates one or more particular features associated with the dynamic element, based on monitoring at least a portion of the environment;
associate the dynamic element with a particular set of predicted motions, based on a determination of a correlation between the identified set of contextual cues and a predetermined set of contextual cues which are associated with the particular set of predicted motions;
generate a predicted trajectory of the dynamic element through the environment based on the particular set of predicted motions associated with the dynamic element; and
generate a set of control commands which, when executed by one or more control elements installed in the vehicle, cause the vehicle to be navigated along a driving route that accounts for the predicted trajectory of the dynamic element.
2. The apparatus of claim 1 , wherein the autonomous navigation system is configured to:
prior to identifying the set of contextual cues associated with the dynamic element located in the environment:
identify a set of contextual cues associated with another dynamic element located in the environment, wherein each contextual cue indicates one or more particular features associated with the other dynamic element, based on monitoring at least a portion of the environment;
monitor a particular set of motions executed by the other dynamic element through the environment, subsequent to identifying the set of contextual cues associated with the other dynamic element; and
based on the monitoring, generate an association of the set of contextual cues, as the set of predetermined contextual cues, with the particular set of motions executed by the other dynamic element, as the particular set of predicted motions;
wherein the association specifies that dynamic elements associated with the set of predetermined contextual cues are predicted to execute the particular set of predicted motions.
3. The apparatus of claim 2 , wherein the autonomous navigation system is configured to:
monitor a particular set of actual motions executed by the dynamic element through the environment, subsequent to associating the dynamic element with the particular set of predicted motions; and
based on a determination that the particular set of actual motions executed by the dynamic element are distinct from the particular set of predicted motions, associate the predetermined set of contextual cues with the particular set of actual motions and disassociating the predetermined set of contextual cues from the particular set of predicted motions.
4. The apparatus of claim 1 , wherein:
the set of contextual cues comprises a specification of a position and velocity of the dynamic element relative to one or more particular static elements detected in the environment.
5. The apparatus of claim 1 , wherein:
the set of contextual cues comprises a specification of a position and velocity of the dynamic element relative to the one or more other dynamic elements detected in the environment.
6. The apparatus of claim 1 , comprising:
one or more sets of signal generators which are configured to generate and direct one or more targeted signals to one or more elements located in the environment; and
an autonomous navigation system configured to:
command at least one of the one or more sets of signal generators to generate a particular targeted signal, comprising a particular instance of content selected based on at least one sensor data representation of the dynamic element and a particular signal beam angle which intersects a limited portion, of the environment, in which the particular element is located, such that the particular instance of content is communicated to the dynamic element in the environment to the exclusion of a remainder portion of the environment.
7. The apparatus of claim 6 , wherein the one or more sets of signal generators comprise one or more sets of speaker devices which are configured to generate one or more targeted signals to one or more elements located in the environment based on at least one of:
beamforming; or
ultrasonic modulation.
8. A method, comprising:
performing, by one or more computer systems installed in a vehicle located in a particular environment:
identifying a set of contextual cues associated with a dynamic element located in the environment, wherein each contextual cue indicates one or more particular features associated with the dynamic element, based on monitoring at least a portion of the environment;
associating the dynamic element with a particular set of predicted motions, based on a determination of a correlation between the identified set of contextual cues and a predetermined set of contextual cues which are associated with the particular set of predicted motions;
generating a predicted trajectory of the dynamic element through the environment based on the particular set of predicted motions associated with the dynamic element; and
generating a set of control commands which, when executed by one or more control elements installed in the vehicle, cause the vehicle to be navigated along a driving route that accounts for the predicted trajectory of the dynamic element.
9. The method of claim 8 , comprising:
prior to identifying the set of contextual cues associated with the dynamic element located in the environment:
identifying a set of contextual cues associated with another dynamic element located in the environment, wherein each contextual cue indicates one or more particular features associated with the other dynamic element, based on monitoring at least a portion of the environment;
monitoring a particular set of motions executed by the other dynamic element through the environment, subsequent to identifying the set of contextual cues associated with the other dynamic element; and
based on the monitoring, generating an association of the set of contextual cues, as the set of predetermined contextual cues, with the particular set of motions executed by the other dynamic element, as the particular set of predicted motions;
wherein the association specifies that dynamic elements associated with the set of predetermined contextual cues are predicted to execute the particular set of predicted motions.
10. The method of claim 9 , comprising:
monitoring a particular set of actual motions executed by the dynamic element through the environment, subsequent to associating the dynamic element with the particular set of predicted motions; and
based on a determination that the particular set of actual motions executed by the dynamic element are distinct from the particular set of predicted motions, associating the predetermined set of contextual cues with the particular set of actual motions and disassociating the predetermined set of contextual cues from the particular set of predicted motions.
11. The method of claim 8 , wherein:
the set of contextual cues comprises a specification of a position and velocity of the dynamic element relative to one or more particular static elements detected in the environment.
12. The method of claim 8 , wherein:
the set of contextual cues comprises a specification of a position and velocity of the dynamic element relative to the one or more other dynamic elements detected in the environment.
13. The method of claim 8 , comprising:
commanding at least one set of one or more signal generators installed in the vehicle to generate a particular targeted signal, comprising a particular instance of content selected based on at least one sensor data representation of the dynamic element and a particular signal beam angle which intersects a limited portion, of the environment, in which the particular element is located, such that the particular instance of content is communicated to the dynamic element in the environment to the exclusion of a remainder portion of the environment.
14. The method of claim 13 , wherein the one or more sets of signal generators comprise one or more sets of speaker devices which are configured to generate one or more targeted signals to one or more elements located in the environment based on at least one of:
beamforming; or
ultrasonic modulation.
15. A non-transitory, computer-readable medium storing a program of instructions which, when executed by at least one computer system, causes the at least one computer system to:
identify a set of contextual cues associated with a dynamic element located in the environment, wherein each contextual cue indicates one or more particular features associated with the dynamic element, based on monitoring at least a portion of the environment;
associate the dynamic element with a particular set of predicted motions, based on a determination of a correlation between the identified set of contextual cues and a predetermined set of contextual cues which are associated with the particular set of predicted motions;
generate a predicted trajectory of the dynamic element through the environment based on the particular set of predicted motions associated with the dynamic element; and
generate a set of control commands which, when executed by one or more control elements installed in the vehicle, cause the vehicle to be navigated along a driving route that accounts for the predicted trajectory of the dynamic element.
16. The non-transitory, computer readable medium of claim 15 , wherein the program of instruction, when executed by the at least one computer system, cause the at least one computer system to:
prior to identifying the set of contextual cues associated with the dynamic element located in the environment:
identify a set of contextual cues associated with another dynamic element located in the environment, wherein each contextual cue indicates one or more particular features associated with the other dynamic element, based on monitoring at least a portion of the environment;
monitor a particular set of motions executed by the other dynamic element through the environment, subsequent to identifying the set of contextual cues associated with the other dynamic element; and
based on the monitoring, generate an association of the set of contextual cues, as the set of predetermined contextual cues, with the particular set of motions executed by the other dynamic element, as the particular set of predicted motions;
wherein the association specifies that dynamic elements associated with the set of predetermined contextual cues are predicted to execute the particular set of predicted motions.
17. The non-transitory, computer readable medium of claim 15 , wherein:
the set of contextual cues comprises a specification of a position and velocity of the dynamic element relative to one or more particular static elements detected in the environment.
18. The non-transitory, computer readable medium of claim 15 , wherein:
the set of contextual cues comprises a specification of a position and velocity of the dynamic element relative to the one or more other dynamic elements detected in the environment.
19. The non-transitory, computer readable medium of claim 15 , wherein the program of instruction, when executed by the at least one computer system, cause the at least one computer system to:
command at least one set of one or more signal generators installed in the vehicle to generate a particular targeted signal, comprising a particular instance of content selected based on at least one sensor data representation of the dynamic element and a particular signal beam angle which intersects a limited portion, of the environment, in which the particular element is located, such that the particular instance of content is communicated to the dynamic element in the environment to the exclusion of a remainder portion of the environment.
20. The non-transitory, computer readable medium of claim 19 , wherein the one or more sets of signal generators comprise one or more sets of speaker devices which are configured to generate one or more targeted signals to one or more elements located in the environment based on at least one of:
beamforming; or
ultrasonic modulation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/758,330 US20180260635A1 (en) | 2015-09-08 | 2016-09-08 | Intention recognition |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562215672P | 2015-09-08 | 2015-09-08 | |
US15/758,330 US20180260635A1 (en) | 2015-09-08 | 2016-09-08 | Intention recognition |
PCT/US2016/050621 WO2017044525A1 (en) | 2015-09-08 | 2016-09-08 | Intention recognition |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2016/050621 A-371-Of-International WO2017044525A1 (en) | 2015-09-08 | 2016-09-08 | Intention recognition |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/899,242 Continuation US11423665B2 (en) | 2015-09-08 | 2020-06-11 | Intention recognition |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180260635A1 true US20180260635A1 (en) | 2018-09-13 |
Family
ID=56940429
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/758,330 Abandoned US20180260635A1 (en) | 2015-09-08 | 2016-09-08 | Intention recognition |
US16/899,242 Active 2037-01-21 US11423665B2 (en) | 2015-09-08 | 2020-06-11 | Intention recognition |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/899,242 Active 2037-01-21 US11423665B2 (en) | 2015-09-08 | 2020-06-11 | Intention recognition |
Country Status (3)
Country | Link |
---|---|
US (2) | US20180260635A1 (en) |
CN (2) | CN107924195B (en) |
WO (1) | WO2017044525A1 (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110928966A (en) * | 2019-10-16 | 2020-03-27 | 福建星网智慧软件有限公司 | Map track prediction method and system for scheduling system |
US10809729B2 (en) * | 2017-03-31 | 2020-10-20 | Panasonic Intellectual Property Management Co., Ltd. | Automatic driving control method, automatic driving control device using the same, and non-transitory storage medium |
US11227497B2 (en) | 2017-09-05 | 2022-01-18 | Starship Technologies Oü | Mobile robot having collision avoidance system for crossing a road from a pedestrian pathway |
US20220019228A1 (en) * | 2020-07-14 | 2022-01-20 | Honda Motor Co., Ltd. | Mobile object control method, mobile object control device, mobile object and storage medium |
US11423665B2 (en) | 2015-09-08 | 2022-08-23 | Apple Inc. | Intention recognition |
US20220406076A1 (en) * | 2021-06-18 | 2022-12-22 | Honda Motor Co.,Ltd. | Warning control apparatus, moving object, warning control method, and computer-readable storage medium |
US11651609B2 (en) * | 2020-06-10 | 2023-05-16 | Here Global B.V. | Method, apparatus, and system for mapping based on a detected pedestrian type |
US11688184B2 (en) | 2020-06-17 | 2023-06-27 | Toyota Motor Engineering & Manufacturing North America, Inc. | Driving automation external communication location change |
US20230242104A1 (en) * | 2022-01-31 | 2023-08-03 | Ford Global Technologies, Llc | Vehicle path verification |
US11921519B2 (en) | 2019-06-24 | 2024-03-05 | Arizona Board Of Regents On Behalf Of Arizona State University | Partition-based parametric active model discrimination with applications to driver intention estimation |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019048332A1 (en) * | 2017-09-05 | 2019-03-14 | Starship Technologies Oü | Mobile robot having collision avoidance system for crossing a road from a pedestrian pathway |
JP6989418B2 (en) * | 2018-03-12 | 2022-01-05 | 矢崎総業株式会社 | In-vehicle system |
RU2756872C1 (en) * | 2018-05-31 | 2021-10-06 | Ниссан Норт Америка, Инк. | Structure of probabilistic object tracking and forecasting |
CN109034448B (en) * | 2018-06-14 | 2022-02-11 | 重庆邮电大学 | Trajectory prediction method based on vehicle trajectory semantic analysis and deep belief network |
DE102018118761A1 (en) * | 2018-08-02 | 2020-02-06 | Robert Bosch Gmbh | Method for at least partially automated driving of a motor vehicle |
US10824155B2 (en) | 2018-08-22 | 2020-11-03 | Ford Global Technologies, Llc | Predicting movement intent of objects |
US11409285B2 (en) * | 2018-12-27 | 2022-08-09 | Continental Automotive Systems, Inc. | Method for maneuver prediction of traffic participant |
CN115033676B (en) * | 2022-06-22 | 2024-04-26 | 支付宝(杭州)信息技术有限公司 | Intention recognition model training and user intention recognition method and device |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2869121B1 (en) * | 2004-04-16 | 2006-06-02 | Renault V I Sa | METHOD FOR CONTROLLING THE TRACK OF A VEHICLE |
WO2007102367A1 (en) * | 2006-02-28 | 2007-09-13 | Toyota Jidosha Kabushiki Kaisha | Object course prediction method, device, program, and automatic driving system |
JP4453046B2 (en) * | 2007-03-30 | 2010-04-21 | アイシン・エィ・ダブリュ株式会社 | Vehicle behavior learning apparatus and vehicle behavior learning program |
US8244408B2 (en) * | 2009-03-09 | 2012-08-14 | GM Global Technology Operations LLC | Method to assess risk associated with operating an autonomic vehicle control system |
CN101813492B (en) * | 2010-04-19 | 2012-11-14 | 清华大学 | Vehicle navigation system and method |
JP2012015900A (en) * | 2010-07-02 | 2012-01-19 | Panasonic Corp | Sound generation system, ultrasonic transmission device, and ultrasonic transmission method |
DE102011121948A1 (en) * | 2011-12-22 | 2013-06-27 | Gm Global Technology Operations, Llc | Perspective on actions of an autonomous driving system |
WO2013108406A1 (en) * | 2012-01-20 | 2013-07-25 | トヨタ自動車 株式会社 | Vehicle behavior prediction device and vehicle behavior prediction method, and driving assistance device |
US8457827B1 (en) | 2012-03-15 | 2013-06-04 | Google Inc. | Modifying behavior of autonomous vehicle based on predicted behavior of other vehicles |
US9182243B2 (en) | 2012-06-05 | 2015-11-10 | Apple Inc. | Navigation application |
DE102014201159A1 (en) * | 2014-01-23 | 2015-07-23 | Robert Bosch Gmbh | Method and device for classifying a behavior of a pedestrian when crossing a roadway of a vehicle and personal protection system of a vehicle |
US9248834B1 (en) * | 2014-10-02 | 2016-02-02 | Google Inc. | Predicting trajectories of objects based on contextual information |
US9429946B2 (en) * | 2014-12-25 | 2016-08-30 | Automotive Research & Testing Center | Driving control system and dynamic decision control method thereof |
CN107924195B (en) | 2015-09-08 | 2020-11-10 | 苹果公司 | Intent recognition |
US10072939B2 (en) * | 2016-03-24 | 2018-09-11 | Motorola Mobility Llc | Methods and systems for providing contextual navigation information |
CN109789880B (en) * | 2016-09-21 | 2022-03-11 | 苹果公司 | External communication of vehicle |
-
2016
- 2016-09-08 CN CN201680049892.1A patent/CN107924195B/en active Active
- 2016-09-08 CN CN202011258579.3A patent/CN112363507B/en active Active
- 2016-09-08 WO PCT/US2016/050621 patent/WO2017044525A1/en active Application Filing
- 2016-09-08 US US15/758,330 patent/US20180260635A1/en not_active Abandoned
-
2020
- 2020-06-11 US US16/899,242 patent/US11423665B2/en active Active
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11423665B2 (en) | 2015-09-08 | 2022-08-23 | Apple Inc. | Intention recognition |
US10809729B2 (en) * | 2017-03-31 | 2020-10-20 | Panasonic Intellectual Property Management Co., Ltd. | Automatic driving control method, automatic driving control device using the same, and non-transitory storage medium |
US11227497B2 (en) | 2017-09-05 | 2022-01-18 | Starship Technologies Oü | Mobile robot having collision avoidance system for crossing a road from a pedestrian pathway |
US11941987B2 (en) | 2017-09-05 | 2024-03-26 | Starship Technologies Oü | Mobile robot having collision avoidance system for crossing a road from a pedestrian pathway |
US11921519B2 (en) | 2019-06-24 | 2024-03-05 | Arizona Board Of Regents On Behalf Of Arizona State University | Partition-based parametric active model discrimination with applications to driver intention estimation |
CN110928966A (en) * | 2019-10-16 | 2020-03-27 | 福建星网智慧软件有限公司 | Map track prediction method and system for scheduling system |
US11651609B2 (en) * | 2020-06-10 | 2023-05-16 | Here Global B.V. | Method, apparatus, and system for mapping based on a detected pedestrian type |
US11688184B2 (en) | 2020-06-17 | 2023-06-27 | Toyota Motor Engineering & Manufacturing North America, Inc. | Driving automation external communication location change |
US20220019228A1 (en) * | 2020-07-14 | 2022-01-20 | Honda Motor Co., Ltd. | Mobile object control method, mobile object control device, mobile object and storage medium |
US20220406076A1 (en) * | 2021-06-18 | 2022-12-22 | Honda Motor Co.,Ltd. | Warning control apparatus, moving object, warning control method, and computer-readable storage medium |
US20230242104A1 (en) * | 2022-01-31 | 2023-08-03 | Ford Global Technologies, Llc | Vehicle path verification |
US11851052B2 (en) * | 2022-01-31 | 2023-12-26 | Ford Global Technologies, Llc | Vehicle path verification |
Also Published As
Publication number | Publication date |
---|---|
CN112363507A (en) | 2021-02-12 |
CN112363507B (en) | 2024-08-09 |
US11423665B2 (en) | 2022-08-23 |
WO2017044525A1 (en) | 2017-03-16 |
CN107924195B (en) | 2020-11-10 |
US20200302194A1 (en) | 2020-09-24 |
CN107924195A (en) | 2018-04-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11423665B2 (en) | Intention recognition | |
EP3526737B1 (en) | Neural network system for autonomous vehicle control | |
US11568100B2 (en) | Synthetic scenario simulator based on events | |
US11574089B2 (en) | Synthetic scenario generator based on attributes | |
US10303257B2 (en) | Communication between autonomous vehicle and external observers | |
US11307587B2 (en) | Operating an autonomous vehicle according to road user reaction modeling with occlusions | |
US10387733B2 (en) | Processing apparatus, processing system, and processing method | |
US10160378B2 (en) | Light output system for a self-driving vehicle | |
US20200209857A1 (en) | Multimodal control system for self driving vehicle | |
US10296001B2 (en) | Radar multipath processing | |
US20200398743A1 (en) | Method and apparatus for learning how to notify pedestrians | |
CN106463060B (en) | Processing apparatus, processing system, and processing method | |
US10849543B2 (en) | Focus-based tagging of sensor data | |
US11693414B2 (en) | Non-solid object monitoring | |
US20210139048A1 (en) | Tree policy planning for autonomous vehicle driving solutions | |
KR20210038852A (en) | Method, apparatus, electronic device, computer readable storage medium and computer program for early-warning | |
CN109823339A (en) | Vehicle traffic light intersection passing control method and control system | |
US20200074851A1 (en) | Control device and control method | |
US10583828B1 (en) | Position determination | |
JP7359097B2 (en) | Vehicle management system, management method, and program | |
WO2020264276A1 (en) | Synthetic scenario generator based on attributes | |
GB2528953A (en) | An apparatus, method, computer program and user device for enabling control of a vehicle | |
CN114207685B (en) | Autonomous vehicle interaction system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |