WO2013024348A2 - Systems and methods for intelligent navigation - Google Patents

Systems and methods for intelligent navigation Download PDF

Info

Publication number
WO2013024348A2
WO2013024348A2 PCT/IB2012/001797 IB2012001797W WO2013024348A2 WO 2013024348 A2 WO2013024348 A2 WO 2013024348A2 IB 2012001797 W IB2012001797 W IB 2012001797W WO 2013024348 A2 WO2013024348 A2 WO 2013024348A2
Authority
WO
WIPO (PCT)
Prior art keywords
agent
systems
navigation
mesh
plane
Prior art date
Application number
PCT/IB2012/001797
Other languages
French (fr)
Inventor
Markus Wilhelm
Christian SCHAADT
Christian Vogelgesang
Original Assignee
Xaitment Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xaitment Gmbh filed Critical Xaitment Gmbh
Publication of WO2013024348A2 publication Critical patent/WO2013024348A2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/006Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]

Definitions

  • the methods and systems described herein relate generally to artificial intelligence in computer game or robot control design.
  • the methods and systems described herein relate to navigation, mapping, pathfinding, collision avoidance, and entity behavior.
  • Simulation programs and video games incorporate advanced graphics and highly detailed physics simulations.
  • a simulated environment may include one or more computer-controlled agents, or software modules controlling movement.
  • agents for example, in a simulation of a car race, a user may operate a first car, while additional cars are controlled by or are one or more agents.
  • an agent may control the robot in its interactions in the real world.
  • the present application is directed to systems and methods for improved navigation.
  • the system allows programmers or developers to enter specifications about the environment and agents, and automatically generates navigation meshes based on the specification and agents.
  • navigation meshes are generated enabling the user to easily make changes to the map, and to try different possibilities to improve gameplay. Once generated, the navigation meshes can be interpreted by a movement module to help agents intelligently navigate the game world.
  • a navigation mesh comprises a plurality of vertices or nodes and one or more edges connecting said vertices or nodes.
  • the navigation mesh may include one or more placeholders or keys to trigger post-processing of the navigation mesh.
  • Postprocessing may include removing parts of a navigation mesh that cannot be reached by an entity to allow for dynamic respawning or placement within a map.
  • Post-processing may further include one or more static blockings to remove one or more parts of a navigation mesh within a defined bounding box.
  • post-processing may also include plane-based blocking to define directional edges within the navigation graph.
  • post-processing may include dynamic blocking.
  • dynamic blocking may comprise temporarily blocking parts of the mesh from being entered or limiting edge transiting. Accordingly, edges may be dynamically enabled or disabled during execution. These techniques may be useful for simulating doors, destructible objects, or other features that may be modified during runtime.
  • the present application is directed to systems and methods for controlling movement of computer-controlled entities or agents within a navigation mesh.
  • the system enables the control and movement of units in a virtual world representation defined by a navigation mesh.
  • the system allows for dynamic and static obstacle avoidance, including avoiding other agents.
  • Figure 1 A is a block diagram illustrative of an embodiment of a networked environment useful for the systems and methods described in this document;
  • Figure IB is a block diagram illustrative of a certain embodiment of a computing machine for practicing the methods and systems described herein;
  • Figure 2A is a diagram of an embodiment of a navigation mesh modified by a dynamic blocking box
  • Figure 2B is a diagram of the navigation mesh of Figure 2A, during application of an embodiment of a dynamic blocking algorithm
  • Figures 3 A and 3B are diagrams of a navigation mesh with applied embodiments of a cutting algorithm without elimination of small triangles and with elimination of small triangles, respectively;
  • Figure 4 is a signal flow diagram of an embodiment of a movement engine executing position control and collision avoidance
  • Figure 5 is a flow chart of an embodiment of a position calculation routine
  • FIGS. 6A-6D illustrate an embodiment of collision detection and avoidance.
  • the networked environment 101 includes one or more client machines 102A-102N (generally referred to herein as “client machine(s) 102" or “client(s) 102") in communication with one or more servers 106A-106N (generally referred to herein as “server machine(s) 106" or “server(s) 106") over a network 104.
  • client machines 102A-102N generally referred to herein as "client machine(s) 102" or “client(s) 102
  • servers 106A-106N generally referred to herein as “server machine(s) 106" or “server(s) 106"
  • the client machine(s) 102 can, in some embodiments, be referred to as a single client machine 102 or a single group of client machines 102, while server (s) 106 may be referred to as a single server 106 or a single group of servers 106. Although four client machines 102 and four server machines 106 are depicted in FIG. 1A, any number of clients 102 may be in communication with any number of servers 106. In one embodiment a single client machine 102 communicates with more than one server 106, while in another embodiment a single server 106 communicates with more than one client machine 102. In yet another embodiment, a single client machine 102 communicates with a single server 106.
  • the computing environment 101 can include an appliance (not shown in FIG. 1A) installed between the server(s) 106 and client machine(s) 102.
  • This appliance can mange client/server connections, and in some cases can load balance connections made by client machines 102 to server machines 106.
  • Suitable appliances are manufactured by any one of the following companies: the Citrix Systems Inc. Application Networking Group; Silver Peak Systems, Inc, both of Santa Clara, California; Riverbed Technology, Inc. of San Francisco, California; F5 Networks, Inc. of Seattle, Washington; or Juniper Networks, Inc. of Sunnyvale, California.
  • Clients 102 and server 106 may be provided as a computing device 100, a specific embodiment of which is illustrated in Figure IB. Included within the computing device 100 is a system bus 150 that communicates with the following components: a central processing unit 121; a main memory 122; storage memory 128; an input/output (I/O) controller 123; display devices 124A-124N; an installation device 116; and a network interface 118.
  • a central processing unit 121 a main memory 122
  • storage memory 128 includes an input/output (I/O) controller 123; display devices 124A-124N; an installation device 116; and a network interface 118.
  • the storage memory 128 includes: an operating system, software routines, and a client agent 120.
  • the I/O controller 123 in some embodiments, is further connected one or more input devices. As shown in Figure IB, the I/O controller 123 is connected to a camera 125, a keyboard 126, a pointing device 127, and a microphone 129.
  • Embodiments of the computing machine 100 can include a central processing unit 121 characterized by any one of the following component configurations: logic circuits that respond to and process instructions fetched from the main memory unit 122; a microprocessor unit, such as: those manufactured by Intel Corporation; those manufactured by Motorola
  • the central processing unit 122 may include any combination of the following: a microprocessor, a microcontroller, a central processing unit with a single processing core, a central processing unit with two processing cores, or a central processing unit with more than one processing core.
  • Figure IB illustrates a computing device 100 that includes a single central processing unit 121
  • the computing device 100 can include one or more processing units 121.
  • the computing device 100 may store and execute firmware or other executable instructions that, when executed, direct the one or more processing units 121 to simultaneously execute instructions or to simultaneously execute instructions on a single piece of data.
  • the computing device 100 may store and execute firmware or other executable instructions that, when executed, direct the one or more processing units to each execute a section of a group of instructions. For example, each processing unit 121 may be instructed to execute a portion of a program or a particular module within a program.
  • the processing unit 121 can include one or more processing cores.
  • the processing unit 121 may have two cores, four cores, eight cores, etc.
  • the processing unit 121 may comprise one or more parallel processing cores.
  • the processing cores of the processing unit 121 may in some embodiments access available memory as a global address space, or in other embodiments, memory within the computing device 100 can be segmented and assigned to a particular core within the processing unit 121.
  • the one or more processing cores or processors in the computing device 100 can each access local memory.
  • memory within the computing device 100 can be shared amongst one or more processors or processing cores, while other memory can be accessed by particular processors or subsets of processors.
  • the multiple processing units can be included in a single integrated circuit (IC). These multiple processors, in some embodiments, can be linked together by an internal high speed bus, which may be referred to as an element interconnect bus.
  • the processors can execute a single instruction simultaneously on multiple pieces of data (SIMD), or in other embodiments can execute multiple instructions simultaneously on multiple pieces of data (MIMD).
  • SIMD SIMD
  • MIMD multiple instructions simultaneously on multiple pieces of data
  • the computing device 100 can include any number of SIMD and MIMD processors.
  • the computing device 100 can include a graphics processor or a graphics processing unit (not shown).
  • the graphics processing unit can include any combination of
  • the graphics processing unit can be included within the processing unit 121.
  • the computing device 100 can include one or more processing units 121, where at least one processing unit 121 is dedicated to processing and rendering graphics.
  • One embodiment of the computing device 100 provides support for any one of the following installation devices 116: a CD-ROM drive, a CD-R/RW drive, a DVD-ROM drive, tape drives of various formats, USB device, a bootable medium, a bootable CD, a bootable CD for GNU/Linux distribution such as KNOPPIX®, a hard-drive or any other device suitable for installing applications or software.
  • Applications can in some embodiments include a client agent 120, or any portion of a client agent 120.
  • the computing device 100 may further include a storage device 128 that can be either one or more hard disk drives, or one or more redundant arrays of independent disks; where the storage device is configured to store an operating system, software, programs applications, or at least a portion of the client agent 120.
  • a further embodiment of the computing device 100 includes an installation device 116 that is used as the storage device 128.
  • Embodiments of the computing device 100 include any one of the following I/O devices 130A-130N: a camera 125, keyboard 126; a pointing device 127; a microphone 129; mice;
  • An I/O controller 123 may in some embodiments connect to multiple I/O devices 103A-130N to control the one or more I/O devices.
  • Some embodiments of the I/O devices 130A-130N may be configured to provide storage or an installation medium 116, while others may provide a universal serial bus (USB) interface for receiving USB storage devices such as the USB Flash Drive line of devices manufactured by Twintech Industry, Inc.
  • Still other embodiments include an I/O device 130 that may be a bridge between the system bus 150 and an external
  • a USB bus such as: a USB bus; an Apple Desktop Bus; an RS-232 serial connection; a SCSI bus; a FireWire bus; a FireWire 800 bus; an Ethernet bus; an AppleTalk bus; a Gigabit Ethernet bus; an Asynchronous Transfer Mode bus; a HIPPI bus; a Super HIPPI bus; a SerialPlus bus; a SCI/LAMP bus; a FibreChannel bus; or a Serial Attached small computer system interface bus.
  • a USB bus such as: a USB bus; an Apple Desktop Bus; an RS-232 serial connection; a SCSI bus; a FireWire bus; a FireWire 800 bus; an Ethernet bus; an AppleTalk bus; a Gigabit Ethernet bus; an Asynchronous Transfer Mode bus; a HIPPI bus; a Super HIPPI bus; a SerialPlus bus; a SCI/LAMP bus; a FibreChannel bus; or a Serial Attached small computer system interface bus.
  • the computing machine 100 can execute any operating system, while in other embodiments the computing machine 100 can execute any of the following operating systems: versions of the MICROSOFT WINDOWS operating systems such as
  • the computing machine 100 can execute multiple operating systems.
  • the computing machine 100 can execute PARALLELS or another virtualization platform that can execute or manage a virtual machine executing a first operating system, while the computing machine 100 executes a second operating system different from the first operating system.
  • the computing machine 100 can be embodied in any one of the following computing devices: a computing workstation; a desktop computer; a laptop or notebook computer; a server; a handheld computer; a mobile telephone; a portable telecommunication device; a media playing device; a gaming system; a mobile computing device; a netbook; a device of the IPOD family of devices manufactured by Apple Computer; any one of the PLAYSTATION family of devices manufactured by the Sony Corporation; any one of the Nintendo family of devices manufactured by Nintendo Co; any one of the XBOX family of devices manufactured by the Microsoft Corporation; or any other type and/or form of computing, telecommunications or media device that is capable of communication and that has sufficient processor power and memory capacity to perform the methods and systems described herein.
  • the computing machine 100 can be a mobile device such as any one of the following mobile devices: a JAVA-enabled cellular telephone or personal digital assistant (PDA), such as the i55sr, i58sr, i85s, i88s, i90c, i95cl, or the imllOO, all of which are manufactured by Motorola Corp; the 6035 or the 7135, manufactured by Kyocera; the i300 or i330, manufactured by Samsung Electronics Co., Ltd; the TREO 180, 270, 600, 650, 680, 700p, 700w, or 750 smart phone manufactured by Palm, Inc; any computing device that has different processors, operating systems, and input devices consistent with the device; or any other mobile computing device capable of performing the methods and systems described herein.
  • PDA personal digital assistant
  • the computing device 100 can be any one of the following mobile computing devices: any one series of Blackberry, or other handheld device manufactured by Research In Motion Limited; the iPhone manufactured by Apple Computer; Palm Pre; a Pocket PC; a Pocket PC Phone; or any other handheld mobile device.
  • the computing device 100 may a smart phone or tablet computer, including products such as the iPhone or iPad manufactured by Apple, Inc. of Cupertino, CA; the BlackBerry devices manufactured by Research in Motion, Ltd. of Waterloo, Ontario, Canada; Windows Mobile devices manufactured by Microsoft Corp., of Redmond, WA; the Xoom manufactured by
  • the computing device 100 can be a virtual machine.
  • the virtual machine can be any virtual machine managed by a hypervisor developed by
  • the virtual machine can be managed by a hypervisor executing on a server 106 or a hypervisor executing on a client 102.
  • the computing device 100 can in some embodiments execute, operate or otherwise provide an application that can be any one of the following: software; an application or program; executable instructions; a virtual machine; a hypervisor; a web browser; a web-based client; a client-server application; an ActiveX control; a Java applet; software related to voice over internet protocol (VoIP) communications like a soft IP telephone; an application for streaming video and/or audio or receiving and playing streamed video and/or audio; an application for facilitating real-time-data communications; a HTTP client; a FTP client; or any other set of executable instructions.
  • Still other embodiments include a client device 102 that displays application output generated by an application remotely executing on a server 106 or other remotely located machine. In these embodiments, the client device 102 can display the application output in an application window, a browser, or other output window.
  • the computing device 100 may further include a network interface 118 to interface to a Local Area Network (LAN), Wide Area Network (WAN) or the Internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links [e.g., 802.11, Tl, T3, 56kb, X.25, SNA, DECNET), broadband connections e.g., ISDN, Frame Relay, ATM, Gigabit Ethernet, Ethernet-over-SONET), wireless connections, or some combination of any or all of the above.
  • LAN Local Area Network
  • WAN Wide Area Network
  • the Internet may further include a network interface 118 to interface to a Local Area Network (LAN), Wide Area Network (WAN) or the Internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links [e.g., 802.11, Tl, T3, 56kb, X.25, SNA, DECNET), broadband connections e.g., ISDN, Frame Relay, ATM, Gigabit Ethernet, Ethernet
  • Connections can also be established using a variety of communication protocols (e.g., TCP/IP, IPX, SPX, NetBIOS, Ethernet, ARCNET, SONET, SDH, Fiber Distributed Data Interface (FDDI), RS232, RS485, IEEE 802.11, IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, CDMA, GSM, WiMax and direct asynchronous connections).
  • the network 104 can comprise one or more sub-networks, and can be installed between any combination of the clients 102, servers 106, computing machines and appliances included within the computing environment 101.
  • the network 104 can be: a local-area network (LAN); a metropolitan area network (MAN); a wide area network (WAN); a primary network 104 comprised of multiple sub-networks 104 located between the client machines 102 and the servers 106; a primary public network 104 with a private sub-network 104; a primary private network 104 with a public sub-network 104; or a primary private network 104 with a private sub-network 104.
  • the network topology of the network 104 can differ within different embodiments, possible network topologies include: a bus network topology; a star network topology; a ring network topology; a repeater-based network topology; or a tiered-star network topology.
  • Additional embodiments may include a network 104 of mobile telephone networks that use a protocol to communicate among mobile devices, where the protocol can be any one of the following: AMPS; TDMA; CDMA; GSM; GPRS UMTS; or any other protocol able to transmit data among mobile devices.
  • AMPS AMPS
  • TDMA Time Division Multiple Access
  • CDMA Code Division Multiple Access
  • GSM Global System for Mobile communications
  • GPRS UMTS any other protocol able to transmit data among mobile devices.
  • the computing environment 101 can include more than one server 106A-106N such that the servers 106A-106N are logically grouped together into a server farm 106.
  • the server farm 106 can include servers 106 that are geographically dispersed and logically grouped together in a server farm 106, servers 106 that are located proximate to each other and logically grouped together in a server farm 106, or several virtual servers executing on physical servers. Geographically dispersed servers 106A-106N within a server farm 106 can, in some
  • server farm 106 may be administered as a single entity, while in other embodiments the server farm 106 can include multiple server farms 106.
  • Navigation meshes may comprise a data structure based on geometry of a virtual or real environment.
  • a navigation mesh may comprise a graph with each node representing an area.
  • a navigation map may represent the floor plan of a building. The footprint of each room may be declared as a node of the navigation map, and the doors between the rooms may be considered as edges between these nodes.
  • the floor plan of the building may thus be defined as a graph of nodes and edges, which may be used by a movement module to calculate accessible areas from each current area.
  • a NavMesh may be based on triangular geometry, while in other embodiments, other geometric primitives may be utilized.
  • a NavMesh may comprise an outline of the traversable surfaces of the environment or virtual world.
  • the triangles or other primitives may be combined to form convex polygons to map the surface of the virtual landscape.
  • a polygon is considered convex if, from an arbitrary point within the object, a straight line may be drawn to any other point within the object without crossing an edge.
  • a convex polygon may represent an area within which an agent may freely travel, such as an empty room, without collision with any wall or other object.
  • a NavMesh can be built in various ways. The simplest possibility is for a user to create the NavMesh manually. Another possibility is to derive the NavMesh from already existing meshes of the game environment.
  • a mesh may comprise a contiguous arrangement of triangles, wherein neighboring triangles have two common corner points. For each corner point, it is possible to generate a list of triangles which use said corner point. In most of the games available, there are no meshes from which a NavMesh can be derived. In such cases, a mesh may be generated manually.
  • a navigation mesh may include areas marked as non-traversable or non-walkable. For example, based on input geometry, areas may be flagged as non- walkable such that the navigation mesh includes a virtual wall or unclimbable obstacle within the mesh. As this changes the navigation mesh, in many embodiments, the non-traversable status of these areas may not be undone during runtime to remove the blocked area. Thus, these virtual walls or objects may be described as static blockings of parts of the mesh.
  • navigation meshes need not be static.
  • the navigation mesh may include one or more placeholders or keys to trigger post-processing of the navigation mesh.
  • Post-processing may include removing parts of a navigation mesh that cannot be reached by an entity to allow for dynamic respawning or placement within a map.
  • Post-processing may further include one or more static blockings to remove one or more parts of a navigation mesh within a defined bounding box.
  • post-processing may also include plane-based blocking to define directional edges within the navigation graph.
  • post-processing may include dynamic blocking. Rather than removing parts of a navigation mesh, dynamic blocking may comprise temporarily blocking parts of the mesh from being entered or limiting edge transiting. Accordingly, edges may be dynamically enabled or disabled during execution. These techniques may be useful for simulating doors, destructible objects, or other features that may be modified during runtime.
  • unit flow of computer-controlled agents may be directed, allowing, for example, a computer-generated crowd to generally proceed from one area to another regardless of the direction of individual units within the crowd.
  • These planes may act as one-way virtual walls, such that when a unit has proceeded from one region to another, it may not return.
  • Such techniques may also be useful in simulating fleeing agents by ensuring that a pathfinder does not have access to nodes that would direct the agent towards a pursuer.
  • dynamic blocking may be performed using a cutting mesh or plane that defines a region.
  • the planes may be bounded to prevent an infinite plane from modifying the entire mesh.
  • the plane may be bounded in the shape of a box, and may be referred to as a blocking box.
  • the plane may be a blocking plane, and may comprise a rectangle that is coplanar to a plane.
  • a cutting mesh may define an intersecting plane to a navigation mesh to cut the existing navigation mesh into one or more half spaces, defined as within the blocking box/plane or outside the blocking box/plane.
  • a bounding box may restrict the intersecting plane to limit cutting across the entire navigation mesh.
  • multiple orthogonal planes may be used to define three-dimensional blocked areas, such as a blocking box.
  • a box may require 6 half-space cuts to define the region.
  • cuts may be modified with lower limit boundaries or the cutting plane may be locally modified to eliminate the generation of very thin triangles.
  • Several sets of half spaces cuts may be combined into a single mesh, and triangles may be extracted along the cut plane.
  • existing vertices of the navigation mesh may be defined in relation to a cutting plane or bounded cutting plane if they are in or out of the half space or on the plane and by introducing new edges and vertices into the input mesh if necessary.
  • a cutting plane or bounded cutting plane if they are in or out of the half space or on the plane and by introducing new edges and vertices into the input mesh if necessary.
  • FIG. 2A illustrated is an embodiment of an example input mesh 200 cut against six half spaces 202 forming a box (dashed edge).
  • existing vertices may be classified as outside vertices 204 and inside vertices 206.
  • Outside vertices 204 may comprise vertices outside blocking box 202.
  • Inside vertices 206 may comprise vertices inside blocking box 202.
  • vertices on planes 208 may also be classified.
  • the dashed lines 202 are the new introduced edges which result from the six cutting planes (though only four are shown in the two-dimensional view of FIG. 2A).
  • the cutting mesh is used to apply several cuts with half spaces where each cut restricts which vertices are inside and which vertices are outside of the convex hull.
  • FIG. 2B illustrated is a result of the navigation mesh during the second, horizontal cut along plane 210.
  • Vertices on one side of the plane which may have negative or positive coordinates relative to the plane, depending on embodiment, may be designated as outside the plane, such as vertices 212.
  • Vertices on the inside 214 or on the plane 216 may be similarly classified.
  • vertices and edges inside or outside of the blocking box may be classified. In some embodiments, unnecessary cuts may be removed after all half space cuts have been applied.
  • vertices close to the cutting plane may, in some embodiments, be classified as on the plane.
  • FIGs. 3 A and 3B illustrated are embodiments of a cutting algorithm without elimination of small triangles and with elimination of small triangles, respectively. If we take a look at the next two images we can see the difference between the feature being enabled and disabled. Vertices 300 generated by the cut of plane 210 that are close to vertices of navigation mesh 302 may be removed or eliminated, preventing generation of small triangles by the cut.
  • the algorithm will reuse existing vertices within a user specified range, thus introducing an error to the cut, but preventing such small triangles which will result in possible errors in algorithms that use the data structure of the navigation mesh such as path search or path smoothing.
  • a blocking box may be generated through iteratively applying the cutting mesh algorithm through six individual planes to define the box shape.
  • an edge identifier or ID for each cut edge of the box may be stored in a blocking box path object, such that the edge connections of the navigation mesh may be toggled as the box is enabled or disabled dynamically. For all cut edges of this object that edge id will be stored inside the blocking box path object so that it can toggle the edge connection in the navigation mesh depending on whether its state is enabled or disabled.
  • a blocking plane may be generated through applying the cutting mesh algorithm with a single plane but a with a cut restriction of the bounding box of the rectangle. This prevents the half space cut from cutting the entire input mesh along its plane (since planes are endless).
  • Cutting edges may be of the approximate length of the blocking plane, and the edge may be stored with the blocking plane object, such that it can be toggled as the edge is dynamically enabled or disabled.
  • the blocking plane may include a direction orthogonal to the plane, such that edges on one side of the plane may be enabled while edges on the other side are disabled.
  • the normal of the blocking plane may be referred to as a first direction, defining a first face of the plane.
  • the opposite direction from the plane or back side may be referred to as a second face of the plane.
  • the edge connection on one face at a time may be enabled for path finding engines or traversal by an agent , such that the agent may move from nodes normal to the first face to nodes normal to the second face.
  • the directionality may be arbitrarily toggled during runtime. Accordingly, a one-way directionality across the plane may be defined for path generation.
  • This feature can be used to control the flow of computer-controlled entities or agents inside the simulation. For example, if a player advances to a certain location inside the simulation, blocking planes can be set to a one-way connection in the direction of the player. This prevents computer-controlled agents from running away from the player since they can no longer reach areas beyond those blockings.
  • each navigation mesh node may comprise data fields or strings for allowing a user to store additional or custom data within the node.
  • This data may comprise annotations to nodes, and may be used for further processing.
  • the data may be read or processed when an agent or player arrives at the node.
  • the systems discussed herein may also be used to control movement of computer- controlled entities or agents within a navigation mesh.
  • the system enables the control and movement of units in a virtual world representation defined by a navigation mesh.
  • the system allows for dynamic and static obstacle avoidance, including avoiding other agents.
  • the system may comprise a movement engine for controlling motion of agents and other computer-controlled entities.
  • Movement engine may comprise an application, service, daemon, routine, or other executable code for managing user defined movement entities, such as non-player characters, vehicles, autonomous entities, or other agents.
  • a movement engine may comprise one or more interfaces for managing movement entities within a simulation world.
  • a movement engine may execute a plurality of threads simultaneously. Accordingly, the movement engine may control movement of multiple entities in different areas or regions within the world. For example, in a game with multiple dungeon instances or levels or disjointed regions, a single movement engine may control movement of agents in the different areas simultaneously. In another embodiment, a single movement engine thread may control movement of the different agents according to a schedule.
  • a single movement engine may provide movement commands to multiple simulations simultaneously.
  • a plurality of clients may request movement commands for agents from a server or server farm executing a movement engine.
  • clients lacking processing power to execute path finding and smoothing for a large number of computer-controlled agents may still receive the benefit of realistic artificial intelligence.
  • processor-limited devices such as smart phones, to execute realistic simulations.
  • a movement engine may execute one or more subroutines or threads.
  • the routines may include a Simulation World 400 routine for managing all move entities that are controlled by the movement engine and interact with each other.
  • the Simulation World 400 routine may control scheduling of movement control and collision detection for a plurality of agents.
  • the routines may include a MoveEntity 402 routine for managing velocity, position and direction of a movement entity or agent scheduled by the
  • routines may include a Collisionlnterface 404 routine for potential determining potential collisions in the surrounding area. Potential collisions may be due to other agents or movement entities, or due to dynamic objects including blocking objects, discussed above.
  • FIG. 4 An embodiment of a scheduling routine is illustrated in FIG. 4.
  • the engine may retrieve position and velocity information for the agent.
  • the engine may further retrieve information regarding any obstacle in the region local to the agent.
  • the movement engine may iteratively loop through each agent to identify a path or new velocity/direction to for the entity.
  • Each entity or agent may have a state of idle, following a path, or avoiding a potential collision.
  • the movement engine may move onto the next entity.
  • the movement engine may calculate a next position along the path or determine whether the entity should no longer be idle and calculate a path accordingly. If the movement engine determines that a next position has not been calculated, the movement engine may compute a braking distance for the entity. In many embodiments, the braking distance may be responsive to a current speed or velocity of the entity.
  • the movement engine may determine a new speed for the entity. For example, the entity may slow from a first speed to a second speed responsive to braking between a first position and second position. Responsive to the new speed, the movement engine may compute a new position along a path.
  • the movement engine may identify whether any dynamic colliders exist within a sensor distance of the entity.
  • an agent or entity may comprise one or more percepts to request or perceive states of the world.
  • a percept simulates a human's senses, such as sight or hearing. So that an agent can receive or perceive a percept, it needs to have a suitable sensor. This simulates a sensory organ, for example, that is to say the ability to see or to hear, for example. Every agent is therefore usually asigned at least one prescribed virtual sensor, and the percepts are filtered to suit the respective sensors of the agent. In other words: the information available to the agents is filtered according to what the respective agents can perceive. This allows very realistic and flexible multi-agent systems to be implemented.
  • a sensor may have a predetermined range. In one embodiment, a range may be adjusted to control how early an entity anticipates and avoids collisions.
  • the movement engine may identify if one or more dynamic colliders exist within range of the entity's sensor. If not, the movement engine may store the position on the path and/or send the new position, direction, and/or velocity to the entity. If one or more colliders exist, the movement engine may sample the static environment of the entity to determine potential paths. The movement engine may weight the influence of dynamic colliders according to soft and hard collisions, based on whether a path from the entity intersects another entity or whether the path is within a predetermined radius from the entity. The movement engine may choose a direction from the potential paths, responsive in some embodiments to current direction and velocity, and may send the new direction, velocity, and/or position to the entity.
  • the avoidance system queries all entities in the sensor radius from the agent for position.
  • the entities or potential colliders are projected a slice, or ray in a direction from the agent.
  • Slices that contain an entity are weighted by a "not wanted value” value that is used to choose a direction for travel. After all other entities in the sensor radius were added to the slices, the slice with the lowest "not wanted value" is chosen and the entity moves in the middle direction of this slice.
  • the system determines not just current positions for other entities, but also potential paths by tracing rays on the navigation mesh for the entities. Accordingly, the system may extrapolate the movement of each entity and uses the projected position to identify potential collisions.
  • an agent 600 may include a collision radius 602 defining the bounds of the agent 600.
  • the agent 600 may also have a soft radius 604, defining a boundary in which the agent would prefer not to cross other entities, but still can. For example, people may stand shoulder to shoulder or nose to nose without touching, but may be quite uncomfortable. Personal space thus defines a soft radius around each person.
  • the agent may further have a path direction 606, and a sensor radius 608.
  • sensor radius 608 may be limited to reduce processor requirements, while in other embodiments, sensor radius 608 may be varied based on environmental qualities. For example, in foggy environments, a sensor representing vision may have a reduced range, increasing the likelihood of near-collisions.
  • the movement engine may identify one or more colliders 610a, 610b. Each collider may be another agent or entity, or may be a static or dynamic boundary or object, such as a table or chair.
  • the movement engine may further identify a projected position of the corresponding collider 612a, 612b. In some embodiments, such as where the collider is a non-moving object, the projected position may be the same as the current position. In other embodiments, such as where the movement engine is iteratively calculating next positions for each agent, the projected position may comprise an previously-calculated next position for the agent.
  • the possible colliders 612a, 612b are projected into the slices or segments of the circle around the agent 600.
  • the circle may be divided into a number of slices responsive to the size of the collision radius 602, such that agents with smaller collision radii have more potential paths.
  • the radius of the collider may be expanded by the radius of the agent 614. This allows a more precise navigation than collision systems that only trigger if the center of the collider is impacted by the radius of the agent.
  • a slice is only affected by a collider if the middle line of the slice intersects with the collider. As shown in FIG.
  • an agent 600 may have multiple potential slices 616 that are free, or do not intersect with colliders.
  • the agent 600 may also have multiple slices 618 that pass through one or more soft radii of projected colliders 614a, 614b.
  • the agent further may have one or more slices 620 that intersect hard radii of projected colliders 614a, 614b.
  • Each slice may be associated with a value.
  • the value may be stored as the length of a vector along the slice. Referring now to FIG. 6C, vectors
  • each vector length may be modified by a current direction and velocity of the agent, reducing the likelihood that the agent reverses direction drastically.
  • the movement engine may determine varying lengths for each vector, responsive to potential collisions and current velocity of the agent. In one embodiment, the movement engine may select a longest vector, or one of a plurality of equally long vectors, to determine a next position for the agent.
  • the movement engine may also compute the relative speed for each potential collider. If a potential collider and the agent have a relative speed below a predetermined threshold value (i.e. if they are moving in roughly the same direction and speed), the movement engine may ignore or reduce the "not wanted" adjustment for vectors intersecting with the collider. Thus, if a first agent is in front of a second agent, but they are moving in the same direction at the same speed, the second agent will be unlikely to alter its path.
  • a predetermined threshold value i.e. if they are moving in roughly the same direction and speed
  • systems described above may provide multiple ones of any or each of those components and these components may be provided on either a standalone machine or, in some embodiments, on multiple machines in a distributed system.
  • the systems and methods described above may be implemented as a method, apparatus or article of manufacture using programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof.
  • the systems and methods described above may be provided as one or more computer-readable programs embodied on or in one or more articles of manufacture.
  • article of manufacture as used herein is intended to encompass code or logic accessible from and embedded in one or more computer-readable devices, firmware, programmable logic, memory devices (e.g., EEPROMs, ROMs, PROMs, RAMs, SRAMs, etc.), hardware (e.g., integrated circuit chip, Field Programmable Gate Array (FPGA), Application Specific Integrated Circuit (ASIC), etc.), electronic devices, a computer readable nonvolatile storage unit (e.g., CD-ROM, floppy disk, hard disk drive, etc.).
  • the article of manufacture may be accessible from a file server providing access to the computer-readable programs via a network transmission line, wireless transmission media, signals propagating through space, radio waves, infrared signals, etc.
  • the article of manufacture may be a flash memory card or a magnetic tape.
  • the article of manufacture includes hardware logic as well as software or programmable code embedded in a computer readable medium that is executed by a processor.
  • the computer-readable programs may be implemented in any programming language, such as LISP, PERL, C, C++, C#, PROLOG, or in any byte code language such as JAVA.
  • the software programs may be stored on or in one or more articles of manufacture as object code.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Processing Or Creating Images (AREA)

Description

Systems and Methods for Intelligent Navigation
Related Applications
The present application claims priority to and the benefit of U.S. Provisional
Application No. 61/524,310, entitled "Systems and Methods for Intelligent Navigation," filed August 16, 2011, the entirety of which is hereby incorporated by reference.
Field of the Invention
The methods and systems described herein relate generally to artificial intelligence in computer game or robot control design. In particular, the methods and systems described herein relate to navigation, mapping, pathfinding, collision avoidance, and entity behavior.
Background of the Invention
Simulation programs and video games incorporate advanced graphics and highly detailed physics simulations. For realism, as well as entertainment purposes, a simulated environment may include one or more computer-controlled agents, or software modules controlling movement. For example, in a simulation of a car race, a user may operate a first car, while additional cars are controlled by or are one or more agents. Similarly, in robotic control systems, an agent may control the robot in its interactions in the real world.
Primitive systems involved agents following predetermined paths, set by a developer during design. While effective, such systems were neither realistic, nor dynamic or able to react to a user or player's interactions. Accordingly, programmers developed artificial intelligence systems capable of mapping an area with a mesh, with waypoints, vertices, or nodes identifying locations that an agent can traverse, and connected by edges identifying whether a path exists between two nodes. Summary of the Invention
The present application is directed to systems and methods for improved navigation. The system allows programmers or developers to enter specifications about the environment and agents, and automatically generates navigation meshes based on the specification and agents. In some embodiments, navigation meshes are generated enabling the user to easily make changes to the map, and to try different possibilities to improve gameplay. Once generated, the navigation meshes can be interpreted by a movement module to help agents intelligently navigate the game world.
In one embodiment, a navigation mesh comprises a plurality of vertices or nodes and one or more edges connecting said vertices or nodes. The navigation mesh may include one or more placeholders or keys to trigger post-processing of the navigation mesh. Postprocessing may include removing parts of a navigation mesh that cannot be reached by an entity to allow for dynamic respawning or placement within a map. Post-processing may further include one or more static blockings to remove one or more parts of a navigation mesh within a defined bounding box. In some embodiments, post-processing may also include plane-based blocking to define directional edges within the navigation graph. In other embodiments, post-processing may include dynamic blocking. Rather than removing parts of a navigation mesh, dynamic blocking may comprise temporarily blocking parts of the mesh from being entered or limiting edge transiting. Accordingly, edges may be dynamically enabled or disabled during execution. These techniques may be useful for simulating doors, destructible objects, or other features that may be modified during runtime.
In another aspect, the present application is directed to systems and methods for controlling movement of computer-controlled entities or agents within a navigation mesh. In some embodiments, the system enables the control and movement of units in a virtual world representation defined by a navigation mesh. In other embodiments, the system allows for dynamic and static obstacle avoidance, including avoiding other agents.
The details of various embodiments of the invention are set forth in the accompanying drawings and the description below.
Brief Description of the Figures
The foregoing and other objects, aspects, features, and advantages of the invention will become more apparent and better understood by referring to the following description taken in conjunction with the accompanying drawings, in which:
Figure 1 A is a block diagram illustrative of an embodiment of a networked environment useful for the systems and methods described in this document;
Figure IB is a block diagram illustrative of a certain embodiment of a computing machine for practicing the methods and systems described herein;
Figure 2A is a diagram of an embodiment of a navigation mesh modified by a dynamic blocking box;
Figure 2B is a diagram of the navigation mesh of Figure 2A, during application of an embodiment of a dynamic blocking algorithm;
Figures 3 A and 3B are diagrams of a navigation mesh with applied embodiments of a cutting algorithm without elimination of small triangles and with elimination of small triangles, respectively;
Figure 4 is a signal flow diagram of an embodiment of a movement engine executing position control and collision avoidance;
Figure 5 is a flow chart of an embodiment of a position calculation routine; and
Figures 6A-6D illustrate an embodiment of collision detection and avoidance. The features and advantages of the present invention will become more apparent from the detailed description set forth below when taken in conjunction with the drawings, in which like reference characters identify corresponding elements throughout. In the drawings, like reference numbers generally indicate identical, functionally similar, and/or structurally similar elements.
Detailed Description of the Invention
Prior to discussing methods and systems for intelligent navigation, it may be helpful to discuss embodiments of computing systems useful for practicing these methods and systems. Referring first to Figure 1A, illustrated is one embodiment of a networked environment 101 in which a simulated environment can be provided. As shown in FIG. 1A, the networked environment 101 includes one or more client machines 102A-102N (generally referred to herein as "client machine(s) 102" or "client(s) 102") in communication with one or more servers 106A-106N (generally referred to herein as "server machine(s) 106" or "server(s) 106") over a network 104. The client machine(s) 102 can, in some embodiments, be referred to as a single client machine 102 or a single group of client machines 102, while server (s) 106 may be referred to as a single server 106 or a single group of servers 106. Although four client machines 102 and four server machines 106 are depicted in FIG. 1A, any number of clients 102 may be in communication with any number of servers 106. In one embodiment a single client machine 102 communicates with more than one server 106, while in another embodiment a single server 106 communicates with more than one client machine 102. In yet another embodiment, a single client machine 102 communicates with a single server 106. Further, although a single network 104 is shown connecting client machines 102 to server machines 106, it should be understood that multiple, separate networks may connect a subset of client machines 102 to a subset of server machines 106. In one embodiment, the computing environment 101 can include an appliance (not shown in FIG. 1A) installed between the server(s) 106 and client machine(s) 102. This appliance can mange client/server connections, and in some cases can load balance connections made by client machines 102 to server machines 106. Suitable appliances are manufactured by any one of the following companies: the Citrix Systems Inc. Application Networking Group; Silver Peak Systems, Inc, both of Santa Clara, California; Riverbed Technology, Inc. of San Francisco, California; F5 Networks, Inc. of Seattle, Washington; or Juniper Networks, Inc. of Sunnyvale, California.
Clients 102 and server 106 may be provided as a computing device 100, a specific embodiment of which is illustrated in Figure IB. Included within the computing device 100 is a system bus 150 that communicates with the following components: a central processing unit 121; a main memory 122; storage memory 128; an input/output (I/O) controller 123; display devices 124A-124N; an installation device 116; and a network interface 118. In one
embodiment, the storage memory 128 includes: an operating system, software routines, and a client agent 120. The I/O controller 123, in some embodiments, is further connected one or more input devices. As shown in Figure IB, the I/O controller 123 is connected to a camera 125, a keyboard 126, a pointing device 127, and a microphone 129.
Embodiments of the computing machine 100 can include a central processing unit 121 characterized by any one of the following component configurations: logic circuits that respond to and process instructions fetched from the main memory unit 122; a microprocessor unit, such as: those manufactured by Intel Corporation; those manufactured by Motorola
Corporation; those manufactured by Transmeta Corporation of Santa Clara, California; the RS/6000 processor such as those manufactured by International Business Machines; a processor such as those manufactured by Advanced Micro Devices; or any other combination of logic circuits. Still other embodiments of the central processing unit 122 may include any combination of the following: a microprocessor, a microcontroller, a central processing unit with a single processing core, a central processing unit with two processing cores, or a central processing unit with more than one processing core.
While Figure IB illustrates a computing device 100 that includes a single central processing unit 121, in some embodiments the computing device 100 can include one or more processing units 121. In these embodiments, the computing device 100 may store and execute firmware or other executable instructions that, when executed, direct the one or more processing units 121 to simultaneously execute instructions or to simultaneously execute instructions on a single piece of data. In other embodiments, the computing device 100 may store and execute firmware or other executable instructions that, when executed, direct the one or more processing units to each execute a section of a group of instructions. For example, each processing unit 121 may be instructed to execute a portion of a program or a particular module within a program.
In some embodiments, the processing unit 121 can include one or more processing cores. For example, the processing unit 121 may have two cores, four cores, eight cores, etc. In one embodiment, the processing unit 121 may comprise one or more parallel processing cores. The processing cores of the processing unit 121 may in some embodiments access available memory as a global address space, or in other embodiments, memory within the computing device 100 can be segmented and assigned to a particular core within the processing unit 121. In one embodiment, the one or more processing cores or processors in the computing device 100 can each access local memory. In still another embodiment, memory within the computing device 100 can be shared amongst one or more processors or processing cores, while other memory can be accessed by particular processors or subsets of processors. In embodiments where the computing device 100 includes more than one processing unit, the multiple processing units can be included in a single integrated circuit (IC). These multiple processors, in some embodiments, can be linked together by an internal high speed bus, which may be referred to as an element interconnect bus. In embodiments where the computing device 100 includes one or more processing units 121, or a processing unit 121 including one or more processing cores, the processors can execute a single instruction simultaneously on multiple pieces of data (SIMD), or in other embodiments can execute multiple instructions simultaneously on multiple pieces of data (MIMD). In some embodiments, the computing device 100 can include any number of SIMD and MIMD processors.
The computing device 100, in some embodiments, can include a graphics processor or a graphics processing unit (not shown). The graphics processing unit can include any
combination of software and hardware, and can further input graphics data and graphics instructions, render a graphic from the inputted data and instructions, and output the rendered graphic. In some embodiments, the graphics processing unit can be included within the processing unit 121. In other embodiments, the computing device 100 can include one or more processing units 121, where at least one processing unit 121 is dedicated to processing and rendering graphics.
One embodiment of the computing device 100 provides support for any one of the following installation devices 116: a CD-ROM drive, a CD-R/RW drive, a DVD-ROM drive, tape drives of various formats, USB device, a bootable medium, a bootable CD, a bootable CD for GNU/Linux distribution such as KNOPPIX®, a hard-drive or any other device suitable for installing applications or software. Applications can in some embodiments include a client agent 120, or any portion of a client agent 120. The computing device 100 may further include a storage device 128 that can be either one or more hard disk drives, or one or more redundant arrays of independent disks; where the storage device is configured to store an operating system, software, programs applications, or at least a portion of the client agent 120. A further embodiment of the computing device 100 includes an installation device 116 that is used as the storage device 128. Embodiments of the computing device 100 include any one of the following I/O devices 130A-130N: a camera 125, keyboard 126; a pointing device 127; a microphone 129; mice;
trackpads; an optical pen; trackballs; microphones; drawing tablets; video displays; speakers; inkjet printers; laser printers; and dye-sublimation printers; touch screen; or any other input/output device able to perform the methods and systems described herein. An I/O controller 123 may in some embodiments connect to multiple I/O devices 103A-130N to control the one or more I/O devices. Some embodiments of the I/O devices 130A-130N may be configured to provide storage or an installation medium 116, while others may provide a universal serial bus (USB) interface for receiving USB storage devices such as the USB Flash Drive line of devices manufactured by Twintech Industry, Inc. Still other embodiments include an I/O device 130 that may be a bridge between the system bus 150 and an external
communication bus, such as: a USB bus; an Apple Desktop Bus; an RS-232 serial connection; a SCSI bus; a FireWire bus; a FireWire 800 bus; an Ethernet bus; an AppleTalk bus; a Gigabit Ethernet bus; an Asynchronous Transfer Mode bus; a HIPPI bus; a Super HIPPI bus; a SerialPlus bus; a SCI/LAMP bus; a FibreChannel bus; or a Serial Attached small computer system interface bus.
In some embodiments, the computing machine 100 can execute any operating system, while in other embodiments the computing machine 100 can execute any of the following operating systems: versions of the MICROSOFT WINDOWS operating systems such as
WINDOWS 3.x; WINDOWS 95; WINDOWS 98; WINDOWS 2000; WINDOWS NT 3.51; WINDOWS NT 4.0; WINDOWS CE; WINDOWS XP; WINDOWS VISTA; and WINDOWS 7; the different releases of the Unix and Linux operating systems; any version of the MAC OS manufactured by Apple Computer; OS/2, manufactured by International Business Machines; any embedded operating system; any real-time operating system; any open source operating system; any proprietary operating system; any operating systems for mobile computing devices; or any other operating system. In still another embodiment, the computing machine 100 can execute multiple operating systems. For example, the computing machine 100 can execute PARALLELS or another virtualization platform that can execute or manage a virtual machine executing a first operating system, while the computing machine 100 executes a second operating system different from the first operating system.
The computing machine 100 can be embodied in any one of the following computing devices: a computing workstation; a desktop computer; a laptop or notebook computer; a server; a handheld computer; a mobile telephone; a portable telecommunication device; a media playing device; a gaming system; a mobile computing device; a netbook; a device of the IPOD family of devices manufactured by Apple Computer; any one of the PLAYSTATION family of devices manufactured by the Sony Corporation; any one of the Nintendo family of devices manufactured by Nintendo Co; any one of the XBOX family of devices manufactured by the Microsoft Corporation; or any other type and/or form of computing, telecommunications or media device that is capable of communication and that has sufficient processor power and memory capacity to perform the methods and systems described herein.
In other embodiments the computing machine 100 can be a mobile device such as any one of the following mobile devices: a JAVA-enabled cellular telephone or personal digital assistant (PDA), such as the i55sr, i58sr, i85s, i88s, i90c, i95cl, or the imllOO, all of which are manufactured by Motorola Corp; the 6035 or the 7135, manufactured by Kyocera; the i300 or i330, manufactured by Samsung Electronics Co., Ltd; the TREO 180, 270, 600, 650, 680, 700p, 700w, or 750 smart phone manufactured by Palm, Inc; any computing device that has different processors, operating systems, and input devices consistent with the device; or any other mobile computing device capable of performing the methods and systems described herein. In still other embodiments, the computing device 100 can be any one of the following mobile computing devices: any one series of Blackberry, or other handheld device manufactured by Research In Motion Limited; the iPhone manufactured by Apple Computer; Palm Pre; a Pocket PC; a Pocket PC Phone; or any other handheld mobile device. In yet still other embodiments, the computing device 100 may a smart phone or tablet computer, including products such as the iPhone or iPad manufactured by Apple, Inc. of Cupertino, CA; the BlackBerry devices manufactured by Research in Motion, Ltd. of Waterloo, Ontario, Canada; Windows Mobile devices manufactured by Microsoft Corp., of Redmond, WA; the Xoom manufactured by
Motorola, Inc. of Libertyville, IL; devices capable of running the Android platform provided by Google, Inc. of Mountain View, CA; or any other type and form of portable computing device.
In still other embodiments, the computing device 100 can be a virtual machine. The virtual machine can be any virtual machine managed by a hypervisor developed by
XenSolutions, Citrix Systems, IBM, VMware, or any other hypervisor. In still other
embodiments, the virtual machine can be managed by a hypervisor executing on a server 106 or a hypervisor executing on a client 102.
In still other embodiments, the computing device 100 can in some embodiments execute, operate or otherwise provide an application that can be any one of the following: software; an application or program; executable instructions; a virtual machine; a hypervisor; a web browser; a web-based client; a client-server application; an ActiveX control; a Java applet; software related to voice over internet protocol (VoIP) communications like a soft IP telephone; an application for streaming video and/or audio or receiving and playing streamed video and/or audio; an application for facilitating real-time-data communications; a HTTP client; a FTP client; or any other set of executable instructions. Still other embodiments include a client device 102 that displays application output generated by an application remotely executing on a server 106 or other remotely located machine. In these embodiments, the client device 102 can display the application output in an application window, a browser, or other output window.
The computing device 100 may further include a network interface 118 to interface to a Local Area Network (LAN), Wide Area Network (WAN) or the Internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links [e.g., 802.11, Tl, T3, 56kb, X.25, SNA, DECNET), broadband connections e.g., ISDN, Frame Relay, ATM, Gigabit Ethernet, Ethernet-over-SONET), wireless connections, or some combination of any or all of the above. Connections can also be established using a variety of communication protocols (e.g., TCP/IP, IPX, SPX, NetBIOS, Ethernet, ARCNET, SONET, SDH, Fiber Distributed Data Interface (FDDI), RS232, RS485, IEEE 802.11, IEEE 802.11a, IEEE 802.11b, IEEE 802.11g, CDMA, GSM, WiMax and direct asynchronous connections). The network 104 can comprise one or more sub-networks, and can be installed between any combination of the clients 102, servers 106, computing machines and appliances included within the computing environment 101. In some embodiments, the network 104 can be: a local-area network (LAN); a metropolitan area network (MAN); a wide area network (WAN); a primary network 104 comprised of multiple sub-networks 104 located between the client machines 102 and the servers 106; a primary public network 104 with a private sub-network 104; a primary private network 104 with a public sub-network 104; or a primary private network 104 with a private sub-network 104. The network topology of the network 104 can differ within different embodiments, possible network topologies include: a bus network topology; a star network topology; a ring network topology; a repeater-based network topology; or a tiered-star network topology. Additional embodiments may include a network 104 of mobile telephone networks that use a protocol to communicate among mobile devices, where the protocol can be any one of the following: AMPS; TDMA; CDMA; GSM; GPRS UMTS; or any other protocol able to transmit data among mobile devices.
The computing environment 101 can include more than one server 106A-106N such that the servers 106A-106N are logically grouped together into a server farm 106. The server farm 106 can include servers 106 that are geographically dispersed and logically grouped together in a server farm 106, servers 106 that are located proximate to each other and logically grouped together in a server farm 106, or several virtual servers executing on physical servers. Geographically dispersed servers 106A-106N within a server farm 106 can, in some
embodiments, communicate using a WAN, MAN, or LAN, where different geographic regions can be characterized as: different continents; different regions of a continent; different countries; different states; different cities; different campuses; different rooms; or any combination of the preceding geographical locations. In some embodiments the server farm 106 may be administered as a single entity, while in other embodiments the server farm 106 can include multiple server farms 106.
In some embodiments, the systems discussed herein may be used to generate navigation meshes from a map. Navigation meshes, frequently referred to as a NavMesh, may comprise a data structure based on geometry of a virtual or real environment. In some embodiments, a navigation mesh may comprise a graph with each node representing an area. For example, in such embodiments, a navigation map may represent the floor plan of a building. The footprint of each room may be declared as a node of the navigation map, and the doors between the rooms may be considered as edges between these nodes. The floor plan of the building may thus be defined as a graph of nodes and edges, which may be used by a movement module to calculate accessible areas from each current area.
In some embodiments, a NavMesh may be based on triangular geometry, while in other embodiments, other geometric primitives may be utilized. In one embodiment, a NavMesh may comprise an outline of the traversable surfaces of the environment or virtual world. The triangles or other primitives may be combined to form convex polygons to map the surface of the virtual landscape. A polygon is considered convex if, from an arbitrary point within the object, a straight line may be drawn to any other point within the object without crossing an edge. As a result, a convex polygon may represent an area within which an agent may freely travel, such as an empty room, without collision with any wall or other object.
A NavMesh can be built in various ways. The simplest possibility is for a user to create the NavMesh manually. Another possibility is to derive the NavMesh from already existing meshes of the game environment. A mesh may comprise a contiguous arrangement of triangles, wherein neighboring triangles have two common corner points. For each corner point, it is possible to generate a list of triangles which use said corner point. In most of the games available, there are no meshes from which a NavMesh can be derived. In such cases, a mesh may be generated manually.
In one embodiment, a navigation mesh may include areas marked as non-traversable or non-walkable. For example, based on input geometry, areas may be flagged as non- walkable such that the navigation mesh includes a virtual wall or unclimbable obstacle within the mesh. As this changes the navigation mesh, in many embodiments, the non-traversable status of these areas may not be undone during runtime to remove the blocked area. Thus, these virtual walls or objects may be described as static blockings of parts of the mesh.
According to the systems and methods discussed herein, however, navigation meshes need not be static. The navigation mesh may include one or more placeholders or keys to trigger post-processing of the navigation mesh. Post-processing may include removing parts of a navigation mesh that cannot be reached by an entity to allow for dynamic respawning or placement within a map. Post-processing may further include one or more static blockings to remove one or more parts of a navigation mesh within a defined bounding box. In some embodiments, post-processing may also include plane-based blocking to define directional edges within the navigation graph. In other embodiments, post-processing may include dynamic blocking. Rather than removing parts of a navigation mesh, dynamic blocking may comprise temporarily blocking parts of the mesh from being entered or limiting edge transiting. Accordingly, edges may be dynamically enabled or disabled during execution. These techniques may be useful for simulating doors, destructible objects, or other features that may be modified during runtime.
In one embodiment, by using directional dynamic blocking planes, unit flow of computer-controlled agents may be directed, allowing, for example, a computer-generated crowd to generally proceed from one area to another regardless of the direction of individual units within the crowd. These planes may act as one-way virtual walls, such that when a unit has proceeded from one region to another, it may not return. Such techniques may also be useful in simulating fleeing agents by ensuring that a pathfinder does not have access to nodes that would direct the agent towards a pursuer.
In one embodiment, dynamic blocking may be performed using a cutting mesh or plane that defines a region. The planes may be bounded to prevent an infinite plane from modifying the entire mesh. In one embodiment, the plane may be bounded in the shape of a box, and may be referred to as a blocking box. In another embodiment, the plane may be a blocking plane, and may comprise a rectangle that is coplanar to a plane.
In brief overview, a cutting mesh may define an intersecting plane to a navigation mesh to cut the existing navigation mesh into one or more half spaces, defined as within the blocking box/plane or outside the blocking box/plane. A bounding box may restrict the intersecting plane to limit cutting across the entire navigation mesh. In some embodiments, multiple orthogonal planes may be used to define three-dimensional blocked areas, such as a blocking box. For example, a box may require 6 half-space cuts to define the region. In one embodiment, cuts may be modified with lower limit boundaries or the cutting plane may be locally modified to eliminate the generation of very thin triangles. Several sets of half spaces cuts may be combined into a single mesh, and triangles may be extracted along the cut plane.
In some embodiments, existing vertices of the navigation mesh may be defined in relation to a cutting plane or bounded cutting plane if they are in or out of the half space or on the plane and by introducing new edges and vertices into the input mesh if necessary. For example, referring briefly to FIG. 2A, illustrated is an embodiment of an example input mesh 200 cut against six half spaces 202 forming a box (dashed edge).
In one embodiment, existing vertices may be classified as outside vertices 204 and inside vertices 206. Outside vertices 204, of which several examples are labeled, may comprise vertices outside blocking box 202. Inside vertices 206, of which several examples are likewise labeled, may comprise vertices inside blocking box 202. Similarly, vertices on planes 208 may also be classified. The dashed lines 202 are the new introduced edges which result from the six cutting planes (though only four are shown in the two-dimensional view of FIG. 2A).
To create the blocking box of FIG. 2A, the cutting mesh is used to apply several cuts with half spaces where each cut restricts which vertices are inside and which vertices are outside of the convex hull. For example, referring now to FIG. 2B, illustrated is a result of the navigation mesh during the second, horizontal cut along plane 210. Vertices on one side of the plane, which may have negative or positive coordinates relative to the plane, depending on embodiment, may be designated as outside the plane, such as vertices 212. Vertices on the inside 214 or on the plane 216 may be similarly classified. As shown, through the use of multiple half space cutting planes, vertices and edges inside or outside of the blocking box may be classified. In some embodiments, unnecessary cuts may be removed after all half space cuts have been applied.
To prevent the generation of degenerated or tiny triangles, wasting storage and CPU utilization in the path finder, vertices close to the cutting plane may, in some embodiments, be classified as on the plane. For example, referring briefly to FIGs. 3 A and 3B, illustrated are embodiments of a cutting algorithm without elimination of small triangles and with elimination of small triangles, respectively. If we take a look at the next two images we can see the difference between the feature being enabled and disabled. Vertices 300 generated by the cut of plane 210 that are close to vertices of navigation mesh 302 may be removed or eliminated, preventing generation of small triangles by the cut. In some embodiments, the algorithm will reuse existing vertices within a user specified range, thus introducing an error to the cut, but preventing such small triangles which will result in possible errors in algorithms that use the data structure of the navigation mesh such as path search or path smoothing.
As discussed above, in some embodiments, a blocking box may be generated through iteratively applying the cutting mesh algorithm through six individual planes to define the box shape. In one embodiment, an edge identifier or ID for each cut edge of the box may be stored in a blocking box path object, such that the edge connections of the navigation mesh may be toggled as the box is enabled or disabled dynamically. For all cut edges of this object that edge id will be stored inside the blocking box path object so that it can toggle the edge connection in the navigation mesh depending on whether its state is enabled or disabled.
Similarly, in some embodiments, a blocking plane may be generated through applying the cutting mesh algorithm with a single plane but a with a cut restriction of the bounding box of the rectangle. This prevents the half space cut from cutting the entire input mesh along its plane (since planes are endless). Cutting edges may be of the approximate length of the blocking plane, and the edge may be stored with the blocking plane object, such that it can be toggled as the edge is dynamically enabled or disabled.
In a further embodiment, the blocking plane may include a direction orthogonal to the plane, such that edges on one side of the plane may be enabled while edges on the other side are disabled. In one embodiment, the normal of the blocking plane may be referred to as a first direction, defining a first face of the plane. The opposite direction from the plane or back side may be referred to as a second face of the plane. In one embodiment, the edge connection on one face at a time may be enabled for path finding engines or traversal by an agent , such that the agent may move from nodes normal to the first face to nodes normal to the second face. In some embodiments, the directionality may be arbitrarily toggled during runtime. Accordingly, a one-way directionality across the plane may be defined for path generation. This feature can be used to control the flow of computer-controlled entities or agents inside the simulation. For example, if a player advances to a certain location inside the simulation, blocking planes can be set to a one-way connection in the direction of the player. This prevents computer-controlled agents from running away from the player since they can no longer reach areas beyond those blockings.
In some embodiments, each navigation mesh node may comprise data fields or strings for allowing a user to store additional or custom data within the node. This data may comprise annotations to nodes, and may be used for further processing. For example, in one embodiment, the data may be read or processed when an agent or player arrives at the node.
The systems discussed herein may also be used to control movement of computer- controlled entities or agents within a navigation mesh. In some embodiments, the system enables the control and movement of units in a virtual world representation defined by a navigation mesh. In other embodiments, the system allows for dynamic and static obstacle avoidance, including avoiding other agents.
In one embodiment, the system may comprise a movement engine for controlling motion of agents and other computer-controlled entities. Movement engine may comprise an application, service, daemon, routine, or other executable code for managing user defined movement entities, such as non-player characters, vehicles, autonomous entities, or other agents. In one embodiment, a movement engine may comprise one or more interfaces for managing movement entities within a simulation world.
In some embodiments, a movement engine may execute a plurality of threads simultaneously. Accordingly, the movement engine may control movement of multiple entities in different areas or regions within the world. For example, in a game with multiple dungeon instances or levels or disjointed regions, a single movement engine may control movement of agents in the different areas simultaneously. In another embodiment, a single movement engine thread may control movement of the different agents according to a schedule.
In still another embodiment, a single movement engine may provide movement commands to multiple simulations simultaneously. For example, in one such embodiment, a plurality of clients may request movement commands for agents from a server or server farm executing a movement engine. Thus, clients lacking processing power to execute path finding and smoothing for a large number of computer-controlled agents may still receive the benefit of realistic artificial intelligence. This may allow processor-limited devices, such as smart phones, to execute realistic simulations.
Referring now to FIG. 4, illustrated is a flow diagram of an embodiment of a movement engine executing position control and collision avoidance. In brief overview, a movement engine may execute one or more subroutines or threads. The routines may include a Simulation World 400 routine for managing all move entities that are controlled by the movement engine and interact with each other. The Simulation World 400 routine may control scheduling of movement control and collision detection for a plurality of agents. In some embodiments, the routines may include a MoveEntity 402 routine for managing velocity, position and direction of a movement entity or agent scheduled by the
Simulation World 400. In other embodiments, the routines may include a Collisionlnterface 404 routine for potential determining potential collisions in the surrounding area. Potential collisions may be due to other agents or movement entities, or due to dynamic objects including blocking objects, discussed above.
An embodiment of a scheduling routine is illustrated in FIG. 4. As shown, in some embodiments, for each entity or agent controlled by the movement engine, the engine may retrieve position and velocity information for the agent. The engine may further retrieve information regarding any obstacle in the region local to the agent. Once position and velocity information for all entities or agents have been updated, the movement engine may iteratively loop through each agent to identify a path or new velocity/direction to for the entity.
Referring now to FIG. 5, illustrated is a flow chart of an embodiment of a position calculation routine. Each entity or agent may have a state of idle, following a path, or avoiding a potential collision. In brief overview, if an entity has a next position calculated (i.e. is following a path) or is idle, the movement engine may move onto the next entity. In some embodiments, the movement engine may calculate a next position along the path or determine whether the entity should no longer be idle and calculate a path accordingly. If the movement engine determines that a next position has not been calculated, the movement engine may compute a braking distance for the entity. In many embodiments, the braking distance may be responsive to a current speed or velocity of the entity. In one embodiment, responsive to braking, the movement engine may determine a new speed for the entity. For example, the entity may slow from a first speed to a second speed responsive to braking between a first position and second position. Responsive to the new speed, the movement engine may compute a new position along a path.
In one embodiment, the movement engine may identify whether any dynamic colliders exist within a sensor distance of the entity. In some embodiments, an agent or entity may comprise one or more percepts to request or perceive states of the world. In one embodiment, a percept simulates a human's senses, such as sight or hearing. So that an agent can receive or perceive a percept, it needs to have a suitable sensor. This simulates a sensory organ, for example, that is to say the ability to see or to hear, for example. Every agent is therefore usually asigned at least one prescribed virtual sensor, and the percepts are filtered to suit the respective sensors of the agent. In other words: the information available to the agents is filtered according to what the respective agents can perceive. This allows very realistic and flexible multi-agent systems to be implemented. In many embodiments, a sensor may have a predetermined range. In one embodiment, a range may be adjusted to control how early an entity anticipates and avoids collisions.
The movement engine may identify if one or more dynamic colliders exist within range of the entity's sensor. If not, the movement engine may store the position on the path and/or send the new position, direction, and/or velocity to the entity. If one or more colliders exist, the movement engine may sample the static environment of the entity to determine potential paths. The movement engine may weight the influence of dynamic colliders according to soft and hard collisions, based on whether a path from the entity intersects another entity or whether the path is within a predetermined radius from the entity. The movement engine may choose a direction from the potential paths, responsive in some embodiments to current direction and velocity, and may send the new direction, velocity, and/or position to the entity.
Referring now to FIGs. 6A-6D, illustrated is top view of an embodiment of collision detection and avoidance within a virtual world. In brief overview, in some embodiments, the avoidance system queries all entities in the sensor radius from the agent for position. The entities or potential colliders are projected a slice, or ray in a direction from the agent. Slices that contain an entity are weighted by a "not wanted value" value that is used to choose a direction for travel. After all other entities in the sensor radius were added to the slices, the slice with the lowest "not wanted value" is chosen and the entity moves in the middle direction of this slice.
In a further embodiment, the system determines not just current positions for other entities, but also potential paths by tracing rays on the navigation mesh for the entities. Accordingly, the system may extrapolate the movement of each entity and uses the projected position to identify potential collisions.
For example, referring first to FIG. 6A, an agent 600 may include a collision radius 602 defining the bounds of the agent 600. The agent 600 may also have a soft radius 604, defining a boundary in which the agent would prefer not to cross other entities, but still can. For example, people may stand shoulder to shoulder or nose to nose without touching, but may be quite uncomfortable. Personal space thus defines a soft radius around each person. The agent may further have a path direction 606, and a sensor radius 608. In many embodiments, sensor radius 608 may be limited to reduce processor requirements, while in other embodiments, sensor radius 608 may be varied based on environmental qualities. For example, in foggy environments, a sensor representing vision may have a reduced range, increasing the likelihood of near-collisions.
The movement engine may identify one or more colliders 610a, 610b. Each collider may be another agent or entity, or may be a static or dynamic boundary or object, such as a table or chair. The movement engine may further identify a projected position of the corresponding collider 612a, 612b. In some embodiments, such as where the collider is a non-moving object, the projected position may be the same as the current position. In other embodiments, such as where the movement engine is iteratively calculating next positions for each agent, the projected position may comprise an previously-calculated next position for the agent.
The possible colliders 612a, 612b are projected into the slices or segments of the circle around the agent 600. Referring now to FIG. 6B, the circle may be divided into a number of slices responsive to the size of the collision radius 602, such that agents with smaller collision radii have more potential paths. To ensure that entities do not collide, the radius of the collider may be expanded by the radius of the agent 614. This allows a more precise navigation than collision systems that only trigger if the center of the collider is impacted by the radius of the agent. In some embodiments, a slice is only affected by a collider if the middle line of the slice intersects with the collider. As shown in FIG. 6B, an agent 600 may have multiple potential slices 616 that are free, or do not intersect with colliders. The agent 600 may also have multiple slices 618 that pass through one or more soft radii of projected colliders 614a, 614b. The agent further may have one or more slices 620 that intersect hard radii of projected colliders 614a, 614b.
Each slice may be associated with a value. In one embodiment, the value may be stored as the length of a vector along the slice. Referring now to FIG. 6C, vectors
representing each slice may be reduced in length by a by a "not wanted" value responsive to whether the vector intersects a hard or soft radii of a potential collider. To achieve the best result in narrow environments, in some embodiments, the computation of this value is identical to the "not wanted" value generated by the sampling of the static environment. In another embodiment, each vector length may be modified by a current direction and velocity of the agent, reducing the likelihood that the agent reverses direction drastically. As shown in FIG. 6C, the movement engine may determine varying lengths for each vector, responsive to potential collisions and current velocity of the agent. In one embodiment, the movement engine may select a longest vector, or one of a plurality of equally long vectors, to determine a next position for the agent.
Referring briefly to FIG. 6D, in a further embodiment, to produce a more realistic crowd movement for agents, the movement engine may also compute the relative speed for each potential collider. If a potential collider and the agent have a relative speed below a predetermined threshold value (i.e. if they are moving in roughly the same direction and speed), the movement engine may ignore or reduce the "not wanted" adjustment for vectors intersecting with the collider. Thus, if a first agent is in front of a second agent, but they are moving in the same direction at the same speed, the second agent will be unlikely to alter its path.
It should be understood that the systems described above may provide multiple ones of any or each of those components and these components may be provided on either a standalone machine or, in some embodiments, on multiple machines in a distributed system. The systems and methods described above may be implemented as a method, apparatus or article of manufacture using programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof. In addition, the systems and methods described above may be provided as one or more computer-readable programs embodied on or in one or more articles of manufacture. The term "article of manufacture" as used herein is intended to encompass code or logic accessible from and embedded in one or more computer-readable devices, firmware, programmable logic, memory devices (e.g., EEPROMs, ROMs, PROMs, RAMs, SRAMs, etc.), hardware (e.g., integrated circuit chip, Field Programmable Gate Array (FPGA), Application Specific Integrated Circuit (ASIC), etc.), electronic devices, a computer readable nonvolatile storage unit (e.g., CD-ROM, floppy disk, hard disk drive, etc.). The article of manufacture may be accessible from a file server providing access to the computer-readable programs via a network transmission line, wireless transmission media, signals propagating through space, radio waves, infrared signals, etc. The article of manufacture may be a flash memory card or a magnetic tape. The article of manufacture includes hardware logic as well as software or programmable code embedded in a computer readable medium that is executed by a processor. In general, the computer-readable programs may be implemented in any programming language, such as LISP, PERL, C, C++, C#, PROLOG, or in any byte code language such as JAVA. The software programs may be stored on or in one or more articles of manufacture as object code.
While various embodiments of the methods and systems have been described, these embodiments are exemplary and in no way limit the scope of the described methods or systems. Those having skill in the relevant art can effect changes to form and details of the described methods and systems without departing from the broadest scope of the described methods and systems. Thus, the scope of the methods and systems described herein should not be limited by any of the exemplary embodiments and should be defined in accordance with the accompanying claims and their equivalents.

Claims

What is Claimed:
1. A device substantially as described herein with reference to and as illustrated by the accompanying drawings.
2. A method substantially as described herein with reference to and as illustrated by the accompanying drawings.
PCT/IB2012/001797 2011-08-16 2012-08-16 Systems and methods for intelligent navigation WO2013024348A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161524310P 2011-08-16 2011-08-16
US61/524,310 2011-08-16

Publications (1)

Publication Number Publication Date
WO2013024348A2 true WO2013024348A2 (en) 2013-02-21

Family

ID=47215670

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2012/001797 WO2013024348A2 (en) 2011-08-16 2012-08-16 Systems and methods for intelligent navigation

Country Status (1)

Country Link
WO (1) WO2013024348A2 (en)

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
No further relevant documents disclosed

Similar Documents

Publication Publication Date Title
US10573071B2 (en) Path planning for virtual reality locomotion
US10922876B2 (en) Saccadic redirection for virtual reality locomotion
JP2019516146A (en) Robot obstacle avoidance control system, method, robot and storage medium
Grinberg et al. Scalable parallel collision detection simulation.
US12039354B2 (en) System and method to operate 3D applications through positional virtualization technology
US9978176B2 (en) Simplifying small mesh components with redundant backs
Kuiper et al. Agent vision in multi-agent based simulation systems
US7526456B2 (en) Method of operation for parallel LCP solver
Karmakharm et al. Agent-based Large Scale Simulation of Pedestrians With Adaptive Realistic Navigation Vector Fields.
Gayle et al. Interactive navigation of heterogeneous agents using adaptive roadmaps
Rahman et al. Towards accelerated agent-based crowd simulation for Hajj and Umrah
Majid et al. GPU-based Optimization of Pilgrim Simulation for Hajj and Umrah Rituals.
WO2013024348A2 (en) Systems and methods for intelligent navigation
Echegaray et al. A methodology for optimal voxel size computation in collision detection algorithms for virtual reality
US20220319096A1 (en) Machine-learning based collision detection for objects in virtual environments
Solar et al. High performance individual-oriented simulation using complex models
JP6194747B2 (en) Mapping processing program, method and apparatus
Hesham et al. Explicit modeling of personal space for improved local dynamics in simulated crowds
Johnson et al. Cognitive model of agent exploration with vision and signage understanding
Sobota et al. On building an object-oriented parallel virtual reality system
Kharevych et al. 3D physics engine for elastic and deformable bodies
US20240160888A1 (en) Realistic, controllable agent simulation using guided trajectories and diffusion models
Jin et al. Multi-agent pathfinding system implemented on XNA
Schock et al. Continuous Appropriately-Oriented Collision Detection Algorithm for Physics-Based Simulations
Haciomeroglu et al. Hardware‐accelerated dynamic clustering of virtual crowd members

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12788254

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 02/06/2014)

122 Ep: pct application non-entry in european phase

Ref document number: 12788254

Country of ref document: EP

Kind code of ref document: A2