WO2019121795A1 - Method and apparatus managing entities in a physical space - Google Patents

Method and apparatus managing entities in a physical space Download PDF

Info

Publication number
WO2019121795A1
WO2019121795A1 PCT/EP2018/085666 EP2018085666W WO2019121795A1 WO 2019121795 A1 WO2019121795 A1 WO 2019121795A1 EP 2018085666 W EP2018085666 W EP 2018085666W WO 2019121795 A1 WO2019121795 A1 WO 2019121795A1
Authority
WO
WIPO (PCT)
Prior art keywords
mobile entity
reference point
physical space
indication
graphical representation
Prior art date
Application number
PCT/EP2018/085666
Other languages
French (fr)
Inventor
Jérémie GARCIA
Stéphane CONVERSY
Nicolas SAPORITO
Guilhem BUISAN
Original Assignee
Ecole Nationale De L'aviation Civile
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecole Nationale De L'aviation Civile filed Critical Ecole Nationale De L'aviation Civile
Publication of WO2019121795A1 publication Critical patent/WO2019121795A1/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • G08G1/0145Measuring and analyzing of parameters relative to traffic conditions for specific applications for active traffic flow control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0112Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0133Traffic data processing for classifying traffic situation
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096775Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0017Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
    • G08G5/0026Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located on the ground
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/0969Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/164Centralised systems, e.g. external to vehicles

Definitions

  • the present invention relates generally to the managing of entities in a physical space.
  • Various classes of oversight of mobile entities can be defined including pilots and drivers on one hand, and supervisors, air traffic controllers, vessel traffic controllers and the like who have a general responsibility for entities in a given area on the other.
  • the working conditions of these classes of individuals are affected by current technological trends, and furthermore significant convergence of their roles.
  • vehicles are increasingly autonomous, so that the role of the driver or pilot is increasingly supported by electronic tools such as navigation tools, and/or entrusted to a remote operator who can take over control of the vehicle via a telecommunications channel at critical instants.
  • supervision and guidance tasks may be increasingly supported by information technology, so that one individual can be expected to supervise an ever larger area, or increasingly, to supervise several different areas, or attribute only a portion of their attention to traffic considerations, with the relevant traffic information being relayed from the respective areas via telecommunications means.
  • a mobile entity manager for managing a mobile entity in a physical space.
  • the mobile entity manager comprises a graphics renderer adapted to generate a graphical representation of the physical space at a first scale, a representation engine adapted to define a reference point in the physical space, to determine a position of the mobile entity with respect to the reference point, and to determine a predicted time of arrival of the mobile entity within a region defined with respect to the reference point.
  • the graphics renderer is further adapted to modify the graphical representation to incorporate an indication of the position of the mobile entity on a path with respect to the region, wherein the indication is situated in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival.
  • a method of managing a mobile entity in a physical space comprising the steps of: defining a graphical representation of the physical space at a first scale, defining a reference point in the physical space, determining a position of the mobile entity with respect to the reference point, determining a predicted time of arrival of the mobile entity within a region defined with respect to the reference point, and modifying the graphical representation to incorporate an indication of the position of the mobile entity on a path with respect to the region, wherein the indication is situated in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival.
  • the method comprises the further steps of receiving a user input specifying a new position of the indication on the path, calculating one or more variations in the speed of the mobile entity which would modify the predicted time of arrival of the mobile entity to correspond to the new position of the indication, and issuing a communication to the mobile entity indicating the variations.
  • the method comprises the further steps of receiving a user input specifying a new position of the indication on the path, a modification of the path which would modify the predicted time of arrival of the mobile entity to correspond to the new position of the indication, and issuing a communication to the mobile entity indicating the variations.
  • the indication is situated in a position in graphical representation of the physical space with an orientation with respect to the reference point corresponding to the relative orientation of the entity to the reference point.
  • the method comprises the further steps of: determining a position of a further the mobile entity with respect to the reference point, determining a predicted time of arrival of the further mobile entity within the region, and, modifying the graphical representation to incorporate an indication of the position of the further mobile entity with respect to the region, wherein the indication is situated on the path in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival of the further mobile entity.
  • the method comprises the further step of determining a timing for a convergence of the two entities on the basis of a speed of the first mobile entity and a speed of the further mobile entity.
  • the path corresponds to a path in the physical space that the mobile entity is expected to follow.
  • the factor by which the distance is proportional to the predicted time of arrival varies as a function of the position of the indication thereon, or wherein the factor by which the distance is proportional to the predicted time of arrival varies as a function of the orientation.
  • the method comprises the further step of: modifying the graphical representation to incorporate speed vector indicator graphic, wherein the speed vector indicator graphic has a dimension proportional to the speed of the mobile entity in the physical space.
  • the method comprises a further step of receiving a user input specifying a new position of the reference point, and repeating the steps of determining a position of the mobile entity, determining a predicted time of arrival of the mobile entity, and modifying the graphical representation on the basis of the new position of the reference point.
  • the method comprises the further step of receiving a user input specifying a new scale for the graphical representation, and repeating the steps of determining a position of the mobile entity, determining a predicted time of arrival of the mobile entity, and modifying the graphical representation on the basis of the new scale.
  • the graphical representation is three dimensional.
  • Figure 1 shows a method of managing the behaviour of a mobile entity in a physical space in accordance with an embodiment
  • Figure 2 shows a method of managing the behaviour of a mobile entity in a physical space in accordance with a development of the embodiment of figure 1 ;
  • Figure 3 shows a method of managing the behaviour of a mobile entity in a physical space in accordance with a development of the embodiment of figure 1 ;
  • Figure 4 shows an example of a physical space susceptible to graphical representation in accordance with an embodiment
  • Figure 5 shows an example of a graphical representation of the physical space of figure 4 generated in accordance with an embodiment
  • Figure 6 shows a further example of a graphical representation of the physical space of figure 4 generated in accordance with an embodiment
  • Figure 7 shows a further example of a physical space susceptible to graphical representation in accordance with an embodiment
  • Figure 8 shows an example of a graphical representation of the physical space of figure 7 generated in accordance with an embodiment
  • Figure 9 shows a mobile entity manager in accordance with an embodiment
  • Figure 10 shows a generic computing system suitable for implementation of embodiments of the invention
  • Figure 11 shows a smartphone device adaptable to constitute an embodiment
  • Figure 12 shows an Air Traffic control desk adaptable to constitute an embodiment.
  • Figure 1 shows a method of managing the behaviour of a mobile entity in a physical space in accordance with an embodiment.
  • the method starts at step 100 before proceeding to step 110, at which a graphical representation of the physical space is defined at a first scale.
  • the method next proceeds to step 120 at which a reference point in the physical space is defined.
  • This definition may comprise retrieving a position value from memory, receiving user input, for example as discussed in further detail below, or otherwise.
  • a position of the mobile entity is determined with respect to the reference point.
  • the position of the mobile entity may be obtained directly, for example by means of a radar, lidar or other such system, by triangulation of a signal emitted or reflected by the entity, or indirectly, for example by receiving positioning information, as obtained for example from a GNSS system, from the mobile entity itself, or inferred for example on a dead-reckoning basis from applying time and velocity data to a known starting point, or otherwise.
  • a predicted time of arrival of the mobile entity within a region defined with respect to the reference point is determined.
  • Determination of the predicted time of arrive may comprise determining a speed of the mobile entity, either directly for example by means of a radar, lidar or other such system, by inference from successive location determinations, or indirectly, for example by receiving speed information from the mobile entity itself.
  • the method then finally proceeds to step 150 at which the graphical representation is modified to incorporate an indication of the position of the mobile entity on a path with respect to the region, wherein the indication is situated in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival before terminating at step 190.
  • Figure 2 shows a method of managing the behaviour of a mobile entity in a physical space in accordance with a development of the embodiment of figure 1.
  • the method proceeds through steps 110, 120, 130, 140 and 150 as described with reference to figure 1 , before proceeding to step 260 at which a user input is received specifying a new position of the indication on the path.
  • the method then proceeds to calculate one or more variations in the speed of the mobile entity which would modify the predicted time of arrival of the mobile entity to correspond to the new position of the indication at step 270 before issuing a communication to the mobile entity indicating the variations at step 280 before terminating at step 190.
  • Such speed variations may be obtained on the basis of a model of the mobile entities capacities in terms of acceleration and deceleration, possibly taking into account fuel efficiency considerations, passenger comfort, possible interactions with other entities, and the like, as will occur to the skilled person.
  • Figure 3 shows a method of managing the behaviour of a mobile entity in a physical space in accordance with a development of the embodiment of figure 1.
  • the method proceeds through steps 110, 120, 130, 140 and 150 as described with reference to figure 1 , before proceeding to step 360 at which a position of a further mobile entity is determined with respect to the reference point.
  • the method next proceeds to step 370 at which a predicted time of arrival of the further mobile entity within the region is determined, and then proceeds to step 380 of modifying the graphical representation to incorporate an indication of the position of the further mobile entity with respect to the region, wherein the indication is situated on the path in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival of the further mobile entity.
  • Figure 4 shows an example of a physical space susceptible to graphical representation in accordance with an embodiment.
  • a section of road 400 having two lanes 411 , 412 in a first direction 410 (right to left), and two lanes 421 , 422 in a second direction 420 (left to right).
  • An entity shown as a car 431 is shown as travelling in the first lane 411 in the first direction 410
  • a entity shown as a car 432 is shown as travelling in the second lane 412 in the first direction 410
  • an entity shown as a car 433 is shown as travelling in the first lane 421 in the second direction 420
  • an entity shown as a car 434 is shown as travelling in the second lane 422 in the second direction 420.
  • the road section 400 or alternatively a road direction 410 or 420, or a lane 411 , 412, 421 or 422 may correspond to the physical space 400 as discussed above.
  • Road section 400 has an entry slip road 423.
  • a reference point 440 is defined in the physical space. This reference point 440 may be defined by a user input, with referent to the position of an entity as discussed further below, or be in a defined relation to the physical space or an element thereof.
  • a position of a mobile entity is determined with respect to the reference point.
  • the position of the two mobile entities 433 and 434 are determined with respect to the reference point 440, as indicated by the lines 443 and 444.
  • the position of any number of mobile entities may be determined in this way.
  • the position of every mobile entity, or every mobile entity satisfying certain selection criteria, in the physical space may be thus determined.
  • the position of every mobile entity, or every mobile entity satisfying certain selection criteria, in some other region, which may be larger than or smaller than the physical space may be thus determined.
  • the speed of mobile entity 433 is 75km/h
  • the speed of mobile entity 434 is 50km/h.
  • the distance from the reference point to the first mobile entity 433 is 1 km, while the distance from the reference point to the second mobile entity 434 is 0.8km.
  • Figure 5 shows an example of a graphical representation 500 of the physical space of figure 4 generated in accordance with an embodiment.
  • a representation of a section of road 500 having two lanes 511 , 512 in a first direction 510 (right to left), and two lanes 521 , 522 in a second direction 520 (left to right).
  • An indication of the position of the first mobile entity 533 is shown as travelling in the first lane 521 in the second direction 520
  • the second mobile entity 534 is shown as travelling in the second lane 522 in the second direction 520.
  • the road section 500 corresponds to the physical space 400 as discussed above, and represents the relevant physical features thereof.
  • the indications of the mobile entities resemble the entities themselves, this need not be the case, and may be the case to any desired extent.
  • the indications may be simple points, boxes containing an icon and/or text or a schematic representation of the class of mobile entity in question at one end of the scale, up to a photorealistic representation or live image of the mobile entity itself at the other end of the scale.
  • figure 5 presents an example of a graphical representation of the physical space at a first scale, defined in accordance with an embodiment.
  • the scale is not specified in figure 5, and it will be appreciated that any convenient scale may be chosen on the basis of the size of the physical space and the size of whatever display is to be used for the presentation of the graphical representation.
  • figure 5 shows a two dimensional representation of the physical space
  • other embodiments may equally present a three dimensional representation of the physical space.
  • the reference point 540 as defined in the physical space is presented in the graphical representation. It will be appreciated that in some embodiments, the position of the reference point need not be included in the graphical representation, or not at all times.
  • the reference point 540 may be defined by a user input, with referent to the position of an entity as discussed further below, or be in a defined relation to the physical space or an element thereof. Where the user point is defined by user input, this may conveniently take place by manipulation of the graphical representation, for example by conventional user interface mechanisms. For example, the user might select the desired location in the physical representation by clicking with a mouse cursor, finger tip, or the like.
  • a predicted time of arrival of each mobile entity whose position is determined within a region 550 defined with respect to the reference point 540 is obtained.
  • the region and the reference point may coincide, or otherwise the region may be situated in any predefined spatial relationship to the reference point.
  • the graphical representation is modified to incorporate an indication of the position of the first mobile entity 533 on a path 560 with respect to the region 550, wherein the indication 533 is situated in a position in the representation at a distance from the reference point proportional to the predicted time of arrival i.e. 58s.
  • the graphical representation is further modified to incorporate an indication of the position of the second mobile entity 534 on the path 560 with respect to the region 550, wherein the indication 534 is situated in a position in the representation at a distance from the reference point proportional to the predicted time of arrival i.e. 48s.
  • a graphical representation in accordance with the embodiment of figure 5 shows the second mobile entity further from the reference point that the first mobile entity, on the basis of the fact that it is expected to arrive later than the first mobile entity, which although presently further away in space, is moving faster, and thus closer in time.
  • the graphical representation may further be modified to incorporate an indication of the position of the further mobile entity with respect to the region, wherein the indication is situated on the path in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival of the further mobile entity. While figure 5 shows indications of the position of two physical entities, it will be appreciated that at any given time the number of indications may be any value from zero upwards, depending on the number of mobile entities in the physical space, or otherwise taken into account as discussed above.
  • the representation of figure 5 may support further variants. For example, on the basis of a speed of the first mobile entity and a speed of a further mobile entity a timing for a convergence of the two entities may be determined. This determination may then be used to indicate whether a particular mobile entity can overtake another within a specified time window (for example, before the arrival of an oncoming vehicle), or within a particular physical space (for example before an overtaking lane ends). Still further, the speed of one, or the other of the two or more entities may be adapted to achieve a desired result, for example indicating the required acceleration to achieve the desired manoeuvre in the available time or space, or in order for the two mobile entities to reach a common destination (corresponding to the reference point) at the same time.
  • the communication may comprise by way of example the transmission of a message to the mobile entity, the transmission of a message to the user, or to any other destination.
  • the communication may comprise multiple transmissions to different destinations.
  • the communication may comprise a modification of the graphical representation, for example through changing the colour or prominence of certain features, such as the path, the region or the entity or an indicator region that may be defined in the graphical region for this purpose.
  • a text message may be presented via the graphical representation, or otherwise.
  • the communication may comprise transmission via any suitable channel. It may be transmitted via any data network such as a WAN (e.g. the internet, GSM, UMTS), a LAN (e.g. wifi network or Ethernet), PAN (e.g. Bluetooth, zigbee).
  • a WAN e.g. the internet, GSM, UMTS
  • LAN e.g. wifi network or Ethernet
  • PAN e.g. Bluetooth, zigbee
  • the communication may be of any format, and may include text, graphical or audio content, or a combination of these.
  • the communication may be formatted for a human addressee, for example in the form of a human readable text or audio message.
  • the signalling action may be formatted for a machine recipient, for example on the basis of an API, or any suitable technical format having regard to the context.
  • the signalling action may be transmitted merely for information or as a warning, or may contain instructions. Where the signalling action involves transmission to a machine recipient, these instructions may be directly operable by that machine.
  • the communication may directly control the speed of the mobile entity.
  • the communication might additionally or alternatively comprise the transmission of an audible chime to be played to an occupant of the entity to alert them to the change of status. As such, the communication may comprise sending a signal to an entity.
  • any embodiment may be implemented with additional reference points.
  • Any mobile entity may be represented in a position reflecting its time of arrival at a respective one of these reference points, or any mobile entity may be represented multiple times, with each representation position reflecting the time of arrival of the corresponding mobile entity at a respective one of these reference points have different targets in the physical space.
  • the indication in addition to providing an indication of the position of the mobile entity on a path with respect to the region, the indication may be situated in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival, a further indication for that same mobile entity may be provided in a position in the physical space corresponding to the actual physical position of that entity.
  • user input may be received specifying a new position of the reference point, whereupon the selection of mobile entities for representation, the determination of the position of those mobile entities, the determination of a predicted time of arrival of those mobile entities will be recalculated, and the graphical representation updated on the basis of the new position of the new reference point.
  • Figure 6 shows a further example of a graphical representation 500 of the physical space of figure 4 generated in accordance with an embodiment.
  • a representation of a section of road 500 substantially as described with reference to figure 5.
  • user input is received specifying a new position of the indication on the path.
  • this is represented by the position of the cursor 601 , which as shown has“dragged” the indicator of the second mobile entity 534 to a new position 633 on the path 560.
  • the second mobile entity is now situated ahead of the first mobile entity 533.
  • the path itself may be modified so as to modify the predicted time of arrival of the second mobile entity to correspond to the new position of the indication may be calculated and issued in a communication to the mobile entity indicating the variations. For example, a shorter or longer path may be imposed, having a length such that at a particular speed the mobile entity will arrive at a desired time.
  • the modified path comprise the original path replaced in whole or in part with a new path, or may constitute a distortion of the original path, or otherwise be selected from a predefined template, or automatically generated by a path finding algorithm given defined path constraints. Meanders, loops, dog legs or other such path extensions may be added to delay the time of arrival.
  • a variety of automated or semi-automated algorithms for defining a suitable path adjustment will occur to the skilled person. The exact manner and degree of freedom for such modifications will vary depending on the context of the implementation- where mobile entities are aircraft in flights, a wide range of variations in three dimensions may be envisages, whilst road vehicles as shown in figure 8 might be more constrained.
  • the speed and or path of the mobile entity may be adjusted for example as described above, or any combination of the two approaches.
  • Figure 7 shows a further example of a physical space susceptible to graphical representation in accordance with an embodiment.
  • a section of road 700 having two lanes 71 1 , 712 in a first direction 710 (right to left), and two lanes 721 , 722 in a second direction 720 (left to right).
  • An entity shown as a car 733 is shown as travelling in the first lane 721 in the second direction 720, and an entity shown as a car 734 is shown as travelling in the entry slip road 723.
  • the first mobile entity 733 has a first orientation cpi with respect to the reference point 740 and the second mobile entity 734 has a second orientation cp 2 with respect to the reference point 740.
  • a reference point 740 is defined in the physical space.
  • a position of a mobile entity is determined with respect to the reference point.
  • the position of the two mobile entities 733 and 734 are determined with respect to the reference point 740, as indicated by the lines 743 and 744.
  • the position of any number of mobile entities may be determined in this way.
  • the position of every mobile entity, or every mobile entity satisfying certain selection criteria, in the physical space may be thus determined.
  • the position of every mobile entity, or every mobile entity satisfying certain selection criteria, in some other region, which may be larger than or smaller than the physical space may be thus determined.
  • the speed of mobile entity 733 is 75km/h
  • the speed of mobile entity 734 is 50km/h.
  • the distance from the reference point to the first mobile entity 733 is 1 km, while the distance from the reference point to the second mobile entity 734 is 0.8km.
  • Figure 8 shows an example of a graphical representation 800 of the physical space of figure 7 generated in accordance with an embodiment.
  • a representation of a section of road 800 having two lanes 811 , 812 in a first direction 810 (right to left), and two lanes 821 , 822 in a second direction 820 (left to right).
  • a representation of the first mobile entity 833 is shown as travelling in the first lane 821 in the second direction 820, and the second mobile entity 834 is shown as travelling in the entry slip road 823.
  • the road section 800 corresponds to the physical space 700 as discussed above, and represents the relevant physical features thereof.
  • figure 8 presents an example of a graphical representation of the physical space at a first scale, defined in accordance with an embodiment.
  • the scale is not specified in figure 8, and it will be appreciated that any convenient scale may be chosen on the basis of the size of the physical space and the size of whatever display is to be used for the presentation of the graphical representation.
  • a predicted time of arrival of each mobile entity whose position is determined within a region defined with respect to the reference point.
  • it may be determined that the first mobile entity 833 will arrive at the reference point in 1/75 0.01333hrs, that is, 48 seconds.
  • 0.8/50 0.016 hrs, that is, 58 seconds.
  • the graphical representation is modified to incorporate an indication of the position of the first mobile entity 833 on a path 863 with respect to the region 850, wherein the indication 833 is situated in a position in the representation at a distance from the reference point proportional to the predicted time of arrival i.e. 58s.
  • the graphical representation is further modified to incorporate an indication of the position of the second mobile entity 834 on the path 864 with respect to the region 850, wherein the indication 834 is situated in a position in the representation at a distance from the reference point proportional to the predicted time of arrival i.e. 48s.
  • the indication of the first mobile entity 833 is situated in a position in the representation of the physical space with an orientation Q1 with respect to the reference point 840 corresponding to the relative orientation of the first mobile entity to the reference point cp-i.
  • the indication of the second mobile entity 834 is situated in a position in the representation of the physical space with an orientation Q2 with respect to the reference point 840 corresponding to the relative orientation of the first mobile entity to the reference point cp 2 .
  • each entity may be located on a respective path corresponding to the path in the physical space that the respective mobile entity is expected to follow.
  • the factor by which the distance is proportional to the predicted time of arrival may vary as a function of the position of the indication thereon.
  • a first additional zone 802 is provided around the edge of the graphical representation
  • a second additional zone 803 is provided around the outer edge of the first additional zone 802.
  • Indications for mobile entities that are outside the physical area may be represented in these additional zones, such as elements 835 and 836 in figure 8. These may be presented as mobile entities out of the visible spatial range but visible in the temporal range.
  • the factor by which the distance is proportional to the predicted time of arrival may be different in each zone, and become progressively greater for each successively more remote zone from the reference point. In other embodiments, the factor by which the distance is proportional to the predicted time of arrival may rise continually, in accordance with any continuous function, with the zones serving to mark convenient reference time graduations in this succession.
  • the zones are situated all around the periphery of the graphical representation, in some cases the zones need not occupy the entire periphery, for example in embodiments where mobile entities only arrive from certain directions. Still further, in cases where mobile entities only arrive along certain paths, the zones may be linear in nature, with mobile entities presented as a queue. Still further, in some cases, for example where such a linear presentation is adopted or where mobile entities arrive from a limited range of directions, the zones need not appear at the periphery of the graphical representation, but may be situated in any convenient location within the graphical representation.
  • the manner in which the factor by which the distance is proportional to the predicted time of arrival may vary as a function of orientation, with regard to the point of reference, so that entities arriving from one direction are subject to time scaling according to one factor, or one linear evolution, whilst entities arriving from another direction are subject to time scaling according to a further respective factor, or linear evolution.
  • a user input may be received specifying a new scale for the graphical representation or some part thereof, whereupon the selection of mobile entities for representation, the determination of the position of each mobile entity, and the determination of the predicted time of arrival of each mobile entity, can be recomputed and the graphical representation modified on the basis of the new scale accordingly.
  • the graphical representation may also be modified to incorporate speed vector indicator graphic, having a dimension proportional to the speed of each respective mobile entity in the physical space.
  • the four mobile entities are each provided with a speed vector indicator in the form of an arrow whose length is proportional to the speed of the mobile entity, and whose orientation corresponds to the direction of movement thereof.
  • a speed vector indicator in the form of an arrow whose length is proportional to the speed of the mobile entity, and whose orientation corresponds to the direction of movement thereof.
  • speed vector indicators might be presented only for mobile entities in the physical area, or in one or more of the additional zones.
  • the dimension of the speed vector indicator may be chosen to correspond to the time the entity will take to travel a specified distance at the known speed of the mobile entity, using the same factor by which the distance is proportional to the predicted time of arrival of the mobile entity in question.
  • Figure 9 shows a mobile entity manager in accordance with an embodiment.
  • a mobile entity manager 900 for managing a mobile entity 923 in a physical space 990.
  • the mobile entity manager 900 comprises a graphics renderer adapted to generate a graphical representation 911 of the physical space 990 at a first scale.
  • the mobile entity manager 900 further comprises a representation engine 950 adapted to define a reference point in the physical space 990, to determine a position of the mobile entity 923 with respect to the reference point, and to determine a predicted time of arrival of the mobile entity 923 within a region defined with respect to the reference point.
  • the graphics renderer 910 is further adapted to modify the graphical representation 911 to incorporate an indication of the position of the mobile entity 923 on a path with respect to the region, wherein the indication is situated in a position in the representation of the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival.
  • the representation of the physical space may be generated on the basis of a digital representation of the space 920.
  • the position of the mobile entities may be generated on the basis of entity data 960.
  • User input for example as described with respect to any of figure 1 to 8, may be provided to the representation engine via the graphical representation 911 from a user 940.
  • the movements and interactions of mobile physical entities such as vehicles, individuals and so on in a physical space can be monitored and controlled by providing a graphical representation of the physical space including an indication of a position of a mobile entity with respect to a region associated with a reference point in the physical space where the indication is situated at a distance from the reference point where the distance is proportional to the predicted time of arrival of the entity in the region.
  • mobile entities are shown in the order of expected arrival, rather than their instantaneous physical location, supporting more rapid assimilation of anticipated arrivals and traffic conditions.
  • Software embodiments include but are not limited to application, firmware, resident software, microcode, etc.
  • the invention can take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or an instruction execution system.
  • Software embodiments include software adapted to implement the steps discussed above with reference to figures 1 to 8.
  • a computer-usable or computer- readable can be any apparatus that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium.
  • the methods and processes described herein may be implemented in whole or part by a user device. These methods and processes may be implemented by computer-application programs or services, an application programming interface (API), a library, and/or other computer-program product, or any combination of such entities.
  • API application programming interface
  • the user device may be a mobile device such as a smart phone or tablet, a drone, a computer or any other device with processing capability, such as a robot or other connected device, including loT (Internet of Things) devices.
  • a mobile device such as a smart phone or tablet, a drone, a computer or any other device with processing capability, such as a robot or other connected device, including loT (Internet of Things) devices.
  • loT Internet of Things
  • Figure 10 shows a generic computing system suitable for implementation of embodiments of the invention.
  • a system includes a logic device 1001 and a storage device 1002.
  • the system may optionally include a display subsystem 101 1 , input/output subsystem 1003, communication subsystem 1020, and/or other components not shown.
  • Logic device 1001 includes one or more physical devices configured to execute instructions.
  • the logic device 1001 may be configured to execute instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs.
  • Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more components, achieve a technical effect, or otherwise arrive at a desired result.
  • the logic device 1001 may include one or more processors configured to execute software instructions. Additionally or alternatively, the logic device may include one or more hardware or firmware logic devices configured to execute hardware or firmware instructions. Processors of the logic device may be single-core or multi-core, and the instructions executed thereon may be configured for sequential, parallel, and/or distributed processing. Individual components of the logic device 1001 optionally may be distributed among two or more separate devices, which may be remotely located and/or configured for coordinated processing. Aspects of the logic device 1001 may be virtualized and executed by remotely accessible, networked computing devices configured in a cloud-computing configuration.
  • Storage device 1002 includes one or more physical devices configured to hold instructions executable by the logic device to implement the methods and processes described herein. When such methods and processes are implemented, the state of storage 1002 device may be transformed— e.g., to hold different data.
  • Storage device 1002 may include removable and/or built-in devices. Storage device may be locally or remotely stored (in a cloud for instance). Storage device 1002 may comprise one or more types of storage device including optical memory (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory (e.g., FLASH, RAM, EPROM, EEPROM, etc.), and/or magnetic memory (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others. Storage device may include volatile, non-volatile, dynamic, static, read/write, read-only, random-access, sequential- access, location-addressable, file-addressable, and/or content-addressable devices.
  • optical memory e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.
  • semiconductor memory e.g., FLASH, RAM, EPROM, EEPROM, etc.
  • magnetic memory e.g., hard-disk drive
  • the system may comprise an interface 1003 adapted to support communications between the logic device 1001 and further system components.
  • additional system components may comprise removable and/or built-in extended storage devices.
  • Extended storage devices may comprise one or more types of storage device including optical memory 1032 (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory 1033 (e.g., RAM, EPROM, EEPROM, FLASH etc.), and/or magnetic memory 1031 (e.g., hard-disk drive, floppy- disk drive, tape drive, MRAM, etc.), among others.
  • Such extended storage device may include volatile, non-volatile, dynamic, static, read/write, read-only, random- access, sequential-access, location-addressable, file-addressable, and/or content- addressable devices.
  • storage device includes one or more physical devices, and excludes propagating signals per se.
  • aspects of the instructions described herein alternatively may be propagated by a communication medium (e.g., an electromagnetic signal, an optical signal, etc.), as opposed to being stored on a storage device.
  • a communication medium e.g., an electromagnetic signal, an optical signal, etc.
  • logic device 1001 and storage device 1002 may be integrated together into one or more hardware-logic components.
  • Such hardware-logic components may include field-programmable gate arrays (FPGAs), program- and application-specific integrated circuits (PAS I C/ASICs), program- and application-specific standard products (PSSP/ASSPs), system-on-a-chip (SOC), and complex programmable logic devices (CPLDs), for example.
  • FPGAs field-programmable gate arrays
  • PAS I C/ASICs program- and application-specific integrated circuits
  • PSSP/ASSPs program- and application-specific standard products
  • SOC system-on-a-chip
  • CPLDs complex programmable logic devices
  • program may be used to describe an aspect of computing system implemented to perform a particular function.
  • a program may be instantiated via logic device executing machine-readable instructions held by storage device 1002. It will be understood that different modules may be instantiated from the same application, service, code block, object, library, routine, API, function, etc. Likewise, the same program may be instantiated by different applications, services, code blocks, objects, routines, APIs, functions, etc.
  • program may encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc.
  • system of figure 10 may be used to implement embodiments of the invention.
  • a program implementing the steps described with respect to figures 1 to 8, or the algorithms presented above may be stored in storage device 1002 and executed by logic device 1001.
  • Information reflecting or defining the physical space, the entities, or the regions may be stored in storage device 1002, 1031 , 1032, 1033.
  • Information reflecting or defining the physical space, the entities, or the regions may be stored received via the communications interface 1020.
  • User input defining the regions may be received via the I/O interface 1003 and in particular the touchscreen display 1011 , camera 1016, microphone 1015, mouse 1013, keyboard 1012 or otherwise.
  • the functions of any or all of the units910 or 950, may similarly be implemented by a program performing the required functions, in communication with additional dedicated hardware units as necessary.
  • the display 1011 may display the graphical representation of the physical space, and/or the regions, and/or the entities. Accordingly the invention may be embodied in the form of a computer program.
  • a“service”, as used herein, is an application program executable across multiple user sessions.
  • a service may be available to one or more system components, programs, and/or other services.
  • a service may run on one or more server-computing devices.
  • display subsystem 1011 may be used to present a visual representation of data held by a storage device.
  • This visual representation may take the form of a graphical user interface (GUI).
  • GUI graphical user interface
  • Display subsystem 1011 may include one or more display devices utilizing virtually any type of technology for example as discussed above. Such display devices may be combined with logic device and/or storage device in a shared enclosure, or such display devices may be peripheral display devices. An audio output such as speaker 1014 may also be provided.
  • input subsystem may comprise or interface with one or more user- input devices such as a keyboard 1012, mouse 1013, touch screen 1011 , or game controller (not shown).
  • the input subsystem may comprise or interface with selected natural user input (NUI) componentry.
  • NUI natural user input
  • Such componentry may be integrated or peripheral, and the transduction and/or processing of input actions may be handled on- or off-board.
  • Example NUI componentry may include a microphone 1015 for speech and/or voice recognition; an infrared, colour, stereoscopic, and/or depth camera 1016 for machine vision and/or gesture recognition; a head tracker, eye tracker, accelerometer, and/or gyroscope for motion detection and/or intent recognition; as well as electric-field sensing componentry for assessing brain activity.
  • the input/output interface 1003 may similarly interface with a loudspeaker 1014, vibromotor or any other transducer device as may occur to the skilled person.
  • the system may interface with a printer 1017.
  • communication subsystem 1020 may be configured to communicatively couple computing system with one or more other computing devices.
  • Communication subsystem may include wired and/or wireless communication devices compatible with one or more different communication protocols.
  • the communication subsystem may be configured for communication via a wireless telephone network 1074, or a wired or wireless local- or wide-area network.
  • the communication subsystem may allow computing system to send and/or receive messages to and/or from other devices via a network such as Internet 1075.
  • the communications subsystem may additionally support short range inductive communications with passive or active devices (NFC, RFID, UFIF, etc).
  • the traffic data may be received via the telephone network 1074 or Internet 1075.
  • the system of figure 10 is intended to reflect a broad range of different types of information handling system. It will be appreciated that many of the subsystems and features described with respect to figure 10 are not required for implementation of the invention, but are included to reflect possible systems in accordance with the present invention. It will be appreciated that system architectures vary widely, and the relationship between the different sub-systems of figure 10 is merely schematic, and is likely to vary in terms of layout and the distribution of roles in systems. It will be appreciated that, in practice, systems are likely to incorporate different subsets of the various features and subsystems described with respect to figure 10.
  • FIG. 11 shows a smartphone device adaptable to constitute an embodiment.
  • the smartphone device incorporates elements 1001 , 1002, 1003, 1020, optional near field communications interface 1021 , flash memory 1033 and elements 1014, 1015, 1016, and 1011 as described above. It is in communication with the telephone network 1074 and a server 1076 via the network 1075. Alternative communication mechanisms such as a dedicated network or Wi-Fi may also be used.
  • the features disclosed in this figure may also be included within a tablet device as well.
  • Figure 12 shows an Air Traffic control desk adaptable to constitute an embodiment.
  • the Air Traffic control desk comprises elements 1001 , 1002, 1003, 1020, 1014, 1015, 1016, 1011 , 1031 , 1032, 1033 as described above.
  • the Air Traffic control desk is in communication with a drone 1001 via a communications satellite 1002 and a radio antenna 1003 coupled to the communications interface 1020.
  • the cockpit comprises a seat, and joysticks, either of which may constitute suitable locations for any user status sensors and/or vibration transducers as discussed above. Alternative communication mechanisms may also be used.
  • Further embodiments may be based on or include immersive environment devices such as the FITC be, Oculus rift etc, or other hybrid device such as the Flololens Meta vision 2.
  • immersive environment devices such as the FITC be, Oculus rift etc, or other hybrid device such as the Flololens Meta vision 2.
  • the specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated and/or described may be performed in the sequence illustrated and/or described, in other sequences, in parallel, or omitted. Likewise, the order of the above-described processes may be changed.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Navigation (AREA)

Abstract

The movements and interactions of mobile physical entities such as vehicles, individuals and so on in a physical space can be monitored and controlled by providing a graphical representation of the physical space including an indication of a position of a mobile entity with respect to a region associated with a reference point in the physical space where the indication is situated at a distance from the reference point where the distance is proportional to the predicted time of arrival of the entity in the region. On this basis, mobile entities are shown in the order of expected arrival, rather than their instantaneous physical location, supporting more rapid assimilation of anticipated arrivals and traffic conditions.

Description

METHOD AND APPARATUS MANAGING ENTITIES IN A PHYSICAL SPACE
Field of the invention
The present invention relates generally to the managing of entities in a physical space.
Background of the invention
Various classes of oversight of mobile entities can be defined including pilots and drivers on one hand, and supervisors, air traffic controllers, vessel traffic controllers and the like who have a general responsibility for entities in a given area on the other. The working conditions of these classes of individuals are affected by current technological trends, and furthermore significant convergence of their roles. On one hand, vehicles are increasingly autonomous, so that the role of the driver or pilot is increasingly supported by electronic tools such as navigation tools, and/or entrusted to a remote operator who can take over control of the vehicle via a telecommunications channel at critical instants. Meanwhile, supervision and guidance tasks may be increasingly supported by information technology, so that one individual can be expected to supervise an ever larger area, or increasingly, to supervise several different areas, or attribute only a portion of their attention to traffic considerations, with the relevant traffic information being relayed from the respective areas via telecommunications means.
It is desirable to integrate automation into a supervision interface so as to reduce the overall work load of such individuals without reducing their situational awareness or the availability of direct control where appropriate.
Summary of the invention
In accordance with the present invention in a first aspect there is provided a mobile entity manager for managing a mobile entity in a physical space. The mobile entity manager comprises a graphics renderer adapted to generate a graphical representation of the physical space at a first scale, a representation engine adapted to define a reference point in the physical space, to determine a position of the mobile entity with respect to the reference point, and to determine a predicted time of arrival of the mobile entity within a region defined with respect to the reference point. The graphics renderer is further adapted to modify the graphical representation to incorporate an indication of the position of the mobile entity on a path with respect to the region, wherein the indication is situated in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival.
In accordance with the present invention in a second aspect there is provided a method of managing a mobile entity in a physical space, the method comprising the steps of: defining a graphical representation of the physical space at a first scale, defining a reference point in the physical space, determining a position of the mobile entity with respect to the reference point, determining a predicted time of arrival of the mobile entity within a region defined with respect to the reference point, and modifying the graphical representation to incorporate an indication of the position of the mobile entity on a path with respect to the region, wherein the indication is situated in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival. In accordance with a development of the second aspect, the method comprises the further steps of receiving a user input specifying a new position of the indication on the path, calculating one or more variations in the speed of the mobile entity which would modify the predicted time of arrival of the mobile entity to correspond to the new position of the indication, and issuing a communication to the mobile entity indicating the variations.
In accordance with a development of the second aspect, the method comprises the further steps of receiving a user input specifying a new position of the indication on the path, a modification of the path which would modify the predicted time of arrival of the mobile entity to correspond to the new position of the indication, and issuing a communication to the mobile entity indicating the variations. In accordance with a development of the second aspect, at the step of modifying the graphical representation to incorporate an indication of the position of the mobile entity with respect to the region, the indication is situated in a position in graphical representation of the physical space with an orientation with respect to the reference point corresponding to the relative orientation of the entity to the reference point.
In accordance with a development of the second aspect, the method comprises the further steps of: determining a position of a further the mobile entity with respect to the reference point, determining a predicted time of arrival of the further mobile entity within the region, and, modifying the graphical representation to incorporate an indication of the position of the further mobile entity with respect to the region, wherein the indication is situated on the path in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival of the further mobile entity.
In accordance with a development of the second aspect, the method comprises the further step of determining a timing for a convergence of the two entities on the basis of a speed of the first mobile entity and a speed of the further mobile entity.
In accordance with a development of the second aspect, the path corresponds to a path in the physical space that the mobile entity is expected to follow.
In accordance with a development of the second aspect, the factor by which the distance is proportional to the predicted time of arrival varies as a function of the position of the indication thereon, or wherein the factor by which the distance is proportional to the predicted time of arrival varies as a function of the orientation. In accordance with a development of the second aspect, the method comprises the further step of: modifying the graphical representation to incorporate speed vector indicator graphic, wherein the speed vector indicator graphic has a dimension proportional to the speed of the mobile entity in the physical space. In accordance with a development of the second aspect, the method comprises a further step of receiving a user input specifying a new position of the reference point, and repeating the steps of determining a position of the mobile entity, determining a predicted time of arrival of the mobile entity, and modifying the graphical representation on the basis of the new position of the reference point.
In accordance with a development of the second aspect, the method comprises the further step of receiving a user input specifying a new scale for the graphical representation, and repeating the steps of determining a position of the mobile entity, determining a predicted time of arrival of the mobile entity, and modifying the graphical representation on the basis of the new scale.
In accordance with a development of the second aspect, the graphical representation is three dimensional.
In accordance with the present invention in a third aspect there is provided computer program comprising instructions adapted to implement the steps of the second aspect.
Brief Description of the Drawings
The above and other advantages of the present invention will now be described with reference to the accompanying drawings, for illustration purposes only, in which: Figure 1 shows a method of managing the behaviour of a mobile entity in a physical space in accordance with an embodiment;
Figure 2 shows a method of managing the behaviour of a mobile entity in a physical space in accordance with a development of the embodiment of figure 1 ;
Figure 3 shows a method of managing the behaviour of a mobile entity in a physical space in accordance with a development of the embodiment of figure 1 ;
Figure 4 shows an example of a physical space susceptible to graphical representation in accordance with an embodiment;
Figure 5 shows an example of a graphical representation of the physical space of figure 4 generated in accordance with an embodiment; Figure 6 shows a further example of a graphical representation of the physical space of figure 4 generated in accordance with an embodiment;
Figure 7 shows a further example of a physical space susceptible to graphical representation in accordance with an embodiment; Figure 8 shows an example of a graphical representation of the physical space of figure 7 generated in accordance with an embodiment;
Figure 9 shows a mobile entity manager in accordance with an embodiment;
Figure 10 shows a generic computing system suitable for implementation of embodiments of the invention;
Figure 11 shows a smartphone device adaptable to constitute an embodiment; and Figure 12 shows an Air Traffic control desk adaptable to constitute an embodiment.
Detailed description
Figure 1 shows a method of managing the behaviour of a mobile entity in a physical space in accordance with an embodiment.
As shown in figure 1 , the method starts at step 100 before proceeding to step 110, at which a graphical representation of the physical space is defined at a first scale. The method next proceeds to step 120 at which a reference point in the physical space is defined. This definition may comprise retrieving a position value from memory, receiving user input, for example as discussed in further detail below, or otherwise. At step 130 a position of the mobile entity is determined with respect to the reference point. The position of the mobile entity may be obtained directly, for example by means of a radar, lidar or other such system, by triangulation of a signal emitted or reflected by the entity, or indirectly, for example by receiving positioning information, as obtained for example from a GNSS system, from the mobile entity itself, or inferred for example on a dead-reckoning basis from applying time and velocity data to a known starting point, or otherwise. At step 140 a predicted time of arrival of the mobile entity within a region defined with respect to the reference point is determined. Determination of the predicted time of arrive may comprise determining a speed of the mobile entity, either directly for example by means of a radar, lidar or other such system, by inference from successive location determinations, or indirectly, for example by receiving speed information from the mobile entity itself. The method then finally proceeds to step 150 at which the graphical representation is modified to incorporate an indication of the position of the mobile entity on a path with respect to the region, wherein the indication is situated in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival before terminating at step 190.
Examples of embodiments in accordance with this method are described below with reference to figures 2 to 8. Figure 2 shows a method of managing the behaviour of a mobile entity in a physical space in accordance with a development of the embodiment of figure 1.
As shown in figure 2, the method proceeds through steps 110, 120, 130, 140 and 150 as described with reference to figure 1 , before proceeding to step 260 at which a user input is received specifying a new position of the indication on the path. The method then proceeds to calculate one or more variations in the speed of the mobile entity which would modify the predicted time of arrival of the mobile entity to correspond to the new position of the indication at step 270 before issuing a communication to the mobile entity indicating the variations at step 280 before terminating at step 190. Such speed variations may be obtained on the basis of a model of the mobile entities capacities in terms of acceleration and deceleration, possibly taking into account fuel efficiency considerations, passenger comfort, possible interactions with other entities, and the like, as will occur to the skilled person.
Possible implementations of this approach and certain implication thereof are discussed further below, for example with reference to figures 4 to 8.
Figure 3 shows a method of managing the behaviour of a mobile entity in a physical space in accordance with a development of the embodiment of figure 1.
As shown in figure 3, the method proceeds through steps 110, 120, 130, 140 and 150 as described with reference to figure 1 , before proceeding to step 360 at which a position of a further mobile entity is determined with respect to the reference point. The method next proceeds to step 370 at which a predicted time of arrival of the further mobile entity within the region is determined, and then proceeds to step 380 of modifying the graphical representation to incorporate an indication of the position of the further mobile entity with respect to the region, wherein the indication is situated on the path in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival of the further mobile entity.
Possible implementations of this approach and certain implication thereof are discussed further below, for example with reference to figures 4 to 8.
It will be appreciated that certain variants of the method of figures 1 , 2 and 3 are possible. In particular, it may be desirable to redefine the basic graphical representation of the physical space by looping back to step 110 as desired. Similarly, on determining that no interaction has occurred, the process might not necessarily look for new user input of new regions by looping back to step 120 on every iteration, and might loop back to steps 130 or 140 in some iterations. Still further, the method may look for certain inputs in parallel- for example, some or all of changes to the underlying physical space, entities, regions and interactions between them may be monitored in parallel.
It will be appreciated that the steps of figures 1 , 2 and 3 may be looped to as to continually update the graphical representation as the mobile entities progress, and as new mobile entities appear or disappear. Further variants will now be discussed with respect to figures 4 to 8.
Figure 4 shows an example of a physical space susceptible to graphical representation in accordance with an embodiment. As shown in figure 4 there is provided a section of road 400, having two lanes 411 , 412 in a first direction 410 (right to left), and two lanes 421 , 422 in a second direction 420 (left to right). An entity shown as a car 431 is shown as travelling in the first lane 411 in the first direction 410, a entity shown as a car 432 is shown as travelling in the second lane 412 in the first direction 410, an entity shown as a car 433 is shown as travelling in the first lane 421 in the second direction 420, and an entity shown as a car 434 is shown as travelling in the second lane 422 in the second direction 420. The road section 400, or alternatively a road direction 410 or 420, or a lane 411 , 412, 421 or 422 may correspond to the physical space 400 as discussed above. Road section 400 has an entry slip road 423.
A reference point 440 is defined in the physical space. This reference point 440 may be defined by a user input, with referent to the position of an entity as discussed further below, or be in a defined relation to the physical space or an element thereof.
In accordance with the embodiment, a position of a mobile entity is determined with respect to the reference point. As shown, the position of the two mobile entities 433 and 434 are determined with respect to the reference point 440, as indicated by the lines 443 and 444. It will be appreciated that the position of any number of mobile entities may be determined in this way. In particular, the position of every mobile entity, or every mobile entity satisfying certain selection criteria, in the physical space may be thus determined. Furthermore, the position of every mobile entity, or every mobile entity satisfying certain selection criteria, in some other region, which may be larger than or smaller than the physical space may be thus determined. For the purposes of the present example, the speed of mobile entity 433 is 75km/h, while the speed of mobile entity 434 is 50km/h.
For the purposes of the present example, the distance from the reference point to the first mobile entity 433 is 1 km, while the distance from the reference point to the second mobile entity 434 is 0.8km.
Figure 5 shows an example of a graphical representation 500 of the physical space of figure 4 generated in accordance with an embodiment. As shown in figure 5 there is presented a representation of a section of road 500, having two lanes 511 , 512 in a first direction 510 (right to left), and two lanes 521 , 522 in a second direction 520 (left to right). An indication of the position of the first mobile entity 533 is shown as travelling in the first lane 521 in the second direction 520, and the second mobile entity 534 is shown as travelling in the second lane 522 in the second direction 520. As such, the road section 500 corresponds to the physical space 400 as discussed above, and represents the relevant physical features thereof.
It will be appreciated that while the indications of the mobile entities resemble the entities themselves, this need not be the case, and may be the case to any desired extent. For example, the indications may be simple points, boxes containing an icon and/or text or a schematic representation of the class of mobile entity in question at one end of the scale, up to a photorealistic representation or live image of the mobile entity itself at the other end of the scale.
As such, figure 5 presents an example of a graphical representation of the physical space at a first scale, defined in accordance with an embodiment. The scale is not specified in figure 5, and it will be appreciated that any convenient scale may be chosen on the basis of the size of the physical space and the size of whatever display is to be used for the presentation of the graphical representation.
While for the sake of simplicity figure 5 shows a two dimensional representation of the physical space, other embodiments may equally present a three dimensional representation of the physical space.
Furthermore, the reference point 540 as defined in the physical space is presented in the graphical representation. It will be appreciated that in some embodiments, the position of the reference point need not be included in the graphical representation, or not at all times. The reference point 540 may be defined by a user input, with referent to the position of an entity as discussed further below, or be in a defined relation to the physical space or an element thereof. Where the user point is defined by user input, this may conveniently take place by manipulation of the graphical representation, for example by conventional user interface mechanisms. For example, the user might select the desired location in the physical representation by clicking with a mouse cursor, finger tip, or the like.
In accordance with the embodiment a predicted time of arrival of each mobile entity whose position is determined within a region 550 defined with respect to the reference point 540 is obtained. The region and the reference point may coincide, or otherwise the region may be situated in any predefined spatial relationship to the reference point.
On the basis of the speed and distance information presented above, it may be determined that the first mobile entity 533 will arrive at the reference point in 1/75 = 0.01333hrs, that is, 48 seconds.
On the basis of the speed and distance information presented above, it may be determined that the second mobile entity 534 will arrive at the reference point in 0.8/50 = 0.016 hrs, that is, 58 seconds. As shown in figure 5, the graphical representation is modified to incorporate an indication of the position of the first mobile entity 533 on a path 560 with respect to the region 550, wherein the indication 533 is situated in a position in the representation at a distance from the reference point proportional to the predicted time of arrival i.e. 58s.
Meanwhile the graphical representation is further modified to incorporate an indication of the position of the second mobile entity 534 on the path 560 with respect to the region 550, wherein the indication 534 is situated in a position in the representation at a distance from the reference point proportional to the predicted time of arrival i.e. 48s.
On this basis, whilst a direct representation of the real physical position of the mobile entities with respect to the reference point would show the first mobile entity further from the reference point than the second entity, a graphical representation in accordance with the embodiment of figure 5 shows the second mobile entity further from the reference point that the first mobile entity, on the basis of the fact that it is expected to arrive later than the first mobile entity, which although presently further away in space, is moving faster, and thus closer in time.
As such where the position of a further mobile entity such as the second mobile entity 534 is determined with respect to the reference point, and a predicted time of arrival of the further mobile entity within the region, the graphical representation may further be modified to incorporate an indication of the position of the further mobile entity with respect to the region, wherein the indication is situated on the path in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival of the further mobile entity. While figure 5 shows indications of the position of two physical entities, it will be appreciated that at any given time the number of indications may be any value from zero upwards, depending on the number of mobile entities in the physical space, or otherwise taken into account as discussed above.
The representation of figure 5 may support further variants. For example, on the basis of a speed of the first mobile entity and a speed of a further mobile entity a timing for a convergence of the two entities may be determined. This determination may then be used to indicate whether a particular mobile entity can overtake another within a specified time window (for example, before the arrival of an oncoming vehicle), or within a particular physical space (for example before an overtaking lane ends). Still further, the speed of one, or the other of the two or more entities may be adapted to achieve a desired result, for example indicating the required acceleration to achieve the desired manoeuvre in the available time or space, or in order for the two mobile entities to reach a common destination (corresponding to the reference point) at the same time.
This determination may constitute the basis of a communication. The communication may comprise by way of example the transmission of a message to the mobile entity, the transmission of a message to the user, or to any other destination. The communication may comprise multiple transmissions to different destinations. The communication may comprise a modification of the graphical representation, for example through changing the colour or prominence of certain features, such as the path, the region or the entity or an indicator region that may be defined in the graphical region for this purpose. A text message may be presented via the graphical representation, or otherwise. The communication may comprise transmission via any suitable channel. It may be transmitted via any data network such as a WAN (e.g. the internet, GSM, UMTS), a LAN (e.g. wifi network or Ethernet), PAN (e.g. Bluetooth, zigbee). The communication may be of any format, and may include text, graphical or audio content, or a combination of these. The communication may be formatted for a human addressee, for example in the form of a human readable text or audio message. The signalling action may be formatted for a machine recipient, for example on the basis of an API, or any suitable technical format having regard to the context. The signalling action may be transmitted merely for information or as a warning, or may contain instructions. Where the signalling action involves transmission to a machine recipient, these instructions may be directly operable by that machine. In particular, the communication may directly control the speed of the mobile entity. The communication might additionally or alternatively comprise the transmission of an audible chime to be played to an occupant of the entity to alert them to the change of status. As such, the communication may comprise sending a signal to an entity.
It will be appreciated that countless further additional or alternative components of the communication may be envisaged as a function of the context.
Still further, it will be appreciated that while the embodiments herein have been described with respect to a single reference point, any embodiment may be implemented with additional reference points. Any mobile entity may be represented in a position reflecting its time of arrival at a respective one of these reference points, or any mobile entity may be represented multiple times, with each representation position reflecting the time of arrival of the corresponding mobile entity at a respective one of these reference points have different targets in the physical space.
Still further, in any embodiment, in addition to providing an indication of the position of the mobile entity on a path with respect to the region, the indication may be situated in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival, a further indication for that same mobile entity may be provided in a position in the physical space corresponding to the actual physical position of that entity.
Still further, user input may be received specifying a new position of the reference point, whereupon the selection of mobile entities for representation, the determination of the position of those mobile entities, the determination of a predicted time of arrival of those mobile entities will be recalculated, and the graphical representation updated on the basis of the new position of the new reference point.
Figure 6 shows a further example of a graphical representation 500 of the physical space of figure 4 generated in accordance with an embodiment. As shown in figure 6 there is presented a representation of a section of road 500, substantially as described with reference to figure 5. In accordance with certain embodiments, user input is received specifying a new position of the indication on the path. In figure 6 this is represented by the position of the cursor 601 , which as shown has“dragged” the indicator of the second mobile entity 534 to a new position 633 on the path 560. Specifically, the second mobile entity is now situated ahead of the first mobile entity 533. On this basis, one or more variations in the speed of the second mobile entity
434 which would modify the predicted time of arrival of the second mobile entity to correspond to the new position of the indication may be calculated and issued in a communication to the mobile entity indicating the variations. Such a communication may be used in any of the ways, and take any of the forms, of communications as discussed above with reference to figure 5. Alternatively, or additionally, the path itself may be modified so as to modify the predicted time of arrival of the second mobile entity to correspond to the new position of the indication may be calculated and issued in a communication to the mobile entity indicating the variations. For example, a shorter or longer path may be imposed, having a length such that at a particular speed the mobile entity will arrive at a desired time. The modified path comprise the original path replaced in whole or in part with a new path, or may constitute a distortion of the original path, or otherwise be selected from a predefined template, or automatically generated by a path finding algorithm given defined path constraints. Meanders, loops, dog legs or other such path extensions may be added to delay the time of arrival. A variety of automated or semi-automated algorithms for defining a suitable path adjustment will occur to the skilled person. The exact manner and degree of freedom for such modifications will vary depending on the context of the implementation- where mobile entities are aircraft in flights, a wide range of variations in three dimensions may be envisages, whilst road vehicles as shown in figure 8 might be more constrained. The speed and or path of the mobile entity may be adjusted for example as described above, or any combination of the two approaches.
Figure 7 shows a further example of a physical space susceptible to graphical representation in accordance with an embodiment. As shown in figure 7 there is provided a section of road 700, having two lanes 71 1 , 712 in a first direction 710 (right to left), and two lanes 721 , 722 in a second direction 720 (left to right). An entity shown as a car 733 is shown as travelling in the first lane 721 in the second direction 720, and an entity shown as a car 734 is shown as travelling in the entry slip road 723. As shown, in view of the different paths of the first and second mobile entities, the first mobile entity 733 has a first orientation cpi with respect to the reference point 740 and the second mobile entity 734 has a second orientation cp2 with respect to the reference point 740.
A reference point 740 is defined in the physical space. In accordance with the embodiment, a position of a mobile entity is determined with respect to the reference point. As shown, the position of the two mobile entities 733 and 734 are determined with respect to the reference point 740, as indicated by the lines 743 and 744. It will be appreciated that the position of any number of mobile entities may be determined in this way. In particular, the position of every mobile entity, or every mobile entity satisfying certain selection criteria, in the physical space may be thus determined. Furthermore, the position of every mobile entity, or every mobile entity satisfying certain selection criteria, in some other region, which may be larger than or smaller than the physical space may be thus determined. For the purposes of the present example, the speed of mobile entity 733 is 75km/h, while the speed of mobile entity 734 is 50km/h.
For the purposes of the present example, the distance from the reference point to the first mobile entity 733 is 1 km, while the distance from the reference point to the second mobile entity 734 is 0.8km.
Figure 8 shows an example of a graphical representation 800 of the physical space of figure 7 generated in accordance with an embodiment. As shown in figure 8 there is presented a representation of a section of road 800, having two lanes 811 , 812 in a first direction 810 (right to left), and two lanes 821 , 822 in a second direction 820 (left to right). A representation of the first mobile entity 833 is shown as travelling in the first lane 821 in the second direction 820, and the second mobile entity 834 is shown as travelling in the entry slip road 823. As such, the road section 800 corresponds to the physical space 700 as discussed above, and represents the relevant physical features thereof.
As such, figure 8 presents an example of a graphical representation of the physical space at a first scale, defined in accordance with an embodiment. The scale is not specified in figure 8, and it will be appreciated that any convenient scale may be chosen on the basis of the size of the physical space and the size of whatever display is to be used for the presentation of the graphical representation. In accordance with the embodiment a predicted time of arrival of each mobile entity whose position is determined within a region defined with respect to the reference point. On the basis of the speed and distance information presented above, it may be determined that the first mobile entity 833 will arrive at the reference point in 1/75 = 0.01333hrs, that is, 48 seconds.
On the basis of the speed and distance information presented above, it may be determined that the second mobile entity 834 will arrive at the reference point in
0.8/50 = 0.016 hrs, that is, 58 seconds.
As shown in figure 8, the graphical representation is modified to incorporate an indication of the position of the first mobile entity 833 on a path 863 with respect to the region 850, wherein the indication 833 is situated in a position in the representation at a distance from the reference point proportional to the predicted time of arrival i.e. 58s.
Meanwhile the graphical representation is further modified to incorporate an indication of the position of the second mobile entity 834 on the path 864 with respect to the region 850, wherein the indication 834 is situated in a position in the representation at a distance from the reference point proportional to the predicted time of arrival i.e. 48s.
As shown, the indication of the first mobile entity 833 is situated in a position in the representation of the physical space with an orientation Q1 with respect to the reference point 840 corresponding to the relative orientation of the first mobile entity to the reference point cp-i.
Similarly, the indication of the second mobile entity 834 is situated in a position in the representation of the physical space with an orientation Q2 with respect to the reference point 840 corresponding to the relative orientation of the first mobile entity to the reference point cp2.
It will be appreciated that even where entities arrive with different orientations as shown in figure 7, it is still possible to represent the respective indicators on a single path, such as shown in figure 5. This approach may be appropriate where the direction of arrival is unimportant, or where many possible directions of arrival are possible such that attempting to accurately represent them would introduce an undesirable degree of variation in the graphical representation. Still further, the indication of each entity may be located on a respective path corresponding to the path in the physical space that the respective mobile entity is expected to follow.
In accordance with certain embodiments, and as shown by way of example in figure 8, the factor by which the distance is proportional to the predicted time of arrival may vary as a function of the position of the indication thereon.
Specifically, as shown in figure 8 there are provided additional zones at the periphery of the graphical representation 800. In particular, a first additional zone 802 is provided around the edge of the graphical representation, and a second additional zone 803 is provided around the outer edge of the first additional zone 802.
Indications for mobile entities that are outside the physical area may be represented in these additional zones, such as elements 835 and 836 in figure 8. These may be presented as mobile entities out of the visible spatial range but visible in the temporal range. In certain embodiments, the factor by which the distance is proportional to the predicted time of arrival may be different in each zone, and become progressively greater for each successively more remote zone from the reference point. In other embodiments, the factor by which the distance is proportional to the predicted time of arrival may rise continually, in accordance with any continuous function, with the zones serving to mark convenient reference time graduations in this succession. While as shown in figure 8 the zones are situated all around the periphery of the graphical representation, in some cases the zones need not occupy the entire periphery, for example in embodiments where mobile entities only arrive from certain directions. Still further, in cases where mobile entities only arrive along certain paths, the zones may be linear in nature, with mobile entities presented as a queue. Still further, in some cases, for example where such a linear presentation is adopted or where mobile entities arrive from a limited range of directions, the zones need not appear at the periphery of the graphical representation, but may be situated in any convenient location within the graphical representation.
Still further, the manner in which the factor by which the distance is proportional to the predicted time of arrival may vary as a function of orientation, with regard to the point of reference, so that entities arriving from one direction are subject to time scaling according to one factor, or one linear evolution, whilst entities arriving from another direction are subject to time scaling according to a further respective factor, or linear evolution.
Still further, a user input may be received specifying a new scale for the graphical representation or some part thereof, whereupon the selection of mobile entities for representation, the determination of the position of each mobile entity, and the determination of the predicted time of arrival of each mobile entity, can be recomputed and the graphical representation modified on the basis of the new scale accordingly. In accordance with certain embodiments, and as shown by way of example in figure 8, the graphical representation may also be modified to incorporate speed vector indicator graphic, having a dimension proportional to the speed of each respective mobile entity in the physical space.
Accordingly, as shown in figure 8 the four mobile entities are each provided with a speed vector indicator in the form of an arrow whose length is proportional to the speed of the mobile entity, and whose orientation corresponds to the direction of movement thereof. It will be appreciated that not all mobile entities need be associated with a speed vector indicator, and that for example speed vector indicators might be presented only for mobile entities in the physical area, or in one or more of the additional zones. Inherently there will exist a relationship between the scale of the speed vector indicator and the factor by which the distance is proportional to the predicted time of arrival of the mobile entity in question. In some embodiments, the dimension of the speed vector indicator may be chosen to correspond to the time the entity will take to travel a specified distance at the known speed of the mobile entity, using the same factor by which the distance is proportional to the predicted time of arrival of the mobile entity in question.
It will be appreciated that while figures 2 to 8 present certain combinations of features together, any other combination of features may be envisaged. For example, the drag feature of figure 6 might be integrated with the zone feature of figure 8, without necessarily adoption other features of these embodiments, and so on.
Figure 9 shows a mobile entity manager in accordance with an embodiment. As shown there is provided a mobile entity manager 900 for managing a mobile entity 923 in a physical space 990. As shown, the mobile entity manager 900 comprises a graphics renderer adapted to generate a graphical representation 911 of the physical space 990 at a first scale. The mobile entity manager 900 further comprises a representation engine 950 adapted to define a reference point in the physical space 990, to determine a position of the mobile entity 923 with respect to the reference point, and to determine a predicted time of arrival of the mobile entity 923 within a region defined with respect to the reference point. The graphics renderer 910 is further adapted to modify the graphical representation 911 to incorporate an indication of the position of the mobile entity 923 on a path with respect to the region, wherein the indication is situated in a position in the representation of the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival.
The representation of the physical space may be generated on the basis of a digital representation of the space 920. The position of the mobile entities may be generated on the basis of entity data 960. User input, for example as described with respect to any of figure 1 to 8, may be provided to the representation engine via the graphical representation 911 from a user 940.
Further adaptations to the system of figure 9 may be envisaged to implement any of the features of any of the foregoing embodiments.
According to certain embodiments, the movements and interactions of mobile physical entities such as vehicles, individuals and so on in a physical space can be monitored and controlled by providing a graphical representation of the physical space including an indication of a position of a mobile entity with respect to a region associated with a reference point in the physical space where the indication is situated at a distance from the reference point where the distance is proportional to the predicted time of arrival of the entity in the region. On this basis, mobile entities are shown in the order of expected arrival, rather than their instantaneous physical location, supporting more rapid assimilation of anticipated arrivals and traffic conditions. Software embodiments include but are not limited to application, firmware, resident software, microcode, etc. The invention can take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or an instruction execution system. Software embodiments include software adapted to implement the steps discussed above with reference to figures 1 to 8. A computer-usable or computer- readable can be any apparatus that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium.
In some embodiments, the methods and processes described herein may be implemented in whole or part by a user device. These methods and processes may be implemented by computer-application programs or services, an application programming interface (API), a library, and/or other computer-program product, or any combination of such entities.
The user device may be a mobile device such as a smart phone or tablet, a drone, a computer or any other device with processing capability, such as a robot or other connected device, including loT (Internet of Things) devices.
Figure 10 shows a generic computing system suitable for implementation of embodiments of the invention.
A shown in figure 10, a system includes a logic device 1001 and a storage device 1002. The system may optionally include a display subsystem 101 1 , input/output subsystem 1003, communication subsystem 1020, and/or other components not shown.
Logic device 1001 includes one or more physical devices configured to execute instructions. For example, the logic device 1001 may be configured to execute instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more components, achieve a technical effect, or otherwise arrive at a desired result.
The logic device 1001 may include one or more processors configured to execute software instructions. Additionally or alternatively, the logic device may include one or more hardware or firmware logic devices configured to execute hardware or firmware instructions. Processors of the logic device may be single-core or multi-core, and the instructions executed thereon may be configured for sequential, parallel, and/or distributed processing. Individual components of the logic device 1001 optionally may be distributed among two or more separate devices, which may be remotely located and/or configured for coordinated processing. Aspects of the logic device 1001 may be virtualized and executed by remotely accessible, networked computing devices configured in a cloud-computing configuration.
Storage device 1002 includes one or more physical devices configured to hold instructions executable by the logic device to implement the methods and processes described herein. When such methods and processes are implemented, the state of storage 1002 device may be transformed— e.g., to hold different data.
Storage device 1002 may include removable and/or built-in devices. Storage device may be locally or remotely stored (in a cloud for instance). Storage device 1002 may comprise one or more types of storage device including optical memory (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory (e.g., FLASH, RAM, EPROM, EEPROM, etc.), and/or magnetic memory (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others. Storage device may include volatile, non-volatile, dynamic, static, read/write, read-only, random-access, sequential- access, location-addressable, file-addressable, and/or content-addressable devices. In certain arrangements, the system may comprise an interface 1003 adapted to support communications between the logic device 1001 and further system components. For example, additional system components may comprise removable and/or built-in extended storage devices. Extended storage devices may comprise one or more types of storage device including optical memory 1032 (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory 1033 (e.g., RAM, EPROM, EEPROM, FLASH etc.), and/or magnetic memory 1031 (e.g., hard-disk drive, floppy- disk drive, tape drive, MRAM, etc.), among others. Such extended storage device may include volatile, non-volatile, dynamic, static, read/write, read-only, random- access, sequential-access, location-addressable, file-addressable, and/or content- addressable devices.
It will be appreciated that storage device includes one or more physical devices, and excludes propagating signals per se. However, aspects of the instructions described herein alternatively may be propagated by a communication medium (e.g., an electromagnetic signal, an optical signal, etc.), as opposed to being stored on a storage device.
Aspects of logic device 1001 and storage device 1002 may be integrated together into one or more hardware-logic components. Such hardware-logic components may include field-programmable gate arrays (FPGAs), program- and application-specific integrated circuits (PAS I C/ASICs), program- and application-specific standard products (PSSP/ASSPs), system-on-a-chip (SOC), and complex programmable logic devices (CPLDs), for example.
The term “program” may be used to describe an aspect of computing system implemented to perform a particular function. In some cases, a program may be instantiated via logic device executing machine-readable instructions held by storage device 1002. It will be understood that different modules may be instantiated from the same application, service, code block, object, library, routine, API, function, etc. Likewise, the same program may be instantiated by different applications, services, code blocks, objects, routines, APIs, functions, etc. The term “program” may encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc.
In particular, the system of figure 10 may be used to implement embodiments of the invention.
For example a program implementing the steps described with respect to figures 1 to 8, or the algorithms presented above may be stored in storage device 1002 and executed by logic device 1001. Information reflecting or defining the physical space, the entities, or the regions may be stored in storage device 1002, 1031 , 1032, 1033. Information reflecting or defining the physical space, the entities, or the regions may be stored received via the communications interface 1020. User input defining the regions may be received via the I/O interface 1003 and in particular the touchscreen display 1011 , camera 1016, microphone 1015, mouse 1013, keyboard 1012 or otherwise. The functions of any or all of the units910 or 950, may similarly be implemented by a program performing the required functions, in communication with additional dedicated hardware units as necessary. The display 1011 may display the graphical representation of the physical space, and/or the regions, and/or the entities. Accordingly the invention may be embodied in the form of a computer program.
It will be appreciated that a“service”, as used herein, is an application program executable across multiple user sessions. A service may be available to one or more system components, programs, and/or other services. In some implementations, a service may run on one or more server-computing devices.
When included, display subsystem 1011 may be used to present a visual representation of data held by a storage device. This visual representation may take the form of a graphical user interface (GUI). As the herein described methods and processes change the data held by the storage device 1002, and thus transform the state of the storage device 1002, the state of display subsystem 1011 may likewise be transformed to visually represent changes in the underlying data. Display subsystem 1011 may include one or more display devices utilizing virtually any type of technology for example as discussed above. Such display devices may be combined with logic device and/or storage device in a shared enclosure, or such display devices may be peripheral display devices. An audio output such as speaker 1014 may also be provided.
When included, input subsystem may comprise or interface with one or more user- input devices such as a keyboard 1012, mouse 1013, touch screen 1011 , or game controller (not shown). In some embodiments, the input subsystem may comprise or interface with selected natural user input (NUI) componentry. Such componentry may be integrated or peripheral, and the transduction and/or processing of input actions may be handled on- or off-board. Example NUI componentry may include a microphone 1015 for speech and/or voice recognition; an infrared, colour, stereoscopic, and/or depth camera 1016 for machine vision and/or gesture recognition; a head tracker, eye tracker, accelerometer, and/or gyroscope for motion detection and/or intent recognition; as well as electric-field sensing componentry for assessing brain activity. The input/output interface 1003 may similarly interface with a loudspeaker 1014, vibromotor or any other transducer device as may occur to the skilled person. For example, the system may interface with a printer 1017.
When included, communication subsystem 1020 may be configured to communicatively couple computing system with one or more other computing devices. For example, communication module of communicatively couple computing device to remote service hosted for example on a remote server 1076 via a network of any size including for example a personal area network, local area network, wide area network, or internet. Communication subsystem may include wired and/or wireless communication devices compatible with one or more different communication protocols. As non-limiting examples, the communication subsystem may be configured for communication via a wireless telephone network 1074, or a wired or wireless local- or wide-area network. In some embodiments, the communication subsystem may allow computing system to send and/or receive messages to and/or from other devices via a network such as Internet 1075. The communications subsystem may additionally support short range inductive communications with passive or active devices (NFC, RFID, UFIF, etc). In certain variants of the embodiments described above, the traffic data may be received via the telephone network 1074 or Internet 1075.
The system of figure 10 is intended to reflect a broad range of different types of information handling system. It will be appreciated that many of the subsystems and features described with respect to figure 10 are not required for implementation of the invention, but are included to reflect possible systems in accordance with the present invention. It will be appreciated that system architectures vary widely, and the relationship between the different sub-systems of figure 10 is merely schematic, and is likely to vary in terms of layout and the distribution of roles in systems. It will be appreciated that, in practice, systems are likely to incorporate different subsets of the various features and subsystems described with respect to figure 10.
Examples of devices comprising at least some elements of the system described with reference to figure 10 and suitable for implementing embodiments of the invention include cellular telephone handsets including smart phones, and vehicle navigation systems. Figure 11 shows a smartphone device adaptable to constitute an embodiment. As shown in figure 9, the smartphone device incorporates elements 1001 , 1002, 1003, 1020, optional near field communications interface 1021 , flash memory 1033 and elements 1014, 1015, 1016, and 1011 as described above. It is in communication with the telephone network 1074 and a server 1076 via the network 1075. Alternative communication mechanisms such as a dedicated network or Wi-Fi may also be used. The features disclosed in this figure may also be included within a tablet device as well.
Figure 12 shows an Air Traffic control desk adaptable to constitute an embodiment. As shown in figure 12, the Air Traffic control desk comprises elements 1001 , 1002, 1003, 1020, 1014, 1015, 1016, 1011 , 1031 , 1032, 1033 as described above. As shown it is in communication with a drone 1001 via a communications satellite 1002 and a radio antenna 1003 coupled to the communications interface 1020. As shown, the cockpit comprises a seat, and joysticks, either of which may constitute suitable locations for any user status sensors and/or vibration transducers as discussed above. Alternative communication mechanisms may also be used.
Further embodiments may be based on or include immersive environment devices such as the FITC vive, Oculus rift etc, or other hybrid device such as the Flololens Meta vision 2. It will be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated and/or described may be performed in the sequence illustrated and/or described, in other sequences, in parallel, or omitted. Likewise, the order of the above-described processes may be changed.
The subject matter of the present disclosure includes all novel and non-obvious combinations and sub-combinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.

Claims

Claims
1. A mobile entity manager for managing a mobile entity in a physical space, said mobile entity manager comprising:
a graphics renderer adapted to generate a two or three dimensional graphical representation of said physical space at a first spatial scale,
a representation engine adapted to define a reference point in said physical space, to determine a position of said mobile entity with respect to said reference point, and to determine a predicted time of arrival of said mobile entity within a region defined with respect to said reference point,
said graphics renderer being further adapted to modify said graphical representation to incorporate an indication of the position of said mobile entity on a path with respect to said region, wherein said indication is situated in a position in said physical space at a distance from said reference point, said distance being proportional to said predicted time of arrival.
2. A method of managing a mobile entity in a physical space, said method comprising the steps of:
defining a two or three dimensional graphical representation of said physical space at a first spatial scale,
defining a reference point in said physical space,
determining a position of said mobile entity with respect to said reference point, determining a predicted time of arrival of said mobile entity within a region defined with respect to said reference point,
modifying said graphical representation to incorporate an indication of the position of said mobile entity on a path with respect to said region, wherein said indication is situated in a position in said physical space at a distance from said reference point, said distance being proportional to said predicted time of arrival.
3. The method of claim 2 comprising a further steps of receiving a user input specifying a new position of said indication on said path, calculating one or more variations in the speed of said mobile entity which would modify the predicted time of arrival of said mobile entity to correspond to said new position of said indication, and issuing a communication to said mobile entity indicating said variations.
4. The method of claim 2 or 3 comprising a further steps of receiving a user input specifying a new position of said indication on said path, and calculating a modification of said path which would modify the predicted time of arrival of said mobile entity to correspond to said new position of said indication, and issuing a communication to said mobile entity indicating said variations.
5. The method of any of claims 2 to 4 wherein at said step of modifying said graphical representation to incorporate an indication of the position of said mobile entity with respect to said region, said indication is situated in a position in said graphical representation of said physical space with an orientation with respect to said reference point corresponding to the relative orientation of said entity to said reference point.
6. The method of any of claims 2 to 5 comprising the further steps of:
determining a position of a further said mobile entity with respect to said reference point,
determining a predicted time of arrival of said further mobile entity within said region, and modifying said graphical representation to incorporate an indication of the position of said further mobile entity with respect to said region, wherein said indication is situated on said path in a position in said graphical representation of said physical space at a distance from said reference point, said distance being proportional to said predicted time of arrival of said further mobile entity.
7. The method of claim 6 comprising a further step of determining on the basis of a speed of said first mobile entity and a speed of said further mobile entity a timing for a meeting of the two entities.
8. The method of any of claims 2 to 7 wherein said path corresponds to a path in said physical space that said mobile entity is expected to follow.
9. The method of any of claims 2 to 8 wherein a factor by which said distance is proportional to said predicted time of arrival varies as a function of the position of said indication on said path, or wherein said the factor by which said distance is proportional to said predicted time of arrival varies as a function of said orientation.
10. The method of any of claims 2 to 9 comprising the further step of:
modifying said graphical representation to incorporate speed vector indicator graphic, wherein said speed vector graphic has a dimension proportional to the speed of said mobile entity in said physical space.
11. The method of any of claims 2 to 10 comprising a further step of receiving a user input specifying a new position of said reference point, and repeating said steps of: determining a position of said mobile entity, determining a predicted time of arrival of said mobile entity, and modifying said graphical representation on the basis of said new position of said reference point.
12. The method of any of claims 2 to 11 comprising a further step of receiving a user input specifying a new scale for said graphical representation, and repeating said steps of:
determining a position of said mobile entity, determining a predicted time of arrival of said mobile entity, and modifying said graphical representation on the basis of said new scale.
13. The method of any of claims 2 to 12 in which said graphical representation is three dimensional.
14. A computer program comprising instructions adapted to implement the steps of any of claims 2 to 13.
PCT/EP2018/085666 2017-12-22 2018-12-18 Method and apparatus managing entities in a physical space WO2019121795A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP17306918.8 2017-12-22
EP17306918.8A EP3503065A1 (en) 2017-12-22 2017-12-22 Method and apparatus managing entities in a physical space

Publications (1)

Publication Number Publication Date
WO2019121795A1 true WO2019121795A1 (en) 2019-06-27

Family

ID=61691601

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/085666 WO2019121795A1 (en) 2017-12-22 2018-12-18 Method and apparatus managing entities in a physical space

Country Status (2)

Country Link
EP (1) EP3503065A1 (en)
WO (1) WO2019121795A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030200024A1 (en) * 2002-04-23 2003-10-23 Poreda Stanley J. Multiple approach time domain spacing aid display system and related techniques
EP1926072A1 (en) * 2005-09-13 2008-05-28 Pioneer Corporation Drive assistance device, drive assistance method, drive assistance program, and recording medium
US20140088820A1 (en) * 2012-09-24 2014-03-27 Caterpillar Inc. Location Services in Mining Vehicle Operations

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030200024A1 (en) * 2002-04-23 2003-10-23 Poreda Stanley J. Multiple approach time domain spacing aid display system and related techniques
EP1926072A1 (en) * 2005-09-13 2008-05-28 Pioneer Corporation Drive assistance device, drive assistance method, drive assistance program, and recording medium
US20140088820A1 (en) * 2012-09-24 2014-03-27 Caterpillar Inc. Location Services in Mining Vehicle Operations

Also Published As

Publication number Publication date
EP3503065A1 (en) 2019-06-26

Similar Documents

Publication Publication Date Title
US10162354B2 (en) Controlling error corrected planning methods for operating autonomous vehicles
KR102211299B1 (en) Systems and methods for accelerated curve projection
EP3324332B1 (en) Method and system to predict vehicle traffic behavior for autonomous vehicles to make driving decisions
CN108099918B (en) Method for determining a command delay of an autonomous vehicle
US10331138B2 (en) Standard scene-based planning control methods for operating autonomous vehicles
CN107664994B (en) System and method for autonomous driving merge management
US9513632B1 (en) Driving mode alerts from self-driving vehicles
KR20220030270A (en) Agent Trajectory Prediction Using Anchor Trajectories
EP3344479B1 (en) A vehicle position point forwarding method for autonomous vehicles
JP2018129028A (en) Information processing apparatus, information processing method, and program
US11897503B2 (en) Method and apparatus for detecting unexpected control state in autonomous driving system
US10613815B2 (en) Apparatus for positioning an interactive-display device within an interior of an autonomous vehicle
US11026049B2 (en) Communication between autonomous vehicles and operations personnel
EP3503065A1 (en) Method and apparatus managing entities in a physical space
EP3489866A1 (en) Method and apparatus managing entities in a space
WO2019206833A1 (en) Method and apparatus monitoring a space
US11798411B2 (en) Systems and methods for transforming high-definition geographical map data into messages for vehicle communications
EP4047583A2 (en) Method and apparatus for controlling vehicle-infrastructure cooperated autonomous driving, electronic device, and vehicle
EP4156144A2 (en) Method and apparatus for generating running log for autonomous vehicle
US11391573B2 (en) Object location tracking
CN115675458A (en) Method and apparatus for generating guide line, electronic device, and medium
CN114604241A (en) Vehicle driving risk assessment method and device, electronic equipment and edge computing equipment
CN114179834A (en) Vehicle parking method and device, electronic equipment, medium and automatic driving vehicle
CN115900724A (en) Path planning method and device
CN114780869A (en) Riding point recommendation method and device, electronic equipment and medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18822057

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18822057

Country of ref document: EP

Kind code of ref document: A1