EP3503065A1 - Verfahren und vorrichtung zur verwaltung von entitäten in einem physischen raum - Google Patents

Verfahren und vorrichtung zur verwaltung von entitäten in einem physischen raum Download PDF

Info

Publication number
EP3503065A1
EP3503065A1 EP17306918.8A EP17306918A EP3503065A1 EP 3503065 A1 EP3503065 A1 EP 3503065A1 EP 17306918 A EP17306918 A EP 17306918A EP 3503065 A1 EP3503065 A1 EP 3503065A1
Authority
EP
European Patent Office
Prior art keywords
mobile entity
reference point
physical space
indication
graphical representation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP17306918.8A
Other languages
English (en)
French (fr)
Inventor
Jérémie GARCIA
Stéphane CONVERSY
Nicolas SAPORITO
Guilhem BUISAN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecole Nationale de l'Aviation Civile ENAC
Original Assignee
Ecole Nationale de l'Aviation Civile ENAC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecole Nationale de l'Aviation Civile ENAC filed Critical Ecole Nationale de l'Aviation Civile ENAC
Priority to EP17306918.8A priority Critical patent/EP3503065A1/de
Priority to PCT/EP2018/085666 priority patent/WO2019121795A1/en
Publication of EP3503065A1 publication Critical patent/EP3503065A1/de
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • G08G1/0145Measuring and analyzing of parameters relative to traffic conditions for specific applications for active traffic flow control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0112Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0133Traffic data processing for classifying traffic situation
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096775Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0017Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
    • G08G5/0026Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located on the ground
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/0969Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/164Centralised systems, e.g. external to vehicles

Definitions

  • the present invention relates generally to the managing of entities in a physical space.
  • Various classes of oversight of mobile entities can be defined including pilots and drivers on one hand, and supervisors, air traffic controllers, vessel traffic controllers and the like who have a general responsibility for entities in a given area on the other.
  • the working conditions of these classes of individuals are affected by current technological trends, and furthermore significant convergence of their roles.
  • vehicles are increasingly autonomous, so that the role of the driver or pilot is increasingly supported by electronic tools such as navigation tools, and/or entrusted to a remote operator who can take over control of the vehicle via a telecommunications channel at critical instants.
  • supervision and guidance tasks may be increasingly supported by information technology, so that one individual can be expected to supervise an ever larger area, or increasingly, to supervise several different areas, or attribute only a portion of their attention to traffic considerations, with the relevant traffic information being relayed from the respective areas via telecommunications means.
  • a mobile entity manager for managing a mobile entity in a physical space.
  • the mobile entity manager comprises a graphics renderer adapted to generate a graphical representation of the physical space at a first scale, a representation engine adapted to define a reference point in the physical space, to determine a position of the mobile entity with respect to the reference point, and to determine a predicted time of arrival of the mobile entity within a region defined with respect to the reference point.
  • the graphics renderer is further adapted to modify the graphical representation to incorporate an indication of the position of the mobile entity on a path with respect to the region, wherein the indication is situated in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival.
  • a method of managing a mobile entity in a physical space comprising the steps of:
  • the method comprises the further steps of receiving a user input specifying a new position of the indication on the path, calculating one or more variations in the speed of the mobile entity which would modify the predicted time of arrival of the mobile entity to correspond to the new position of the indication, and issuing a communication to the mobile entity indicating the variations.
  • the method comprises the further steps of receiving a user input specifying a new position of the indication on the path, a modification of the path which would modify the predicted time of arrival of the mobile entity to correspond to the new position of the indication, and issuing a communication to the mobile entity indicating the variations.
  • the indication is situated in a position in graphical representation of the physical space with an orientation with respect to the reference point corresponding to the relative orientation of the entity to the reference point.
  • the method comprises the further steps of:
  • the method comprises the further step of determining a timing for a convergence of the two entities on the basis of a speed of the first mobile entity and a speed of the further mobile entity.
  • the path corresponds to a path in the physical space that the mobile entity is expected to follow.
  • the factor by which the distance is proportional to the predicted time of arrival varies as a function of the position of the indication thereon, or wherein the factor by which the distance is proportional to the predicted time of arrival varies as a function of the orientation.
  • the method comprises the further step of:
  • the method comprises a further step of receiving a user input specifying a new position of the reference point, and repeating the steps of determining a position of the mobile entity, determining a predicted time of arrival of the mobile entity, and modifying the graphical representation on the basis of the new position of the reference point.
  • the method comprises the further step of receiving a user input specifying a new scale for the graphical representation, and repeating the steps of determining a position of the mobile entity, determining a predicted time of arrival of the mobile entity, and modifying the graphical representation on the basis of the new scale.
  • the graphical representation is three dimensional.
  • Figure 1 shows a method of managing the behaviour of a mobile entity in a physical space in accordance with an embodiment.
  • the method starts at step 100 before proceeding to step 110, at which a graphical representation of the physical space is defined at a first scale.
  • the method next proceeds to step 120 at which a reference point in the physical space is defined.
  • This definition may comprise retrieving a position value from memory, receiving user input, for example as discussed in further detail below, or otherwise.
  • a position of the mobile entity is determined with respect to the reference point.
  • the position of the mobile entity may be obtained directly, for example by means of a radar, lidar or other such system, by triangulation of a signal emitted or reflected by the entity, or indirectly, for example by receiving positioning information, as obtained for example from a GNSS system, from the mobile entity itself, or inferred for example on a dead-reckoning basis from applying time and velocity data to a known starting point, or otherwise.
  • a predicted time of arrival of the mobile entity within a region defined with respect to the reference point is determined.
  • Determination of the predicted time of arrive may comprise determining a speed of the mobile entity, either directly for example by means of a radar, lidar or other such system, by inference from successive location determinations, or indirectly, for example by receiving speed information from the mobile entity itself.
  • the method then finally proceeds to step 150 at which the graphical representation is modified to incorporate an indication of the position of the mobile entity on a path with respect to the region, wherein the indication is situated in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival before terminating at step 190.
  • Figure 2 shows a method of managing the behaviour of a mobile entity in a physical space in accordance with a development of the embodiment of figure 1 .
  • the method proceeds through steps 110, 120, 130, 140 and 150 as described with reference to figure 1 , before proceeding to step 260 at which a user input is received specifying a new position of the indication on the path.
  • the method then proceeds to calculate one or more variations in the speed of the mobile entity which would modify the predicted time of arrival of the mobile entity to correspond to the new position of the indication at step 270 before issuing a communication to the mobile entity indicating the variations at step 280 before terminating at step 190.
  • Such speed variations may be obtained on the basis of a model of the mobile entities capacities in terms of acceleration and deceleration, possibly taking into account fuel efficiency considerations, passenger comfort, possible interactions with other entities, and the like, as will occur to the skilled person.
  • Figure 3 shows a method of managing the behaviour of a mobile entity in a physical space in accordance with a development of the embodiment of figure 1 .
  • the method proceeds through steps 110, 120, 130, 140 and 150 as described with reference to figure 1 , before proceeding to step 360 at which a position of a further mobile entity is determined with respect to the reference point.
  • the method next proceeds to step 370 at which a predicted time of arrival of the further mobile entity within the region is determined, and then proceeds to step 380 of modifying the graphical representation to incorporate an indication of the position of the further mobile entity with respect to the region, wherein the indication is situated on the path in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival of the further mobile entity.
  • Figure 4 shows an example of a physical space susceptible to graphical representation in accordance with an embodiment.
  • a section of road 400 having two lanes 411, 412 in a first direction 410 (right to left), and two lanes 421, 422 in a second direction 420 (left to right).
  • An entity shown as a car 431 is shown as travelling in the first lane 411 in the first direction 410
  • a entity shown as a car 432 is shown as travelling in the second lane 412 in the first direction 410
  • an entity shown as a car 433 is shown as travelling in the first lane 421 in the second direction 420
  • an entity shown as a car 434 is shown as travelling in the second lane 422 in the second direction 420.
  • the road section 400 or alternatively a road direction 410 or 420, or a lane 411, 412, 421 or 422 may correspond to the physical space 400 as discussed above.
  • Road section 400 has an entry slip road 423.
  • a reference point 440 is defined in the physical space. This reference point 440 may be defined by a user input, with referent to the position of an entity as discussed further below, or be in a defined relation to the physical space or an element thereof.
  • a position of a mobile entity is determined with respect to the reference point.
  • the position of the two mobile entities 433 and 434 are determined with respect to the reference point 440, as indicated by the lines 443 and 444.
  • the position of any number of mobile entities may be determined in this way.
  • the position of every mobile entity, or every mobile entity satisfying certain selection criteria, in the physical space may be thus determined.
  • the position of every mobile entity, or every mobile entity satisfying certain selection criteria, in some other region, which may be larger than or smaller than the physical space may be thus determined.
  • the speed of mobile entity 433 is 75km/h, while the speed of mobile entity 434 is 50km/h.
  • the distance from the reference point to the first mobile entity 433 is 1km, while the distance from the reference point to the second mobile entity 434 is 0.8km.
  • Figure 5 shows an example of a graphical representation 500 of the physical space of figure 4 generated in accordance with an embodiment.
  • a representation of a section of road 500 having two lanes 511, 512 in a first direction 510 (right to left), and two lanes 521, 522 in a second direction 520 (left to right).
  • An indication of the position of the first mobile entity 533 is shown as travelling in the first lane 521 in the second direction 520
  • the second mobile entity 534 is shown as travelling in the second lane 522 in the second direction 520.
  • the road section 500 corresponds to the physical space 400 as discussed above, and represents the relevant physical features thereof.
  • the indications of the mobile entities resemble the entities themselves, this need not be the case, and may be the case to any desired extent.
  • the indications may be simple points, boxes containing an icon and/or text or a schematic representation of the class of mobile entity in question at one end of the scale, up to a photorealistic representation or live image of the mobile entity itself at the other end of the scale.
  • figure 5 presents an example of a graphical representation of the physical space at a first scale, defined in accordance with an embodiment.
  • the scale is not specified in figure 5 , and it will be appreciated that any convenient scale may be chosen on the basis of the size of the physical space and the size of whatever display is to be used for the presentation of the graphical representation.
  • figure 5 shows a two dimensional representation of the physical space
  • other embodiments may equally present a three dimensional representation of the physical space.
  • the reference point 540 as defined in the physical space is presented in the graphical representation. It will be appreciated that in some embodiments, the position of the reference point need not be included in the graphical representation, or not at all times.
  • the reference point 540 may be defined by a user input, with referent to the position of an entity as discussed further below, or be in a defined relation to the physical space or an element thereof. Where the user point is defined by user input, this may conveniently take place by manipulation of the graphical representation, for example by conventional user interface mechanisms. For example, the user might select the desired location in the physical representation by clicking with a mouse cursor, finger tip, or the like.
  • a predicted time of arrival of each mobile entity whose position is determined within a region 550 defined with respect to the reference point 540 is obtained.
  • the region and the reference point may coincide, or otherwise the region may be situated in any predefined spatial relationship to the reference point.
  • the graphical representation is modified to incorporate an indication of the position of the first mobile entity 533 on a path 560 with respect to the region 550, wherein the indication 533 is situated in a position in the representation at a distance from the reference point proportional to the predicted time of arrival i.e. 58s.
  • the graphical representation is further modified to incorporate an indication of the position of the second mobile entity 534 on the path 560 with respect to the region 550, wherein the indication 534 is situated in a position in the representation at a distance from the reference point proportional to the predicted time of arrival i.e. 48s.
  • a graphical representation in accordance with the embodiment of figure 5 shows the second mobile entity further from the reference point that the first mobile entity, on the basis of the fact that it is expected to arrive later than the first mobile entity, which although presently further away in space, is moving faster, and thus closer in time.
  • the graphical representation may further be modified to incorporate an indication of the position of the further mobile entity with respect to the region, wherein the indication is situated on the path in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival of the further mobile entity.
  • figure 5 shows indications of the position of two physical entities, it will be appreciated that at any given time the number of indications may be any value from zero upwards, depending on the number of mobile entities in the physical space, or otherwise taken into account as discussed above.
  • the representation of figure 5 may support further variants.
  • a timing for a convergence of the two entities may be determined. This determination may then be used to indicate whether a particular mobile entity can overtake another within a specified time window (for example, before the arrival of an oncoming vehicle), or within a particular physical space (for example before an overtaking lane ends). Still further, the speed of one, or the other of the two or more entities may be adapted to achieve a desired result, for example indicating the required acceleration to achieve the desired manoeuvre in the available time or space, or in order for the two mobile entities to reach a common destination (corresponding to the reference point) at the same time.
  • the communication may comprise by way of example the transmission of a message to the mobile entity, the transmission of a message to the user, or to any other destination.
  • the communication may comprise multiple transmissions to different destinations.
  • the communication may comprise a modification of the graphical representation, for example through changing the colour or prominence of certain features, such as the path, the region or the entity or an indicator region that may be defined in the graphical region for this purpose.
  • a text message may be presented via the graphical representation, or otherwise.
  • the communication may comprise transmission via any suitable channel. It may be transmitted via any data network such as a WAN (e.g. the internet, GSM, UMTS), a LAN (e.g. wifi network or Ethernet), PAN (e.g. Bluetooth, zigbee).
  • a WAN e.g. the internet, GSM, UMTS
  • LAN e.g. wifi network or Ethernet
  • PAN e.g. Bluetooth, zigbee
  • the communication may be of any format, and may include text, graphical or audio content, or a combination of these.
  • the communication may be formatted for a human addressee, for example in the form of a human readable text or audio message.
  • the signalling action may be formatted for a machine recipient, for example on the basis of an API, or any suitable technical format having regard to the context.
  • the signalling action may be transmitted merely for information or as a warning, or may contain instructions. Where the signalling action involves transmission to a machine recipient, these instructions may be directly operable by that machine.
  • the communication may directly control the speed of the mobile entity.
  • the communication might additionally or alternatively comprise the transmission of an audible chime to be played to an occupant of the entity to alert them to the change of status.
  • the communication may comprise sending a signal to an entity.
  • any embodiment may be implemented with additional reference points.
  • Any mobile entity may be represented in a position reflecting its time of arrival at a respective one of these reference points, or any mobile entity may be represented multiple times, with each representation position reflecting the time of arrival of the corresponding mobile entity at a respective one of these reference points have different targets in the physical space.
  • the indication in addition to providing an indication of the position of the mobile entity on a path with respect to the region, the indication may be situated in a position in the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival, a further indication for that same mobile entity may be provided in a position in the physical space corresponding to the actual physical position of that entity.
  • user input may be received specifying a new position of the reference point, whereupon the selection of mobile entities for representation, the determination of the position of those mobile entities, the determination of a predicted time of arrival of those mobile entities will be recalculated, and the graphical representation updated on the basis of the new position of the new reference point.
  • Figure 6 shows a further example of a graphical representation 500 of the physical space of figure 4 generated in accordance with an embodiment.
  • a representation of a section of road 500 substantially as described with reference to figure 5 .
  • user input is received specifying a new position of the indication on the path.
  • this is represented by the position of the cursor 601, which as shown has "dragged" the indicator of the second mobile entity 534 to a new position 633 on the path 560.
  • the second mobile entity is now situated ahead of the first mobile entity 533.
  • one or more variations in the speed of the second mobile entity 434 which would modify the predicted time of arrival of the second mobile entity to correspond to the new position of the indication may be calculated and issued in a communication to the mobile entity indicating the variations.
  • a communication may be used in any of the ways, and take any of the forms, of communications as discussed above with reference to figure 5 .
  • the path itself may be modified so as to modify the predicted time of arrival of the second mobile entity to correspond to the new position of the indication may be calculated and issued in a communication to the mobile entity indicating the variations.
  • a shorter or longer path may be imposed, having a length such that at a particular speed the mobile entity will arrive at a desired time.
  • the modified path comprise the original path replaced in whole or in part with a new path, or may constitute a distortion of the original path, or otherwise be selected from a predefined template, or automatically generated by a path finding algorithm given defined path constraints. Meanders, loops, dog legs or other such path extensions may be added to delay the time of arrival.
  • the speed and or path of the mobile entity may be adjusted for example as described above, or any combination of the two approaches.
  • Figure 7 shows a further example of a physical space susceptible to graphical representation in accordance with an embodiment.
  • a section of road 700 having two lanes 711, 712 in a first direction 710 (right to left), and two lanes 721, 722 in a second direction 720 (left to right).
  • An entity shown as a car 733 is shown as travelling in the first lane 721 in the second direction 720, and an entity shown as a car 734 is shown as travelling in the entry slip road 723.
  • the first mobile entity 733 has a first orientation ⁇ 1 with respect to the reference point 740 and the second mobile entity 734 has a second orientation ⁇ 2 with respect to the reference point 740.
  • a reference point 740 is defined in the physical space.
  • a position of a mobile entity is determined with respect to the reference point.
  • the position of the two mobile entities 733 and 734 are determined with respect to the reference point 740, as indicated by the lines 743 and 744.
  • the position of any number of mobile entities may be determined in this way.
  • the position of every mobile entity, or every mobile entity satisfying certain selection criteria, in the physical space may be thus determined.
  • the position of every mobile entity, or every mobile entity satisfying certain selection criteria, in some other region, which may be larger than or smaller than the physical space may be thus determined.
  • the speed of mobile entity 733 is 75km/h, while the speed of mobile entity 734 is 50km/h.
  • the distance from the reference point to the first mobile entity 733 is 1km, while the distance from the reference point to the second mobile entity 734 is 0.8km.
  • Figure 8 shows an example of a graphical representation 800 of the physical space of figure 7 generated in accordance with an embodiment.
  • a representation of a section of road 800 having two lanes 811, 812 in a first direction 810 (right to left), and two lanes 821, 822 in a second direction 820 (left to right).
  • a representation of the first mobile entity 833 is shown as travelling in the first lane 821 in the second direction 820, and the second mobile entity 834 is shown as travelling in the entry slip road 823.
  • the road section 800 corresponds to the physical space 700 as discussed above, and represents the relevant physical features thereof.
  • figure 8 presents an example of a graphical representation of the physical space at a first scale, defined in accordance with an embodiment.
  • the scale is not specified in figure 8 , and it will be appreciated that any convenient scale may be chosen on the basis of the size of the physical space and the size of whatever display is to be used for the presentation of the graphical representation.
  • a predicted time of arrival of each mobile entity whose position is determined within a region defined with respect to the reference point is determined within a region defined with respect to the reference point.
  • the graphical representation is modified to incorporate an indication of the position of the first mobile entity 833 on a path 863 with respect to the region 850, wherein the indication 833 is situated in a position in the representation at a distance from the reference point proportional to the predicted time of arrival i.e. 58s.
  • the graphical representation is further modified to incorporate an indication of the position of the second mobile entity 834 on the path 864 with respect to the region 850, wherein the indication 834 is situated in a position in the representation at a distance from the reference point proportional to the predicted time of arrival i.e. 48s.
  • the indication of the first mobile entity 833 is situated in a position in the representation of the physical space with an orientation ⁇ 1 with respect to the reference point 840 corresponding to the relative orientation of the first mobile entity to the reference point ⁇ 1 .
  • the indication of the second mobile entity 834 is situated in a position in the representation of the physical space with an orientation ⁇ 2 with respect to the reference point 840 corresponding to the relative orientation of the first mobile entity to the reference point ⁇ 2 .
  • each entity may be located on a respective path corresponding to the path in the physical space that the respective mobile entity is expected to follow.
  • the factor by which the distance is proportional to the predicted time of arrival may vary as a function of the position of the indication thereon.
  • a first additional zone 802 is provided around the edge of the graphical representation
  • a second additional zone 803 is provided around the outer edge of the first additional zone 802.
  • Indications for mobile entities that are outside the physical area may be represented in these additional zones, such as elements 835 and 836 in figure 8 . These may be presented as mobile entities out of the visible spatial range but visible in the temporal range.
  • the factor by which the distance is proportional to the predicted time of arrival may be different in each zone, and become progressively greater for each successively more remote zone from the reference point. In other embodiments, the factor by which the distance is proportional to the predicted time of arrival may rise continually, in accordance with any continuous function, with the zones serving to mark convenient reference time graduations in this succession.
  • the zones are situated all around the periphery of the graphical representation, in some cases the zones need not occupy the entire periphery, for example in embodiments where mobile entities only arrive from certain directions. Still further, in cases where mobile entities only arrive along certain paths, the zones may be linear in nature, with mobile entities presented as a queue. Still further, in some cases, for example where such a linear presentation is adopted or where mobile entities arrive from a limited range of directions, the zones need not appear at the periphery of the graphical representation, but may be situated in any convenient location within the graphical representation.
  • the manner in which the factor by which the distance is proportional to the predicted time of arrival may vary as a function of orientation, with regard to the point of reference, so that entities arriving from one direction are subject to time scaling according to one factor, or one linear evolution, whilst entities arriving from another direction are subject to time scaling according to a further respective factor, or linear evolution.
  • a user input may be received specifying a new scale for the graphical representation or some part thereof, whereupon the selection of mobile entities for representation, the determination of the position of each mobile entity, and the determination of the predicted time of arrival of each mobile entity, can be recomputed and the graphical representation modified on the basis of the new scale accordingly.
  • the graphical representation may also be modified to incorporate speed vector indicator graphic, having a dimension proportional to the speed of each respective mobile entity in the physical space.
  • the four mobile entities are each provided with a speed vector indicator in the form of an arrow whose length is proportional to the speed of the mobile entity, and whose orientation corresponds to the direction of movement thereof.
  • a speed vector indicator in the form of an arrow whose length is proportional to the speed of the mobile entity, and whose orientation corresponds to the direction of movement thereof.
  • speed vector indicators might be presented only for mobile entities in the physical area, or in one or more of the additional zones.
  • the dimension of the speed vector indicator may be chosen to correspond to the time the entity will take to travel a specified distance at the known speed of the mobile entity, using the same factor by which the distance is proportional to the predicted time of arrival of the mobile entity in question
  • Figure 9 shows a mobile entity manager in accordance with an embodiment.
  • a mobile entity manager 900 for managing a mobile entity 923 in a physical space 990.
  • the mobile entity manager 900 comprises a graphics renderer adapted to generate a graphical representation 911 of the physical space 990 at a first scale.
  • the mobile entity manager 900 further comprises a representation engine 950 adapted to define a reference point in the physical space 990, to determine a position of the mobile entity 923 with respect to the reference point, and to determine a predicted time of arrival of the mobile entity 923 within a region defined with respect to the reference point.
  • the graphics renderer 910 is further adapted to modify the graphical representation 911 to incorporate an indication of the position of the mobile entity 923 on a path with respect to the region, wherein the indication is situated in a position in the representation of the physical space at a distance from the reference point, the distance being proportional to the predicted time of arrival.
  • the representation of the physical space may be generated on the basis of a digital representation of the space 920.
  • the position of the mobile entities may be generated on the basis of entity data 960.
  • User input for example as described with respect to any of figure 1 to 8 , may be provided to the representation engine via the graphical representation 911 from a user 940.
  • Software embodiments include but are not limited to application, firmware, resident software, microcode, etc.
  • the invention can take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or an instruction execution system.
  • Software embodiments include software adapted to implement the steps discussed above with reference to figures 1 to 8 .
  • a computer-usable or computer-readable can be any apparatus that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium.
  • the methods and processes described herein may be implemented in whole or part by a user device. These methods and processes may be implemented by computer-application programs or services, an application-programming interface (API), a library, and/or other computer-program product, or any combination of such entities.
  • API application-programming interface
  • the user device may be a mobile device such as a smart phone or tablet, a drone, a computer or any other device with processing capability, such as a robot or other connected device, including loT (Internet of Things) devices.
  • a mobile device such as a smart phone or tablet, a drone, a computer or any other device with processing capability, such as a robot or other connected device, including loT (Internet of Things) devices.
  • loT Internet of Things
  • Figure 10 shows a generic computing system suitable for implementation of embodiments of the invention.
  • a system includes a logic device 1001 and a storage device 1002.
  • the system may optionally include a display subsystem 1011, input/output subsystem 1003, communication subsystem 1020, and/or other components not shown.
  • Logic device 1001 includes one or more physical devices configured to execute instructions.
  • the logic device 1001 may be configured to execute instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs.
  • Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more components, achieve a technical effect, or otherwise arrive at a desired result.
  • the logic device 1001 may include one or more processors configured to execute software instructions. Additionally or alternatively, the logic device may include one or more hardware or firmware logic devices configured to execute hardware or firmware instructions. Processors of the logic device may be single-core or multi-core, and the instructions executed thereon may be configured for sequential, parallel, and/or distributed processing. Individual components of the logic device 1001 optionally may be distributed among two or more separate devices, which may be remotely located and/or configured for coordinated processing. Aspects of the logic device 1001 may be virtualized and executed by remotely accessible, networked computing devices configured in a cloud-computing configuration.
  • Storage device 1002 includes one or more physical devices configured to hold instructions executable by the logic device to implement the methods and processes described herein. When such methods and processes are implemented, the state of storage 1002 device may be transformed-e.g., to hold different data.
  • Storage device 1002 may include removable and/or built-in devices. Storage device may be locally or remotely stored (in a cloud for instance). Storage device 1002 may comprise one or more types of storage device including optical memory (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory (e.g., FLASH, RAM, EPROM, EEPROM, etc.), and/or magnetic memory (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others. Storage device may include volatile, non-volatile, dynamic, static, read/write, read-only, random-access, sequential-access, location-addressable, file-addressable, and/or content-addressable devices.
  • optical memory e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.
  • semiconductor memory e.g., FLASH, RAM, EPROM, EEPROM, etc.
  • magnetic memory e.g., hard-disk drive
  • the system may comprise an interface 1003 adapted to support communications between the logic device 1001 and further system components.
  • additional system components may comprise removable and/or built-in extended storage devices.
  • Extended storage devices may comprise one or more types of storage device including optical memory 1032 (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory 1033 (e.g., RAM, EPROM, EEPROM, FLASH etc.), and/or magnetic memory 1031 (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others.
  • Such extended storage device may include volatile, non-volatile, dynamic, static, read/write, read-only, random-access, sequential-access, location-addressable, file-addressable, and/or content-addressable devices.
  • logic device 1001 and storage device 1002 may be integrated together into one or more hardware-logic components.
  • Such hardware-logic components may include field-programmable gate arrays (FPGAs), program- and application-specific integrated circuits (PASIC/ASICs), program- and application-specific standard products (PSSP/ASSPs), system-on-a-chip (SOC), and complex programmable logic devices (CPLDs), for example.
  • FPGAs field-programmable gate arrays
  • PASIC/ASICs program- and application-specific integrated circuits
  • PSSP/ASSPs program- and application-specific standard products
  • SOC system-on-a-chip
  • CPLDs complex programmable logic devices
  • system of figure 10 may be used to implement embodiments of the invention.
  • a program implementing the steps described with respect to figures 1 to 8 , or the algorithms presented above may be stored in storage device 1002 and executed by logic device 1001.
  • Information reflecting or defining the physical space, the entities, or the regions may be stored in storage device 1002, 1031, 1032, 1033.
  • Information reflecting or defining the physical space, the entities, or the regions may be stored received via the communications interface 1020.
  • User input defining the regions may be received via the I/O interface 1003 and in particular the touchscreen display 1011, camera 1016, microphone 1015, mouse 1013, keyboard 1012 or otherwise.
  • the functions of any or all of the units910 or 950 may similarly be implemented by a program performing the required functions, in communication with additional dedicated hardware units as necessary.
  • the display 1011 may display the graphical representation of the physical space, and/or the regions, and/or the entities. Accordingly the invention may be embodied in the form of a computer program.
  • a “service”, as used herein, is an application program executable across multiple user sessions.
  • a service may be available to one or more system components, programs, and/or other services.
  • a service may run on one or more server-computing devices.
  • display subsystem 1011 may be used to present a visual representation of data held by a storage device.
  • This visual representation may take the form of a graphical user interface (GUI).
  • GUI graphical user interface
  • Display subsystem 1011 may include one or more display devices utilizing virtually any type of technology for example as discussed above. Such display devices may be combined with logic device and/or storage device in a shared enclosure, or such display devices may be peripheral display devices. An audio output such as speaker 1014 may also be provided.
  • input subsystem may comprise or interface with one or more user-input devices such as a keyboard 1012, mouse 1013, touch screen 1011, or game controller (not shown).
  • the input subsystem may comprise or interface with selected natural user input (NUI) componentry.
  • NUI natural user input
  • Such componentry may be integrated or peripheral, and the transduction and/or processing of input actions may be handled on- or off-board.
  • Example NUI componentry may include a microphone 1015 for speech and/or voice recognition; an infrared, colour, stereoscopic, and/or depth camera 1016 for machine vision and/or gesture recognition; a head tracker, eye tracker, accelerometer, and/or gyroscope for motion detection and/or intent recognition; as well as electric-field sensing componentry for assessing brain activity.
  • the input/output interface 1003 may similarly interface with a loudspeaker 1014, vibromotor or any other transducer device as may occur to the skilled person.
  • the system may interface with a printer 1017.
  • communication subsystem 1020 may be configured to communicatively couple computing system with one or more other computing devices.
  • Communication subsystem may include wired and/or wireless communication devices compatible with one or more different communication protocols.
  • the communication subsystem may be configured for communication via a wireless telephone network 1074, or a wired or wireless local- or wide-area network.
  • the communication subsystem may allow computing system to send and/or receive messages to and/or from other devices via a network such as Internet 1075.
  • the communications subsystem may additionally support short range inductive communications with passive or active devices (NFC, RFID, UHF, etc).
  • the traffic data may be received via the telephone network 1074 or Internet 1075.
  • the system of figure 10 is intended to reflect a broad range of different types of information handling system. It will be appreciated that many of the subsystems and features described with respect to figure 10 are not required for implementation of the invention, but are included to reflect possible systems in accordance with the present invention. It will be appreciated that system architectures vary widely, and the relationship between the different sub-systems of figure 10 is merely schematic, and is likely to vary in terms of layout and the distribution of roles in systems. It will be appreciated that, in practice, systems are likely to incorporate different subsets of the various features and subsystems described with respect to figure 10 .
  • Examples of devices comprising at least some elements of the system described with reference to figure 10 and suitable for implementing embodiments of the invention include cellular telephone handsets including smart phones, and vehicle navigation systems.
  • FIG 11 shows a smartphone device adaptable to constitute an embodiment.
  • the smartphone device incorporates elements 1001, 1002, 1003, 1020, optional near field communications interface 1021, flash memory 1033 and elements 1014, 1015, 1016, and 1011 as described above. It is in communication with the telephone network 1074 and a server 1076 via the network 1075. Alternative communication mechanisms such as a dedicated network or Wi-Fi may also be used. The features disclosed in this figure may also be included within a tablet device as well.
  • FIG 12 shows an Air Traffic control desk adaptable to constitute an embodiment.
  • the Air Traffic control desk comprises elements 1001, 1002, 1003, 1020, 1014, 1015, 1016, 1011, 1031, 1032, 1033 as described above.
  • the aircraft is in communication with a drone 1001 via a communications satellite 1002 and a radio antenna 1003 coupled to the communications interface 1020.
  • the cockpit comprises a seat, and joysticks, either of which may constitute suitable locations for any user status sensors and/or vibration transducers as discussed above. Alternative communication mechanisms may also be used.
  • immersive environment devices such as the HTC be, Oculus rift etc, or other hybrid device such as the Hololens Meta vision 2.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Navigation (AREA)
EP17306918.8A 2017-12-22 2017-12-22 Verfahren und vorrichtung zur verwaltung von entitäten in einem physischen raum Withdrawn EP3503065A1 (de)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP17306918.8A EP3503065A1 (de) 2017-12-22 2017-12-22 Verfahren und vorrichtung zur verwaltung von entitäten in einem physischen raum
PCT/EP2018/085666 WO2019121795A1 (en) 2017-12-22 2018-12-18 Method and apparatus managing entities in a physical space

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP17306918.8A EP3503065A1 (de) 2017-12-22 2017-12-22 Verfahren und vorrichtung zur verwaltung von entitäten in einem physischen raum

Publications (1)

Publication Number Publication Date
EP3503065A1 true EP3503065A1 (de) 2019-06-26

Family

ID=61691601

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17306918.8A Withdrawn EP3503065A1 (de) 2017-12-22 2017-12-22 Verfahren und vorrichtung zur verwaltung von entitäten in einem physischen raum

Country Status (2)

Country Link
EP (1) EP3503065A1 (de)
WO (1) WO2019121795A1 (de)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030200024A1 (en) * 2002-04-23 2003-10-23 Poreda Stanley J. Multiple approach time domain spacing aid display system and related techniques
EP1926072A1 (de) * 2005-09-13 2008-05-28 Pioneer Corporation Fahrhilfsvorrichtung, fahrhilfsverfahren, fahrhilfsprogramm und aufzeichnungsmedium
US20140088820A1 (en) * 2012-09-24 2014-03-27 Caterpillar Inc. Location Services in Mining Vehicle Operations

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030200024A1 (en) * 2002-04-23 2003-10-23 Poreda Stanley J. Multiple approach time domain spacing aid display system and related techniques
EP1926072A1 (de) * 2005-09-13 2008-05-28 Pioneer Corporation Fahrhilfsvorrichtung, fahrhilfsverfahren, fahrhilfsprogramm und aufzeichnungsmedium
US20140088820A1 (en) * 2012-09-24 2014-03-27 Caterpillar Inc. Location Services in Mining Vehicle Operations

Also Published As

Publication number Publication date
WO2019121795A1 (en) 2019-06-27

Similar Documents

Publication Publication Date Title
KR102211299B1 (ko) 곡선 투영을 가속화하기 위한 시스템 및 방법
EP3324332B1 (de) Verfahren und system zur vorhersage von fahrzeugverkehrverhalten für autonome fahrzeuge bei der fällung von fahrentscheidungen
EP3335006B1 (de) Fehlerkorrigierte planungsverfahren für den betrieb von autonomen fahrzeugen
CN108099918B (zh) 用于确定自主车辆的命令延迟的方法
JP7459224B2 (ja) アンカー軌道を使用したエージェント軌道予測
CN107664994B (zh) 用于自主驾驶合并管理的系统和方法
US20220309920A1 (en) Controlling vehicle-infrastructure cooperated autonomous driving
EP3344479B1 (de) Verfahren zur weiterleitung von fahrzeugpositionspunkten für autonome fahrzeuge
CN114830138A (zh) 训练轨迹评分神经网络以准确分配分数
US11897503B2 (en) Method and apparatus for detecting unexpected control state in autonomous driving system
JP2023024276A (ja) 譲るシナリオにおける自律型車両のための行動計画立案
US11026049B2 (en) Communication between autonomous vehicles and operations personnel
EP3503065A1 (de) Verfahren und vorrichtung zur verwaltung von entitäten in einem physischen raum
US10613815B2 (en) Apparatus for positioning an interactive-display device within an interior of an autonomous vehicle
US9530317B2 (en) Movement-measurement-processing system, movement-measurement-processing method, and movement-measurement-processing program
EP3489866A1 (de) Verfahren und vorrichtung zur verwaltung von entitäten in einem raum
EP3561793A1 (de) Verfahren und vorrichtung zur überwachung eines raums
EP4047583A2 (de) Verfahren und vorrichtung zur steuerung des autonomen fahrens in zusammenarbeit mit fahrzeuginfrastruktur, elektronischem gerät und fahrzeug
US11798411B2 (en) Systems and methods for transforming high-definition geographical map data into messages for vehicle communications
EP4156144A2 (de) Verfahren und vorrichtung zur erzeugung eines laufprotokolls für ein autonomes fahrzeug
JP2023016628A (ja) 処理システム、処理装置、処理方法、処理プログラム
CN114604241A (zh) 车辆驾驶风险评估方法、装置、电子设备及边缘计算设备
CN115675458A (zh) 引导线的生成方法及装置、电子设备和介质
CN115900724A (zh) 路径规划方法和装置
CN114780869A (zh) 乘车点推荐方法、装置、电子设备、介质

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20191217

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20210701