WO2018115897A1 - Localisation of mobile device using image and non-image sensor data in server processing - Google Patents
Localisation of mobile device using image and non-image sensor data in server processing Download PDFInfo
- Publication number
- WO2018115897A1 WO2018115897A1 PCT/GB2017/053874 GB2017053874W WO2018115897A1 WO 2018115897 A1 WO2018115897 A1 WO 2018115897A1 GB 2017053874 W GB2017053874 W GB 2017053874W WO 2018115897 A1 WO2018115897 A1 WO 2018115897A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- localisation
- global
- location
- server system
- Prior art date
Links
- 230000004807 localization Effects 0.000 title claims abstract description 200
- 238000012545 processing Methods 0.000 title description 7
- 238000000034 method Methods 0.000 claims abstract description 117
- 230000004044 response Effects 0.000 claims abstract description 27
- 230000000007 visual effect Effects 0.000 claims description 25
- 230000007613 environmental effect Effects 0.000 claims description 14
- 238000005259 measurement Methods 0.000 claims description 14
- 238000001514 detection method Methods 0.000 claims description 13
- 230000003190 augmentative effect Effects 0.000 claims description 9
- 238000013507 mapping Methods 0.000 claims description 7
- 230000002085 persistent effect Effects 0.000 claims description 4
- 230000026676 system process Effects 0.000 claims description 4
- 230000004931 aggregating effect Effects 0.000 claims description 2
- 238000004590 computer program Methods 0.000 claims description 2
- 230000001419 dependent effect Effects 0.000 claims 6
- 238000004891 communication Methods 0.000 description 14
- 230000008569 process Effects 0.000 description 6
- 239000000203 mixture Substances 0.000 description 4
- 230000001413 cellular effect Effects 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000001953 sensory effect Effects 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- VJYFKVYYMZPMAB-UHFFFAOYSA-N ethoprophos Chemical compound CCCSP(=O)(OCC)SCCC VJYFKVYYMZPMAB-UHFFFAOYSA-N 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 238000001429 visible spectrum Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/16—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3833—Creation or updating of map data characterised by the source of data
- G01C21/3848—Data obtained from both position sensors and additional sensors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
- G01S19/485—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an optical system or imaging system
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/0009—Transmission of position information to remote stations
- G01S5/0018—Transmission from mobile station to base station
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
Definitions
- the present invention relates to method of localisation for devices. More particularly, the present invention relates the use of both local and remote resources to provide substantially real time localisation at a device.
- the capability for devices to determine with a high degree of precision where the device is located within a three-dimensional space, for example with precision within a centimetre, is likely to be pivotal for many robotics, augmented, and virtual reality applications.
- this level of precision can allow a robot to operate safely and efficiently in its environment or can allow the display of augmented content at the appropriate place accurately.
- aspects and/or embodiments seek to provide a distributed localisation and/or mapping system capable of delivering substantially high-accuracy real-time localisation at a device.
- a method of determining a location of a device having one or more sensors comprising the steps of: sending a localisation request to a server system, the localisation request comprising at least a portion of data from the one or more sensors; receiving localisation data from the server system in response to the localisation request; and determining a location of the device from the received localisation data.
- the method includes the further step of estimating a location of the device based on data from the one or more sensors and wherein the step of determining a location of the device includes determining the location of the device using the estimated location.
- Determining the location of a device using localisation data from a server system, using data from one or more sensors, can allow a device to localise using a large amount of map data and significant processing power (at a server system) without needing the device to have significant processing power, data storage or bandwidth.
- the step of determining a location of the device further comprises using at least a portion of the data from the one or more sensors along with the received localisation data.
- Using the sensor data can allow the device to determine or estimate what has changed between the time of sending the localisation request and receiving the localisation data, thus allowing an estimate of the movement since the sending of the localisation request and therefore more accurately determine the current position of the device.
- the odometry system can use sensor data to determine how much further the device has moved since sending the localisation request.
- the localisation request further comprises an estimated location.
- the step of determining a location of the device includes any of: determining the location of the device at the point in time that the localisation request was sent to the server system; or determining the current location of the device taking into account the point in time that the localisation request was sent to the server system.
- Determining the location of the device can involve determining the location of the device at the point at which it transmits the localisation data, or the point at which the localisation data is captured, or can involve determining the current position (or future position) based on for example movement data, speed data or previous localisation requests.
- said one or more sensors comprises at least one visual data sensor.
- said at least one visual data sensor comprises any or a combination of: an image camera; a video camera; a monocular camera; a depth camera; a stereo image camera; a high dynamic range camera, a light detection and ranging sensor; a radio detection and ranging sensor.
- said at least a portion of the data from the one or more sensors comprises visual data from the at least one visual data sensor.
- Using visual data such as from an image camera; a video camera; a monocular camera; a depth camera; a stereo image camera; a high dynamic range camera, a light detection and ranging sensor; or a radio detection and ranging sensor can allow localisation of a device by a server system based on known visual data stored by the server system.
- data from said one or more sensors is combined with data from an inertial measurement unit and/or a satellite positioning system.
- Using inertial measurement can allow a device to estimate its location based on movement relative to one or more previous known locations.
- Using data from a satellite positioning system (GPS) helps to narrow the location search.
- GPS satellite positioning system
- the step of estimating a location further comprises using previous localisation data.
- previous localisation data comprises any of: one or more previous estimates of location; one or more previous localisation requests; one or more of previous said localisation data received from the server system in response to previous said localisation requests; and previous determinations of location.
- the step of estimating a location is performed using any of: a local pose; or a local frame of reference.
- the localisation data received from the server system in response to said localisation request uses any of: a global pose; or a global frame of reference.
- a global to local pose transform or relationship; or a global frame of reference to local frame of reference transform or relationship is determined.
- Previous localisation data can be used by a device to more accurately determine or estimate its location.
- a device By establishing a correlation, or transform, or relationship between the local frame of reference of the estimated locations and determined locations and the localisation data from the server system, a device can sometimes more accurately estimate or determine its position.
- the device comprises any of a mobile phone, a robot, augmented reality or virtual reality headset, or navigation device, a self-driving vehicle, a drone or other autonomous vehicle.
- the step of determining the location of the device is accurate to within a several centimetres, optionally to within a centimetre.
- the method is persistent.
- determining the location of the device can be performed inside buildings or in dense urban areas.
- the sensor data comprises data on environmental conditions such as weather conditions.
- the server system comprises global map data; optionally wherein the global map data comprises any of: one or more sets of local map data and/or one or more sets of global map data and/or one or more global maps of interconnected local maps.
- the server system can store a large amount of map data in a global master map in order to provide relevant localisation data from the global master map data to devices when requested to do so.
- one or more sets of local map data and/or one or more sets of global map data can be stored within the global master map data.
- the server system can be any of: a single server, a distributed server system, a cloud system, a physical system or a virtual system.
- the global map can be distributed between a plurality of server systems.
- the global map can be hosted in a peer-to-peer arrangement on any of: one or more devices acting as a server system, or a mix of one or more server systems and one or more devices.
- the server system can be configured in a number of different ways in order to balance usage between computer hardware efficiently and with lower latency, or even between a mix of client devices and/or server systems.
- the device and server system communicate over any of: a mobile data network, or wireless data network.
- the devices can typically operate over a mobile data network, but can operate using other data networks.
- the device has an odometry system.
- the odometry system processes sensor data to determine relative position differences between successive sensor data.
- the position estimates have between three and six degrees of freedom.
- the odometry system performs full simultaneous location and mapping and/or loop closure and/or graph optimisation.
- the odometry system outputs the estimated position.
- the estimated position is relative to a local origin.
- An odometry system can use the inputs from the sensors on the device to estimate the location of the device, and can use the relative position differences determined from successive data from the sensors to determine relative movement of the device and therefore movement in the local frame of reference of the device in order to estimate the position of the device.
- the method can work to provide localisation in a number of different dimensions, including in two- and three-dimensions (but also in other numbers of dimensions, including for example one- dimension).
- the previous localisation data is stored with a timestamp.
- Timestamp data can be provided by storing for example GPS data.
- the location of the device is determined in a global co-ordinate frame.
- the localisation request is in a local co-ordinate frame.
- the method further comprises the step of determining a transform between the global co-ordinate frame and the local co-ordinate frame.
- the localisation data is in a global co-ordinate frame.
- the device By determining a location in a global co-ordinate frame, the device can accurately localise within a global frame of reference. By operating in a local co-ordinate frame, the device can co-ordinate its sensor data and/or odometry data and/or estimated position. By determining a local-to-global transform, the device can determine its global position from its local position and/or from localisation data and/or from sensor data.
- a method of determining localisation data at a server system comprising the steps of: receiving a localisation request from one or more client systems, the localisation request comprising at least a portion of data from one or more sensors; determining relevant localisation nodes from a global map and aggregating this data as localisation data; sending the localisation data to the relevant one or more client systems in response to the localisation request.
- the localisation request includes an estimated location of the respective one or more client systems.
- Determining localisation data for a client system or device using localisation requests from a client system, using data from one or more sensors, can allow a client system to determine its location using a large amount of map data and significant processing power of a server system without needing the client system to have significant processing power, data storage or bandwidth.
- other data can be determined and/or extracted and/or aggregated from the global map to be sent to the client systems as localisation data.
- no aggregation needs to be performed.
- the localisation data is operable to determine a location of the device at a point in time that the localisation request was sent to the server system; or the localisation data is operable to determine the current location of the device taking into account the point in time that the localisation request was sent to the server system.
- Determining the location of the device can involve determining the location of the device at the point at which it transmits the localisation data, or the point at which the localisation data is captured, or can involve determining the current position (or future position) based on for example movement data, speed data or previous localisation requests.
- a further step is performed of providing feedback data to one or more client systems to send more data or less said portion of data.
- Providing feedback data to the client system(s) can prevent the client system(s) from sending too much data to the server system when not required for localisation or updating the global master map at the server, or can trigger the client system(s) to send more data in order to allow refinement or building of the global master map at the server.
- a further step is performed of updating the global map with at least a portion of the said at least a portion of data from one or more sensors received from the one or more client systems.
- the global map or global master map can be updated with new or updated information gathered by the sensors of the client device(s).
- said one or more sensors comprises at least one visual data sensor.
- said at least one visual data sensor comprises any or a combination of: an image camera; a video camera; a monocular camera; a depth camera; a stereo image camera; a high dynamic range camera, a light detection and ranging sensor; a radio detection and ranging sensor.
- said at least a portion of the data from the one or more sensors comprises visual data from the at least one visual data sensor.
- Using visual data such as from an image camera; a video camera; a monocular camera; a depth camera; a stereo image camera; a high dynamic range camera, a light detection and ranging sensor; or a radio detection and ranging sensor can allow localisation of a device by a server system based on known visual data stored by the server system.
- data from said one or more sensors is combined with data from an inertial measurement unit and/or a satellite positioning system.
- Using inertial measurement can allow a device to estimate its location based on movement relative to one or more previous known locations.
- Using data from a satellite positioning system (GPS) helps to narrow the location search.
- GPS satellite positioning system
- the step of estimating a location further comprises using previous localisation data.
- previous localisation data comprises any of: one or more previous estimates of location; one or more previous localisation requests; one or more of previous said localisation data received from the server system in response to previous said localisation requests; and previous determinations of location.
- the step of estimating a location is performed using any of: a local pose; or a local frame of reference.
- the localisation data received from the server system in response to said localisation request uses any of: a global pose; or a global frame of reference.
- a global to local pose transform or relationship; or a global frame of reference to local frame of reference transform or relationship is determined.
- Previous localisation data can be used by a device to more accurately determine or estimate its location.
- a device By establishing a correlation, or transform, or relationship between the local frame of reference of the estimated locations and determined locations and the localisation data from the server system, a device can sometimes more accurately estimate or determine its position.
- the device comprises any of a mobile phone, a robot, augmented reality or virtual reality headset, or navigation device, a self-driving vehicle, a drone or other autonomous vehicle.
- the step of determining the location of the device is accurate to within several centimetres, optionally to within a centimetre.
- the method is persistent.
- determining the location of the device can be performed inside buildings or in dense urban areas.
- the sensor data comprises data on environmental conditions such as weather conditions.
- the method of this aspect can be very accurate, and can be operational continuously to allow devices to determine their location accurately, and can be used inside buildings or in dense urban areas or in different environmental or weather conditions to determine the location of a device.
- the server system comprises global map data; optionally wherein the global map data comprises any of: one or more sets of local map data and/or one or more sets of global map data and/or one or more global maps of interconnected local maps.
- the server system can store a large amount of map data in a global master map in order to provide relevant localisation data from the global master map data to devices when requested to do so.
- one or more sets of local map data and/or one or more sets of global map data can be stored within the global master map data.
- the server system can be any of: a single server, a distributed server system, a cloud system, a physical system or a virtual system.
- the global map can be distributed between a plurality of server systems.
- the global map can be hosted in a peer-to-peer arrangement on any of: one or more devices acting as a server system, or a mix of one or more server systems and one or more devices.
- the server system can be configured in a number of different ways in order to balance usage between computer hardware efficiently and with lower latency, or even between a mix of client devices and/or server systems.
- the device and server system communicate over any of: a mobile data network, or wireless data network.
- the devices can typically operate over a mobile data network, but can operate using other data networks.
- the device has an odometry system.
- the odometry system processes sensor data to determine relative position differences between successive sensor data.
- the position estimates have between three and six degrees of freedom.
- the odometry system performs full simultaneous location and mapping and/or loop closure and/or graph optimisation.
- the odometry system outputs the estimated position.
- the estimated position is relative to a local origin.
- An odometry system can use the inputs from the sensors on the device to estimate the location of the device, and can use the relative position differences determined from successive data from the sensors to determine relative movement of the device and therefore movement in the local frame of reference of the device in order to estimate the position of the device.
- the previous localisation data is stored with a timestamp.
- Timestamp data can be provided by storing for example GPS data.
- the location of the device is determined in a global co-ordinate frame.
- the localisation request is in a local co-ordinate frame.
- the method further comprises the step of determining a transform between the global co-ordinate frame and the local co-ordinate frame.
- the localisation data is in a global co-ordinate frame.
- the device By determining a location in a global co-ordinate frame, the device can accurately localise within a global frame of reference. By operating in a local co-ordinate frame, the device can co-ordinate its sensor data and/or odometry data and/or estimated position. By determining a local-to-global transform, the device can determine its global position from its local position and/or from localisation data and/or from sensor data.
- a method of updating a global map at a server system comprising the steps of: receiving sensor data from one or more client systems; and adding the sensor data to the global map.
- estimated location data is received along with sensor data.
- the step of adding the sensor data to the global map comprises any of: creating a new node in the global map; updating or amending an existing node in the global map; or deleting a node in the global map.
- the sensor data comprises any of: environmental data; or weather data; satellite positioning data; inertial measurement data; image data; or video data.
- a further step is performed of optimising the global map.
- the step of optimising comprises performing global loop-closures to link positions of different localisation nodes.
- Updating the global map or global master map using sensor data from client systems can allow the global map to adapt over time and continuously improve, including by growing larger in size and richer in data for, for example, different environmental conditions.
- a further step is performed of sending an instruction to the one or more client systems to provide more or less sensor data to the server system.
- Providing an instruction to the client system(s) can prevent the client system(s) from sending too much data to the server system when not required for localisation or updating the global map at the server, or can trigger the client system(s) to send more data in order to refine or build the global map at the server.
- updating the global map is performed at intervals, optionally at time intervals or regular time intervals, or optionally when a predetermined threshold of sensor data has been received
- a system comprising at least one server system or apparatus operable to perform the method of the first aspect and at least one client system or apparatus operable to perform the method of the second aspect.
- server system and client system throughout the specification, can respectively be replaced with server apparatus and client apparatus.
- a computer program product for providing the method or system of the above aspects.
- Figure 1 illustrates an overview of the system according to an embodiment
- Figure 2 illustrates a flowchart showing the client system operation according to an embodiment
- Figure 3 illustrates a flowchart showing the localiser operation according to an embodiment
- Figure 4 illustrates a flowchart showing the map updater operation according to an embodiment.
- the system 1 comprises one server system 2 communicating with at least one client system 3.
- the server system 2 of this embodiment will now be described in more detail below.
- the server system 2 is running on and implemented using cloud infrastructure, but in other embodiments the server system 2 may have a variety of physical and/or virtual configurations. In other embodiments, for example, there may be one or more servers and/or server systems and, where there are more than one servers and/or server systems, these may be configured to act as a single server or server system or as multiple independent servers or server systems and may or may not be in direct communication with each other.
- a global master map 22 is maintained at the server system 2, the global master map 22 having or using a global frame, i.e. co-ordinates providing a global frame of reference.
- Each client system 3 has or uses its own local frame, i.e. local frame of reference.
- the global master map may comprise multiple local and/or global maps, each with their own respective local and/or global reference frame.
- the client system 3 and/or server system 2 may not be able to relate the local reference frames of multiple client systems 3 to each other so these may need to exist as separate local and/or global maps within the global master map.
- a client system 3 is operating indoors, for example in a factory, and has no need to leave the factory then it will not usually be possible, for example with GPS co-ordinates, to relate this to outdoor local maps or other local and/or global map(s) relevant to other client systems 3.
- a global master map 22 is stored on the server system 2, and the global master map 22 can be shared with client devices 3 in communication with the server system 2.
- the global master map 22 can be continuously or periodically updated, for example with data provided from the client devices 3.
- the global master map 22 can be refined based on data provided, or calculations or operations performed on relevant data.
- the global master map 22 can be used for localisation by the server system 2 and/or one or more of the client devices 3.
- all or a portion of the global master map 22 may be stored in a distributed fashion across a plurality of server instances or cloud infrastructure.
- the system of some, or all, of these embodiments can be used in distributed large-scale scenarios with low-cost client hardware, such as mobile phones, and/or with any, or all, of: augmented reality headsets, self-driving cars, drones, and other robots.
- the server system 2 and client system 3 are in communication with each other, typically through a bandwidth-restricted communication channel, and in this embodiment for example the communications channel 4 is a mobile 'phone cellular data network.
- the communications channel 4 is a mobile 'phone cellular data network.
- other wireless data networks may be used instead or in addition to a mobile 'phone cellular data network.
- the role of the server system 2 in this embodiment is to maintain and update a consistent global master map 22, and to respond to global localisation requests 39 from client devices 3 using the global master map 22 data stored on the server system 3.
- the communications are typically made over a bandwidth-restricted communication channel, it is anticipated that the global localisation requests 39 from each client device occur with low frequency in order to minimise bandwidth usage and/or utilise the bandwidth available in an efficient manner.
- a localisation response 26 is sent by the server system 2 to the client system 3 that sent the localisation request 39.
- the client system 3 of this embodiment will now be described in more detail.
- the client system 3 comprises a global pose record 30, which is in communication with the server system 2 and receives localisation responses 26 from the server system 2.
- the global pose record 30 is in communication with the global pose estimator 32.
- the global pose estimator 32 is also in communication with the local pose record 36 and the odometry system 34.
- the global pose estimator 32 outputs the estimated global position of the client system 3.
- the odometry system 34 and the local pose record are in communication with the sensors, for example including sensor 40 (which can be an image sensor or video sensor, depth camera sensor or LIDAR sensor), and optionally an inertial measurement unit IMU 42 and satellite positioning system (such as GPS, for example) 44, which can be combined or blended 38.
- the global pose estimator 32 outputs the estimated global position of the client system 3 and this can be combined with the combined data 38 to be sent as a localisation request to the server system 2.
- the one or more client systems 3 of this embodiment run on client hardware equipped with at least one sensor 40, for example devices such as a mobile 'phone, augmented reality headset, drone, or autonomous car driving system.
- the client hardware can, for example, be any one or a combination of: visible spectrum cameras, depth cameras, LIDAR sensors, IMU 42 and/or satellite positioning system 44.
- each client system 3 The role of each client system 3 is to process client system data acquired from at least one of the sensors 40, IMU 42 and satellite positioning system 44, communicate with the server system 2, and maintain a real-time global position estimate of the client system 3.
- the client system 3 comprises an odometry system 34 which processes the client combined data and translates this data into relative position differences. Based on the sensor used (or in combination with IMU 42 and/or satellite positioning system 44) this can be implemented in several ways. For example, in the case of a camera sensor, the camera movement can be determined by matching and triangulating observed image features or pixel intensities in between successive measurements to provide visual odometry.
- Mesh-matching methods such as iterative closest point optimisation can be used to achieve similar pose estimates in active sensors such as depth cameras or LIDAR sensors.
- this system can be solely based on or aided by an inertial measurement unit (IMU).
- IMU inertial measurement unit
- several such measurements coming from different sensors and modalities can be integrated into one pose estimate using methods such as Kalman Filtering to compensate for individual sensor drawbacks and to achieve higher robustness and accuracy.
- the position estimates take the form of a pose with six degrees-of-freedom (three-dimensional rotation and three-dimensional translation), but in the case of embodiments relating to, for example, planar automotive scenarios this can be reasonably reduced to three degrees of freedom (rotation and two-dimensional translation). In some embodiments, only one degree of freedom can be used.
- odometry typically accumulates an error over time if based purely on local sensor data and estimates - a problem known as "drift".
- odometry can be extended to a full client simultaneous location and mapping (SLAM) system utilizing loop closure and graph optimisation procedures. Implementations of such systems will depend on the type of sensor used, such as, for example monocular or stereo camera, depth camera, or laser sensors.
- the output of the odometry system 34 can provide a substantially high-quality estimate (or estimates) of device position in relation to some arbitrary local origin (the arbitrary local origin is typically the position of the device where the system started or initiated).
- the arbitrary local origin is typically the position of the device where the system started or initiated.
- global localisation i.e. a substantially high-accuracy position in a global map
- client system 3 regularly performs "global localisation requests" 39 to the server system 2.
- a summary of recent sensor inputs in a form of, for example, image or video data, depth maps, features, relationship to previous localisation requests etc. is aggregated to create a localisation request 39.
- data will only be available from one, typically high frequency sensor, such as an IMU and so only this data is transmitted in the localisation request 39.
- data may be available from a plurality of sensors, for example an image from a visual sensor along with IMU data, which can be transmitted in the localisation request 39.
- this localisation request 39 is usually of a much smaller data size and performed at much lower frequency than the related equivalent raw sensor data but given sufficient bandwidth the raw sensor data can optionally be streamed directly to the server system as a continuous localisation request (and similarly the localisation response from the server can then be intermittent or continuous).
- This localisation request 39 is then sent to a localiser module or (sub-)process 20 at the server system 2. Simultaneously the current estimate of the device position in the local coordinate frame produced by the odometry is added to the client "local pose record" database 36.
- the localiser 20 at the server system 2 responds to the localisation request 39 from the client device 3 with an estimate of the "global pose” of the device 3 at a time of the issued localisation request 39, sent by the server system 2 as a localisation response 26.
- This localisation response 26, when received by the client system 3 is then stored in the "Global Pose Record” database 30 on the client system 3.
- the relative and global pose of these requests are retrieved from Local and Global Pose Records 36, 30 and compared to provide the estimate of the local origin pose in the global map of the client system 3.
- This estimate is then combined with subsequent high-frequency device pose/location estimates in the local coordinate frame from the odometry system 34 to provide a high-frequency device pose, or location, in the global coordinate frame.
- This can be achieved by translating high-frequency device pose/location estimates by the estimated local to the global coordinate frame transform.
- further optimisation can be performed to combine local position estimates with one or more responses based on properties of local odometry and global localisation system (for example, GPS).
- the information stored in the local and/or global pose records is a list of local and/or global positions of the device. Each position is associated with a particular time and unique timestamp or ID. In some embodiments, as an alternative or addition, each position can be associated with GPS data which can include timestamp data. The aforementioned time might be that of a particular sensor measurement, and the timestamp or ID can be used to cross-reference the local and global record. Relating the local and global pose of one or multiple device poses together with the current local device pose gives the current global device pose. In other embodiments, additional or alternative information can be stored in the local or global pose records.
- the principle of operation 100 of the client system 3 is set out in the flowchart of Figure 2, which will now be described in further detail.
- the client system 3 needs to determine a local-to- global frame transform.
- the client system 3 carries out the step of fetching the positions of past globally localised images in the global frame from the global pose record 30.
- the client system 3 carries out the step of fetching the positions of past globally localised images in the local frame from the local pose record 36.
- the positions are compared in order to compute a local-to-global frame transform.
- the client system 3 in step 108 obtains or receives sensor data, for example image or video data from a camera. Then, in step 110, the client system 3 computes the position of the current image from the camera in the local frame. Next, in step 112, and using the previously- or parallel-computed local-to-global frame transform determined in step 106, the client system 3 computes the position of the current image from the camera in the global frame. Then, in step 1 14, the client system outputs the position of the current image in the global frame.
- the server system 2 comprises three modules, systems or sub-systems: the global master map 22, localiser 20 and map updater 24.
- the global master map 22 is in communication with the localiser 20, and the localiser 20 can retrieve data from the global master map 22 as needed to perform the functions of the localiser 20.
- the map updater 24 is also in communication with the global master map 22 and can both input to and retrieve data from the global master map 22.
- the map updater 24 also receives data received by the server system 2 from client systems 3, specifically localisation requests 39.
- the map updater 24 is also in communication with the localiser 20 and also receives the localisation responses 26 sent by the localiser 20 in response to each localisation request 39.
- the global master map 22 comprises a collection of localisation nodes. Each localisation node summarises a particular sensory experience at a particular place (such as a picture, position of visual features, depth map or three-dimensional point cloud of the environment, for example) and, optionally, metadata (such as a combination of different weather conditions and/or lighting conditions, for example) required for the purpose of localiser 20 operation. Each localisation node has an estimate of its pose assigned to it in the global co-ordinate frame.
- the global master map 22 might contain links or cross-references/relationships between localisation nodes and their positions within the global master map 22.
- the localiser 20 receives one or more localisation requests 39 from one or more client systems 3.
- the localiser 20 searches the global master map 22 for relevant localisation nodes capturing similar past sensory experiences from the localisation request 39, termed querying the localisation map for relevant nodes and their positions.
- the search can be performed by extraction of visual signatures, for example those based on image features or a neural-network-based encoding to search for nodes with a similar signature by using data acquired from the one or more sensors 40. Additionally, the search can be accelerated based on a location estimate, for instance as given by device satellite (e.g. GPS) position or the history of its previous localisation requests to consider only nodes in a nearby area. Various statistical methods such as tf-idf can be further used to exploit statistical properties and relevance of the individual localisation nodes for the purpose of localisation.
- step 206 based on the data stored in the localisation node and localisation request 39, the localisation process then performs relative pose estimation between the global position of the localisation node and the localisation request 39 to determine the global pose of the client device 3 at the time of issuing the request (i.e. to localise against what are determined to be the relevant nodes of the localisation map).
- This information is then reported back to the client system in the form of localisation response 26, where the information is aggregated in step 208 from the localisation results from step 206 and then reported in step 210.
- the localisation response can contain part of the master map to be transmitted to the client device to serve localisation requests locally on the client device.
- the map updater 24 is notified with the localisation request 39 and response 26, in order to queue the incoming data in order to perform map refinement at regular intervals to update the global master map 22 with data from one or more localisation requests received at the server system, which will now be described with reference to the flow chart in Figure 4. While in some embodiments the global master map 22 can be updated every time a new localisation request 39 is received, batch optimisation can allow for the process of updating the global master map 22 to be performed more efficiently at intervals.
- step 302 when the map updater 24 receives notification of localisation requests 39 and responses 26, the map updater 24 extracts the information from the localisation request 39 and response 26.
- the information extracted from the localisation request 39 and response 26 is used to update or create new localisation node in the global master map 22 in step 304.
- the updated or created new localisation node is included in the global master map 22 in step 306, which might result in an extension of the global master map 22.
- links can optionally be added to the relevant nodes in the localisation map in step 308.
- step 310 the map updater 24 performs optimisation to refine this information, for example using either pose-graph optimisation or bundle adjustment.
- the principle of this optimisation is to perform global map loop-closures to link the positions of different localisation nodes originating from the same place.
- These links can contain relative 6 degrees of freedom (3D position and 3D rotation) difference between positions of the nodes.
- the relative position can include an additional difference in scale for scale-free systems such as monocular cameras giving 7 degrees of freedom (3D position, 3D rotation and a seventh dimension relating to the relative scale difference).
- the poses of the nodes are then optimised to minimise the total cost of the graph. This cost is calculated using the constraints imposed upon the graph by the relative poses estimated between nodes.
- this optimisation can include re-computing the position of localisation features or other stored information used for location.
- one or more client systems can be requested by the server system to modify the frequency and amount of information included in the localisation requests from those one or more client systems (i.e. a sub-set of the client systems may be requested to modify the information transmitted to the server system, to increase or decrease the amount of data transmitted to the server system).
- This can serve to optimise bandwidth usage, reflecting the varying need for more data at different regions of the global master map 22, based on location and environmental conditions. For example, if the map does not contain enough relevant information in the location region and/or certain environmental conditions (e.g.
- the client system can be requested to compensate by sending more data and at a higher rate.
- the localisation requests and map updates need not be performed as often.
- the bandwidth usage is expected to decrease over time as the global master map 22 gathers more data.
- the system can be configured to selectively disable either map updating or localisation such that the system only then performs the other function.
- the performance of the entire system 1 can be determined by the plurality of localisation nodes in the global master map 22, the quality of their position estimates of the nodes in the global coordinate frame, the quality of localisation towards these nodes by the localiser module 20 and the quality of the odometry system 34 in the client system 3.
- the quality of global master map 22 can increase with the amount of data collected by the client systems 3 and integrated as localisation nodes into the global master map 22 on the server system 2. Therefore, the system 1 can perform better over time as more data at different conditions is collected and observed.
- some or all of the client systems do not estimate their position and/or do not transmit an estimated position along with sensor data in localisation requests to the server. Typically, this would be the case for devices that have either a temporarily disabled or malfunctioning odometry system, or limited functionality or hardware, or are producing varyingly inaccurate estimates of position.
- sensor(s) 40 only relates to visual data sensors such as an image camera; a video camera; a monocular camera; a depth camera; a stereo image camera; a high dynamic range camera, a LIDAR sensor; and a radio detection and ranging sensor.
- sensor data may be combined with data from an IMU unit 42 and/or a satellite positioning system (GPS) 44.
- GPS satellite positioning system
- any feature in one aspect of the invention may be applied to other aspects of the invention, in any appropriate combination.
- method aspects may be applied to system aspects, and vice versa.
- any, some and/or all features in one aspect can be applied to any, some and/or all features in any other aspect, in any appropriate combination.
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Mobile Radio Communication Systems (AREA)
- Navigation (AREA)
Abstract
The present invention relates to method of localisation for devices. More particularly, it relates the use of both local and remote resources to provide substantially real time localisation at a device. According to an aspect, there is provided a method of determining a location of a device having one or more sensors comprising the steps of: sending a localisation request to a server system, the localisation request comprising at least a portion of data from the one or more sensors; receiving localisation data from the server system in response to the localisation request; and determining a location of the device from the received localisation data. Optionally, the method includes the further step of estimating a location of the device based on data from the one or more sensors and wherein the step of determining a location of the device includes determining the location of the device using the estimated location.
Description
LOCALISATION OF MOBILE DEVICE USING IMAGE AND NON-IMAGE SENSOR DATA IN SERVER PROCESSING
Field
The present invention relates to method of localisation for devices. More particularly, the present invention relates the use of both local and remote resources to provide substantially real time localisation at a device.
Background
The capability for devices to determine with a high degree of precision where the device is located within a three-dimensional space, for example with precision within a centimetre, is likely to be pivotal for many robotics, augmented, and virtual reality applications.
For example, this level of precision can allow a robot to operate safely and efficiently in its environment or can allow the display of augmented content at the appropriate place accurately.
This level of precision cannot be achieved with current satellite global position technologies for various reasons, including for example due to the challenging environmental conditions experienced in at least some if not most locations (atmospheric effects, the reflection of waves in urban environments, sky visibility etc.). In particular, satellite position within buildings and dense urban environments is typically very imprecise.
Moreover, for localisation functionality with the above-mentioned level of precision to be useful in robotics and augmented reality applications, it must be sufficiently robust. For it to be sufficiently robust such localisation should work persistently, in all weather conditions, in changing environments, both indoors and outdoors, at city-level scale, and in real-time.
Classical SLAM (simultaneous localisation and mapping) solutions are not sufficiently robust, as they are typically not appropriate for both large scale use and with the degree of environmental change observed in the real world, as most implementations are designed for experiments in a certain set of environmental conditions and within a certain size of map to avoid requiring a large amount of storage and processing power being required.
Summary of Invention
Aspects and/or embodiments seek to provide a distributed localisation and/or mapping system capable of delivering substantially high-accuracy real-time localisation at a device.
According to a first aspect, there is provided a method of determining a location of a device
having one or more sensors comprising the steps of: sending a localisation request to a server system, the localisation request comprising at least a portion of data from the one or more sensors; receiving localisation data from the server system in response to the localisation request; and determining a location of the device from the received localisation data.
Optionally, the method includes the further step of estimating a location of the device based on data from the one or more sensors and wherein the step of determining a location of the device includes determining the location of the device using the estimated location.
Determining the location of a device using localisation data from a server system, using data from one or more sensors, can allow a device to localise using a large amount of map data and significant processing power (at a server system) without needing the device to have significant processing power, data storage or bandwidth.
Optionally the step of determining a location of the device further comprises using at least a portion of the data from the one or more sensors along with the received localisation data.
Using the sensor data can allow the device to determine or estimate what has changed between the time of sending the localisation request and receiving the localisation data, thus allowing an estimate of the movement since the sending of the localisation request and therefore more accurately determine the current position of the device. For example, the odometry system can use sensor data to determine how much further the device has moved since sending the localisation request.
Optionally, the localisation request further comprises an estimated location.
Optionally, the step of determining a location of the device includes any of: determining the location of the device at the point in time that the localisation request was sent to the server system; or determining the current location of the device taking into account the point in time that the localisation request was sent to the server system.
Determining the location of the device can involve determining the location of the device at the point at which it transmits the localisation data, or the point at which the localisation data is captured, or can involve determining the current position (or future position) based on for example movement data, speed data or previous localisation requests.
Optionally, said one or more sensors comprises at least one visual data sensor. Optionally, said at least one visual data sensor comprises any or a combination of: an image camera; a video camera; a monocular camera; a depth camera; a stereo image camera; a high dynamic range camera, a light detection and ranging sensor; a radio detection and ranging sensor.
Optionally, said at least a portion of the data from the one or more sensors comprises visual data from the at least one visual data sensor.
Using visual data such as from an image camera; a video camera; a monocular camera; a depth camera; a stereo image camera; a high dynamic range camera, a light detection and ranging sensor; or a radio detection and ranging sensor can allow localisation of a device by a server system based on known visual data stored by the server system.
Optionally, data from said one or more sensors is combined with data from an inertial measurement unit and/or a satellite positioning system.
Using inertial measurement can allow a device to estimate its location based on movement relative to one or more previous known locations. Using data from a satellite positioning system (GPS) helps to narrow the location search.
Optionally, the step of estimating a location further comprises using previous localisation data.
Optionally, previous localisation data comprises any of: one or more previous estimates of location; one or more previous localisation requests; one or more of previous said localisation data received from the server system in response to previous said localisation requests; and previous determinations of location. Optionally, the step of estimating a location is performed using any of: a local pose; or a local frame of reference. Optionally, the localisation data received from the server system in response to said localisation request uses any of: a global pose; or a global frame of reference. Optionally, a global to local pose transform or relationship; or a global frame of reference to local frame of reference transform or relationship is determined.
Previous localisation data can be used by a device to more accurately determine or estimate its location. By establishing a correlation, or transform, or relationship between the local frame of reference of the estimated locations and determined locations and the localisation data from the server system, a device can sometimes more accurately estimate or determine its position.
Optionally, the device comprises any of a mobile phone, a robot, augmented reality or virtual reality headset, or navigation device, a self-driving vehicle, a drone or other autonomous vehicle.
Many devices can be used with this aspect in order to determine their location using the method of this aspect.
Optionally, the step of determining the location of the device is accurate to within a several centimetres, optionally to within a centimetre. Optionally, the method is persistent. Optionally, determining the location of the device can be performed inside buildings or in dense urban areas. Optionally, the sensor data comprises data on environmental conditions such as weather conditions.
The method of this aspect can be very accurate, and can be operational continuously to allow devices to determine their location accurately, and can be used inside buildings or in dense urban areas or in different environmental or weather conditions to determine the location of a device. Optionally, the server system comprises global map data; optionally wherein the global map data comprises any of: one or more sets of local map data and/or one or more sets of global map data and/or one or more global maps of interconnected local maps.
The server system can store a large amount of map data in a global master map in order to provide relevant localisation data from the global master map data to devices when requested to do so. Optionally, one or more sets of local map data and/or one or more sets of global map data can be stored within the global master map data.
Optionally, the server system can be any of: a single server, a distributed server system, a cloud system, a physical system or a virtual system. Optionally, the global map can be distributed between a plurality of server systems. Optionally, the global map can be hosted in a peer-to-peer arrangement on any of: one or more devices acting as a server system, or a mix of one or more server systems and one or more devices.
The server system can be configured in a number of different ways in order to balance usage between computer hardware efficiently and with lower latency, or even between a mix of client devices and/or server systems. Optionally, the device and server system communicate over any of: a mobile data network, or wireless data network.
The devices can typically operate over a mobile data network, but can operate using other data networks.
Optionally, the device has an odometry system. Optionally, the odometry system processes sensor data to determine relative position differences between successive sensor data. Optionally, the position estimates have between three and six degrees of freedom. Optionally, the odometry system performs full simultaneous location and mapping and/or loop closure
and/or graph optimisation. Optionally, the odometry system outputs the estimated position. Optionally, the estimated position is relative to a local origin.
An odometry system can use the inputs from the sensors on the device to estimate the location of the device, and can use the relative position differences determined from successive data from the sensors to determine relative movement of the device and therefore movement in the local frame of reference of the device in order to estimate the position of the device. The method can work to provide localisation in a number of different dimensions, including in two- and three-dimensions (but also in other numbers of dimensions, including for example one- dimension).
Optionally, the previous localisation data is stored with a timestamp.
By providing a timestamp on localisation data, data from different sources and stored in different data stores can be compared in order to estimate and/or determine the position of the device. Timestamp data can be provided by storing for example GPS data.
Optionally, the location of the device is determined in a global co-ordinate frame. Optionally, wherein the localisation request is in a local co-ordinate frame. Optionally, the method further comprises the step of determining a transform between the global co-ordinate frame and the local co-ordinate frame. Optionally, the localisation data is in a global co-ordinate frame.
By determining a location in a global co-ordinate frame, the device can accurately localise within a global frame of reference. By operating in a local co-ordinate frame, the device can co-ordinate its sensor data and/or odometry data and/or estimated position. By determining a local-to-global transform, the device can determine its global position from its local position and/or from localisation data and/or from sensor data.
According to a second aspect, there is provided a method of determining localisation data at a server system comprising the steps of: receiving a localisation request from one or more client systems, the localisation request comprising at least a portion of data from one or more sensors; determining relevant localisation nodes from a global map and aggregating this data as localisation data; sending the localisation data to the relevant one or more client systems in response to the localisation request. Optionally, the localisation request includes an estimated location of the respective one or more client systems.
Determining localisation data for a client system or device using localisation requests from a client system, using data from one or more sensors, can allow a client system to determine its location using a large amount of map data and significant processing power of a server system
without needing the client system to have significant processing power, data storage or bandwidth. Alternatively, to determining relevant localisation nodes, other data can be determined and/or extracted and/or aggregated from the global map to be sent to the client systems as localisation data. Optionally no aggregation needs to be performed. Optionally, either the localisation data is operable to determine a location of the device at a point in time that the localisation request was sent to the server system; or the localisation data is operable to determine the current location of the device taking into account the point in time that the localisation request was sent to the server system.
Determining the location of the device can involve determining the location of the device at the point at which it transmits the localisation data, or the point at which the localisation data is captured, or can involve determining the current position (or future position) based on for example movement data, speed data or previous localisation requests.
Optionally, a further step is performed of providing feedback data to one or more client systems to send more data or less said portion of data. Providing feedback data to the client system(s) can prevent the client system(s) from sending too much data to the server system when not required for localisation or updating the global master map at the server, or can trigger the client system(s) to send more data in order to allow refinement or building of the global master map at the server.
Optionally, a further step is performed of updating the global map with at least a portion of the said at least a portion of data from one or more sensors received from the one or more client systems.
The global map or global master map can be updated with new or updated information gathered by the sensors of the client device(s).
Optionally, said one or more sensors comprises at least one visual data sensor. Optionally, said at least one visual data sensor comprises any or a combination of: an image camera; a video camera; a monocular camera; a depth camera; a stereo image camera; a high dynamic range camera, a light detection and ranging sensor; a radio detection and ranging sensor. Optionally, said at least a portion of the data from the one or more sensors comprises visual data from the at least one visual data sensor. Using visual data such as from an image camera; a video camera; a monocular camera; a depth camera; a stereo image camera; a high dynamic range camera, a light detection and ranging sensor; or a radio detection and ranging sensor can allow localisation of a device by
a server system based on known visual data stored by the server system.
Optionally, data from said one or more sensors is combined with data from an inertial measurement unit and/or a satellite positioning system.
Using inertial measurement can allow a device to estimate its location based on movement relative to one or more previous known locations. Using data from a satellite positioning system (GPS) helps to narrow the location search.
Optionally, the step of estimating a location further comprises using previous localisation data. Optionally, previous localisation data comprises any of: one or more previous estimates of location; one or more previous localisation requests; one or more of previous said localisation data received from the server system in response to previous said localisation requests; and previous determinations of location. Optionally, the step of estimating a location is performed using any of: a local pose; or a local frame of reference. Optionally, the localisation data received from the server system in response to said localisation request uses any of: a global pose; or a global frame of reference. Optionally, a global to local pose transform or relationship; or a global frame of reference to local frame of reference transform or relationship is determined.
Previous localisation data can be used by a device to more accurately determine or estimate its location. By establishing a correlation, or transform, or relationship between the local frame of reference of the estimated locations and determined locations and the localisation data from the server system, a device can sometimes more accurately estimate or determine its position.
Optionally, the device comprises any of a mobile phone, a robot, augmented reality or virtual reality headset, or navigation device, a self-driving vehicle, a drone or other autonomous vehicle.
Many devices can be used with this aspect in order to determine their location using the method of this aspect.
Optionally, the step of determining the location of the device is accurate to within several centimetres, optionally to within a centimetre. Optionally, the method is persistent. Optionally, determining the location of the device can be performed inside buildings or in dense urban areas. Optionally, the sensor data comprises data on environmental conditions such as weather conditions.
The method of this aspect can be very accurate, and can be operational continuously to allow devices to determine their location accurately, and can be used inside buildings or in dense
urban areas or in different environmental or weather conditions to determine the location of a device.
Optionally, the server system comprises global map data; optionally wherein the global map data comprises any of: one or more sets of local map data and/or one or more sets of global map data and/or one or more global maps of interconnected local maps.
The server system can store a large amount of map data in a global master map in order to provide relevant localisation data from the global master map data to devices when requested to do so. Optionally, one or more sets of local map data and/or one or more sets of global map data can be stored within the global master map data.
Optionally, the server system can be any of: a single server, a distributed server system, a cloud system, a physical system or a virtual system. Optionally, the global map can be distributed between a plurality of server systems. Optionally, the global map can be hosted in a peer-to-peer arrangement on any of: one or more devices acting as a server system, or a mix of one or more server systems and one or more devices.
The server system can be configured in a number of different ways in order to balance usage between computer hardware efficiently and with lower latency, or even between a mix of client devices and/or server systems.
Optionally, the device and server system communicate over any of: a mobile data network, or wireless data network.
The devices can typically operate over a mobile data network, but can operate using other data networks.
Optionally, the device has an odometry system. Optionally, the odometry system processes sensor data to determine relative position differences between successive sensor data. Optionally, the position estimates have between three and six degrees of freedom. Optionally, the odometry system performs full simultaneous location and mapping and/or loop closure and/or graph optimisation. Optionally, the odometry system outputs the estimated position. Optionally, the estimated position is relative to a local origin.
An odometry system can use the inputs from the sensors on the device to estimate the location of the device, and can use the relative position differences determined from successive data from the sensors to determine relative movement of the device and therefore movement in the local frame of reference of the device in order to estimate the position of the device.
Optionally, the previous localisation data is stored with a timestamp.
By providing a timestamp on localisation data, data from different sources and stored in different data stores can be compared in order to estimate and/or determine the position of the device. Timestamp data can be provided by storing for example GPS data. Optionally, the location of the device is determined in a global co-ordinate frame. Optionally, wherein the localisation request is in a local co-ordinate frame. Optionally, the method further comprises the step of determining a transform between the global co-ordinate frame and the local co-ordinate frame. Optionally, the localisation data is in a global co-ordinate frame.
By determining a location in a global co-ordinate frame, the device can accurately localise within a global frame of reference. By operating in a local co-ordinate frame, the device can co-ordinate its sensor data and/or odometry data and/or estimated position. By determining a local-to-global transform, the device can determine its global position from its local position and/or from localisation data and/or from sensor data.
According to a third aspect, there is provided a method of updating a global map at a server system comprising the steps of: receiving sensor data from one or more client systems; and adding the sensor data to the global map. Optionally, estimated location data is received along with sensor data. Optionally, the step of adding the sensor data to the global map comprises any of: creating a new node in the global map; updating or amending an existing node in the global map; or deleting a node in the global map. Optionally, the sensor data comprises any of: environmental data; or weather data; satellite positioning data; inertial measurement data; image data; or video data. Optionally, a further step is performed of optimising the global map. Optionally, the step of optimising comprises performing global loop-closures to link positions of different localisation nodes.
Updating the global map or global master map using sensor data from client systems can allow the global map to adapt over time and continuously improve, including by growing larger in size and richer in data for, for example, different environmental conditions.
Optionally, a further step is performed of sending an instruction to the one or more client systems to provide more or less sensor data to the server system.
Providing an instruction to the client system(s) can prevent the client system(s) from sending too much data to the server system when not required for localisation or updating the global map at the server, or can trigger the client system(s) to send more data in order to refine or build the global map at the server.
Optionally, updating the global map is performed at intervals, optionally at time intervals or regular time intervals, or optionally when a predetermined threshold of sensor data has been received
It can be more efficient to performed updating of the global map or global master map at intervals, for example at regular time intervals or at points where a certain threshold of amount of data has been received or data relating to a certain number of nodes or new nodes has been collected.
According to a fourth aspect, there is provided a system comprising at least one server system or apparatus operable to perform the method of the first aspect and at least one client system or apparatus operable to perform the method of the second aspect.
The terms server system and client system, throughout the specification, can respectively be replaced with server apparatus and client apparatus.
According to a fifth aspect, there is provided a computer program product for providing the method or system of the above aspects.
Brief Description of Drawings
Embodiments will now be described, by way of example only and with reference to the accompanying drawings having like-reference numerals, in which:
Figure 1 illustrates an overview of the system according to an embodiment;
Figure 2 illustrates a flowchart showing the client system operation according to an embodiment;
Figure 3 illustrates a flowchart showing the localiser operation according to an embodiment; and
Figure 4 illustrates a flowchart showing the map updater operation according to an embodiment.
Specific Description
An example embodiment will now be described with reference to Figure 1.
In the embodiment of Figure 1 , the system 1 comprises one server system 2 communicating with at least one client system 3.
The server system 2 of this embodiment will now be described in more detail below.
In this embodiment, the server system 2 is running on and implemented using cloud infrastructure, but in other embodiments the server system 2 may have a variety of physical and/or virtual configurations. In other embodiments, for example, there may be one or more servers and/or server systems and, where there are more than one servers and/or server systems, these may be configured to act as a single server or server system or as multiple independent servers or server systems and may or may not be in direct communication with each other.
A global master map 22 is maintained at the server system 2, the global master map 22 having or using a global frame, i.e. co-ordinates providing a global frame of reference. Each client system 3 has or uses its own local frame, i.e. local frame of reference. In some embodiments, however, the global master map may comprise multiple local and/or global maps, each with their own respective local and/or global reference frame. The client system 3 and/or server system 2 may not be able to relate the local reference frames of multiple client systems 3 to each other so these may need to exist as separate local and/or global maps within the global master map. For example, where a client system 3 is operating indoors, for example in a factory, and has no need to leave the factory then it will not usually be possible, for example with GPS co-ordinates, to relate this to outdoor local maps or other local and/or global map(s) relevant to other client systems 3.
In this embodiment, a global master map 22 is stored on the server system 2, and the global master map 22 can be shared with client devices 3 in communication with the server system 2. In some embodiments, the global master map 22 can be continuously or periodically updated, for example with data provided from the client devices 3. In some embodiments, the global master map 22 can be refined based on data provided, or calculations or operations performed on relevant data. In some embodiments, the global master map 22 can be used for localisation by the server system 2 and/or one or more of the client devices 3. In some embodiments, all or a portion of the global master map 22 may be stored in a distributed fashion across a plurality of server instances or cloud infrastructure.
The system of some, or all, of these embodiments can be used in distributed large-scale scenarios with low-cost client hardware, such as mobile phones, and/or with any, or all, of: augmented reality headsets, self-driving cars, drones, and other robots.
The server system 2 and client system 3 are in communication with each other, typically through a bandwidth-restricted communication channel, and in this embodiment for example
the communications channel 4 is a mobile 'phone cellular data network. In other embodiments, other wireless data networks may be used instead or in addition to a mobile 'phone cellular data network.
The role of the server system 2 in this embodiment is to maintain and update a consistent global master map 22, and to respond to global localisation requests 39 from client devices 3 using the global master map 22 data stored on the server system 3. In this embodiment, where the communications are typically made over a bandwidth-restricted communication channel, it is anticipated that the global localisation requests 39 from each client device occur with low frequency in order to minimise bandwidth usage and/or utilise the bandwidth available in an efficient manner. A localisation response 26 is sent by the server system 2 to the client system 3 that sent the localisation request 39.
The client system 3 of this embodiment will now be described in more detail.
The client system 3 comprises a global pose record 30, which is in communication with the server system 2 and receives localisation responses 26 from the server system 2. The global pose record 30 is in communication with the global pose estimator 32. The global pose estimator 32 is also in communication with the local pose record 36 and the odometry system 34. The global pose estimator 32 outputs the estimated global position of the client system 3. The odometry system 34 and the local pose record are in communication with the sensors, for example including sensor 40 (which can be an image sensor or video sensor, depth camera sensor or LIDAR sensor), and optionally an inertial measurement unit IMU 42 and satellite positioning system (such as GPS, for example) 44, which can be combined or blended 38. The global pose estimator 32 outputs the estimated global position of the client system 3 and this can be combined with the combined data 38 to be sent as a localisation request to the server system 2.
The one or more client systems 3 of this embodiment run on client hardware equipped with at least one sensor 40, for example devices such as a mobile 'phone, augmented reality headset, drone, or autonomous car driving system. The client hardware can, for example, be any one or a combination of: visible spectrum cameras, depth cameras, LIDAR sensors, IMU 42 and/or satellite positioning system 44.
The role of each client system 3 is to process client system data acquired from at least one of the sensors 40, IMU 42 and satellite positioning system 44, communicate with the server system 2, and maintain a real-time global position estimate of the client system 3.
The client system 3 comprises an odometry system 34 which processes the client combined data and translates this data into relative position differences. Based on the sensor used (or in combination with IMU 42 and/or satellite positioning system 44) this can be implemented in several ways. For example, in the case of a camera sensor, the camera movement can be determined by matching and triangulating observed image features or pixel intensities in between successive measurements to provide visual odometry. Mesh-matching methods such as iterative closest point optimisation can be used to achieve similar pose estimates in active sensors such as depth cameras or LIDAR sensors. Alternatively, this system can be solely based on or aided by an inertial measurement unit (IMU). Furthermore, several such measurements coming from different sensors and modalities can be integrated into one pose estimate using methods such as Kalman Filtering to compensate for individual sensor drawbacks and to achieve higher robustness and accuracy.
From the client combined data and/or relative position differences, measurements are then accumulated to provide relative real-time position estimates. In the most general case the position estimates take the form of a pose with six degrees-of-freedom (three-dimensional rotation and three-dimensional translation), but in the case of embodiments relating to, for example, planar automotive scenarios this can be reasonably reduced to three degrees of freedom (rotation and two-dimensional translation). In some embodiments, only one degree of freedom can be used.
A property of odometry is that it typically accumulates an error over time if based purely on local sensor data and estimates - a problem known as "drift". Optionally, to mitigate this effect, odometry can be extended to a full client simultaneous location and mapping (SLAM) system utilizing loop closure and graph optimisation procedures. Implementations of such systems will depend on the type of sensor used, such as, for example monocular or stereo camera, depth camera, or laser sensors.
The output of the odometry system 34 can provide a substantially high-quality estimate (or estimates) of device position in relation to some arbitrary local origin (the arbitrary local origin is typically the position of the device where the system started or initiated). To achieve global localisation (i.e. a substantially high-accuracy position in a global map), a relative position to the local origin in the global coordinate map must be estimated. To achieve this, the client system 3 regularly performs "global localisation requests" 39 to the server system 2. A summary of recent sensor inputs in a form of, for example, image or video data, depth maps, features, relationship to previous localisation requests etc. is aggregated to create a localisation request 39. Sometimes, for a particular frame, data will only be available from one, typically high frequency sensor, such as an IMU and so only this data is transmitted in the
localisation request 39. For a typical frame, data may be available from a plurality of sensors, for example an image from a visual sensor along with IMU data, which can be transmitted in the localisation request 39. As the bandwidth between the client system 2 and server system 3 is limited, this localisation request 39 is usually of a much smaller data size and performed at much lower frequency than the related equivalent raw sensor data but given sufficient bandwidth the raw sensor data can optionally be streamed directly to the server system as a continuous localisation request (and similarly the localisation response from the server can then be intermittent or continuous).
This localisation request 39 is then sent to a localiser module or (sub-)process 20 at the server system 2. Simultaneously the current estimate of the device position in the local coordinate frame produced by the odometry is added to the client "local pose record" database 36. The localiser 20 at the server system 2 responds to the localisation request 39 from the client device 3 with an estimate of the "global pose" of the device 3 at a time of the issued localisation request 39, sent by the server system 2 as a localisation response 26. This localisation response 26, when received by the client system 3, is then stored in the "Global Pose Record" database 30 on the client system 3. Provided that at least one localisation request 39 was successfully responded to, the relative and global pose of these requests are retrieved from Local and Global Pose Records 36, 30 and compared to provide the estimate of the local origin pose in the global map of the client system 3. This estimate is then combined with subsequent high-frequency device pose/location estimates in the local coordinate frame from the odometry system 34 to provide a high-frequency device pose, or location, in the global coordinate frame. This can be achieved by translating high-frequency device pose/location estimates by the estimated local to the global coordinate frame transform. Alternatively, further optimisation can be performed to combine local position estimates with one or more responses based on properties of local odometry and global localisation system (for example, GPS).
In some embodiments, the information stored in the local and/or global pose records is a list of local and/or global positions of the device. Each position is associated with a particular time and unique timestamp or ID. In some embodiments, as an alternative or addition, each position can be associated with GPS data which can include timestamp data. The aforementioned time might be that of a particular sensor measurement, and the timestamp or ID can be used to cross-reference the local and global record. Relating the local and global pose of one or multiple device poses together with the current local device pose gives the current global device pose. In other embodiments, additional or alternative information can be stored in the local or global pose records.
The principle of operation 100 of the client system 3 is set out in the flowchart of Figure 2, which will now be described in further detail.
First, or in parallel with normal operation, the client system 3 needs to determine a local-to- global frame transform. In step 102, the client system 3 carries out the step of fetching the positions of past globally localised images in the global frame from the global pose record 30. Then, in step 104, the client system 3 carries out the step of fetching the positions of past globally localised images in the local frame from the local pose record 36. Next, in step 106, the positions are compared in order to compute a local-to-global frame transform.
For normal operation, the client system 3 in step 108 obtains or receives sensor data, for example image or video data from a camera. Then, in step 110, the client system 3 computes the position of the current image from the camera in the local frame. Next, in step 112, and using the previously- or parallel-computed local-to-global frame transform determined in step 106, the client system 3 computes the position of the current image from the camera in the global frame. Then, in step 1 14, the client system outputs the position of the current image in the global frame.
The server system 2 of this embodiment will now be described in further detail with reference to Figure 1.
The server system 2 comprises three modules, systems or sub-systems: the global master map 22, localiser 20 and map updater 24.
The global master map 22 is in communication with the localiser 20, and the localiser 20 can retrieve data from the global master map 22 as needed to perform the functions of the localiser 20. The map updater 24 is also in communication with the global master map 22 and can both input to and retrieve data from the global master map 22. The map updater 24 also receives data received by the server system 2 from client systems 3, specifically localisation requests 39. The map updater 24 is also in communication with the localiser 20 and also receives the localisation responses 26 sent by the localiser 20 in response to each localisation request 39.
The global master map 22 comprises a collection of localisation nodes. Each localisation node summarises a particular sensory experience at a particular place (such as a picture, position of visual features, depth map or three-dimensional point cloud of the environment, for example) and, optionally, metadata (such as a combination of different weather conditions and/or lighting conditions, for example) required for the purpose of localiser 20 operation. Each localisation node has an estimate of its pose assigned to it in the global co-ordinate frame. In addition, for the purpose of map updater 24 functionality, the global master map 22 might
contain links or cross-references/relationships between localisation nodes and their positions within the global master map 22.
With reference to Figure 3, the localisation process occurring at the server system 2 will now be described.
In step 202, the localiser 20 receives one or more localisation requests 39 from one or more client systems 3. When a localisation request 39 is received, in step 204 the localiser 20 searches the global master map 22 for relevant localisation nodes capturing similar past sensory experiences from the localisation request 39, termed querying the localisation map for relevant nodes and their positions.
Several different methods can be used together or separately to find relevant localisation nodes. The search can be performed by extraction of visual signatures, for example those based on image features or a neural-network-based encoding to search for nodes with a similar signature by using data acquired from the one or more sensors 40. Additionally, the search can be accelerated based on a location estimate, for instance as given by device satellite (e.g. GPS) position or the history of its previous localisation requests to consider only nodes in a nearby area. Various statistical methods such as tf-idf can be further used to exploit statistical properties and relevance of the individual localisation nodes for the purpose of localisation.
In step 206, based on the data stored in the localisation node and localisation request 39, the localisation process then performs relative pose estimation between the global position of the localisation node and the localisation request 39 to determine the global pose of the client device 3 at the time of issuing the request (i.e. to localise against what are determined to be the relevant nodes of the localisation map). This information is then reported back to the client system in the form of localisation response 26, where the information is aggregated in step 208 from the localisation results from step 206 and then reported in step 210. Optionally, the localisation response can contain part of the master map to be transmitted to the client device to serve localisation requests locally on the client device.
Simultaneously the map updater 24 is notified with the localisation request 39 and response 26, in order to queue the incoming data in order to perform map refinement at regular intervals to update the global master map 22 with data from one or more localisation requests received at the server system, which will now be described with reference to the flow chart in Figure 4. While in some embodiments the global master map 22 can be updated every time a new
localisation request 39 is received, batch optimisation can allow for the process of updating the global master map 22 to be performed more efficiently at intervals.
In step 302, when the map updater 24 receives notification of localisation requests 39 and responses 26, the map updater 24 extracts the information from the localisation request 39 and response 26. The information extracted from the localisation request 39 and response 26 is used to update or create new localisation node in the global master map 22 in step 304. The updated or created new localisation node is included in the global master map 22 in step 306, which might result in an extension of the global master map 22. Further, links can optionally be added to the relevant nodes in the localisation map in step 308.
Finally, as the new information might result in more accurate pose estimates of already contained localisation nodes, in step 310 the map updater 24 performs optimisation to refine this information, for example using either pose-graph optimisation or bundle adjustment.
The principle of this optimisation is to perform global map loop-closures to link the positions of different localisation nodes originating from the same place. These links can contain relative 6 degrees of freedom (3D position and 3D rotation) difference between positions of the nodes. Alternatively, the relative position can include an additional difference in scale for scale-free systems such as monocular cameras giving 7 degrees of freedom (3D position, 3D rotation and a seventh dimension relating to the relative scale difference). The poses of the nodes are then optimised to minimise the total cost of the graph. This cost is calculated using the constraints imposed upon the graph by the relative poses estimated between nodes. Alternatively, this optimisation can include re-computing the position of localisation features or other stored information used for location.
In some embodiments, based on the output of the localiser, performance, and amount of relevant map localisation nodes, one or more client systems can be requested by the server system to modify the frequency and amount of information included in the localisation requests from those one or more client systems (i.e. a sub-set of the client systems may be requested to modify the information transmitted to the server system, to increase or decrease the amount of data transmitted to the server system). This can serve to optimise bandwidth usage, reflecting the varying need for more data at different regions of the global master map 22, based on location and environmental conditions. For example, if the map does not contain enough relevant information in the location region and/or certain environmental conditions (e.g. it is raining and the global master map 22 only has data for fair weather conditions), the client system can be requested to compensate by sending more data and at a higher rate. Conversely, in a well-mapped area the localisation requests and map updates need not be
performed as often. As a result, the bandwidth usage is expected to decrease over time as the global master map 22 gathers more data. In some embodiments, the system can be configured to selectively disable either map updating or localisation such that the system only then performs the other function.
The performance of the entire system 1 can be determined by the plurality of localisation nodes in the global master map 22, the quality of their position estimates of the nodes in the global coordinate frame, the quality of localisation towards these nodes by the localiser module 20 and the quality of the odometry system 34 in the client system 3. The quality of global master map 22 can increase with the amount of data collected by the client systems 3 and integrated as localisation nodes into the global master map 22 on the server system 2. Therefore, the system 1 can perform better over time as more data at different conditions is collected and observed.
In some embodiments, some or all of the client systems do not estimate their position and/or do not transmit an estimated position along with sensor data in localisation requests to the server. Typically, this would be the case for devices that have either a temporarily disabled or malfunctioning odometry system, or limited functionality or hardware, or are producing varyingly inaccurate estimates of position.
In some embodiments, sensor(s) 40 only relates to visual data sensors such as an image camera; a video camera; a monocular camera; a depth camera; a stereo image camera; a high dynamic range camera, a LIDAR sensor; and a radio detection and ranging sensor. Optionally, sensor data may be combined with data from an IMU unit 42 and/or a satellite positioning system (GPS) 44.
Any system feature as described herein may also be provided as a method feature, and vice versa. As used herein, means plus function features may be expressed alternatively in terms of their corresponding structure.
Any feature in one aspect of the invention may be applied to other aspects of the invention, in any appropriate combination. In particular, method aspects may be applied to system aspects, and vice versa. Furthermore, any, some and/or all features in one aspect can be applied to any, some and/or all features in any other aspect, in any appropriate combination.
It should also be appreciated that particular combinations of the various features described and defined in any aspects of the invention can be implemented and/or supplied and/or used independently.
Claims
A method of determining a location of a device having one or more sensors comprising the steps of: sending a localisation request to a server system, the localisation request comprising at least a portion of data from the one or more sensors; receiving localisation data from the server system in response to the localisation request; and determining a location of the device from the received localisation data.
The method of claim 1 , further comprising the step of estimating a location of the device based on data from the one or more sensors and wherein the step of determining a location of the device includes determining the location of the device using the estimated location.
The method of any preceding claim, wherein the step of determining a location of the device further comprises using at least a portion of the data from the one or more sensors along with the received localisation data.
The method of any preceding claim, wherein the localisation request further comprises an estimated location.
The method of any preceding claim, wherein the step of determining a location of the device includes any of: determining the location of the device at the point in time that the localisation request was sent to the server system; or determining the current location of the device taking into account the point in time that the localisation request was sent to the server system.
The method of any preceding claim, wherein said one or more sensors comprises at least one visual data sensor.
The method of claim 6, wherein said at least one visual data sensor comprises any or a combination of: an image camera; a video camera; a monocular camera; a depth camera; a stereo image camera; a high dynamic range camera, a light detection and ranging sensor; a radio detection and ranging sensor.
The method of any of claims 6 or 7, wherein said at least a portion of the data from the one or more sensors comprises visual data from the at least one visual data sensor.
9. The method of claim 8 wherein data from said one or more sensors is combined with data from an inertial measurement unit and/or a satellite positioning system.
10. The method of any preceding claim wherein the step of estimating a location further comprises using previous localisation data.
1 1. The method of claim 10, wherein previous localisation data comprises any of: one or more previous estimates of location; one or more previous localisation requests; one or more of previous said localisation data received from the server system in response to previous said localisation requests; and previous determinations of location.
12. The method of any preceding claim wherein the step of estimating a location is performed using any of: a local pose; or a local frame of reference.
13. The method of any preceding claim wherein the localisation data received from the server system in response to said localisation request uses any of: a global pose; a global frame of reference; or a portion of the master map.
14. The method of claim 13 when dependent on claim 12, further comprising the step of determining a global to local pose transform or relationship; or a global frame of reference to local frame of reference transform or relationship.
15. The method of any preceding claim wherein the device comprises any of a mobile phone, a robot, augmented reality or virtual reality headset, or navigation device, a self- driving vehicle, a drone or other autonomous vehicle.
16. The method of any preceding claim wherein the step of determining the location of the device is accurate to within several centimetres, optionally to within a centimetre.
17. The method of any preceding claim wherein the method is persistent.
18. The method of any preceding claim wherein determining the location of the device can be performed inside buildings or in dense urban areas.
19. The method of any preceding claim wherein the sensor data comprises data on environmental conditions such as weather conditions.
20. The method of any preceding claim wherein the server system comprises global map data; optionally wherein the global map data comprises any of: one or more sets of local map data and/or one or more sets of global map data and/or one or more global maps of interconnected local maps.
21. The method of any preceding claim wherein the server system can be any of: a single server, a distributed server system, a cloud system, a physical system or a virtual system.
22. The method of claim 20 wherein the global map can be distributed between a plurality of server systems.
23. The method of claims 20 to 22 wherein the global map can be hosted in a peer-to-peer arrangement on any of: one or more devices acting as a server system, or a mix of one or more server systems and one or more devices.
24. The method of any previous claim wherein the device and server system communicate over any of: a mobile data network, or wireless data network.
25. The method of any previous claim wherein the device has a odometry system.
26. The method of claim 25, wherein the odometry system processes sensor data to determine relative position differences between successive sensor data.
27. The method of claim 26, wherein the position estimates have between three and six degrees of freedom.
28. The method of either claims 25 or 26, wherein the odometry system performs full simultaneous location and mapping and/or loop closure and/or graph optimisation.
29. The method of any of claims 25 to 28 wherein the odometry system outputs the estimated position.
30. The method of any preceding claim wherein the estimated position is relative to a local origin.
31. The method of any of claims 10, 11 , or 12 to 30 (when dependent on claims 9 or 10), wherein the previous localisation data is stored with a timestamp.
32. The method of any preceding claim, wherein the location of the device is determined in a global co-ordinate frame.
33. The method of any preceding claim, wherein the localisation request is in a local coordinate frame.
34. The method of claim 33 when dependent on claim 32, further comprising the step of
determining a transform between the global co-ordinate frame and the local co-ordinate frame.
35. The method of any preceding claim, wherein the localisation data is in a global coordinate frame.
36. A method of determining localisation data at a server system comprising the steps of: receiving a localisation request from one or more client systems, the localisation request comprising at least a portion of data from one or more sensors; determining relevant localisation nodes from a global map and aggregating this data as localisation data; and sending the localisation data to the relevant one or more client systems in response to the localisation request.
37. The method of claim 36 wherein the localisation request further comprises an estimated location of the respective one or more client systems.
38. The method of claim 36 wherein either the localisation data is operable to determine a location of the device at a point in time that the localisation request was sent to the server system; or the localisation data is operable to determine the current location of the device taking into account the point in time that the localisation request was sent to the server system.
39. The method of any of claims 36 to 38 further comprising the step of providing feedback data to one or more client systems to send more data or less said portion of data.
40. The method of any of claims 36 to 39, further comprising the step of updating the global map with at least a portion of the said at least a portion of data from one or more sensors received from the one or more client systems.
41. The method of any of claims 36 to 40, wherein said one or more sensors comprises at least one visual data sensor.
42. The method of claim 41 , wherein said at least one visual data sensor comprises any or a combination of: an image camera; a video camera; a monocular camera; a depth camera; a stereo image camera; a high dynamic range camera, a light detection and ranging sensor; a radio detection and ranging sensor.
43. The method of any of claims 41 or 42, wherein said at least a portion of the data from the one or more sensors comprises visual data from the at least one visual data sensor.
44. The method of claim 43 wherein data from said one or more sensors is combined with data from an inertial measurement unit and/or a satellite positioning system.
45. The method of any of claims 36 to 43 wherein the step of estimating a location further comprises using previous localisation data.
46. The method of claim 45, wherein previous localisation data comprises any of: one or more previous estimates of location; one or more previous localisation requests; one or more of previous said localisation data received from the server system in response to previous said localisation requests; and previous determinations of location.
47. The method of any of claims 36 to 46 wherein the step of estimating a location is performed using any of: a local pose; or a local frame of reference.
48. The method of any of claims 36 to 47 wherein the localisation data received from the server system in response to said localisation request uses any of: a global pose; a global frame of reference; or a portion of the master map.
49. The method of claim 48 when dependent on claim 45, further comprising the step of determining a global to local pose transform or relationship; or a global frame of reference to local frame of reference transform or relationship.
50. The method of any of claims 36 to 49 wherein the device comprises any of a mobile phone, a robot, augmented reality or virtual reality headset, or navigation device, a self- driving vehicle, a drone or other autonomous vehicle.
51. The method of any of claims 36 to 50 wherein the step of determining the location of the device is accurate to within several centimetres, optionally to within a centimetre.
52. The method of any of claims 36 to 51 wherein the method is persistent.
53. The method of any of claims 36 to 52 wherein determining the location of the device can be performed inside buildings or in dense urban areas.
54. The method of any of claims 36 to 53 wherein the sensor data comprises data on environmental conditions such as weather conditions.
55. The method of any of claims 36 to 54 wherein the server system comprises global map
data; optionally wherein the global map data comprises any of: one or more sets of local map data and/or one or more sets of global map data and/or one or more global maps of interconnected local maps.
56. The method of any of claims 36 to 55 wherein the server system can be any of: a single server, a distributed server system, a cloud system, a physical system or a virtual system.
57. The method of claim 55 wherein the global map can be distributed between a plurality of server systems.
58. The method of claims 55 to 57 wherein the global map can be hosted in a peer-to-peer arrangement on any of: one or more devices acting as a server system, or a mix of one or more server systems and one or more devices.
59. The method of any of claims 36 to 58 wherein the device and server system communicate over any of: a mobile data network, or wireless data network.
60. The method of any of claims 36 to 59 wherein the device has a odometry system.
61. The method of claim 60, wherein the odometry system processes sensor data to determine relative position differences between successive sensor data.
62. The method of claim 61 , wherein the position estimates have between three and six degrees of freedom.
63. The method of either claims 60 or 61 , wherein the odometry system performs full simultaneous location and mapping and/or loop closure and/or graph optimisation.
64. The method of any of claims 60 to 63 wherein the odometry system outputs the estimated position.
65. The method of any of claims 36 to 64 wherein the estimated position is relative to a local origin.
66. The method of any of claims 45, 46 or 47 to 65 (when dependent on claims 43 or 44) wherein the previous localisation data is stored with a timestamp.
67. The method of any of claims 36 to 66, wherein the location of the device is determined in a global co-ordinate frame.
68. The method of any of claims 36 to 66, wherein the localisation request is in a local coordinate frame.
69. The method of claim 68 when dependent on claim 67, further comprising the step of determining a transform between the global co-ordinate frame and the local co-ordinate frame.
70. The method of any of claims 36 to 69, wherein the localisation data is in a global coordinate frame.
71. A method of updating a global map at a server system comprising the steps of: receiving sensor data from one or more client systems; and adding the sensor data to the global map.
72. The method of claim 71 , further comprising the step of receiving estimated location data from one or more client systems.
73. The method of claim 71 or 72, wherein the step of adding the sensor data to the global map comprises any of: creating a new node in the global map; updating or amending an existing node in the global map; or deleting a node in the global map.
74. The method of any of claims 71 or 73 wherein the sensor data comprises any of: environmental data; or weather data; satellite positioning data; inertial measurement data; image data; or video data.
75. The method of any of claims 71 to 74 further comprising the step of optimising the global map.
76. The method of claim 75 wherein the step of optimising comprises performing global loop- closures to link positions of different localisation nodes.
77. The method of any of claims 71 to 76 further comprising the step of sending an instruction to the one or more client systems to provide more or less sensor data to the server system.
78. The method of any of claims 71 to 77 wherein updating the global map is performed at intervals, optionally at time intervals or regular time intervals, or optionally when a predetermined threshold of sensor data has been received.
79. An apparatus configured to provide the method of any preceding claim.
80. A system comprising at least one server system operable to perform the method of any of claims 1 to 35 and at least one client system operable to perform the method of any of claims 36 to 78.
81. A computer program product for providing the method or system of any preceding claim.
82. An apparatus substantially as hereinbefore described in relation to the Figures 1 to 4.
83. A method substantially as hereinbefore described in relation to the Figures 1 to 4.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP17822768.2A EP3559692A1 (en) | 2016-12-21 | 2017-12-21 | Localisation of mobile device using image and non-image sensor data in server processing |
US16/472,767 US11761766B2 (en) | 2016-12-21 | 2017-12-21 | Localisation of mobile device using image and non-image sensor data in server processing |
US18/231,038 US20240118083A1 (en) | 2016-12-21 | 2023-08-07 | Localisation of mobile device using image and non-image sensor data in server processing |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662437246P | 2016-12-21 | 2016-12-21 | |
GBGB1621903.2A GB201621903D0 (en) | 2016-12-21 | 2016-12-21 | Localisation |
US62/437,246 | 2016-12-21 | ||
GB1621903.2 | 2016-12-21 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/472,767 A-371-Of-International US11761766B2 (en) | 2016-12-21 | 2017-12-21 | Localisation of mobile device using image and non-image sensor data in server processing |
US18/231,038 Continuation US20240118083A1 (en) | 2016-12-21 | 2023-08-07 | Localisation of mobile device using image and non-image sensor data in server processing |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018115897A1 true WO2018115897A1 (en) | 2018-06-28 |
Family
ID=58284276
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/GB2017/053874 WO2018115897A1 (en) | 2016-12-21 | 2017-12-21 | Localisation of mobile device using image and non-image sensor data in server processing |
Country Status (4)
Country | Link |
---|---|
US (2) | US11761766B2 (en) |
EP (1) | EP3559692A1 (en) |
GB (1) | GB201621903D0 (en) |
WO (1) | WO2018115897A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021046699A1 (en) * | 2019-09-10 | 2021-03-18 | Beijing Voyager Technology Co., Ltd. | Systems and methods for positioning |
US11776151B2 (en) | 2019-11-08 | 2023-10-03 | Huawei Technologies Co., Ltd. | Method for displaying virtual object and electronic device |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10048753B1 (en) * | 2017-04-20 | 2018-08-14 | Robert C. Brooks | Perspective or gaze based visual identification and location system |
DE102018125397A1 (en) * | 2018-10-15 | 2020-04-16 | Visualix GmbH | Method and device for determining an area map |
US20230019181A1 (en) * | 2019-12-20 | 2023-01-19 | Interdigital Ce Patent Holdings | Device and method for device localization |
US11972576B2 (en) * | 2020-06-30 | 2024-04-30 | Lyft, Inc. | Generating and fusing reconstructions using overlapping map segments |
US11821994B2 (en) * | 2021-06-29 | 2023-11-21 | New Eagle, Llc | Localization of autonomous vehicles using camera, GPS, and IMU |
CN113790728B (en) * | 2021-09-29 | 2024-07-16 | 佛山市南海区广工大数控装备协同创新研究院 | Loose coupling multi-sensor fusion positioning algorithm based on visual odometer |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004213191A (en) * | 2002-12-27 | 2004-07-29 | Denso Wave Inc | Map information provision system and portable terminal therefor |
US20080137912A1 (en) * | 2006-12-08 | 2008-06-12 | Electronics And Telecommunications Research Institute | Apparatus and method for recognizing position using camera |
US20110306323A1 (en) * | 2010-06-10 | 2011-12-15 | Qualcomm Incorporated | Acquisition of navigation assistance information for a mobile station |
US8510039B1 (en) * | 2010-10-05 | 2013-08-13 | The Boeing Company | Methods and apparatus for three-dimensional localization and mapping |
US20140217168A1 (en) * | 2011-08-26 | 2014-08-07 | Qualcomm Incorporated | Identifier generation for visual beacon |
WO2015013418A2 (en) * | 2013-07-23 | 2015-01-29 | The Regents Of The University Of California | Method for processing feature measurements in vision-aided inertial navigation |
WO2015014018A1 (en) * | 2013-08-01 | 2015-02-05 | Mao Weiqing | Indoor positioning and navigation method for mobile terminal based on image recognition technology |
US20150092061A1 (en) * | 2013-09-30 | 2015-04-02 | Qualcomm Incorporated | Location based brand detection |
WO2015048434A1 (en) * | 2013-09-27 | 2015-04-02 | Qualcomm Incorporated | Hybrid photo navigation and mapping |
US20150350846A1 (en) * | 2014-05-27 | 2015-12-03 | Qualcomm Incorporated | Methods and apparatus for position estimation |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3260649B2 (en) | 1997-01-31 | 2002-02-25 | 松下電器産業株式会社 | Mobile station location search method and mobile station location search system |
US6377210B1 (en) | 2000-02-25 | 2002-04-23 | Grey Island Systems, Inc. | Automatic mobile object locator apparatus and method |
CN1575422A (en) | 2001-11-08 | 2005-02-02 | 罗瑟姆公司 | Services based on position location using broadcast digital television signals |
JP4652886B2 (en) | 2005-04-28 | 2011-03-16 | 株式会社エヌ・ティ・ティ・ドコモ | Position estimation apparatus and position estimation method |
US7529236B2 (en) | 2005-08-15 | 2009-05-05 | Technocom Corporation | Embedded wireless location validation benchmarking systems and methods |
US7689522B2 (en) | 2006-01-12 | 2010-03-30 | Ronan Sorensen | Method and system of organizing information based on human thought processes |
KR100941142B1 (en) | 2009-01-06 | 2010-02-09 | 주식회사 텔에이스 | System and method for detecting location using data communication network |
US8855929B2 (en) * | 2010-01-18 | 2014-10-07 | Qualcomm Incorporated | Using object to align and calibrate inertial navigation system |
US20110285591A1 (en) | 2010-05-19 | 2011-11-24 | Palm, Inc. | Correlating contextual data and position data to improve location based services |
EP4332612A3 (en) * | 2015-03-07 | 2024-09-04 | Verity AG | Distributed localization systems and methods and self-localizing apparatus |
US10051434B2 (en) * | 2016-03-24 | 2018-08-14 | Qualcomm Incorporated | Selective crowdsourcing for multi-level positioning |
-
2016
- 2016-12-21 GB GBGB1621903.2A patent/GB201621903D0/en not_active Ceased
-
2017
- 2017-12-21 WO PCT/GB2017/053874 patent/WO2018115897A1/en unknown
- 2017-12-21 US US16/472,767 patent/US11761766B2/en active Active
- 2017-12-21 EP EP17822768.2A patent/EP3559692A1/en not_active Withdrawn
-
2023
- 2023-08-07 US US18/231,038 patent/US20240118083A1/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004213191A (en) * | 2002-12-27 | 2004-07-29 | Denso Wave Inc | Map information provision system and portable terminal therefor |
US20080137912A1 (en) * | 2006-12-08 | 2008-06-12 | Electronics And Telecommunications Research Institute | Apparatus and method for recognizing position using camera |
US20110306323A1 (en) * | 2010-06-10 | 2011-12-15 | Qualcomm Incorporated | Acquisition of navigation assistance information for a mobile station |
US8510039B1 (en) * | 2010-10-05 | 2013-08-13 | The Boeing Company | Methods and apparatus for three-dimensional localization and mapping |
US20140217168A1 (en) * | 2011-08-26 | 2014-08-07 | Qualcomm Incorporated | Identifier generation for visual beacon |
WO2015013418A2 (en) * | 2013-07-23 | 2015-01-29 | The Regents Of The University Of California | Method for processing feature measurements in vision-aided inertial navigation |
WO2015014018A1 (en) * | 2013-08-01 | 2015-02-05 | Mao Weiqing | Indoor positioning and navigation method for mobile terminal based on image recognition technology |
WO2015048434A1 (en) * | 2013-09-27 | 2015-04-02 | Qualcomm Incorporated | Hybrid photo navigation and mapping |
US20150092061A1 (en) * | 2013-09-30 | 2015-04-02 | Qualcomm Incorporated | Location based brand detection |
US20150350846A1 (en) * | 2014-05-27 | 2015-12-03 | Qualcomm Incorporated | Methods and apparatus for position estimation |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021046699A1 (en) * | 2019-09-10 | 2021-03-18 | Beijing Voyager Technology Co., Ltd. | Systems and methods for positioning |
US11940279B2 (en) | 2019-09-10 | 2024-03-26 | Beijing Voyager Technology Co., Ltd. | Systems and methods for positioning |
US11776151B2 (en) | 2019-11-08 | 2023-10-03 | Huawei Technologies Co., Ltd. | Method for displaying virtual object and electronic device |
Also Published As
Publication number | Publication date |
---|---|
GB201621903D0 (en) | 2017-02-01 |
US20200132461A1 (en) | 2020-04-30 |
US11761766B2 (en) | 2023-09-19 |
US20240118083A1 (en) | 2024-04-11 |
EP3559692A1 (en) | 2019-10-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20240118083A1 (en) | Localisation of mobile device using image and non-image sensor data in server processing | |
US11503428B2 (en) | Systems and methods for co-localization of multiple devices | |
US12111178B2 (en) | Distributed device mapping | |
US10540804B2 (en) | Selecting time-distributed panoramic images for display | |
EP3624063B1 (en) | Electronic device localization based on imagery | |
US11380012B2 (en) | Method and apparatus for visual positioning based on mobile edge computing | |
CN113811920A (en) | Distributed pose estimation | |
US20220157032A1 (en) | Multi-modality localization of users | |
US20230375365A1 (en) | Collecting telemetry data for 3d map updates | |
WO2024057779A1 (en) | Information processing device, program, and information processing system | |
CN117635697A (en) | Pose determination method, pose determination device, pose determination equipment, storage medium and program product | |
KR20210014952A (en) | Method and system for estimating location of aerial vehicle | |
CN117730239A (en) | Apparatus and method for navigation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17822768 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2017822768 Country of ref document: EP Effective date: 20190722 |