WO2024033801A1 - A method and system for mapping a real-world environment - Google Patents

A method and system for mapping a real-world environment Download PDF

Info

Publication number
WO2024033801A1
WO2024033801A1 PCT/IB2023/057997 IB2023057997W WO2024033801A1 WO 2024033801 A1 WO2024033801 A1 WO 2024033801A1 IB 2023057997 W IB2023057997 W IB 2023057997W WO 2024033801 A1 WO2024033801 A1 WO 2024033801A1
Authority
WO
WIPO (PCT)
Prior art keywords
real
world environment
mobile robot
robot platform
occupancy grid
Prior art date
Application number
PCT/IB2023/057997
Other languages
French (fr)
Inventor
Martin Tosas Bautista
Original Assignee
Dyson Technology Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dyson Technology Limited filed Critical Dyson Technology Limited
Publication of WO2024033801A1 publication Critical patent/WO2024033801A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • G01C21/383Indoor data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/242Means based on the reflection of waves generated by the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/246Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/246Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
    • G05D1/2464Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM] using an occupancy grid
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/40Control within particular dimensions
    • G05D1/43Control of position or course in two dimensions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2105/00Specific applications of the controlled vehicles
    • G05D2105/80Specific applications of the controlled vehicles for information gathering, e.g. for academic research
    • G05D2105/87Specific applications of the controlled vehicles for information gathering, e.g. for academic research for exploration, e.g. mapping of an area
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2109/00Types of controlled vehicles
    • G05D2109/10Land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • the present invention relates to a method and system for mapping a real-world environment. More particularly, but not exclusively, the method and system, relate to mapping a real-world environment using a mobile robot platform.
  • determining the location and characteristics of objects within said environment enables the robotic device to detect and avoid those objects.
  • Sensor systems detect the objects and store the locations and characteristics within the real-world environment such that, when navigating, the robotic device can determine where it can move unimpeded.
  • Processing the data from the sensor system is a computationally complex and resource-intensive process, and therefore mapping the entirety of the real-world environment can take a long time. This is further compounded by objects within the real-world environment obstructing the sensor system of the robotic device such that determining a representation of the entirety of the real-world environment is further complicated.
  • a method of mapping a real-world environment using a mobile robot platform comprising receiving, from the mobile robot platform, at least one indication of an update to at least a portion of an occupancy grid, the occupancy grid comprising a representation of the real-world environment; generating a distance map representative of at least the portion of the occupancy grid; generating at least one contour based on the distance map and comprising a plurality of waypoints spaced along the contour, each representing a geographical location in the real-world environment; navigating a locomotion- enabled component of the mobile robot platform to at least one of the waypoints; and updating the portion of the occupancy grid based on an input received from the one or more sensors that are mounted on the locomotion- enabled component of the mobile robot platform.
  • mapping of a real-world environment to be undertaken whilst exploring said environment, and in particular capture and update mapping information represented in the occupancy grid for areas which are initially obscured by objects within the environment. Furthermore, by determining contours within the real-world environment, the mapping can be undertaken efficiently by focussing on areas within the real-world environment which are visible within the field of view of the sensors associated with the mobile robot platform.
  • the distance map may comprise a plurality of areas, each categorised as at least one of occupied space representative of an area in the real-world environment comprising at least a part of an object, known empty space, representative of an area in the real-world environment comprising no objects, and unknown space representative of an area in the real-world environment which has not been mapped by the mobile robot platform.
  • the method comprises selecting an appropriate distance, wherein the appropriate distance is representative of a visibility characteristic of the one or more sensors of the mobile robot platform; and refining the distance map based on the appropriate distance.
  • the distance map is refined based on the characteristics of the sensors of the mobile robot platform, thereby ensuring that the distance map accurately represents the portions of the real- world environment visible by the one or more sensors.
  • the step of refining the distance map comprises removing, from the distance map, areas categorised as unknown space. This ensures that the distance map only comprises areas which have been previously mapped, thereby ensuring that the locomotion-enabled component of the mobile robot platform is navigated to geographical locations categorised as known empty space.
  • the visibility characteristic of one or more sensors may comprise determining a minimum distance such that at least one object within the real-world environment is in the field of view of the one or more sensors.
  • a nearest waypoint of the plurality of waypoints may be selected, wherein the nearest waypoint, is a waypoint geographically closest to the location of the mobile robot platform in the real-world environment. This enables the mobile robot platform to navigate to the geographical location of the start of the closest contour in the distance map, ensuring that the real-world environment is mapped in the most efficient manner.
  • the nearest waypoint is not in a blacklist map.
  • a representation of the at least one object in storage associated with the mobile robot platform may be used to interact with the objects at a future time,
  • the representation of the at least one object may further comprise an indication of a geographical location of the at least one object in the real-world environment.
  • the mobile robot system may determine a desired location of the object, and at a future time reposition and/or reorient the object in accordance with the desired location.
  • characteristics of the mobile robot platform may be stored and associated with the characteristics of the representation of the at least one object.
  • the mobile robot platform may return to the location of the object at a future time, and position itself in such a way to efficiently facilitate interaction with the object, such as adjusting the height of a manipulator of the robot platform that is arranged to interact with the object.
  • a system for mapping a real-world environment comprising at least one sensor to capture information associated with the real-world environment; storage to store at least an occupancy grid representative of the real-world environment; at least one processor arranged to update the occupancy grid representative of the real-world environment based on information captured by the at least one sensor; generate a distance map representative of at least a portion of the occupancy grid; determine at least one contour in the distance map comprising a plurality of waypoints spaced along the contour, each waypoint being representative of a geographic location in the real-world environment; determine a route comprising a plurality of the waypoints; and a locomotion-enabled component to navigate to at least one of the waypoints of the route determined by the motion planning component.
  • mapping of a real-world environment to be undertaken whilst exploring said environment, and in particular capture and update mapping information represented in the occupancy grid for areas which are initially obscured by objects within the environment. Furthermore, by determining contours within the real-world environment, the mapping can be undertaken in an efficient manner, by focussing on areas within the environment which may initially be obscured from the field of view of the sensors associated with the mobile robot platform.
  • the system may comprise a simultaneous localization and mapping module for determining the location of at least the locomotion-enabled component within the real-world environment. This enables the mobile robot platform to determine its current location within the real-world environment accurately using the data gathered from one or more sensors.
  • the at least one sensor for capturing information associated with the real-world environment comprises at least one of a camera unit; a time of flight sensor unit; an array distance sensor unit; and an inertial measuring unit.
  • a camera unit a time of flight sensor unit
  • an array distance sensor unit a distance sensor for capturing information associated with the real-world environment
  • an inertial measuring unit a distance sensor for capturing information associated with the real-world environment.
  • a non- transitory computer-readable storage medium comprising a set of computer- readable instructions stored thereon which, when executed by at least one processor are arranged to control a mobile robot platform to map a real-world environment, wherein the instructions, when executed, cause the processor to receive, from the mobile robot platform, at least one indication of an update to at least a portion of an occupancy grid, the occupancy grid comprising a representation of the real-world environment; generate a distance map representative of at least the portion of the occupancy grid; generate at least one contour based on the distance map, and comprising a plurality of waypoints spaced along the contour, each representing a geographical location in the real- world environment; navigating a locomotion-enabled component of the mobile robot platform to at least one of the waypoints; and update the portion of the occupancy grid based on an input received from one or more sensors of the locomotion-enabled component of the mobile robot platform.
  • mapping of a real-world environment to be undertaken whilst exploring said environment, and in particular capture and update mapping information represented in the occupancy grid for areas which are initially obscured by objects within the environment. Furthermore, by determining contours within the real-world environment, the mapping can be undertaken efficiently, by focussing on areas within the environment which may initially be obscured from the field of view of the sensors associated with the mobile robot platform.
  • Figure 1 is a flowchart illustrating a method of mapping a real-world environment using a mobile robot platform, according to an example
  • Figure 2 is a schematic diagram of an occupancy grid used for mapping a real- world environment using a mobile robot platform, according to an example
  • Figure 3 is a schematic diagram of a distance map generated by the mobile robot platform according to an example
  • Figure 4A is a schematic diagram of a refined distance map indicating a plurality of contours according to an example
  • Figure 4B is a schematic diagram of a refined distance map of Figure 4A where the plurality of contours have associated waypoints;
  • Figure 5 is a block diagram of a system for mapping a real-world environment according to an example.
  • FIG. 1 is a flowchart showing a method 100 in accordance with an example.
  • the method 100 maps a real-world environment using a given mobile robot platform, such as the mobile robot platform described below with reference to Figure 5.
  • the method 100 will also be described with reference to representations 200, 300, 400 of an exemplary real-world environment shown in Figures 2 - 4B.
  • an indication to update at least a portion of an occupancy grid representing the real-world environment is received by a mobile robot platform 220.
  • the indication may be a message sent from a user device, indicating that the mobile robot platform 220 is to update the occupancy grid, in other examples the indication may be a notification that the mobile robot platform 220 is to update the occupancy grid, such as at a predetermined time, and/or according to a given schedule.
  • the occupancy grid is a representation of a given real-world environment, such as a room in a dwelling or other building.
  • An example of an occupancy grid 200 is shown in Figure 2.
  • the occupancy grid 200 contains areas of known occupied space 210 representing the edges of objects, surfaces, and other obstructions within the real-world environment, such as walls.
  • the occupancy grid 200 may comprise other information, such as data related to the position of the mobile robot platform 220. Whilst four areas of known occupied space 210 are labelled in the occupancy grid 200 shown in Figure 2, in reality, there may be any number of positions contained within the occupancy grid 200 representing, for example, the locations of walls, surfaces, objects and obstructions within the real-world environment. Taken together, these multiple positions in the occupancy grid 200 signify the edges of objects, surfaces and/or obstructions within the real-world environment indicated by the distance map. These positions are then used to generate the contours which are representative of paths in the real-world environment, in which it is safe for the mobile robotic platform 220 to move without impinging on the positions. Other information may also be associated with the mobile robot platform 220 and stored in storage. Examples of such information include the positioning of servos, motors, hydraulics and/or pneumatic components of the mobile robot platform 220. This information may be used to position one or more armatures or other moveable components.
  • areas within the real-world environment may not be visible by one or more sensors associated with the mobile robot platform, that are used to capture data representing the real-world environment. For example, in some areas, the view of one or more sensors may be obscured by one or more objects, such as object 260. Examples of such sensors will be described below with reference to Figure 5. Based on the data captured, areas within the occupancy grid 200 may be categorised as one of:
  • the indication to update at least a portion of the occupancy grid may be received by the mobile robot platform via a wired or wireless network connection, including but not limited to 802.11 , Ethernet, Bluetooth®, or nearfield communications (‘NFC’) protocol.
  • a wired or wireless network connection including but not limited to 802.11 , Ethernet, Bluetooth®, or nearfield communications (‘NFC’) protocol.
  • the portion of the occupancy grid may represent a predefined area within the real-world environment, such as a room in a dwelling, may represent a group of pre-defined areas, such as a whole apartment within an apartment block, or may represent a given area within another area, such as part of a larger room, as indicated by area 250 in the occupancy grid 200 of Figure 2.
  • a user of the mobile robotic platform 220 can use a network-connected computing device, such as a mobile telephone, laptop computer, desktop computer, or wearable device to indicate the portions of the occupancy grid to update.
  • indications may be received periodically, or at predetermined times.
  • the indications may be received from one or more sensors in the real-world environment which detect changes, and/or may be received from a remote device such as a user’s smartphone These indications may be used to notify the mobile robotic platform that a portion of the occupancy grid is to be updated, thereby ensuring an accurate representation of the real-world environment is maintained.
  • a distance map is generated.
  • a distance map 300 for this given real-world environment is shown in Figure 3.
  • the distance map 300 represents the portion of the occupancy grid 200 for which the indication was received, in this example, the entirety of the real-world environment, and represents the maximum distance in the real-world environment where the mobile robot platform 220 can move. By moving within this maximum distance, the mobile robot platform 220 is able to capture data regarding the object(s), surface(s), and/or obstruction(s) represented in the occupancy grid 200 clearly.
  • the mobile robot platform 220 is able to position itself such that it is close enough to the object, surface and/or obstruction for the one or more sensors to be able to capture in-focus data regarding said object, surface and/or obstruction.
  • Positions such as positions 310, 320, 330, are represented on the distance map 300 as areas surrounding the known occupied space 210 in the occupancy grid 200.
  • the positions 310, 320, 330 may be based on characteristics of the mobile robot platform 220 itself, for example, a clearance required for any locomotion components such as a wheel assembly, or in some examples, may be based on visibility characteristics of one or more sensors associated with the mobile robot platform 220.
  • a given sensor associated with the mobile robot platform 220 may have a set focal length.
  • the distance map 300 may represent areas within the occupancy grid 200 where the mobile robot platform 220 may move to a position within the real-world environment. This enables the mobile robot platform 220 to accurately capture in-focus data of whatever object, obstruction, and/or surface is at a given location.
  • multiple positions 310, 320, 330 may be generated for each sensor, such that each sensor is able to obtain accurate in-focus data regarding the known occupied space 210 from the multiple sensors.
  • the distance map 300 may be refined, such that only positions within the real-world environment where the mobile robot platform 220 can capture accurate in-focus data are represented.
  • the distance map 300 may be refined such that it represents areas of the occupancy grid that are known empty space 230 or unknown space 240. In yet further examples, the distance map may be refined such that it only includes areas which are known empty space 230. This may involve removing, from the distance map 300, areas categorised as unknown space 240 and/or known occupied space 210. This ensures that the distance map 300 includes areas which the mobile robot platform 220 is able to traverse. However, it will be appreciated that this need not be undertaken, and that other means of ensuring the mobile robot platform 220 does not enter unknown space 240 when mapping the real-world environment may be used.
  • Examples of this include the use of a location positioning system in combination with geofencing, which may be provided by a simultaneous location and mapping method performed by a simultaneous location and method module, as described below.
  • the movement of the mobile robot platform 220 itself may be used to update the occupancy grid 200 as it traverses the real-world environment.
  • the distance map 300 comprises positions 310, 320, 330 which represent a distance from known occupied space 210 identified within the occupancy grid.
  • the positions 310, 320, 330 may be based on a number of characteristics, and represent a location of the mobile robot platform 220 in the real-world environment. This ensures that accurate data can be captured relating to objects, surfaces and/or obstructions in the real-world environment. As such, some of the positions 310, 320, 330 may fall within unknown space 240.
  • one or more contours such as contours 410, 420, 430, are generated based on the positions 310, 320, 330 as shown in Figure 4A.
  • the contours 410, 420, 430 represent portions of the position 310, 320, 330 that fall within known empty space 230. Accordingly, it is possible to move the mobile robot platform 220 to that location within the real- world environment without interacting with any of the objects, surfaces and/or obstructions.
  • a start waypoint 410A, 420A, 430A Associated with each contour 410, 420, 430 is a start waypoint 410A, 420A, 430A.
  • the start waypoint represents a starting waypoint of the contour 410, 420, 430 closest to the geographical location of the mobile robot platform 220 in the real-world environment.
  • other methods for selecting a start waypoint 410A, 420A, 430A may be used.
  • the start waypoint 410A, 420A, 430A may be selected in accordance with other characteristics, such as determining whether the selected waypoint is on a blacklist of waypoints, such as the blacklist of waypoints described below, or based on the length of the path the mobile robot platform 220 must traverse in order to reach the start waypoint 410A, 420A, 430A. It will be appreciated that the methodology for selecting a start waypoint may be based on a combination of methods, such as the above mentioned closest methodology and the blacklist methodology.
  • the contour 410, 420, 430 may be separated into a plurality of waypoints between the start waypoint 410A, 420A, 430A, and an ending waypoint 410Z, 420Z, 430Z representing the furthest point along the contour 410A, 420A, 430A.
  • This therefore, represents a continuous route from the start waypoint 410A, 420A, 430A to the respective end waypoint 410Z, 420Z, 430Z.
  • contours 410, 420, 430 being separated into a plurality of waypoints each with a start waypoint 410A, 420A, 430A and an end waypoint 410Z, 420Z, 430Z is shown in Figure 4B.
  • Figure 4B An example of the contours 410, 420, 430 being separated into a plurality of waypoints each with a start waypoint 410A, 420A, 430A and an end waypoint 410Z, 420Z, 430Z is shown in Figure 4B.
  • One such example may be to divide the contour 410, 420, 430 evenly, such that each waypoint is equidistant from another. This waypoint information may then be stored in association with the occupancy grid to enable easy and quick subsequent access during operations requiring the mobile robot platform 220 to transit the waypoints.
  • the waypoint information is used to navigate the mobile robot platform 220 to a waypoint in one of the contours 410, 420, 430. In some examples, this may be the start waypoint 410A, 420A, 430A associated with the contour 410, 420, 430. Navigating the mobile robot platform 220 to the waypoint may include initiating a locomotion-enabled component associated with the mobile robot platform 220. The locomotion-enabled component enables the mobile robot platform 220 to physically move the mobile robot platform 220 around the real-world environment in accordance with the occupancy grid 200 and the contours 410, 420, 430.
  • the occupancy grid 200 is updated whilst navigating along the contours 410, 420, 430. This may occur whilst navigating along a contour, for example at a waypoint and/or between waypoints.
  • This enables data to be captured using one or more sensors associated with the mobile robot platform 220, such that a more accurate representation of the real-world environment can be obtained.
  • the mobile robot platform 220 may traverse the entire contour 410, 420, 430, and perform the updating process when the end waypoint 410Z, 420Z, 430Z is reached, and/or may perform the updating process whilst navigating between the waypoints.
  • the process can be repeated, such that new contours are determined enabling further exploration of the real-world environment, and further increasing the accuracy of the map and occupancy grid 200.
  • each waypoint visited may be added to a blacklist and the updated blacklist is then stored in storage.
  • the visited waypoints By recording the visited waypoints in this way, it can be tracked which waypoints have and have not been visited, and for which updated information has been obtained.
  • This enables a starting waypoint to be selected from the waypoints which are not contained within the blacklist, and for which updated information has not already been obtained.
  • this enables the mapping process to be stopped and started as required whilst maintaining an understanding of the current progress of the mapping process.
  • the data obtained by the one or more sensors may be analysed to identify objects within the real-world environment.
  • the mobile robot platform can obtain further information about the real-world environment and thereby generate a more accurate mapping of the locations of objects, surfaces and/or other obstructions.
  • information associated with the identity and/or representation of the objects may be stored as part of, or separately from, the occupancy grid, enabling subsequent access and analysis to be undertaken.
  • the identity and/or representation of the object may be associated with its geographical location in the real-world environment.
  • the mobile robot platform 220 may need to adjust the position of one or more armatures or other moveable components associated with it in order to traverse a contour, and/or to perform an action at a given location.
  • the position of the armature or other moveable component may also be tracked and stored alongside the identity of the object, the representation of the object and/or a given location in the occupancy grid 200. It will be appreciated that other characteristics of the mobile robot platform 220 may also be stored, not just the position of the armature and/or other moveable components.
  • Figure 5 shows a schematic representation of a system 500, such as the mobile robot platform 220 described above with reference to Figures 1 through 4B.
  • the components of the system 510, 520, 530, 540, 550 may be interconnected by a bus, or in some examples, may be separate such that data is transmitted to and from each component via a network.
  • the system 500 comprises at least one sensor 510A, 510Z for capturing information associated with the real-world environment.
  • the one or more sensors 510A, 510Z may include a camera unit for capturing frames of image data representing the real-world environment.
  • the camera unit may be a visual camera unit configured to capture data in the visible light frequencies. Alternatively, and/or additionally, the camera unit may be configured to capture image data in the infra-red frequencies. It will be appreciated that other types of camera unit may be used.
  • the camera unit may comprise multiple individual cameras each configured differently, such as with different lens configurations, and may be mounted in such a way as to be a 360-degree camera.
  • the camera unit may be arranged to rotate such that it scans the real-world environment, thereby increasing its field of view. Again, it will be appreciated that other configurations may be possible.
  • the at least one sensor 510A, 510Z may comprise a time of flight sensor unit or array distance sensor unit configured to measure the distance to/from the sensor unit to objects, surfaces and/or obstacles in the real-world environment.
  • An example of such time of flight or array distance sensors includes laser imaging, detection, and ranging (LIDAR). Other time of flight and/or array distance sensors may also be used.
  • the one or more sensors 510A, 510Z may also include an inertial measuring unit for measuring the movement of the mobile robot platform 220 around the real-world environment.
  • the one or more sensors 510A, 510Z provide the captured data to a processor 530 for processing.
  • the processor 530 is arranged to use the captured data to update the occupancy grid 200 accordingly, such that the occupancy grid 200 represents the real-world environment within the field-of-view of the one or more sensors 510A, 51 OZ.
  • the system 500 also comprises storage 520 which may include any type of storage medium such as a solid-state drive (SSD) or other semiconductorbased RAM; a ROM, for example, a CD ROM or a semiconductor ROM; a magnetic recording medium, for example, a floppy disk or hard disk; optical memory devices in general; etc.
  • the storage 520 is configured to store at least an occupancy grid representing the real-world environment, such as the occupancy grid 200 described above.
  • the storage 520 may be configured to store characteristics associated with the mobile robot platform, such as the position of armatures and other moveable components, in addition to the identities and characteristics of objects within the real-world environment.
  • the processor 530 is configured to perform at least the method 100 described above with reference to Figures 1 - 4B, and is configured to receive the occupancy grid data from storage 520 and data representing the real-world environment from the one or more sensors 510A, 510Z.
  • the processor 530 comprises at least an updating module 532 for updating the occupancy grid based on the data obtained by the one or more sensors 510A, 510Z, as described above in relation to step 150 of method 100. Updating the occupancy grid comprises analysing the data obtained by the one or more sensors 510A, 510Z and categorising portions of the occupancy grid as known empty space, unknown space, and known occupied space.
  • a distance map generating module 534 associated with the processor 530 is configured to generate the distance map, such as distance map 300, based on the known occupied space represented in the occupancy grid in accordance with step 120 of method 100 described above.
  • a contour determination module 536 is used to generate a plurality of contours based on the generated distance map, where each contour comprises a plurality of waypoints as described above in accordance with step 130 of method 100.
  • the processor 530 also comprises a motion planning component 538 for determining a route comprising the plurality of waypoints of a given contour.
  • the motion planning component 538 analyses the contours generated by the contour determination module 536, and maps a route for a mobile robot platform, such as mobile robot platform 220 described above to take.
  • the occupancy grid comprises other information, such as data relating to the positioning of armatures and/or other moveable components of the mobile robot platform 220
  • the motion planning component 538 may be arranged to indicate this positioning data in accordance with a given waypoint on the determined route.
  • the locomotion-enabled component 540 may be a wheel assembly, propellor assembly, or other controllable means for moving a mobile robot platform around the real-world environment. This enables the one or more sensors 510A, 510Z to capture data relating to areas of the real-world environment that were previously outside the field-of-view of the one or more sensors 510A, 510Z.
  • the system 500 may comprise a simultaneous localization and mapping (SLAM) module 550 for locating the system 500 in the real-world environment.
  • the SLAM module 550 may comprise several additional sensors and/or components such as a local positioning sensor. This may be used in combination with other sensors such as the inertial measuring sensor described above, and/or a satellite radio-navigation system. Examples of such satellite radio-navigation systems include the Global Positioning System, Galileo, or GLONASS. These sensors, either individually or together, are capable of tracking the location of the system 500 in the real-world environment as the locomotion-enabled component 540 moves the system around the real-world environment. It will be appreciated that the simultaneous localization and mapping module may comprise other components for performing these functions.
  • At least some aspects of the examples described herein with reference to Figures 1 - 5 comprise computer processes performed in processing systems or processors.
  • the disclosure also extends to computer programs, particularly computer programs on or in an apparatus, adapted for putting the disclosure into practice.
  • the program may be in the form of non-transitory source code, object code, a code intermediate source and object code such as in partially compiled form, or in any other non-transitory form suitable for use in the implementation of processes according to the disclosure.
  • the apparatus may be any entity or device capable of carrying the program.
  • the apparatus may comprise a storage medium, such as a solid-state drive (SSD) or other semiconductor-based RAM; a ROM, for example, a CD ROM or a semiconductor ROM; a magnetic recording medium, for example, a floppy disk or hard disk; optical memory devices in general; etc.
  • SSD solid-state drive
  • ROM read-only memory
  • magnetic recording medium for example, a floppy disk or hard disk
  • optical memory devices in general etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Multimedia (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Traffic Control Systems (AREA)

Abstract

A method and system of mapping a real-world environment using a mobile robot platform comprising receiving, an indication of an update to at least a portion of an occupancy grid, comprising a representation of the real-world environment. A distance map representative of at least the portion of the occupancy grid is generated. A contour based on the distance map is generated that comprises a plurality of waypoints spaced along the contour; and navigating a locomotion- enabled component of the mobile robot system to at least one of the waypoints. The portion of the occupancy grid is updated based on an input received from the one or more sensors mounted on the locomotion-enabled component of the mobile robot platform.

Description

A METHOD AND SYSTEM FOR MAPPING A REAL-WORLD ENVIRONMENT
Field of the Invention
The present invention relates to a method and system for mapping a real-world environment. More particularly, but not exclusively, the method and system, relate to mapping a real-world environment using a mobile robot platform.
Background of the Invention
When navigating a robotic device in a real-world environment, determining the location and characteristics of objects within said environment enables the robotic device to detect and avoid those objects. Sensor systems detect the objects and store the locations and characteristics within the real-world environment such that, when navigating, the robotic device can determine where it can move unimpeded. Processing the data from the sensor system is a computationally complex and resource-intensive process, and therefore mapping the entirety of the real-world environment can take a long time. This is further compounded by objects within the real-world environment obstructing the sensor system of the robotic device such that determining a representation of the entirety of the real-world environment is further complicated. of the Invention
According to a first aspect of the present invention, there is provided a method of mapping a real-world environment using a mobile robot platform, the method comprising receiving, from the mobile robot platform, at least one indication of an update to at least a portion of an occupancy grid, the occupancy grid comprising a representation of the real-world environment; generating a distance map representative of at least the portion of the occupancy grid; generating at least one contour based on the distance map and comprising a plurality of waypoints spaced along the contour, each representing a geographical location in the real-world environment; navigating a locomotion- enabled component of the mobile robot platform to at least one of the waypoints; and updating the portion of the occupancy grid based on an input received from the one or more sensors that are mounted on the locomotion- enabled component of the mobile robot platform. This enables the mapping of a real-world environment to be undertaken whilst exploring said environment, and in particular capture and update mapping information represented in the occupancy grid for areas which are initially obscured by objects within the environment. Furthermore, by determining contours within the real-world environment, the mapping can be undertaken efficiently by focussing on areas within the real-world environment which are visible within the field of view of the sensors associated with the mobile robot platform.
The distance map may comprise a plurality of areas, each categorised as at least one of occupied space representative of an area in the real-world environment comprising at least a part of an object, known empty space, representative of an area in the real-world environment comprising no objects, and unknown space representative of an area in the real-world environment which has not been mapped by the mobile robot platform. By categorising areas in the distance map, object avoidance can be implemented, and the mapping focussed on portions which have yet to be mapped by the mobile robot platform, thereby increasing the efficiency of the method and the speed by which an accurate map of the entirety of the real-world environment can be generated.
Preferably, the method comprises selecting an appropriate distance, wherein the appropriate distance is representative of a visibility characteristic of the one or more sensors of the mobile robot platform; and refining the distance map based on the appropriate distance. This enables the distance map to be refined based on the characteristics of the sensors of the mobile robot platform, thereby ensuring that the distance map accurately represents the portions of the real- world environment visible by the one or more sensors. Optionally, the step of refining the distance map comprises removing, from the distance map, areas categorised as unknown space. This ensures that the distance map only comprises areas which have been previously mapped, thereby ensuring that the locomotion-enabled component of the mobile robot platform is navigated to geographical locations categorised as known empty space.
The visibility characteristic of one or more sensors may comprise determining a minimum distance such that at least one object within the real-world environment is in the field of view of the one or more sensors. By refining the distance map based on a minimum distance associated with the field-of-view of the sensors, it ensures that the distance map is limited to the areas where the sensor is capable of obtaining data, such that only areas within the field-of-view of the sensor at a given time are used when determining the contour and, subsequently, the areas the mobile robot platform moves within.
A nearest waypoint of the plurality of waypoints may be selected, wherein the nearest waypoint, is a waypoint geographically closest to the location of the mobile robot platform in the real-world environment. This enables the mobile robot platform to navigate to the geographical location of the start of the closest contour in the distance map, ensuring that the real-world environment is mapped in the most efficient manner.
Preferably, the nearest waypoint is not in a blacklist map. The blacklist map may represent portions in the occupancy grid already visited by the mobile robot platform. A visited portion may be added to the blacklist map, the visited portion being the portion of the occupancy grid that has been updated. This prevents the mobile robot platform from revisiting areas in the environment which have already been mapped, and as such, where ethe occupancy grid has already been updated. This ensures that a map of the entirety of the real-world environment is captured in the most efficient and timely manner. Updating the portion of the occupancy grid may comprise identifying at least one object in the field-of-view of the one or more sensors. This enables objects which may obstruct the movement of the mobile robot platform when navigating around the real-world environment. Preferably, a representation of the at least one object in storage associated with the mobile robot platform. By identifying objects, and subsequently storing a representation of an object in the real-world environment, the mobile robot system may use this data to interact with the objects at a future time,
The representation of the at least one object may further comprise an indication of a geographical location of the at least one object in the real-world environment. By determining the geographical location of the object in the real- world environment, the mobile robot system may determine a desired location of the object, and at a future time reposition and/or reorient the object in accordance with the desired location.
Optionally, characteristics of the mobile robot platform may be stored and associated with the characteristics of the representation of the at least one object. By associating characteristics of the mobile robot platform with the object representation, the mobile robot platform may return to the location of the object at a future time, and position itself in such a way to efficiently facilitate interaction with the object, such as adjusting the height of a manipulator of the robot platform that is arranged to interact with the object.
According to a second aspect of the present invention, there is provided a system for mapping a real-world environment, the system comprising at least one sensor to capture information associated with the real-world environment; storage to store at least an occupancy grid representative of the real-world environment; at least one processor arranged to update the occupancy grid representative of the real-world environment based on information captured by the at least one sensor; generate a distance map representative of at least a portion of the occupancy grid; determine at least one contour in the distance map comprising a plurality of waypoints spaced along the contour, each waypoint being representative of a geographic location in the real-world environment; determine a route comprising a plurality of the waypoints; and a locomotion-enabled component to navigate to at least one of the waypoints of the route determined by the motion planning component. This enables the mapping of a real-world environment to be undertaken whilst exploring said environment, and in particular capture and update mapping information represented in the occupancy grid for areas which are initially obscured by objects within the environment. Furthermore, by determining contours within the real-world environment, the mapping can be undertaken in an efficient manner, by focussing on areas within the environment which may initially be obscured from the field of view of the sensors associated with the mobile robot platform.
The system may comprise a simultaneous localization and mapping module for determining the location of at least the locomotion-enabled component within the real-world environment. This enables the mobile robot platform to determine its current location within the real-world environment accurately using the data gathered from one or more sensors.
Preferably, the at least one sensor for capturing information associated with the real-world environment comprises at least one of a camera unit; a time of flight sensor unit; an array distance sensor unit; and an inertial measuring unit. Using different sensors enables differing information to be gathered and processed, and therefore can be used to generate a more accurate and comprehensive representation of the real-world environment.
According to a third aspect of the resent invention, there is provided a non- transitory computer-readable storage medium comprising a set of computer- readable instructions stored thereon which, when executed by at least one processor are arranged to control a mobile robot platform to map a real-world environment, wherein the instructions, when executed, cause the processor to receive, from the mobile robot platform, at least one indication of an update to at least a portion of an occupancy grid, the occupancy grid comprising a representation of the real-world environment; generate a distance map representative of at least the portion of the occupancy grid; generate at least one contour based on the distance map, and comprising a plurality of waypoints spaced along the contour, each representing a geographical location in the real- world environment; navigating a locomotion-enabled component of the mobile robot platform to at least one of the waypoints; and update the portion of the occupancy grid based on an input received from one or more sensors of the locomotion-enabled component of the mobile robot platform. This enables the mapping of a real-world environment to be undertaken whilst exploring said environment, and in particular capture and update mapping information represented in the occupancy grid for areas which are initially obscured by objects within the environment. Furthermore, by determining contours within the real-world environment, the mapping can be undertaken efficiently, by focussing on areas within the environment which may initially be obscured from the field of view of the sensors associated with the mobile robot platform.
Further features and advantages of the invention will become apparent from the following description of examples of the invention, given by way of example only, which is made with reference to the accompanying drawings. Optional features of aspects of the present invention may be equally applied to other aspects of the present invention, where appropriate.
Brief
Figure imgf000007_0001
of the
Figure imgf000007_0002
Figure 1 is a flowchart illustrating a method of mapping a real-world environment using a mobile robot platform, according to an example;
Figure 2 is a schematic diagram of an occupancy grid used for mapping a real- world environment using a mobile robot platform, according to an example; Figure 3 is a schematic diagram of a distance map generated by the mobile robot platform according to an example;
Figure 4A is a schematic diagram of a refined distance map indicating a plurality of contours according to an example;
Figure 4B is a schematic diagram of a refined distance map of Figure 4A where the plurality of contours have associated waypoints; and
Figure 5 is a block diagram of a system for mapping a real-world environment according to an example.
Detailed
Figure imgf000008_0001
Details of methods and systems according to examples will become apparent from the following description with reference to the figures. In this description, for the purposes of explanation, numerous specific details of certain examples are set forth. Reference in the specification to ‘an example’ or similar language means that a feature, structure, or characteristic described in connection with the example is included in at least that one example but not necessarily in other examples. It should be further noted that certain examples are illustrated schematically with certain features omitted and/or necessarily simplified for the ease of explanation and understanding of the concepts underlying the examples.
Accurately mapping a real-world environment using a robot can be a time consuming and processor-intensive operation, which is dependent on a number of factors, including the complexity of the real-world environment, the presence of objects within the real-world environment, and the capability of the hardware used to map the real-world environment. With the advent of robotic systems, the mapping of such real-world environments can be optimised, and the accuracy of the generated maps increased. Figure 1 is a flowchart showing a method 100 in accordance with an example. The method 100 maps a real-world environment using a given mobile robot platform, such as the mobile robot platform described below with reference to Figure 5. The method 100 will also be described with reference to representations 200, 300, 400 of an exemplary real-world environment shown in Figures 2 - 4B. Whilst a given real-world environment is shown in Figures 2 - 4B, it will be appreciated that the method 100 may be applied to any given real-world environment, and different representations generated. Although the method 100 is described with reference to a contour exploration methodology, it will be appreciated that other methodologies, such as frontier exploration, may be used in other examples.
At step 110 of method 100, an indication to update at least a portion of an occupancy grid representing the real-world environment is received by a mobile robot platform 220. In some examples, the indication may be a message sent from a user device, indicating that the mobile robot platform 220 is to update the occupancy grid, in other examples the indication may be a notification that the mobile robot platform 220 is to update the occupancy grid, such as at a predetermined time, and/or according to a given schedule. The occupancy grid is a representation of a given real-world environment, such as a room in a dwelling or other building. An example of an occupancy grid 200 is shown in Figure 2. The occupancy grid 200 contains areas of known occupied space 210 representing the edges of objects, surfaces, and other obstructions within the real-world environment, such as walls. In some examples, the occupancy grid 200 may comprise other information, such as data related to the position of the mobile robot platform 220. Whilst four areas of known occupied space 210 are labelled in the occupancy grid 200 shown in Figure 2, in reality, there may be any number of positions contained within the occupancy grid 200 representing, for example, the locations of walls, surfaces, objects and obstructions within the real-world environment. Taken together, these multiple positions in the occupancy grid 200 signify the edges of objects, surfaces and/or obstructions within the real-world environment indicated by the distance map. These positions are then used to generate the contours which are representative of paths in the real-world environment, in which it is safe for the mobile robotic platform 220 to move without impinging on the positions. Other information may also be associated with the mobile robot platform 220 and stored in storage. Examples of such information include the positioning of servos, motors, hydraulics and/or pneumatic components of the mobile robot platform 220. This information may be used to position one or more armatures or other moveable components.
Given the nature of the real-world environment, and the fact that the mobile robot platform 220 is in a given location, areas within the real-world environment may not be visible by one or more sensors associated with the mobile robot platform, that are used to capture data representing the real-world environment. For example, in some areas, the view of one or more sensors may be obscured by one or more objects, such as object 260. Examples of such sensors will be described below with reference to Figure 5. Based on the data captured, areas within the occupancy grid 200 may be categorised as one of:
• known empty space 230, representing areas within the real-world environment where it is known that there are no objects, obstructions and/or surfaces;
• known occupied space 210 as described above representing the edges of objects, obstructions and/or surfaces within the real-world environment; and
• unknown space 240 representing areas within the real-world environment where the one or more sensors are unable to obtain data, for example based on a current field-of-view of the sensor at the location of the mobile robot platform 220. The indication to update at least a portion of the occupancy grid may be received by the mobile robot platform via a wired or wireless network connection, including but not limited to 802.11 , Ethernet, Bluetooth®, or nearfield communications (‘NFC’) protocol. The portion of the occupancy grid may represent a predefined area within the real-world environment, such as a room in a dwelling, may represent a group of pre-defined areas, such as a whole apartment within an apartment block, or may represent a given area within another area, such as part of a larger room, as indicated by area 250 in the occupancy grid 200 of Figure 2. In some examples, a user of the mobile robotic platform 220 can use a network-connected computing device, such as a mobile telephone, laptop computer, desktop computer, or wearable device to indicate the portions of the occupancy grid to update. Alternatively, in another examples, indications may be received periodically, or at predetermined times. In other examples, the indications may be received from one or more sensors in the real-world environment which detect changes, and/or may be received from a remote device such as a user’s smartphone These indications may be used to notify the mobile robotic platform that a portion of the occupancy grid is to be updated, thereby ensuring an accurate representation of the real-world environment is maintained.
Returning again to Figure 1 , following the receipt of the indication to update a portion of the occupancy grid, at step 120, a distance map is generated. Carrying on from exemplary occupancy grid 200 of Figure 2, a distance map 300 for this given real-world environment is shown in Figure 3. The distance map 300 represents the portion of the occupancy grid 200 for which the indication was received, in this example, the entirety of the real-world environment, and represents the maximum distance in the real-world environment where the mobile robot platform 220 can move. By moving within this maximum distance, the mobile robot platform 220 is able to capture data regarding the object(s), surface(s), and/or obstruction(s) represented in the occupancy grid 200 clearly. For example, the mobile robot platform 220 is able to position itself such that it is close enough to the object, surface and/or obstruction for the one or more sensors to be able to capture in-focus data regarding said object, surface and/or obstruction. Positions, such as positions 310, 320, 330, are represented on the distance map 300 as areas surrounding the known occupied space 210 in the occupancy grid 200.
The positions 310, 320, 330 may be based on characteristics of the mobile robot platform 220 itself, for example, a clearance required for any locomotion components such as a wheel assembly, or in some examples, may be based on visibility characteristics of one or more sensors associated with the mobile robot platform 220. For example, a given sensor associated with the mobile robot platform 220 may have a set focal length. As such, the distance map 300 may represent areas within the occupancy grid 200 where the mobile robot platform 220 may move to a position within the real-world environment. This enables the mobile robot platform 220 to accurately capture in-focus data of whatever object, obstruction, and/or surface is at a given location. Where the position is based on the visibility characteristics of more than one sensor, multiple positions 310, 320, 330 may be generated for each sensor, such that each sensor is able to obtain accurate in-focus data regarding the known occupied space 210 from the multiple sensors. In some examples, based on this distance, the distance map 300 may be refined, such that only positions within the real-world environment where the mobile robot platform 220 can capture accurate in-focus data are represented.
In some examples, the distance map 300 may be refined such that it represents areas of the occupancy grid that are known empty space 230 or unknown space 240. In yet further examples, the distance map may be refined such that it only includes areas which are known empty space 230. This may involve removing, from the distance map 300, areas categorised as unknown space 240 and/or known occupied space 210. This ensures that the distance map 300 includes areas which the mobile robot platform 220 is able to traverse. However, it will be appreciated that this need not be undertaken, and that other means of ensuring the mobile robot platform 220 does not enter unknown space 240 when mapping the real-world environment may be used. Examples of this include the use of a location positioning system in combination with geofencing, which may be provided by a simultaneous location and mapping method performed by a simultaneous location and method module, as described below. In yet further examples, the movement of the mobile robot platform 220 itself may be used to update the occupancy grid 200 as it traverses the real-world environment.
Following the generation of the distance map 300, at step 130 one or more contours are generated. The contours represent appropriate paths in the known empty space 230 that a mobile robot platform 220 may follow, so as to avoid the objects, obstructions and/or surfaces 210 in the real-world environment. As mentioned above, the distance map 300 comprises positions 310, 320, 330 which represent a distance from known occupied space 210 identified within the occupancy grid. The positions 310, 320, 330 may be based on a number of characteristics, and represent a location of the mobile robot platform 220 in the real-world environment. This ensures that accurate data can be captured relating to objects, surfaces and/or obstructions in the real-world environment. As such, some of the positions 310, 320, 330 may fall within unknown space 240. It would, therefore, be undesirable to position the mobile robot platform 220 at such a position since it is unknown whether there is a surface, object and/or obstacle at that position. As such, one or more contours, such as contours 410, 420, 430, are generated based on the positions 310, 320, 330 as shown in Figure 4A. The contours 410, 420, 430 represent portions of the position 310, 320, 330 that fall within known empty space 230. Accordingly, it is possible to move the mobile robot platform 220 to that location within the real- world environment without interacting with any of the objects, surfaces and/or obstructions.
Associated with each contour 410, 420, 430 is a start waypoint 410A, 420A, 430A. In some examples, the start waypoint represents a starting waypoint of the contour 410, 420, 430 closest to the geographical location of the mobile robot platform 220 in the real-world environment. However it will be appreciated that other methods for selecting a start waypoint 410A, 420A, 430A may be used. For example, the start waypoint 410A, 420A, 430A may be selected in accordance with other characteristics, such as determining whether the selected waypoint is on a blacklist of waypoints, such as the blacklist of waypoints described below, or based on the length of the path the mobile robot platform 220 must traverse in order to reach the start waypoint 410A, 420A, 430A. It will be appreciated that the methodology for selecting a start waypoint may be based on a combination of methods, such as the above mentioned closest methodology and the blacklist methodology. Following the selection of a start waypoint 410A, 420A, 430A, the contour 410, 420, 430 may be separated into a plurality of waypoints between the start waypoint 410A, 420A, 430A, and an ending waypoint 410Z, 420Z, 430Z representing the furthest point along the contour 410A, 420A, 430A. This, therefore, represents a continuous route from the start waypoint 410A, 420A, 430A to the respective end waypoint 410Z, 420Z, 430Z. An example of the contours 410, 420, 430 being separated into a plurality of waypoints each with a start waypoint 410A, 420A, 430A and an end waypoint 410Z, 420Z, 430Z is shown in Figure 4B. It will be appreciated that there are a number of methodologies for separating the contour into a plurality of waypoints. One such example may be to divide the contour 410, 420, 430 evenly, such that each waypoint is equidistant from another. This waypoint information may then be stored in association with the occupancy grid to enable easy and quick subsequent access during operations requiring the mobile robot platform 220 to transit the waypoints.
Once the contours 410, 420, 430 have been generated and the waypoints determined, at step 140, the waypoint information is used to navigate the mobile robot platform 220 to a waypoint in one of the contours 410, 420, 430. In some examples, this may be the start waypoint 410A, 420A, 430A associated with the contour 410, 420, 430. Navigating the mobile robot platform 220 to the waypoint may include initiating a locomotion-enabled component associated with the mobile robot platform 220. The locomotion-enabled component enables the mobile robot platform 220 to physically move the mobile robot platform 220 around the real-world environment in accordance with the occupancy grid 200 and the contours 410, 420, 430. This is achieved by navigating the mobile robot system 220 to the start waypoint 410A, 420A, 430A of one of the contours 410, 420, 430, and then traversing the contour 410, 420, 430 by navigating to the next waypoint of the contour 410, 420, 430 until the end waypoint 410Z, 420Z, 430Z is reached. This will be described in further detail below with reference to Figure 5.
Referring again to Figure 1 , at step 150, the occupancy grid 200 is updated whilst navigating along the contours 410, 420, 430. This may occur whilst navigating along a contour, for example at a waypoint and/or between waypoints. This enables data to be captured using one or more sensors associated with the mobile robot platform 220, such that a more accurate representation of the real-world environment can be obtained. This is achieved, since areas of the real-world environment which were previously outside the field-of-view of the one or more sensors (that is, areas of unknown space 240) may now be within the field-of-view (and can thus be classed as either known occupied space 210 or known empty space 230) as the mobile robot platform 220 traverses the contours 410, 420, 430. The updating process may occur as each waypoint is visited by the mobile robot platform 220, and, in other examples, may occur at selected waypoints of the contour 410, 420, 430. In yet further examples, the mobile robot platform 220 may traverse the entire contour 410, 420, 430, and perform the updating process when the end waypoint 410Z, 420Z, 430Z is reached, and/or may perform the updating process whilst navigating between the waypoints. In some examples, the process can be repeated, such that new contours are determined enabling further exploration of the real-world environment, and further increasing the accuracy of the map and occupancy grid 200.
As the mobile robot platform 220 traverses along the contour 410, 420, 430 and visits the waypoints performing the update action described above, each waypoint visited may be added to a blacklist and the updated blacklist is then stored in storage. By recording the visited waypoints in this way, it can be tracked which waypoints have and have not been visited, and for which updated information has been obtained. This enables a starting waypoint to be selected from the waypoints which are not contained within the blacklist, and for which updated information has not already been obtained. Furthermore, this enables the mapping process to be stopped and started as required whilst maintaining an understanding of the current progress of the mapping process.
During the exploration of the real-world environment by the mobile robot platform 220 and, whilst updating the occupancy grid 200, the data obtained by the one or more sensors may be analysed to identify objects within the real- world environment. By analysing the data and identifying objects therein, the mobile robot platform can obtain further information about the real-world environment and thereby generate a more accurate mapping of the locations of objects, surfaces and/or other obstructions. In some examples, information associated with the identity and/or representation of the objects may be stored as part of, or separately from, the occupancy grid, enabling subsequent access and analysis to be undertaken. In addition, the identity and/or representation of the object may be associated with its geographical location in the real-world environment.
As mentioned above, in some examples, the mobile robot platform 220 may need to adjust the position of one or more armatures or other moveable components associated with it in order to traverse a contour, and/or to perform an action at a given location. In such examples, the position of the armature or other moveable component may also be tracked and stored alongside the identity of the object, the representation of the object and/or a given location in the occupancy grid 200. It will be appreciated that other characteristics of the mobile robot platform 220 may also be stored, not just the position of the armature and/or other moveable components. Figure 5 shows a schematic representation of a system 500, such as the mobile robot platform 220 described above with reference to Figures 1 through 4B. The components of the system 510, 520, 530, 540, 550 may be interconnected by a bus, or in some examples, may be separate such that data is transmitted to and from each component via a network.
The system 500 comprises at least one sensor 510A, 510Z for capturing information associated with the real-world environment. The one or more sensors 510A, 510Z may include a camera unit for capturing frames of image data representing the real-world environment. The camera unit may be a visual camera unit configured to capture data in the visible light frequencies. Alternatively, and/or additionally, the camera unit may be configured to capture image data in the infra-red frequencies. It will be appreciated that other types of camera unit may be used. In some examples, the camera unit may comprise multiple individual cameras each configured differently, such as with different lens configurations, and may be mounted in such a way as to be a 360-degree camera. In other examples, the camera unit may be arranged to rotate such that it scans the real-world environment, thereby increasing its field of view. Again, it will be appreciated that other configurations may be possible.
In addition to, or instead of, a camera unit, the at least one sensor 510A, 510Z may comprise a time of flight sensor unit or array distance sensor unit configured to measure the distance to/from the sensor unit to objects, surfaces and/or obstacles in the real-world environment. An example of such time of flight or array distance sensors includes laser imaging, detection, and ranging (LIDAR). Other time of flight and/or array distance sensors may also be used. In addition to detecting objects within the environment, the one or more sensors 510A, 510Z may also include an inertial measuring unit for measuring the movement of the mobile robot platform 220 around the real-world environment. The one or more sensors 510A, 510Z provide the captured data to a processor 530 for processing. The processor 530 is arranged to use the captured data to update the occupancy grid 200 accordingly, such that the occupancy grid 200 represents the real-world environment within the field-of-view of the one or more sensors 510A, 51 OZ.
The system 500 also comprises storage 520 which may include any type of storage medium such as a solid-state drive (SSD) or other semiconductorbased RAM; a ROM, for example, a CD ROM or a semiconductor ROM; a magnetic recording medium, for example, a floppy disk or hard disk; optical memory devices in general; etc. The storage 520 is configured to store at least an occupancy grid representing the real-world environment, such as the occupancy grid 200 described above. In some examples, and as explained above, the storage 520 may be configured to store characteristics associated with the mobile robot platform, such as the position of armatures and other moveable components, in addition to the identities and characteristics of objects within the real-world environment.
The processor 530 is configured to perform at least the method 100 described above with reference to Figures 1 - 4B, and is configured to receive the occupancy grid data from storage 520 and data representing the real-world environment from the one or more sensors 510A, 510Z. The processor 530 comprises at least an updating module 532 for updating the occupancy grid based on the data obtained by the one or more sensors 510A, 510Z, as described above in relation to step 150 of method 100. Updating the occupancy grid comprises analysing the data obtained by the one or more sensors 510A, 510Z and categorising portions of the occupancy grid as known empty space, unknown space, and known occupied space.
A distance map generating module 534 associated with the processor 530 is configured to generate the distance map, such as distance map 300, based on the known occupied space represented in the occupancy grid in accordance with step 120 of method 100 described above. A contour determination module 536 is used to generate a plurality of contours based on the generated distance map, where each contour comprises a plurality of waypoints as described above in accordance with step 130 of method 100.
The processor 530 also comprises a motion planning component 538 for determining a route comprising the plurality of waypoints of a given contour. The motion planning component 538 analyses the contours generated by the contour determination module 536, and maps a route for a mobile robot platform, such as mobile robot platform 220 described above to take. In some examples, where the occupancy grid comprises other information, such as data relating to the positioning of armatures and/or other moveable components of the mobile robot platform 220, the motion planning component 538 may be arranged to indicate this positioning data in accordance with a given waypoint on the determined route.
Once the route has been determined by the motion planning component 538, it is output to a locomotion-enabled component 540 for navigating the waypoints of the route. The locomotion-enabled component 540 may be a wheel assembly, propellor assembly, or other controllable means for moving a mobile robot platform around the real-world environment. This enables the one or more sensors 510A, 510Z to capture data relating to areas of the real-world environment that were previously outside the field-of-view of the one or more sensors 510A, 510Z.
In some examples, the system 500 may comprise a simultaneous localization and mapping (SLAM) module 550 for locating the system 500 in the real-world environment. The SLAM module 550 may comprise several additional sensors and/or components such as a local positioning sensor. This may be used in combination with other sensors such as the inertial measuring sensor described above, and/or a satellite radio-navigation system. Examples of such satellite radio-navigation systems include the Global Positioning System, Galileo, or GLONASS. These sensors, either individually or together, are capable of tracking the location of the system 500 in the real-world environment as the locomotion-enabled component 540 moves the system around the real-world environment. It will be appreciated that the simultaneous localization and mapping module may comprise other components for performing these functions.
At least some aspects of the examples described herein with reference to Figures 1 - 5 comprise computer processes performed in processing systems or processors. However, in some examples, the disclosure also extends to computer programs, particularly computer programs on or in an apparatus, adapted for putting the disclosure into practice. The program may be in the form of non-transitory source code, object code, a code intermediate source and object code such as in partially compiled form, or in any other non-transitory form suitable for use in the implementation of processes according to the disclosure. The apparatus may be any entity or device capable of carrying the program. For example, the apparatus may comprise a storage medium, such as a solid-state drive (SSD) or other semiconductor-based RAM; a ROM, for example, a CD ROM or a semiconductor ROM; a magnetic recording medium, for example, a floppy disk or hard disk; optical memory devices in general; etc.
In the preceding description, for purposes of explanation, numerous specific details of certain examples are set forth. Reference in the specification to "an example" or similar language means that a particular feature, structure, or characteristic described in connection with the example is included in at least that one example, but not necessarily in other examples.
The above examples are to be understood as illustrative examples of the disclosure. Further examples of the disclosure are envisaged. It is to be understood that any feature described in relation to any one example may be used alone, or in combination with other features described, and may also be used in combination with one or more features of any other of the examples, or any combination of any other of the examples. Furthermore, equivalents and modifications not described above may also be employed without departing from the scope of the disclosure, which is defined in the accompanying claims.

Claims

1. A method of mapping a real-world environment using a mobile robot platform, the method comprising: receiving, by the mobile robot platform, at least one indication of an update to at least a portion of an occupancy grid, the occupancy grid comprising a representation of the real-world environment; generating a distance map representative of at least the portion of the occupancy grid; generating at least one contour based on the distance map and comprising a plurality of waypoints spaced along the contour, each representing a geographical location in the real-world environment; navigating a locomotion-enabled component of the mobile robot platform to at least one of the waypoints; and updating the portion of the occupancy grid based on an input received from one or more sensors that are mounted on the locomotion-enabled component of the mobile robot platform.
2. The method of claim 1 , wherein the distance map comprises a plurality of areas, each categorised as at least one of: occupied space representative of an area in the real-world environment comprising at least a part of an object; known empty space representative of an area in the real-world environment comprising no objects; and unknown space representative of an area in the real-world environment which has not been mapped by the mobile robot platform.
3. The method of claim 1 , comprising: selecting an appropriate distance, wherein the appropriate distance is representative of a visibility characteristic of the one or more sensors of the mobile robot platform; and refining the distance map based on the appropriate distance.
4. The method of claim 3, wherein the step of refining the distance map comprises removing, from the distance map, areas categorised as unknown space.
5. The method of claim 3 or claim 4, wherein the visibility characteristic of one or more sensors comprises determining a minimum distance such that at least one object within the real-world environment is in the field of view of the one or more sensors.
6. The method of any previous claim, comprising selecting a nearest waypoint of the plurality of waypoints, wherein the nearest waypoint is a waypoint geographically closest to a current location of the mobile robot platform in the real-world environment.
7. The method of claim 6, wherein the nearest waypoint is not in a blacklist map, wherein the blacklist map represents portions in the occupancy grid already visited by the mobile robot platform.
8. The method of claim 7, comprising adding a visited portion to the blacklist map, the visited portion being the portion of the occupancy grid that has been updated.
9. The method any previous claim, wherein updating the portion of the occupancy grid comprises identifying at least one object in the field-of-view of the one or more sensors.
10. The method of claim 9, comprising storing a representation of the at least one object in storage associated with the mobile robot platform.
11. The method of claim 10, wherein the representation of the at least one object comprises an indication of a geographical location of the at least one object in the real-world environment.
12. The method of claim 10 or claim 11 , comprising storing characteristics of the mobile robot platform and associating the characteristics with the representation of the at least one object.
13. The method of any previous claim, whereby the occupancy grid is updated at the or at least some of the plurality of waypoints.
14. A system for mapping a real-world environment, the system comprising: at least one sensor to capture information associated with the real-world environment; storage to store at least an occupancy grid representative of the real- world environment; at least one processor arranged to: update the occupancy grid representative of the real-world environment based on information captured by the at least one sensor; generate a distance map representative of at least a portion of the occupancy grid; determine at least one contour in the distance map comprising a plurality of waypoints spaced along the contour, each waypoint being representative of a geographic location in the real-world environment; and determine a route comprising a plurality of the waypoints; and a locomotion-enabled component to navigate to at least one of the waypoints of the route determined by the motion planning component.
15. The system according to claim 13, comprising a simultaneous localization and mapping module for determining a location of at least the locomotion-enabled component within the real-world environment.
16. The system according to claim 13 or claim 14, wherein the at least one sensor for capturing information associated with the real-world environment comprises at least one of: a camera unit; a time of flight sensor unit; an array distance sensor unit; and an inertial measuring unit.
17. A non-transitory computer-readable storage medium comprising a set of computer-readable instructions stored thereon which, when executed by at least one processor, are arranged to control a mobile robot platform to map a real-world environment, wherein the instructions, when executed, cause the processor to: receive, by the mobile robot platform, at least one indication of an update to at least a portion of an occupancy grid, the occupancy grid comprising a representation of the real-world environment; generate a distance map representative of at least the portion of the occupancy grid; generate at least one contour based on the distance map, and comprising a plurality of waypoints spaced along the contour, each representing a geographical location in the real-world environment; navigate a locomotion-enabled component of the mobile robot platform to at least one of the waypoints; and updating the portion of the occupancy grid based on an input received from one or more sensors of the locomotion-enabled component of the mobile robot platform.
PCT/IB2023/057997 2022-08-10 2023-08-08 A method and system for mapping a real-world environment WO2024033801A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB2211684.2 2022-08-10
GB2211684.2A GB2621564A (en) 2022-08-10 2022-08-10 A method and system for mapping a real-world environment

Publications (1)

Publication Number Publication Date
WO2024033801A1 true WO2024033801A1 (en) 2024-02-15

Family

ID=84546196

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2023/057997 WO2024033801A1 (en) 2022-08-10 2023-08-08 A method and system for mapping a real-world environment

Country Status (2)

Country Link
GB (1) GB2621564A (en)
WO (1) WO2024033801A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10571925B1 (en) * 2016-08-29 2020-02-25 Trifo, Inc. Autonomous platform guidance systems with auxiliary sensors and task planning
SG11202009494YA (en) * 2018-03-28 2020-10-29 Agency Science Tech & Res Method and system for returning a displaced autonomous mobile robot to its navigational path
US20220024034A1 (en) * 2020-07-24 2022-01-27 Samsung Electronics Co., Ltd. Method of predicting occupancy of unseen areas for path planning, associated device, and network training method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11314262B2 (en) * 2016-08-29 2022-04-26 Trifo, Inc. Autonomous platform guidance systems with task planning and obstacle avoidance
CN114859932A (en) * 2022-05-20 2022-08-05 郑州大学产业技术研究院有限公司 Exploration method and device based on reinforcement learning and intelligent equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10571925B1 (en) * 2016-08-29 2020-02-25 Trifo, Inc. Autonomous platform guidance systems with auxiliary sensors and task planning
SG11202009494YA (en) * 2018-03-28 2020-10-29 Agency Science Tech & Res Method and system for returning a displaced autonomous mobile robot to its navigational path
US20220024034A1 (en) * 2020-07-24 2022-01-27 Samsung Electronics Co., Ltd. Method of predicting occupancy of unseen areas for path planning, associated device, and network training method

Also Published As

Publication number Publication date
GB2621564A (en) 2024-02-21
GB202211684D0 (en) 2022-09-21

Similar Documents

Publication Publication Date Title
CN106796434B (en) Map generation method, self-position estimation method, robot system, and robot
JP6705465B2 (en) Observability grid-based autonomous environment search
KR102159376B1 (en) Laser scanning system, laser scanning method, mobile laser scanning system and program
US11249492B2 (en) Methods and apparatus to facilitate autonomous navigation of robotic devices
KR102234461B1 (en) Method and system for generating depth information of street view image using 2d map
CN110889808A (en) Positioning method, device, equipment and storage medium
CN108628318A (en) Congestion environment detection method, device, robot and storage medium
JP2019032700A (en) Information processing equipment, information processing method, program, and moving entity
WO2019124343A1 (en) Moving body
EP3839817A2 (en) Generating and/or using training instances that include previously captured robot vision data and drivability labels
JP2023071592A (en) Autonomous mobile robot for coverage path planning
CN111376249B (en) Mobile equipment positioning system, method and device and mobile equipment
WO2024033801A1 (en) A method and system for mapping a real-world environment
US20220019227A1 (en) Route planning apparatus, route planning method, and computer-readable recording medium
CN110631586A (en) Map construction method based on visual SLAM, navigation system and device
CN112393719B (en) Grid semantic map generation method and device and storage equipment
US20220317293A1 (en) Information processing apparatus, information processing method, and information processing program
CN113014658B (en) Device control, device, electronic device, and storage medium
US20220357751A1 (en) Hybrid sky and ground navigation for machine employing satellite positioning
CN114812539A (en) Map search method, map using method, map searching device, map using device, robot and storage medium
ELzaiady et al. Next-best-view planning for environment exploration and 3D model construction
CN110595457B (en) Pseudo laser data generation method, map construction method, navigation method and system
CN109901589B (en) Mobile robot control method and device
WO2024033804A1 (en) A method and system for exploring a real-world environment
US10735902B1 (en) Method and computer program for taking action based on determined movement path of mobile devices