GB2584839A - Mapping of an environment - Google Patents

Mapping of an environment Download PDF

Info

Publication number
GB2584839A
GB2584839A GB1908432.6A GB201908432A GB2584839A GB 2584839 A GB2584839 A GB 2584839A GB 201908432 A GB201908432 A GB 201908432A GB 2584839 A GB2584839 A GB 2584839A
Authority
GB
United Kingdom
Prior art keywords
room
environment
doorway
endpoint
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1908432.6A
Other versions
GB2584839B (en
GB201908432D0 (en
Inventor
Yuan Haifeng
Chuan Lim Han
chang To
Hai Huan Zheng Daniel
Collingwood Watson Andrew
Ruth Voisey Stephanie
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dyson Technology Ltd
Original Assignee
Dyson Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dyson Technology Ltd filed Critical Dyson Technology Ltd
Priority to GB1908432.6A priority Critical patent/GB2584839B/en
Publication of GB201908432D0 publication Critical patent/GB201908432D0/en
Priority to CN202010528248.0A priority patent/CN112087573B/en
Publication of GB2584839A publication Critical patent/GB2584839A/en
Application granted granted Critical
Publication of GB2584839B publication Critical patent/GB2584839B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • G01C21/383Indoor data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3837Data obtained from a single source
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • G01S17/48Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0044Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with a computer generated representation of the environment of the vehicle, e.g. virtual reality, maps
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/162Segmentation; Edge detection involving graph-based methods
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A method 100 for mapping an environment is disclosed. The method includes obtaining navigation data from an autonomous robotic device (200, Fig. 2A). This data is used to identify a first room and a second, adjacent, room (502, 506, Fig.7) within the environment, by determining at least one endpoint associated with a doorway between the rooms. An internal wall is identified using the endpoint. A path corresponding to the internal wall may be found, possibly optimising the trajectory using a cost function. Feature data may be recovered, representing objects (504a, Fig. 7) occupying locations in the environment, whose image may be deleted via flood-filling. Simultaneous localisation and mapping (SLAM) techniques may be employed. The width and/or height of features may be measured. Distance metrics may be defined and distances between doorways or other locations can be measured. The navigation data may represent a map. Doorways or halls may be identified by recognising overlapping areas/regions between rooms (Fig. 10). Graphs may be employed to illustrate trajectories. A remote terminal (1806, Fig. 27 A) may be employed by a user to interact with the robot. An associated system and computer program product are also disclosed.

Description

MAPPING OF AN ENVIRON ANT
Technical Field
The present invention relates to relates to mapping an environment. The invention has particular, but not exclusive, relevance to mapping an environment using navigation data obtained from an autonomous robotic device.
Background
Low cost robotic devices, such as floor cleaning robots, generally rely on limited perception and simple algorithms to map, and in certain cases navigate, a three-dimensional space, such as the interior of a house or other building. For example, a robotic device may include art infra-red or ultrasonic sensor that detects objects within a line of site that may then be avoided. While great progress has been made around techniques such as simultaneous localisation and mapping (SLAM)" many of the solutions rely on the substantial computational resources that are available to research laboratories. This makes it difficult to translate these solutions to the embedded computing devices that control real-world commercial robotic devices. Additionally, certain solutions require a suite of specialised sensor devices such as LAser Detection And Ranging RADAR) sensors, structured light sensors, or time-of-flight depth cameras. These specialised sensor devices add expense and complexity that makes them less suitable for real-world robotic applications.
The paper "A Solution to Room-by-Room Coverage for Autonomous Cleaning Robots" by Kleiner et al., published in the Proceedings of the IEEE International Conference on Intelligent Robots and Systems (IROS) in 2017 describes a method for segmenting occupancy grid maps into regions that represent rooms and corridors in the real world.
Given existing techniques, there is still a desire to accurately and efficiently map an environment, for example to improve navigation or cleaning of the environment by a robotic device.
Summary
According to a first aspect of the present invention, there is provided a method for mapping an environment. The method comprises obtaining navigation data from an autonomous robotic device. The method further comprises using the navigation data to identify a first room and a second room within the environment; wherein the first room is adjacent to the second room, comprising determining at least one endpoint associated with a doorway between the first room and the second room and identifying an internal wall between the first room and the second room using the at least one endpoint. With the method of the first aspect, the internal wall may be identified more efficiently. For example, the at least one endpoint may be used to constrain a position of the internal wall, allowing the internal wall to be identified more readily than other approaches. This may also improve the accuracy with which the internal wall is identified. In this way, the environment can be accurately mapped, which can improve subsequent interactions between the robotic device and the environment.
In certain examples, the navigation data is used to identify an external wall associated with a boundary of the environment In such examples, identifying the internal wall comprises determining a path from an endpoint of the at least one endpoint to the external wall and identifying that the path corresponds to the internal In certain examples, the doorway is a first doorway, the internal wall is a first internal wall, and the method comprises determining a path from a first endpoint of the at least one endpoint to a second endpoint associated with a second doorway between the first room and a third room of the environment, and identifying that the path corresponds to a second internal wall.
In certain examples, identifying the internal wall comprises optimising a cost function to identify a path from an endpoint of the at least one endpoint to a target position within the environment and associating the path with the internal wall. In such examples, the cost function may comprise at least one of a first cost term to penalise an input path that extends outside of at least one predetermined direction or a second cost term to penalise a change in direction of an input path. M such examples, feature data representative of a position of a feature within the environment may obtained, wherein the feature indicates occupancy of space within the environment. In these examples the cost function may comprise a third cost term to reward an input path that coincides with the position of the feature. The feature data may he obtained from a simultaneous localisation and mapping system. The method may include determining that the feature is positioned at a height that satisfies a height condition.
In certain examples, the doorway is a first doorway, the internal wall is a first internal wall, and the method comprises identifying a set of candidate doorways containing the first doorway, sorting the set of candidate doorways into a processing order, according to a doorway characteristic, and processing the set of candidate doorways in the processing order, to identify a set of candidate internal walls within the environment, the set of candidate internal walls containing the first internal wall. in such examples, the doorway characteristic may comprise at least one of: a distance between a doorway and a further doorway, a distance between a doorway and an external wall associated with a. boundary of the environment, or a size of a room associated with a doorway.
In certain examples" determining the at least one endpoint comprises determining an area of overlap between a first area associated with the first room and a second area associated with the second room, and determining a location of the doorway based on the area of overlap.
In certain examples, the method comprises determining a plurality of endpoints associated with the doorway, the plurality of endpoints comprising the at least one endpoint, determining that each of the plurality of endpoints associated with the doorway connects to at least one of an external wall associated with a boundary of the environment or an endpoint associated with a further doorway within the environment, and, subsequently, adding the internal wall to a map of the environment.
in certain examples, the method comprises determining that a wall area associated with the internal wall overlaps at least a first area associated with the first room and a second area associated with the second room.
In certain examples, the method comprises identifying,he first room and the second room before determining the at least one endpoint.
In certain examples, using the navigation data comprises using the navigation data to identify predicted wall regions associated with the environment, and using the predicted wall regions to identify the doorway between the first room and the second MGM.
In certain examples, using the navigation data. comprises using the navigation data to determine the at least one endpoint, and using the at least one endpoint to identify the first room and the second room. In such examples, the at least one endpoint may comprise a first endpoint corresponding to an end of a first predicted wall region associated with the environment and a second endpoint corresponding to an end of a second predicted wall region associated with the environment. In such examples, using the at least one endpoint to identify the first room and the second room may comprise using the first endpoint and the second endpoint to determine the location of the doorway and identifying the first room and the second room based on the location of the doorway.
In certain examples, the navigation data represents navigation map and identifying the first room and the second room comprises identifying a location of the doorway using the at least one endpoint, and applying a clustering process, based on the location of the doorway, to the navigation map.
In certain examples, the method comprises determining a distance metric between a first location on a first side of the doorway and a second location on a second side of the doorway, opposite to the first side. In such examples, a value of the distance metric may be larger than. a distance between the first location and the second location. Such examples may include using the navigation data to generate a graph representation of the environment, wherein the first location corresponds to a first node of the graph representation and the second location corresponds to a second node of the graph representation, weighting an edge between the first node and the second node using the distance metric, and dividing the graph representation into at least a first portion corresponding to the first room and a second portion corresponding to the second room.
In certain examples" the method comprises using the first room and the second room identified using the at least one endpoint to identify the internal wall. In such examples, using the first room and the second room identified using the at least one endpoint to identify the internal wall may comprise generating a first rectilinear representation of the first room, generating a second rectilinear representation of the second room, and identifying the internal wall using the first rectilinear representation, the second rectilinear representation and the doorway.
in certain examples, the method comprises determining that a width doorway * les a width condition before identifying the internal wall.
In certain examples, using the navigation data to identify a first room and a second room within the environment comprises identifying a plurality of rooms within a first area associated with the first room, determining that at least one of the plurality of rooms fails to satisfy an area condition, and merging the plurality of rooms to define the first room before determining the at least one endpoint.
In certain examples, at least one of the first room or the second room comprises a hallway or an entrance room.
In certain examples, the navigation data represents a navigation map of the environment, and the method comprises removing a representation of an object from the navigation map, wherein the object is disconnected from a boundary of the environment. in such examples, removing the representation of the object may use flood-tit fine According to a second aspect of the present invention, there is provided a system arranged to process navigation data from an autonomous robotic device to map an environment The system comprises a room identification engine to identify a first room and a second room within the environment, wherein the first room is adjacent to the second room, a doorway identification engine to determine at least one endpoint associated with a doorway between the first room and the second room, and a wall identification engine to identify an internal wall between the first room and the second room, using the at least one endpoint. As explained with reference to the first aspect, the system of the second aspect can improve the accuracy and/or efficiency with which the internal wall is identified" which may in turn improve mapping of the enviromnent and subsequent interaction between the robotic device and the environment.
In certain examples, the system is arranged to generate map data representative of a map of the environment, and the system comprises an interface to receive user-defined environment data representative of a user-defined wall, doorway or object within the environment. In such examples, the system is arranged to update the map of the environment based on the user-defined environment data.
in certain examples, the navigation data represents a navigation map of the environment and the system comprises an orientation engine to process the navigation data to orient the navigation map of the environment such that an axis associated with the navigation map is aligned along a predetermined direction.
In certain examples, the navigation data represents a two-dimensional binary map. in such examples, pixels of the two-dimensional binary map may indicate at least one of an occupancy or an accessibility of a corresponding region of the environment. An endpoint of the at least one endpoint may correspond to at least one pixel in the two-dimensional binary map.
According to a third aspect of the present ion there is provided a computing system comprising the system according to the third aspect, wherein the system is arranged to generate a map of the environment comprising the doorway and the internal wall, and a capture device to obtain image data representative of an observation of the environment by the autonomous robotic device_ In certain examples, the computing system comprises a simultaneous localisation and mapping system to provide feature data representative of observed features of the environment, wherein the system is arranged to use the feature data to identify the internal wall.
According to a fourth aspect of the present invention there is provided a robotic device comprising the computing system according to the fourth aspect, one or more actuators to enable the robotic device to interact with the environment, and an interaction engine comprising at least one processor to control the one or more actuators, wherein the interaction engine is configured to use the map of the environment to interact with the environment.
According to a fifth aspect of the present invention there is provided a user device comprising the system according to the second aspect, wherein the system is arranged to generate a map of the environment comprising the doorway and the internal wall, an image data interface arranged to receive image data representative of an observation of the environment from the autonomous robotic devic and a s)1 device arranged to display the map of the environment According to a sixth aspect of the present invention there is provided a non-transitory computer-readable storage medium comprising computer-executable instructions which, when executed by a processor; cause a computing device to perform any of the methods described above.
Further features will become apparent from the following description,which is made with reference to the accompanying drawings.
Brief 1-Jescrii ti on of the Drawings Figure 1 is a flow diagram showing a method for mapping an environment according to examples; Figures 2A and 2B are schematic diagrams showing wo examples of robotic devices; Figures 3A and 313 are schematic diagrams showing motion of a robotic device according to examples; Figure 4 is a flow diagram showing a method for mapping an environment according to first examples; Figure 5 is a schematic diagram showing a navigation map according o examples; Figure 6 is a schematic diagram showing the navigation map of Figure 5 after reorientation according to examples; Figure 7 is a schematic diagram showing the navigation map of Figure 6 after identification of a boundary of an environment according to examples; Figure 8 is a schematic diagram showing the navigation map of Lire 7 after removal of a representation of an object from the navigation map; Figure 9 is a schematic diagram showing room identification according to examples; Figure 10 is a schematic diagram showing doorway detection according to examples; schematic diagram showing detected doorways within art environment according to examples; Figure 12 is a schematic diagram showing endpoints of a doorway of Figure 11; Figure 13 is a flow diagram showing a method for identifying an internal wall ng to examples; Figure 14 is a flow diagram showing a method for identifying internal walls according to further examples; Figure 15 is a flow diagram showing a method for identifying internal walls according to yet further examples; Figure 16 is a schematic diagram showing detected internal walls according to examples; Figure 17 is a schematic diagram showing a feature within e ron according to examples; Figure 18 is a flow diagram showing a method for identifying internal wails according to still further examples; Figure 19 is a schematic diagram showing certain components of a system for use in mapping an environment according to examples; Figure 20 is a schematic diagram showing components of a system for use in mapping an environment according to further examples; Figure 21 is a schematic diagram showing the addition of a doorway within a representation of an environment according to examples; Figure 22 is a. schematic diagram showing the internal walls generated in the representation of the environment after the addition of the doorway illustrated in Figure 2 1; Figure 23 is a flow diagram showing a method for mapping an environment according to second examples; Figure 24 is a schematic diagram showing room identificationaccording to further examples; Figure 25 is a. schematic diagram showing doorway identification according to
further examples;
Figure 26 is a schematic diagram showing a rectilinear map of an environment according to examples; Figures 27A to 271 are schematic diagrams showing various systems for use in mapping an environment according to examples; Figure 28A is a schematic diagram showing components of a computing system apping an environment according to still further examples; Figure 2813 is a schematic diagram showing components of a otic device according to examples; Figure 29 is a schematic: diagram showing components of a user device according to examples; and Figure 30 is a schematic diagram showing a non-transitory computer readable medium according to examples.
Detailed Description
Certain examples described herein enable an environment to be mapped. A map of the environment generated using the examples herein can be used by a robotic device, such as an autonomous robotic device, for example to navigate or interact with the environment.
Figure 1 is a flow diagram showing a method 100 for mapping an environment according to examples.
At block 102 of Figure 1, navigation data is obtained from an autonomous robotic device. The navigation data for example indicates regions of space within the environment that have been observed or traversed by the robotic device. An autonomous robotic device is for example a robot that performs tasks without the need for human intervention, or with limited human intervention or instruction. For example, an autonomous robotic device may he pre-programmed or otherwise configured to pertbrm a particular task or other routine. The robotic device may then carry out this task without requiring further input from a human or from another source. Such a robotic device may have learning capabilities and may therefore he able to adapt its performance of the task based on feedback or input received over time. For example; a robotic device may continue to obtain navigation data as it navigates around an environment. This navigation data. may be used to update or generate a map of the environment, which may be used by the robotic device for subsequent navigation (which may be improved compared to its original navigation, e.g. by covering a greater proportion of the environment, or by more accurately avoiding collisions such as with internal walls within the environment). Examples of navigation data are discussed further below.
At block 104 of Figure 1, the navigation data is used to identify a first room and a second room within the environment, which are adjacent to each other. Identification of the first room and the second room includes determination of at least one endpoint associated with a doorway between the first room and the second room, and identification of an internal wall between the first room and the second room using the at least one endpoint.
Use of the at least one endpoint associated with the doorway allows the internal wail to be identified more efficiently. For example, this approach exploits the likely structure of an environment (in which an end of a wall tends to correspond to an endpoint associated with a doorway) to more efficiently identify internal walls than other approaches that are not constrained in this way.
By identifying internal walls within an environment, examples herein allow a more accurate map of the environment to be generated. For example, the navigation data may not represent the entirety of the environment. Certain parts of a room may not have been explored or observed by the robotic device, for example if they are inaccessible to the robotic device due to obstacles (such as objects) within the room. However, by identifying an internal wall of the room using at least one endpoint associated with a doorway of the room, the shape of the room may be more accurately detemtined, and may include regions of the room that are not represented by the navigation data. This may allow the robotic device to more effectively navigate the room in future. For example, if a corner if the room was previously inaccessible to the robotic device due to an object which is subsequently removed, this corner of the room may be absent from a navigation map represented by the navigation data. However, the corner of the room may he included in the map of the environment generated by the methods herein, and may therefore be navigable by the robotic device in future, using the map.
Example Robotic Devices Figure 2A shows a first example 200 of a robotic device 205 that may be used to obtain navigation data, obtain a map of an environment or use the map of the environment to navigate the environment, as described herein. This robotic device is provided for ease of understanding the following examples and should not be seen as limiting; other robotic devices of different configurations may equally apply the operations described in the following passages. Although certain methods and systems are described in the context of an environment explored by a robotic device, the same methods and systems may alternatively be applied using data obtained from a handheld or other mobile device, e.g. such as a device with an inbuilt camera device that is moved by a human being or other robotic device.
The robotic device 205 of Figure 2A includes a camera device 210 to capture an mage of an environment. The camera device 210 may comprise a wide-angle lens and/or comprise a monocular multi-directional camera to capture image data from a plurality of angular positions. In use, multiple images may be captured, one after each other. In certain cases, the plurality of angular positions cover a wide field of view. In one case, the camera device 210 may include an omni-directional camera, e.g. a device arranged to capture a field of view of substantially 360 degrees. In this case, the omnidirectional camera may include a device with a panoramic-annular-lens, e.g. the lens may be mounted in relation to a charge-coupled array. In the example of Figure 2A, the camera device 210 is mounted on a configurable arm above the robotic device; in other cases, the camera device 210 may be statically mounted within a body portion of the robotic device 205. in one case, the camera device may include a still image device configured to capture a sequence of images; in another case, the camera device may include a video device to capture video data comprising a sequence of images in the form of video frames.
The robotic device 205 of Figure 2A further includes at least one movement actuator 215 that in this case includes a set of driven wheels arranged in relation to the body portion of the robotic device 205. The at least one movement actuator 215, which may include at least one electric motor coupled to one or more wheels, tracks and/or rollers, is arranged to move the robotic device within an environment. An example of such an environment is described later with reference to Figures 3A and 3B. The robotic device 205 also includes a controller 220. This may include an embedded computing device as indicated by the dashed lines in Figure 2A. For example, the controller 220 may be implemented using at least one processor and memory and/or one or more system-on-chip controllers. In certain cases, the controller 220 may be implemented by way of machine-readable instructions, e.g. firmware as retrieved from a read-only or programmable memory such as an erasable programmable read-only memory (EPROM). The controller 220 controls movement of the robotic device 205 within the environment. For example, the controller 2.20 may instruct the at least one movement actuator to propel the robotic device 205 forwards or backwards, or to differentially if drive the wheels of the robotic device 205 so as to turn or rotate the device. In Figure 2A, the robotic device 205 also has a rotatable free-wheel 225 that allows rotation of the robotic device 205. In operation, the controller 220 may be configured to map an environment, which is for example an enclosed environment. For example, the controller 220 may include a memory or other machine-readable medium where data representing the map of the environment is stored. The controller 220 may additionally or alternatively be configured to obtain navigation data as the robotic device 205 navigates the environment. This navigation data may be stored in the memory or machine-readable medium, and used to generate the map of the environment.
Figure 2B shows another example 250 of a robotic device 255. The robotic device 255 of Figure 2B includes a domestic cleaning robot. Like the robotic device 205 of Figure 2A, the cleaning robotic device 255 includes a camera device 260. The camera device 260 may again comprise a monocular multi-directional camera device or one or more Red Green Blue (RGB) camera devices. In the example of Figure 2B, the camera device 260 is mounted on the top of the cleaning robotic device 255. In one implementation., the cleaning robotic device 255 may have a height of around 10 to 15cm, however, other sizes are possible. The cleaning robotic device 255 also includes at least one movement actuator 265; in the present case this includes at least one electric motor arranged to drive two sets of tracks mounted on either side of the device to propel the device forwards and backwards. These tracks may further be differentially driven to steer the cleaning robotic device 255. In other examples, different drive and/or steering components and technologies may be provided. As in Figure 2A, the cleaning robotic device 255 includes a controller 270 and a rotatable free-wheel 275.
In addition to the components of the robotic device 205 shown in Figure 2A, the cleaning robotic device includes a cleaning element 280. This cleaning element 280 may include an element to clean a floor of a room. ft may include rollers or brushes 285 and/or wet or dry elements. In one case_ the cleaning element 280 may include a vacuum device, e.g. arranged to capture dirt and dust particles. In this case, the controller 270 may he configured to use the map of the environment, either directly or indirectly, to determine a cleaning pattern for the environment and instruct activation of the cleaning element 280 according to the cleaning pattern. For example, a vacuum device may be activated to clean an area of environment defined by the map. The robotic device may use the map of the environment to determine, amongst others, one or more of required levels of cleaning fluid; required battery power to clean the environment (if is determined that this power is not available an alert may be provided); a cleaning d or system to use for a particular room (e.g. a kitchen may use a wet element that is suitable for a carpeted room of different dimensions); and a cleaning pattern for the environment (e.g. a proposed route to cover the area of the environment).
Example Motion for Robotic Device Figures 3A and 313 schematically show motion 300, 350 of a. robotic device 305 within an environment 310 according to two examples. The robotic device 305 may, in some examples, include a device as shown in Figure 2A or Figure 2B. in Figures 3,A and 3B, the environment 310 includes a three-dimensional environment in the form of an interior room. This is merely illustrative, though. In general, in the methods herein, the robotic device navigates an environment including at least two rooms. A room is for example an at least partially enclosed physical space, which is surrounded by one or more surfaces. A typical room is surrounded by four surfaces (excluding the floor and ceiling), which each correspond to a wall. In other cases, though, a room may be surrounded by more or fewer walls. For example, a room may be hexagonal in a plan view, and therefore surrounded by six walls_ In certain cases, a room may be enclosed or surrounded by surfaces on two sides, wherein the other sides are estimated by assuming a regular polygonal cross section such as a square or rectangle. For example, a room may include a hallway, or an entrance room. A hallway (sometimes referred to as a hall) is typically a long passage with doors on one or more both sides leading to neighbouring rooms. An entrance room (sometimes referred as an entryway, lobby or vestibule) is for example a small chamber, generally located at the entrance to a building, which leads into other rooms of the building, such as the hallway or other rooms. A room as described herein may not have a ceiling, and may therefore lack a suiface above the robotic device. Similarly, if the robotic device is an aerial device, such as a (multi rotor) helicopter, a floor, e.g. a surface under the robotic device, may also not be required to apply the examples described herein.
The environment 310 in Figures 3A and 3B includes a number of physical objects 320 (labelled as objects 320-A, 320-B, 320-C and 320-D in Figure 3A) that are located with the environment. Not all enclosed environments need include physical objects such as 320; however, many real-world environments will include such objects. The objects 320 may include one or more of, amongst others: furniture, building portions, equipment, raised floor portions, interior wall portions, people, electronic devices., animals, etc. Although the environment. 310 in Figures 3A and 313 is shown from above as being planar with a lower surface this need not be the case in all implementations, for example an environment may be aerial or within extra-terrestrial environment. The lower surface of the environment also need not be a level floor, e.g. it may comprise an inclined plane and/or multi-level series of planes.
In the example of Figure 3A, the robotic device 305 is adapted to move around a point 330 in the environment. For example, a controller 220 or 270 as shown in Figures 2A or 2B may be configured to instruct a movement 340 using at least one movement actuator, e.g. 215 or 265. In one example, during the movement 340, the robotic device 305 is configured to obtain a sequence of images at a plurality of different angular positions using an equipped camera device, e.g. 210 or 260 in Figures 2A or 213. For example, the movement 340 may include a substantially circular motion within a portion of the environment. In certain cases, the movement 340 may include a complete loop, e.g a rotation of 360 degrees around the point 330; in other cases, the movement may include a portion of a loop, e.g. a rotation of less than 360 degrees around the point 330. The movement 340 need not be circular, it may he a circumferential movement around at least a portion of a perimeter of any shape, e.g. any polygon including those with equal and unequal sides. In a relatively small-size room of around 4 or 5 metres square (e.g. an average domestic room), the movement 340 may include n the order of 0.5 metres across, e.g. may comprise a roughly circular motion with a diameter of 0.5 metres. This may take between I 0-20 seconds. In certain examples, for a small-size room, a sequence of images may include on the order of 100 or 200 frames. In other examples, different movements 340 may be instructed depending on the configuration of the robotic device 305 and/or an installed camera device, e.g. a camera device having a small-angle field of view may lead to a different motion from a camera device having a wide-angle field of view.
In general, in the example of Figure 3A, the robotic device 305 is controlled so as to perfoini at least one motion to enable the camera device to capture at least one sequence of images (e.g. video frames) of the environment that have disparity in a plurality of directions. For example, in an environment with an approximately horizontal floor, i.e. forming a plane of movement for the robotic device 305, the sequence of images may have disparity in a plurality of horizontal directions. Comparatively, in environments with an angled plane for movement, or in aerial or extra-terrestrial environments, the disparity may be in a plurality of directions that are parallel with the plane of movement. This movement 340 may he seen as a brief explanatory movement, e.g. analogous to a (sub-conscious) human or animal ability to glance around a room to orientate themselves within the room. The movement 340 allows a robotic device 305 to obtain navigation data representative of the shape of the environment. This then provides a robotic device 305 with an ability to map and as such subsequently "understand" the global environment within a room, and facilitates intelligent high-level planning and semantic under standing of environments.
In some cases, the movement 340 of Figure 3A may be performed for a plurality of rooms of a multi-room environment. For example, the robotic device 305 may move to a first room and perform the movement 340 to obtain navigation data for the tint Mont The robotic device 305 may then navigate to a second room within the environment and perform the same movement 340 (or a similar -different movement) to obtain navivation data for the second room.
Figure 3B shows another example motion 350 that may be used in larger environments, e.g. multi-segment interior environments. For example, the environment 355 in Figure 3B may include a room with at least one wall of 10-20 metres. In certain examples, as shown in Figure 3B, the environment may include a plurality of enviromnent portions that are separated by visual barriers, e.g. partition 360 may include, amongst others, a partial or full wail, a desk unit or an item of furniture. In Figure 313., the motion 350 includes a plurality of movements 370 380 and 390, e.g. a plurality of movements as described with respect to Figure 3A. In Figure 3B, three movements are shown; however, this is not intended to be limiting. In this case, the movements may include a set of similar or dissimilar movements, e.g. selected from a set of circular or circumferential movements around a point or at least a portion of a perimeter of a shape. For larger rooms, the movements may be larger than those described for smaller rooms, e.g. a circular movement may be around 1 metre in diameter. The plurality of movements may be controlled such that visual occlusions, such as partition 360, are at least partially circumnavigated. For example, data obtained from the first movement 370 may be used to detect partition 360 and instruct, e.g. by way of a controller, a second movement 380 that takes place beyond the partition. The number of movements and/or the spacing between different movements may depend on the size of the environment and/or the location of objects within the environment. In a room with at least one wall of 10-20 metres the spacing may be of the order of 1-3 metres. In certain cases, additional movements may be performed until a predefined portion of environment has been mapped. In the example of Figure 3B, the robotic device 305 is configured to make several small circular scans in sequence, moving to a new viewpoint in-between, whereby additional parts of the environment are revealed since occluding obstacles are being rounded. The information obtained from all of these scans may be used to map an environment, as is described in more detail in the following sections.
As described with reference to Figure 3A, the motion 350 of Figure 3B may be performed by the robotic device for each room of multi-room environment that the robotic device enters or otherwise observes. In some cases, the motion performed by the robotic device may differ for different rooms within the environment. For example, within a first room (which may be relatively small), the robotic device may perform the movement 340 of Figure 3A. However, the robotic device may perform the motion 350 of Figure 3B in a second room, which is larger than the first room.
In the examples of Figures 3A and 313, the robotic device obtains images of the environment as the robotic device traverses the environment. These images can then be processed to obtain navigation data representative of the environment. Jn other cases, though, the robotic device need not obtain images of the environment to generate the navigation data. instead, the navigation data may be generated based on motion of the robotic device within the environment. For example, the navigation data may represent a region of the environment traversed by the robotic device, or a region of the environment that the robotic device has interacted with (e.g. by cleaning, in cases where the robotic device is a cleaning robotic device). In certain examples, a robotic device may not comprise a camera device, and may process data obtained based on a. movement path that is not constrained by obstacles, e.g. as measured during a collision and/or distance sensing device (such as an infra-red or ultrasonic sensor).
Example Methods Mapping at Figure 4 is a flow diagram illustrating a method 400 for mapping an environment according to first examples herein.
At item 402 of Figure 4, navigation data is obtained from an autonomous robotic device. The navigation data typically provides a representation of an environment obtained during exploration of the environment by the robotic device. The navigation data for example represents a navigation map" which for example indicates spatial locations within the environment that have been observed or interacted with by the robotic. For example, the navigation map may correspond with an occupancy map indicating regions of the environment that are occupied (e.g. by an object, such as furniture or a wall) or a cleaning map (representing a path within the environment that the robotic device has traversed, e.g. during cleaning). The navigation data may be generated in various different ways, as the skilled person will appreciate. For example, images captured by a camera device of the robotic device as the robotic device moves around the environment may be processed using SLAM techniques to obtain a visual representation of the environment. Alternatively or additionally further information obtained by the robotic device (e.g. from sensors, such as collision sensors arranged to detect collisions between the robotic device and objects, or depth sensors arranged to obtain depth data) may be used to generate or refine a navigation map. In one case, the navigation data may be obtained by movements similar to those shown in Figures 3A and 3B.
The navigation data may represent a two-dimensional binary map, which may be stored as an image (e.g. as a. bitmap). Pixels of the two-dimensional binary map for example indicate at least one of an occupancy or an accessibility of a corresponding region of the environment. Each pixel of the binary map may have one of two values. For example" a value of I (for example corresponding to a dark pixel intensity such as black) may indicate that the region of space corresponding to that pixel is occupied, and a value of 0 may indicate that that region of space is unoccupied (and for example corresponds to a light pixel intensity, such as white). In such cases, regions of the environment which are unexplored or unobserved by the robotic device may be If considered to be occupied, and may therefore be indicated with a value of I in the binary map. This is merely an example, though. In other cases, the navigation data may represent a non-binary map, in which pixel values indicate a likelihood or probability that the corresponding region of the environment is navigable and/or occupied (which for example may take non-integer values between 0 and 1).
An example of a. navigation map 500 is shown schematically in Figure 5. The navigation map 500 includes navigable areas 502, and non-navigable areas 504a, 504b (collectively referred to with the reference numeral 504), which are indicated in Figure 5 with a diagonal stripe filling. The navigable areas 502 may represent areas that are unoccupied by objects in the environment, e.g. that may be navigated by a robotic device. The non-navigable areas 504 may be areas that are deemed to be occupied by objects, e.g. that cannot be accessed by the robotic device. In certain cases, areas 502 may be areas that are observed and/or measured by the robotic device and areas 504 may be areas that are not observed and/or measured by the robotic device. In certain cases, a measurement may include whether there is a surface boundary within a portion of space, e g. an object that has a volume. In Figure 5, the boundary of the outer non-navigable area 504b is arbitrary: in this case, it is assumed that any areas that are not identified by the robotic device as being navigable 502 are considered to be non-navigable. However, in other cases, a boundary of the environment may be determined, for example as discussed further below with reference to Figure 7, To obtain a navigation map 500 such as that of Figure 5, initial processing may be applied to navigation data obtained by the robotic device, for example to remove image noise, image artefacts, blurring or other image defects. In one case, a low-pass filter may be applied to an I. tavigatfon map (in the form of a two-dimensional image), to remove incorrectly identified occupied regions. incorrect identifications may occur due to the particular shape of the robotic device, which may affect its navigation within the environment, or the cleaning behaviour of the robotic device (if it is a cleaning robot pre-programmed to perform a particular cleaning routine). For example; a Gaussian blur filter may be applied to the initial navigation map, and a new binary map may be obtained by assigning a value of I to pixel intensity values of greater than or equal to a threshold value (e.g. 0.5), which e.g. may be considered to correspond to occupied regions of the environment, and a value of 0 otherwise, which e.g. may he considered to correspond to unoccupied regions.
In other examples, image opening and closure may be used to reduce noise in the navigation map 500. Image opening for example involves the removal of small objects from an image (which for example correspond to bright pixels), by placing, them in a background (which for example corresponds with dark pixels). This may involve IS altering a pixel value of these pixels from that which corresponds to a non-navigable value (e.g. 1) to a navigable value (e.g 0). Image closing for example removes small holes of background in the foreground by converting these holes into foreground. This may involve alternating a pixel value of these pixels from that which corresponds to a navigable value (e.g. 0) to a non--navigable value (e.g. 0.
In the example of Figure 5, the navigation map 500 is angled with respect to the vertical. In other words, non-navigable areas 504b of the navigation map 500 (which may correspond to walls) are at a non-zero angle with respect to the vertical or horizontal. To aid processing of the navigation map 500 to identify internal walls within the environment, the navigation map 500 may be rotated to align non-navigable areas more closely with the horizontal and vertical directions.
In order to rotate the navigation map 500 in this way, the orientation of the navigation map 500 in which longinidinal axes of respective portions of a perimeter 506 of the navigable area 502 are; on average, closest to horizontal or vertical may be determined. For example, the navigation data representative of the navigation map 500 may be processed to orient the navigation map 500 such that an axis associated with the navigation map 500 (e.g. corresponding to a direction of a portion of the perimeter 506) is aligned along a predetermined direction (e.g. horizontal or vertical).
Figure 5 illustrates an example of axes 508 associated with the n is i ation reap 500. in Figure 5, the axes 508 include a first axis 510 aligned along a portion of the perimeter 506 which closest to the vertical (e.g. the leftmost portion of the perimeter 506 as shown in Figure 5). The axes 508 also include a second axis 512 aligned along a portion of the perimeter 506 which is closest to the horizontal (e.g. the bottom. portion of the perimeter 506 as shown in Figure 5).
Various different methods may be used to orient the navigation map 500 of Figure 5 so that longitudinal axes of a navigable area 502 are more closely aligned with predetermined directions (in this case, vertical and horizontal), as the skilled person will appreciate.
For example, the navigation map 500 may be processed to estimate a local image gradient at each point (or a subset of points) within the navigation map 500. Before such processing, the navigation map 500 may be processed, for example using a low-pass filter, to further enhance the sharpness of edges in the navigation map 500 (which may correspond to walls). This can stabilise the rotation of the navigation map 500. Pixels with a gradient magnitude which meets a gradient threshold (for example with a magnitude of greater than 0.5) may be deemed to be close to boundaries of the image, for example corresponding to the perimeter 506 of the navigable area 502. A smoothing process may be applied to these pixels, such as a kernel smoothing estimate.
These pixels may then be orientated so as to align with a predetermined direction (in this case, horizontal or vertical).
In other examples, the navigation map 500 may be oriented using a Trough line transform. An edge detector may be used to detect edge pixels in the navigation map 500. For example, pixels of the non-navigable area 504 that are adjacent to one or more pixels of the navigable area 502 may be marked as edge pixels. Based on these edge pixels, the Hough line transform may then be used to find straight lines in the navigation map 500. Each of the straight lines has an inclination angle of between -90 and 90 degrees, for example. Based on the inclination angles of the straight lines, a dominant angle of the navigation map 500 may be determined, and taken as a longitudinal axis of the navigation map 500, such as one of the axes 508. The navigation map 500 may be rotated or otherwise re-oriented to more closely align the dominant angle with a predetermined direction, such as the horizontal or vertical.
Figure 6 illustrates the navigation map 500 of Figure 5 after rotation to align the longitudinal axes of the perimeter 506 of the navigable area 502. more closely with the horizontal and vertical directions.
Examples herein include the determination of a boundary of an environment represented by a navigation map. Various different methods may be used to identify such a boundary, which for example may be considered to correspond with external walls of the environment. For example, a bounding box surrounding the navigable area 502 or the navigable area 502 with a buffer of a few pixels either side of the navigable area 502 (to account for any inaccuracies during rotation or warping in the navigation map 500) may be obtained. The portion of the navigation map 500 within the bounding box may then be processed to identify the largest rectangles that will fit within the bounding box (but outside the navigable area 502). The boundary of the environment may then be taken as the outer perimeter of the rectangles. However, this is merely an example, and other methods for determining a boundary of an environment may be used in other cases. Furthermore, the boundary of the environment may be identified at other stages of the methods described herein. An example of a boundary is shown schematically in Figure 7, which illustrates the boundary 514 of the environment represented by the navigation map 500 of Figure 6.
The navigation map 500 of Figures 5 to 7 includes a non--navigable area 504a which is disconnected from the non-navigable area 504b surrounding the navigable area 502. This area 504a may be referred to as a disconnected non-navigable area 504a. A non-navigable area such as this may correspond to a region of the environment that is occupied by a moveable object, such as furniture, rather than by a wall, which is typically fixed or otherwise unchanging in position. In some cases, the method includes removing a representation of an object such as this, which is disconnected from a boundary 514 of the environment. This is shown in Figure 8, which shows the navigation map 500 of Figure 7 after removal of the representation of the object (corresponding to the disconnected non-navigable area 504a which is disconnected from the boundary 514 of the environment, and hence from the non-navigable area 504b surrounding the navigable area 502). In this way, internal objects, such as chairs, foot stools, laundry baskets or other furniture or objects (including people or animals) which typically change position within the environment, can be removed from the navigation map 500. The navigation map 500 therefore more accurately represents an underlying layout of the environment. In other cases, though, a representation of such an object may be retained in the navigation map 500. In such cases, the representation of the object may be tagged or otherwise identified as furniture.
In this case, objects that are connected to the boundary 514 of the environment (e.g. by being connected to the disconnected non-navigable area 504b surrounding the navigable area 502) are not removed from the navigation map 500. Such objects may correspond to large furniture such as beds or cupboards that are usually in contact with a wall of a room. The position of such items generally does not change on a regular basis. Hence, retaining a representation of these objects allows the navigation map 500 to accurately reflect al likely configuration of the environment.
Various different approaches may be used to remove representations of an object from the navigation map 500. In one case, flood-filling may be used. Flood-filling in this case may he used to fill the pixels of the navigation map 500 that are connected to a pixel of the disconnected non-navigable area 504a and that have the same pixel value (in this case, a value of 1, indicating that a region of the environment corresponding to these pixels is non-navigable), with a value of 0. By altering the pixel values for these pixels to navigable pixel values, the disconnected non-navigable area 504a is removed front the navigation map 500, as shown in Figure 8.
Referring back-to Figure 4, item 404 involves using the navigation data (which in this case is representative of the navigation map 500) to identify a first room and a second room within the environment, wherein the first room is adjacent to the second room. Various different methods may be used to identify the first and second rooms, including image erosion and watershed splitting.
Image erosion involves narrowing the navigable area 502, for example by increasing the size of the non--navigable area 504 surrounding the navigable area 502 by a predetermined amount, such as a number of pixels corresponding to a width of a standard doorway. By eroding the navigation map 500 in this way, the navigation map 500 is for example divided into a set of disconnected areas, each of which may be taken as a room. The areas between each of the rooms may be considered to correspond to doorways within the environment.
Watershed splitting may additionally or alternatively be used to partition the environment into rooms. Watershed splitting, for example, treats the navigation map 500 like a topographic map, with a brightness of each pixel representing its height. As will be appreciated, pixels of the navigation map 500 of Figures 5 to 8 indicate whether a corresponding region of the environment is occupied or has been explored, rather than a height. Hence, where watershed splitting is used, a distance transform may be calculated as the "height" of a pixel in the navigation map 500. In one case, the distance transform represents, for each pixel, the distance between that pixel and the nearest pixel which is non-navigable or is occupied (e.g. with a pixel value of 1), However, this is merely an example, and other distance transforms may be used in other examples, as the skilled person will appreciate.
After generating a topographic map in this way, based on the distance transform for the pixels of the navigation map 500, lines that run along the tops of ridges in the topographic map may then be determined. For example, if one imagines filling up such a topographic map with water, at some point the water from one basin will overflow into a neighbouring basin. This point may be considered to correspond to a top of a ridge in the map (which may be referred to as a watershed or transition region between neighbouring basin regions). Each basin region may be considered to correspond to a room of the environment, and the transition region may be considered to correspond to a doorway between two neighbouring or adjacent rooms. In some cases, Gaussian low-pass filtering may be applied to the topographic map before determination of the transition regions, although this need not be the case.
Figure 9 illustrates the navigation map 500 after identification of rooms On this case five rooms: 516a, 516b, 516c, 5 16d, 516e, which may be referred to collectively with the reference numeral 516). Transition regions between the rooms 516 are indicated schematically in 9 using dotted lines. Identification of the rooms for example involves outputting an indication of which pixels of the navigation map 500 belong to a given room 516. An indication of which rooms are connected to each other (and which are for example adjacent to or neighbouring each other) may also be output.
0 After identifying the rooms 516 within the environment, further processing may be applied to the navigation map 500 to further improve the accuracy of the navigation map 500 and to remove or merge regions that have been identified as being a 'room" but are nevertheless unlikely to correspond to a physical room within the environment. For example, rooms which are not connected to any other rooms may be removed from the navigation map 500.
In some cases, rooms which are unrealistically small may be merged with other Moms. For example, if a plurality of rooms are identified, it may be determined that at least one of the plurality of rooms fails to satisfy an area condition. An area condition may correspond to a predetermined threshold area, such as an average smallest room area which is representative of the average area of the smallest room in houses or buildings of a certain type or within a certain location or country. For example, a room may fail to satisfy the area condition if the area of the room (as determined from the navigation map 500) is beneath the predetermined area threshold. In other cases, the area condition may be a relative threshold area, to remove rooms that are relatively small compared to another, adjacent, room. For example, if an area of a room is less than a predetermined proportion (e.g,. 30%, 20% or 10%) of an area of a neighbouring room, the room may fail to satisfy the area condition. In yet further cases, whether the area condition is satisfied may he determined based on a radius of a room, rather than its total area. For example, the radius of the maximum possible inscribed circle within the room may be calculated, and taken as the radius of the room. If this radius is less than a predetermined threshold (which may be absolute or relative to a radius of a neighbouring room), t ie room may fail to satisfy the area condition. If at least one of the plurality of rooms fails to satisfy the area condition, the plurality of rooms may be merged.
This is shown in Figure 9, in which the room 516e fails to satisfy the area condition. In this case, the room 5I6e is merged with the room 5 I Oa to define a first room 516f shown in Figure 10 (discussed further below). In some cases, before merging at least two rooms, it is determined that each of the rooms individually fails to satisfy a respective area condition before the rooms are merged. Such an approach may be used to merge small areas, which together correspond to a corridor_ Indeed, it is to be appreciated that rooms herein may include various room types, including corridors, hallways or entrance rooms.
Referring back to Figure 4, item 406 involves determining at least one endpoint associated with a doorway between a first room and a second room. In sonic cases, detenuini rig the atleas1 one endpoint involves identifying the first ro0111 and the second room before determining the at least one endpoint associated with the doorway. An example of doorway identification is shown schematically in Figure 10.
In examples such as Figure 10, an area of overlap between a first area associated with the first room 516f and a second area associated with the second room 516b can be determined, and the location of the doorway can be determined based on the area of overlap. Determining such an area of overlap (which may be referred to as an intersection) can be performed in various ways. In one case, a boundary of each room is widened, and an area of overlap 518a between the widened boundaries associated with the rooms 516f, Slob is identified. This area of overlap 518a may correspond to a larger area of the environment than a typical doorway, so may therefore be reduced in size, for example using an image thinning operation to reduce this area of overlap 518a to a more appropriate size. Areas of overlap 518a, 518b, 518c (collectively referred to with the reference numeral 518) may be found between each set of neighbouring rooms in the environment, and may then be reduced in size to obtain doorways. The doorways 520a, 520h, 520c (collectively referred to with the reference numeral 520) found using such a method are shown schematically in Figure 11. As can be seen from Figure 11, a doorway is for example an entrance or other point of entry or egress to a room or building. A doorway may be an opening that may be closed by a door (although a door need not be located in a doorway, as sonic doorways may instead correspond to a space, passage or opening between rooms or between an external and internal environment). In some cases, endpoints associated with the doorways 520 are determined after identifying the doorways 520. An endpoint for example corresponds with an edge of a doorframe surrounding a doorway. For example, a typical doorway is rectangular in shape, and has two opposing vertical frame portions An endpoint may be coincident with each doorframe, so that the doorway is associated with two endpoints on the left and right sides of the doorway. In other examples, an endpoint may correspond to an end of a wall of the environment, where the end of the wall defines the beginning of the doorway. In other words, the doorway may be a gap or opening in the wall, with the 34 endpoints corresponding to the location of the gap. An endpoint of a doorway may be found in various ways. In one case, the endpoint of a doorway 520 may be determined by ide nu an end of the doorway 520 (such as the pixel of the navigation map 500 that corresponds to the end of a line representing e doorway 520). For example, the standard MATLABt functions bwmorph(.. and immorphl.., 'endpoints. ) may he used to thin the areas of overlap 518 to identify the doorways 520 and identify the endpoints of the doorways 520, respectively. Hence, in some cases, an endpoint may correspond to at least one pixel of the navigation map 500, which in this case is a two-dimensional binary map.
In some cases, the doorway 520 (and the endpoints associated with the doorway 520) may be rotated to a predetermined direction (such as a horizontal or vertical), for example if a direction of the doorway 520 is sufficiently close to the horizontal or if vertical prior to rotation. This may simplify further processing of the navigation map 500 in future.
The doorways emified in this way may, however, not necessarily correspond to physical doorways in the environment, for example due to a particular environmental configuration, due to missing or erroneous navigation data or due to image defects such as noise in the navigation map 500. To identify and remove incorrectly identified doorways, the navigation map 500 may undergo further processing. For example, some cases may involve determining that a width of the doorway satisfies a width condition before retaining the doorway 520. The width of the doorway may be taken as the distance between two endpoints associated with the doorway. The width condition may be satisfied were the width of the doorway meets or is less than a predetermined width threshold (such as the width of a standard double doorway, e.g. 1.5 metres). Hence, the width condition may not be satisfied where there are wide gaps between neighbouring rooms as identified previously. This may be the case where the rooms are not actually separate rooms but are instead different regions of the same, larger, room, which may be of a non-rectangular shape. This can occur in open-plan buildings, with large rooms that may nevertheless include multiple separate zones, such as a combined kitchen-diner with a kitchen zone and a dining zone. Wide doorways such as this may be removed. In some cases, the rooms separated by wide doorways may also be merged with each other (although this need not be the case). In other cases, the width condition may be satisfied where the width of the doorway is less than a certain proportion of an area or a radius of a room to either side of the doorway. For example, if the width of the doorway is greater than 90% of the diameter of either room, the doorway may be considered an insufficiently dear narrowing and may therefore he discarded.
Some cases involve determining that the number of endpoints associated with a doorway satisfies an endpoint condition. A typical doorway has two endpoints (although in sonic cases a doorway may have more endpoints than this, for example if there is a three-way room join). Hence, in some cases, if a doorway is associated with more than two endpoints; some of the endpoints may be spurious In such cases, upon identifying more than two endpoints associated with a doorway, a distance between each pair of endpoints is determined. if two of the endpoints are relatively close together (for example, with a distance less than a particular distance threshold, e.g. within 5 pixels of each other in the navigation map 500), one of the endpoints may be discarded as being spurious.
Figure 12. shows schematically two endpoints 522a, 522b of the doorway 520a of F igure 11 which is between the first room 516f and the second room 516b. For ease of visualisation, Figure 12 illustrates a portion of the navigation map 500 including the endpoints 522a, 522b; other portions of the navigation map 500 are omitted.
Referring back to Figure 4, item 408 of Figure 4 involves identifying an internal wail between the first room and the second room using the at least one endpoint associated with the doorway between the first room and the second room.
Identifying the internal wall may include determining a path from an endpoint of the at least one endpoint to a target position within the environment, and associating the path with the internal wall. Figure 13 is a flow diagram illustrating an example method 600 of identifying an internal wall in which various paths are determined to dentify the internal wall t item 602 of Figure 13, an ixternal wall associated with a boundary of the environment is identified using the navigation data. The external wall for example corresponds with the non-navigable area 504 illustrated in Figure 11. For example, an inner perimeter of the external wall corresponds with the innermost pixels of the non-navigable area 504.
0 At item 604 of Figure 13, a path from an endpoint of the at least one endpoint (such as the endpoint 522a show in Figure 12) to the external wall is determined. if such a path is determined, the method of Figure nvolves, at item 606, identifying that the path corresponds to an internal wall. An external Tall for example corresponds to a wall foaming an exterior enclosure of a building, and may include a roof In contrast, an internal wall corresponds to a wall which is within the interior of a building (notwithstanding that the building may be lacking a roof and/or floor). An internal wall need not be firll height; it may instead extend partway from a lower boundary of a. room (such as a floor) to an upper boundary of a room (such as a ceiling). An internal wall may therefore correspond to a dividing element between rooms or regions of the environment.
The path may be determined based on various environmental constraints, to find an appropriate path from the endpoint to the external wall. As an example, such constraints may constrain the path from travelling into the navigable area 502 (as this would be contrary to the navigation data obtained by the robotic device, indicating that this area is unoccupied by objects, such as walls). These constraints may constrain the path to travel in straight lines or with a limited number of curved sections, as this corresponds to the typical construction of internal walls_ Such straight lines may be constraint to be preferentially either along the longitudinal axes 508 associated with the navigation map 500 (which may be horizontal or vertical after reorientation of the navigation map 500 as described with reference to Figures 5 and 6), or at a predetermined angle with respect to these axes 508 (such as at a 45 degree angle). It should be noted that references to "horizontal" and "vertical" herein may equally apply to any two orthogonal axes within two-dimensions. The path may be constrained to be as short as possible, subject to other constraints. By determining the path from the endpoint (which for example corresponds to one pixel or a relatively small number of pixels within the navigation map 500) to the external wall (which typically corresponds to a much larger number of pixels than the endpoint), rather than vice versa, the number of paths to be explored or otherwise investigated may be reduced. This allows the path to be determined more efficiently.
In some cases, identification of the internal wall includes optimising cost function to identify a path from an endpoint of the at least one endpoint to a target position within the environment. The target position for example corresponds with an external wall corresponding to a boundary of the environment. For example, the cost function may be used to constrain the path as described above. The cost function may include at least one of a suet cost term to penalise an input path that extends outside of at least one predetermined direction (such as a path that extends outside of horizontal or vertical), or a second cost term to penalise a change in direction of an input path.
Various different paths across the navigation map 500, starting from an endpoint 522a of the doorway 520a, may be investigated. At each point along a path, there may be a predetermined number of permissible steps. For example, at each point, the path may move up one pixel, down one pixel, left one pixel, right one pixel or in the 4 diagonal directions, giving a total of eight permissible steps at each point. In addition, the permissible steps may be constrained to be within the occupied area 504. For each permissible step, the following cost may be assigned: * Horizontal/vertical step, cost = * Diagonal step, cost = 2 * Change direction to horizontal or vertical ost ----5 Change direction to diagonal, cost =8 By combining the cost terms (such as adding le costs per step, for each step of a path), the total cost may be obtained for any path. The path that optimises the cost function (which for example corresponds to a. sum or other combination of the cost terms) may be selected as the internal wall. For example, the internal;van may be taken as the path with the smallest total cost.
In some cases, the non-navigable area 504 may be expanded in size (for example by a few pixels) to account for any uneven edges in the non-navigable area 504, e.g. due to incomplete or inaccurate navigation data. After obtaining the path corresponding to the internal wall, the path may be increased in size, for example by thickening a line corresponding to the path in the navigation map 500, which may improve navigation of the environment by a robotic device using a map including the internal walls by reducing the likelihood of collisions that may otherwise occur if the position of the internal wall in the map differs slightly from an actual position of the internal wail in the environment.
In some cases, the cost function optimisation (or other investigation of possible paths from the endpoint to the external wail) may be ceased upon meeting a predetermined criterion, such as the total cost meeting or being beneath a cost threshold value. This may reduce he number of paths investigated, improving the efficiency of the method.
In some cases, it may not be possible to determine a path from the endpoint 522a to an external wall. This may be the case if the endpoint 522a is located within the navigable area 502. In such cases, the method 600 of Figure 13 involves trying to identify an internal wall between two doorways rather than from a doorway to an external waif Item 608 of Figure 13 includes determining a path from the endpoint (which is, for example, a first endpoint of a first doorway) to a second endpoint associated with a second doorway between the first room and a third room of the environment. If such a path is determined, item 610 involves identifying that the path corresponds to a second internal wall (which extends from the first doorway to the second doorway), rather than a first internal wall between the first room and the second room.
The determination of the path in this case may he performed similarly to the determination of the path from the endpoint to the external wall but investigating paths from the first endpoint to the second endpoint (or to an endpoint or other doorways within the environment) rather than from the first endpoint to the external wall.
If no path can be determined from the endpoint to the external wall or to an endpoint of a further doorway, the endpoint is discarded at item 612 of Figure 13. In this way, spurious endpoints may be removed, such as extra endpoints that do not correspond with an endpoint of a physical doorway within the environment.
It is to be appreciated that items 602 to 606 of Figure 13 may be performed without items 608 to 612 or vice versa.
in some cases, unnecessary internal walls may remain after the method 600 of Figure 13 has been performed, for example where two rooms have more than one doorway between them but one of these doorways is too wide and has therefore been discarded. To remove such spurious internal walls, the method 700 of Figure 14 may be performed.
At item 702 of Figure 14, it is determined whether a wall area associated with an internal wall overlaps at least a first area associated with the first room and a second area associated with the second room. If this is the case, the internal wall is retained in a map of the environ tem 704. Otherwise, the internal wall t item 706.
Another example of removal of internal walls that may be spurious is shown in the method 800 of Figure 15. At item 802 of Figure 15, a plurality of endpoints of a doorway are identified. At item 804 it is determined whether each of a plurality of endpoints associated with the doorway connects to at least one of an external wall associated with a boundary of the environment or an endpoint associated with a further doorway within the environment. If this is the case, the internal walls corresponding to respective paths between the doorway and the external wall or the endpoint associated with the further doorway are retained in a map of the environment. Otherwise, at item 806 of Figure 15, the internal walls are discarded. The doorway itself may also be removed or indicated as no longer corresponding to a doorway within the environment. 'This may be used to remove spurious internal walls and/or doorways from the navigation map 500. In other cases, the internal walls may be added to a map of the environment after it is determined that they connect to the external wall or to an endpoint associated with a further doorway, at item 804.
An illustration of the navigation map 500 after identification of the internal walls 524 is shown schematically in Figure 16. The doorways 520 of Figure 11 are omitted from Figure 16 to enhance the visibility of the internal walls 524. The navigation map 500 after the processing to identify the internal walls 524 may be considered to correspond to a map of the environment, which may be used by the robotic device for further interaction with the environment.
In some cases, identification of internal walls may be improved based on features within the environment. The example of Figure 17 illustrates schematically a navigation map 900 including a representation of a feature 924 representative of an occupancy of space within an environment. Elements of Figure 17 similar to corresponding elements of Figure 8 are labelled with the same reference numeral but prefaced by a 9 instead of a 5; Figure 17 represents a navigation map 900 at the same stage of processing as the navigation map 500 of Figure 8 but for a navigation map 900 including the feature 924. The feature 924 for example corresponds to furniture or an other object which is located along or near to a wall of the environment. By using the location of the feature 924, the position of the internal wall can be further constrained.
For example, a cost function optimised to identify a path from an endpoint associated with a doorway to a target position within the environment may include a cost term to reward an input path that coincides with a position of the feature 924. This cost term may be a third cost term, in addition to the first and second cost terms discussed above.
The feature 924 may be represented by feature data, which is for example obtained from a simultaneous localisation and mapping (SLAM) system. In such cases, the feature is for example a SLAM feature, representing an environmental feature identified by the SLAM system. These features may correspond to locations within the environment that have not necessarily been explored by the robotic device, e.g. during cleaning. For example, the feature 924 may be located at a height, on a wall of a room.
For example, the feature 924 may correspond to a shelf or an object on a shelf A shelf may not necessarily be observed or collided with by a relatively small robotic device moving across a floor of the room, but may be detected using the SLAM system. Hence, by using the feature 924, the internal wall identification may be improved. To specifically identify features corresponding to objects located at least partway up a wall of the environment, it may be determined that the feature is positioned at a height that satisfies a height condition. The height condition is for example a minimum height, e.g. metre. For example, features located at a height above the minimum height may be retained and used during the internal wall identification. Other features may be discarded.
In some cases in which the feature 924 is a SLAM feature, a lowpass filter may be applied to an image of SLAM points, and a negative cost (or positive reward) may be accumulated at each pixel of an input path, which is proportional to a value of the lowpass-filtered SLAM feature map, so that walls travelling through more feature points receive a greater reward. In such cases, a correspondence between the pixels of the lowpass-filtered SLAM feature map and the pixels of the navigation map may be detennined, so as to determine a total cost for a given path, which for example includes the first and second cost terms obtained by traversing the navigation map and the third cost term obtained by traversing the SLAM Feature map.
In some cases, the endpoints of the doorways are processed to identify the internal walls in the order in which the doorways are identified, which typically depends on the size of the area of overlap between the two rooms the doorway is between.
However as an internal wall can connect to other internal walls, the order in which internal walls are identified may affect the output map of the environment. The method 1000 of Figure 18 is a flow diagram illustrating an example order of identifying internal walls.
At item 1002, a set of candidate doorways is identified The set of candidate doorways is for example the doorways 520 shown in Figure 11 At item 1004, the set of candidate doorways is sorted into a processing order according to a doorway characteristic. As an example, the doorway characteristic may include at least one of a distance between a doorway and a further doorway, a distance between a doorway and an external wall, or a size of a room associated with a doorway. At item 1006, the set of candidate doorways is processed in the processing order to identify a set of candidate internal walls within the environment. The set of candidate internal walls for example include the internal wail between the first room 516f and the second room 5I6b.
attil fe SyS tern jiff Alapping crrr Environment Figure 19 is a schematic diagram showing certain components of a system 1100 for mapping an environment according to examples. The system 1100 is arranged to process navigation data 1102 from an autonomous robotic device to map an environment. The system 1100 includes a mapping engine 1104 to map the environment. The system 1100 may form part of the robotic devices 200, 250 shown in Figures 2A and 2B, and/or part of a control system that is remote from the autonomous robotic device, e.g. a server computing device.
The mapping engine 1104 includes an orientation engine 1106 to process the navigation data 1102 to orient a navigation map represented by the navigation data I 102 such that an axis associated with the navigation map is aligned along a predetermined direction, as described with reference to Figures 5 and 6 above. The orientation engine 1106 may be omitted in some cases, for example where the navigation map is already orientated before receipt by the mapping engine 1104, or in cases where reorientation of the navigation map is omitted, The mapping engine 1104 also includes a room identification: engine 1108 to identify a first room and a second room within the environment, where the first room is adjacent to the second room. In addition, the mapping engine 1104 includes a doorway identification engine 1110 to determine at least one endpoint associated with a doorway between the first room and the second room. The mapping engine 1104 also includes a wall identification engine 1112 to identify an internal wall between the first room and the second room, using the at least one endpoint. The mapping engine 1104 therefore includes components to perform any of the methods described herein of mapping an environment.
The system 1100 of Figure 19 is arranged to generate map data 1114 representative of a map of the environment. For example, the map data 1114 may represent a navigation map after processing to identify internal walls.
Figure 20 illustrates schematically another example of a system 1200 for processing navigation data from an autonomous robotic device to map an environment. The system 1200 is arranged to receive user-defined environment data 1202 representative of a user--defined wall, doorway or object within the environment. The user-defined environment data 1202 is received via an interface 1204 of the system 1200. The interface 1204 is for example an interface of a computing system or a system such as the system 1100 of Figure 19. This is discussed further with reference to Figure 28. The system 1200 of Figure 20 is arranged to update the map of the environment based on the user-defined environment data 1202, to generate an updated map 1208.
This is shown schematically in Figure 21. Figure 21 illustrates the navigation map 500 of Figure 16, which may be presented to a user via a graphical user interface (GUI). For example, the navigation map 500 may be displayed on a display screen of a user device such as a computer or mobile device using a GUI associated with the user device. In this example, the user has added a representation of an additional, user-defined, doorway 526 to the navigation map 500. For example, this may be performed using the GUI, by drawing or otherwise indicating the position of the doorway 526 on the navigation map 500.
After receiving the user-defined environment data 1202, the internal wall identification process may be at least partly re-performed, to identify new internal walls in the environment. In this case, upon receipt of the user-defined environment data 1202, which in this case represents a user-defined doorway, the endpoints associated with the user-defined doorway may be identified as described herein. Internal walls within the environment may then be identified using these endpoints. In this way, two new internal walls 528a, 528h maybe 'den ti fi ed, and added to the navigation map 500, as shown schematically in Figure 22. in this way, the navigation map 500 may be updated.
A similar approach may be taken if a user-defined wall (which may be an internal wall or external wall) or feature is obtained by the system 1200. If a user-defined internal wall is received, the user-defined internal wall may be added to the navigation map 500. This may be performed either without re-identifying other internal walls in the environment, or after attempting to re-identify further additional internal walls (as adding new internal walls may affect whether other internal wails are identified or not). Similarly, if a user-defined external wall or feature (such as a SIAM feature) is added, the internal walls within the environment may be re-identified.
Second Example lethods lor Mapping an Environment Figure 23 is a flow diagram illustrating a method 1300 for napping an environment according to second examples herein.
At item 1302 of Figure 23, navigation data is obtained from an autonomous robotic device. The navigation data and the way in which it is obtained may be similar to that described with reference to item 402 of Figure 4.
At item 1304 of Figure 23, the navigation data is used to identify predicted 1 regions within the environment. In this example, the predicted wall regions may be taken as the occupied, non-navigable or unexplored regions of the environment, such as the non-navigable regions 1404 of the navigation map 1400 shown in Figure 24. The navigation map 1400 of Figure 24 is similar to the navigation map 500 of Figure 8 and in this example has undergone the same processing as the navigation map 500 of Figure 8. Elements of Figure 24 which are the same as corresponding elements ofFigure 8 are labelled with the same reference but prefixed by a 14 instead of a 5.
At item 1306 of Figure 23, the predicted wall regions are used to identify a doorway between a first room and a second room within the environment. An example of identification of a doorway in accordance with second examples is shown schematically in Figure 24. hr this example, doorway identification involves identification of relatively small openings or other gaps between the predicted wall regions (which in this case correspond to the non-navigable regions 1404). This may involve identifying wall openings that are in a predetermined direction, such as along a longitudinal axis associated with the navigation map 1400. In this case, wall openings in a horizontal or vertical direction are identified, although this is merely an example. To identify the doorways, a width of wall openings may be assessed against a wall opening condition. This may correspond to an absolute or relative width value against which the width may be compared. This may therefore be similar to determining whether a doorway satisfies a width condition. For example, if a wall opening has a width which exceeds the width of a standard double doorway, the wall opening may be discarded. In addition, a length of a wall portion to either side of a wail opening (in a longitudinal direction of the wall opening) may be determined to assess whether the wall opening corresponds to a doorway. For example, if such a wall portion is relatively small (or if the wall portions to either side of a. wall opening are relatively small), such as less than a predetermined absolute or relative threshold value, the wall opening may be considered not to correspond with a doorway.
Figure 24 illustrates the doorways (1430a-1430f, collectively referred to as 1430) that are identified using this method. This approach may identify a larger number of putative doorways than there are physical doorways within the environment. The number of doorways may be reduced by further processing of the navigation map, as discussed further below. In sonic cases, the doorways identified may be associated with a respective likelihood, indicating an estimated likelihood that a wall opening actually corresponds to a physical doorway within the environment. Such a likelihood may be based on the size or location of the doorway. or other features such as the length of wall portions to one or both sides of the doorway.
By identifying the doorways within the environment, endpoints of the doorways may also be identified. For example., an end of a wall opening may be considered to corresponding to an endpoint associated with a doorway. In this way, the navigation data may he used to determine at least one endpoint associated with a doorway.
The at least one endpoint associated with a doorway between a first room and a second room within the environment may then be used to identify the first room and the second room, as illustrated schematically in Figure 25. Figure 25 illustrates a portion of a navigation map 1600 in simplified form, for ease of illustration. In Figure 25, a doorway 1602 is identified, for example using the methods described with reference to Figures 23 and 24. For example, the doorway 1602 may he identified by identifying at least one endpoint associated with the doorway 1602. In this case, the at least one endpoint includes a first endpoint I 604a corresponding to an end of a first predicted wall region 1606a associated with the environment, and a second endpoint 1604b corresponding to an end of a second predicted wail region 16065 associated with the environment. The endpoints are used to determine a location of the doorway 1602 (for example by determining that a width between the first and second endpoints 1604a, 16045 satisfies a wall opening condition). A first room 1608a and a second room 1608b Within the environment are identified based on the location of the doorway 1602 in this case. For example, a room segmentation process may be applied to the navigation map 1600 after finding the doorway 1602.
In the example of Figure 2.5, a clustering process is applied to the navigation map 1600 based on the location of the doorway 1602. Such a clustering process is for example a spatial clustering" in that units of two-dimensional space (for example corresponding to pixels of the navigation map 1600, representative of corresponding regions of the environment) are clustered into rooms. This clustering may include spectral clustering, such as spectral clustering applied to spatial portions, as represented by corresponding portions of the navigation map 1600.
In general, a spectral clustering process takes each navigable or unoccupied pixel as a node on a bidirectional graph, and uses a distance metric such as the Euclidean distance between pixels as a weight associated with edges among different nodes. The graph may then be divided into clusters, with nodes that are relatively close to each other, based on the distance metric, being grouped into the same cluster.
In this case, a distance metric between a first location 1610a on a first side of the doorway 1602 (for example within an area of the navigation map 1600 corresponding to the first room 1608a) and a second location 16 I Ob on a second side of the doorway 1602 (for example within an area of the navigation map 1600 corresponding to the second room 1608b) is determined. Rather than the distance metric representing an actual distance 1612 between the first and second locations 1610a, 16101) within the environment, in this case, a value of the distance metric may differ from this distance 1612 in order to encourage the locations to be grouped into different clusters corresponding to different rooms. For example, the value of the distance metric may be larger than the ta rice 1612 between the first and second locations 1.6 [Oa., 1610b. This may be generalised across the pixels of the navigation map 1600, such that if two pixels of the navigation map 1600 are separated by the doorway 1602, their pairwise distance is larger than their actual distance (for example as captured by the pairwi se Euclidean distance between the locations in the environment that correspond to the pixels).
The distance metric between two locations may be proportional to a likelihood of at least one doorway between those locations. For example, the pairwise distance between the first and second locations 1610a, 160 lb may be taken as the weighted sum of a geometric distance between these locations (such as the Euclidean distance) and a likelihood that the doorway 1602 corresponds to a physical doorway within the environment. If it is highly likely that the doorway 1602 corresponds to a physical doorway, pixels of the navigation map 1600 corresponding to the first and second locations 1610a, 1610b will typically be located in two different clusters after clustering. Conversely, if it is less likely that the doorway 1602 corresponds to a physical doorway, the contribution of the likelihood to the pairwise distance is reduced relative to the contribution of the geometric distance. Hence, in this case, the first and second locations 1610a, 1610b are clustered primarily based on their geometric distance.
This distance metric may he used within a clustering process by using the navigation data to generate a graph representation of the environment. Within the graph representation, the first location 1610a may be taken to correspond to a first node of the graph representation and the second location 1610b may be taken to correspond to a second node of the graph representation. An edge of the graph representation between the first node and the second node may be weighted using the distance metric. The graph representation may then be divided into at least a first portion corresponding to the first room I 608a and a second portion corresponding to the second room 16086. The pixels corresponding to the locations within the first portion of the graph representation (as represented by respective nodes) may then be associated with the first room 1608a. Similarly, the pixels corresponding to the locations within the second portion of the graph representation (as represented by respective nodes) may then be associated with the second room 1608b.
By using such a distance metric (which may be considered to be a modified distance metric compared with other distance in etri cs which represent an actual distance between pixels), segmentation of the navigation map 1600 into rooms may be improved. For example, the navigation map 1600 may be divided into a more appropriate number of rooms than with other distance metrics, which may divide the navigation map 1600 into a larger number of rooms than those present within the environment.
After identifying rooms within the environment, small or spurious rooms may be removed, for example using the methods described above with respect to the first examples and as shown schematically in Figure 9.
The first and second rooms identified using the at least one endpoint may be used to identify an internal wall within the environment. For example, whereas the predicted wail regions may have jagged or uneven edges, which correspond to an edge of an occupied, non-navigable or explored region, the internal wall may have a shape which is closer to or otherwise-more representative of a likely or actual wall shape. In some cases, the internal wall may be straight or rectilinear in shape or with a limited number of curved sections. A direction of the internal wall may align with a direction of an axis associated with the navigation map 1600, such as a horizontal or vertical direction. By identification of internal walls in this way, a more accurate representation of the environment may be obtained, improving future navigation of or interaction with the environment by the robotic device.
In some cases, identification of the internal wall includes generating a first rectilinear representation of the first room 1608a and generating a second rectilinear representation of the second room 1608b. A rectilinear representation of a room is for example formed of a series of straight lines, where such straight lines may be aligned along one of a set of predetermined directions (such as horizontal or vertical). A rectilinear representation may correspond with a rectilinear polygon, in which all edges of the polygon intersect at right angles. In such cases, the room may be represented as a rectangle or as other rectilinear polygonal shapes.
The internal wall may he identified using the first rectilinear representation, the second rectilinear representation and the doorway 1602, for example so that the internal wall has an appropriate gap or opening corresponding to the doorway 1602, and so that the internal wall is also rectilinear in shape. Further constraints may also be applied. For example, the first and second rectilinear representations may be constrained to be of particular shapes. If two rooms are adjacent to each other, a width of the intern& wall between the two rooms may be determined and compared to a wall width condition. If the width fails to satisfy the wall width condition (for example if it is too wide or if two rooms are separated by two separate walls, a shape of the rooms and/or internal wall may be adjusted appropriately until the wail width condition is satisfied. Similarly" an alignment between two rooms which each adjoin the same other room (which for example corresponds to a corridor within the environment) may be checked. if the internal walls associated with the rooms are misaligned, the shape or size of the rooms may be altered or the position of the internal walls may be altered so that the internal wall running along the side of the corridor room is straight. A rectilinear representation 1700 of the environment represented by the navigation map 1400 of Figure 23 is shown schematically in Figure 26, which illustrates the walls identified, which include internal walls 1702 as well as external walls 1704. The rectilinear representation 1700 of Figure 26 may be considered to correspond to a rectilinear map of the environment.
E.xaaarpies of Systems and Apparatus Pr Use w lib the Methods' Herein Figures 27A to 2713 are schematic diagrams showing various systems for use in mapping an environment according to examples. The system 1800 of Figure 27A includes a robotic device 1802, a system 1804 for mapping an environment, such as the systems 1100, 1200 of Figure 19 and 20, and a user device 1806. The robotic device 1802 is for example an autonomous robotic device as described with reference to Figures 2A and 2B. The system 1804 is for example a computing system, which includes any suitable electronic device with data processing capabilities '['he computing system 1000 may be a single computing device (e.g. a desktop, laptop, mobile and/or embedded computing device) or may be a distributed computing system, which is distributed over multiple discrete computing devices (e.g. certain components may be implemented by one or more server computing devices based on requests from one or more client computing devices made over a network). The user device 1806 may be any suitable user computing system, and may be a mobile device, such as a mobile phone, for example a smartphone, a tablet, laptop or personal computer.
Each of these components is communicatively coupled to each ether via a network 1808. The network 1808 may be a wireless network 1808. For example, communication of data between these components via the network may include wireless transmissions via a wireless network or Bluetooth® connection. In other cases; some or all of these components may be directly coupled to each other, e.g. via a universal serial bus (LISS) connected, rather than indirectly coupled via the network 1808 (which may include one or more computer networks).
The system 1804 in this case may be located on a server system which is remote from the robotic device 1802, which may 'be a distributed server system. Such a server system may have more computing resources than the robotic device 1802 and may therefore be able to identify the internal walls more rapidly than the robotic device 1802. For example, the robotic device 1802 may obtain the navigation data and may transmit the navigation data via the network 1808 to the system 1804. The system 1804 may process the navigation data to map the environment, thereby generating map data representative of a map of the environment which includes a representation of internal walls within the environment. The map may be transmitted to the robotic device 1802 and/or the user device 1806, via the network 1808. In sonic cases, the map may be transmitted to the user device 1806 to obtain user input such as the user-defined environment data discussed above. The user--defined environment data may be transmitted to the system 1804, which may then update the map of the environment.
The updated map of the environment may be returned to the user device 1806 and/or the robotic device 1802 for further use or processing.
Figure 278, 27C and 27D illustrate further examples of systems for mapping an environment. Features of Figures 27B, 27C and 27D which are similar to corresponding features of Figure 27A are labelled with the same reference numeral but prefixed by a 19, 20 and 21, respectively, instead of an 18.
The system 1900 of Figure 2713 includes a robotic device 1902 and a user device 1906 connected via a network 1908 (however, in other cases, the robotic device 1902 and the user device 1906 may be connected directly). .In this case, the robotic device 1902 may generate a map of the environment which is transmitted to the user device 1906 via the network 1908. Al ternatively, the robotic device 1902 may generate or otherwise obtain the navigate data. The navigation data may be transmitted to the user device 1906 via the network 1908. The user device 1906 may process the navigation data to generate the map of the environment, which may be transmitted back to the robotic device 1902 for further exploration of or navigation within the environment.
The system 2000 of Figure 27C includes a robotic device 2002 and a system 2004, which is for example a remote server system. The robotic device 2002 and the system 2004 are connected via the network 2008. The system 2000 of Figure 27C is the same as that of Figure 27B, except that the user device 1906 is replaced by the system 2004 In the system 2100 of Figure 27D, the mapping of the environment is performed locally to the robotic device. In this example, the system 2100 for example corresponds to a computing system including the system 2104 arranged to generate a map of the environment including the doorway and internal wall. The system 2.100 also includes a capture device, in this example a camera 21 10, which is arranged to obtain image data representative of an observation of the environment by the autonomous robotic device.
The image data may be taken to correspond to the navigation data, or may be otherwise processed to obtain the navigation data (e.g. to determine an occupancy of the environment observed by the robotic device).
Figure 28A shows schematically components of a computing system 2200 for mapping an environment according to further examples. The computing system 2200 may be a single computing device or may be a distributed computing system.
The computing system 2200 includes a camera 2202, which in this case is a video camera arranged to provide frames of video, which for example include observations of a scene. The computing system 2200 includes an image processing system 2204, which is arranged to implement methods in accordance with those described herein. In Figure 28A, the image processing system 2204 is arranged to process image data obtained by the camera 2202 to obtain the navigation data.
The computing system 2200 also includes a tracking system 2206 arranged to determine poses of the camera 2202 during observation of the scene. The computing system 2200 includes a mapping system 2208 arranged to generate a map of the environment using the navigation data. In other examples, the mapping system 2208 may be arranged to generate at least one further map of the environment, such as a depth or semantic map of the environment.
The tracking and mapping systems 2206, 2208 may form part of a simultaneous localisation and mapping (SLAM) system. A SLAM system within the field of robotic mapping and navigation acts to construct and update a map of an unknown environment while simultaneously locating a robotic device associated with the map within the environment For example, the robotic device may he the device that is constructing, updating and/or using the map. As explained with reference to Figure 17, such a SLAM system may be used to provide feature data representative of observed features of the environment. These features may be used to identify internal walls within the environment, which may be used to generate or update a map of the environment.
Figure 28B is a schematic. diagram showing components of a robotic device 2300 according to an example. The robotic device 2300 includes the computing system 2200 of Figure 2S A. The robotic device 2300 also includes one or more actuators 2302 to enable the robotic device 2300 to interact with a surrounding three-dimensional environment. At least a portion of the surrounding three-dimensional environment may be shown in the scene captured by the camera 2202 of the computing system 2200. in the case of Figure 285, the robotic device 2300 may be configured to capture image data as the robotic device 2 navigates a particular environment. hi another case, though, the robotic device 2300 may scan an environment, or operate on image data received from a third party, such as a user with a mobile device or another robotic device. As the robotic device 2300 processes the image data, it may be arranged to obtain navigation data from which a map of the environment may be obtained.
The robotic device 2300 also includes an interaction engine 2304 including at least one processor to control the one or more actuators 2302. The interaction engine 2304 of Figure 2813 may be configured to use a map obtained by the robotic device 2300 to control the robotic device 2300 to interact with the surrounding three-sional environment. For example, the map may he used to identify a particular cleaning routine for the robotic device 2300 to perform, which for example is more efficient than otherwise, with fewer collisions between the robotic device 2300 and walls or objects within the environment.
Examples of functional components as described herein with reference to Figures 28A and 28B may include dedicated processing electronics and/or may be implemented by way of computer program code executed by a processor of at least one computing device. In certain cases, one or more embedded computing devices may be used Components as described herein may include at least one processor operating, in association with memory to execute computer program code loaded onto a computer readable medium. This medium may include solid state storage such as an erasable programmable read only memory and the computer program code may include firmware. in other cases, the components may include a suitably configured system-onchip, application-specific integrated circuit and/or one or more suitably programmed field-programmable gate arrays. In one case, the components may be implemented by way of computer program code and'or dedicated processing electronics in a mobile computing device and/or a desktop computing device. In one case, the components may he implemented, as well as or instead of the previous cases, by one or more graphical processing units executing computer program code. In certain cases, the components may be implemented by way of one or more functions implemented in parallel, e.g. on multiple processors and/or cores of a graphics processing unit.
Figure) is a schematic diagram showing components of a user device 2400 according to examples. The user device 2400 includes storage 2402. The storage 2402 may include at least one of volatile memory, such as a Random Access Memory (RAM) and non-volatile memory, such as Read Only Memory (ROM) or a solid state drive (SSD) such as Flash memory. At least one processor 2404 is communicatively coupled to the storage 2402.
The storage 2402 ian the example of Figure 29 includes computer program instructions configured to, when processed by the at least one processor 2404, map an environment as described in the examples herein The computer program instructions may be stored in an accessible non-transitory computer-readable medium and loaded into memory, for example the storage 2402, to implement these methods. These instructions are illustrated schematically in Figure as corresponding to a system 2406 for implementing the methods herein. A map 2408 obtained by the system 2406 is also stored in the storage 2402 in this example. Other data may additionally be stored in the system 2406, such as the navigation data is to be processed to generate the map 2408, or user-defined environment data received from a user.
In this example, the user device 2400 includes an image data interlace 2410 to receive image data from an autonomous robotic device, which the user device 2400 may process (or may send to a remote device for processing) to obtain the navigation data. The image data interface 2410 may he any suitable interface to allow communication of data between the user device 2400 arid the robotic device.
The components of the user device 2400 in the example of Figure 29 are interconnected using a systems bus 2412. This allows data to be transferred between the various components. For example, the map of the environment generated by the method according to examples can be stored in the storage 2402 and subsequently transmitted via the systems bus 2412 from the storage 2402 to a di splay device interface 2414 for transfer to a di splay device 2416 for display. The display device interface 2414 may include a display port and/or an internal electronics interface" e.g where the display device 2416 is part of the user device 2400 such as a display screen of a smartphone. Therefore, when instructed by the at least one processor 2404 via the display device interface 2416, the display device 2416 will display the map of the environment Figure 30 is a schematic diagram showing an example 2500 of a processor 2502 and a non-transitory computer-readable storage medium 2504 comprising computer-executable instructions 2506. The computer-executable instructions 2504, when executed by the processor 2504, cause a computing system, such as a computing system including the processor 2504 to perform any of the example methods described above.
For example, the computer-readable storage medium 2504 may be arranged to store navigation data 2508 obtained from an autonomous robotic device. The computer-executable instructions 2406, when executed by the processor 2502, may be configured to cause the computing system to process the navigation data 2508 to obtain a map 2510 of an environment, which may be stored in the computer-readable storage medium 2504. Although in Figure 30" the navigation data 2508 and the map 2510 are shown as being stored on the computer-readable storage medium 25.04, in other examples, at least one of the navigation data 250S and the map 2510 may be cored in storage which is external to (but accessible by) the computer-readable storage medium 2504.
The above examples are to be understood as illustrative. Further examples are ed.
It is to be understood that any feature described in relation to any one example may be used alone, or in combination with other features described,: and may also be used in combination with one or more features of any other of the examples, or any combination of any other of the examples. Furthermore, equivalents and modifications not described above may also be employed without departing from the scope of the accompanying claims.

Claims (39)

  1. CLAMSA method for mapping an environment, the method comprising: obtaining navigation data from an autonomous robotic device; using the navigation data to identify a first room and a second room within the ivironment, wherein the first room is adjacent to the second room, comprising: determining at least one endpoint associated with a doorway between the first room and the second room; and identifying an internal wall between the first room and the second room using the at least one endpoint.
  2. 2. The method according to claim 1, comprising using the navigation data to identify an external wail associated with a boundary of the environment; wherein identifying the internal wall comprises: deteunining a path from an endpoint of the at least one endpoint to the external wall; and identifying that the path corresponds to the internal wall.
  3. The method according to claim 1 or claim 2, wherein the doorway is a first doorway, the internal wall is a first internal wall, and the method comprises: determining a path from a first endpoint of the at least one endpoint to a second endpoint associated with a second doorway between the first room and a third room of the environment; and identifying that the path corresponds to a second internal wall.
  4. 4. The method according o any one of claims I to 3, wherein identifying the internal wall comprises: optimising a cost function identify a path from an endpoint of the at least one endpoint to a target position within the environment; and associating the path with the internal wall.
  5. 5. The method according 1 4, wherein the cost function comprises at least one of: a first cost term to penalise an input path that extends outside of at least one predetermined direction; or a second cost term to penalise a change in direction of an input path.
  6. 6. The method according to claim 4 or claim 5, comprising obtaining feature data representative of a position of a feature within the environment, wherein the feature indicates occupancy of space within the environment, wherein the cost function comprises: a third cost term to reward an input path that coincides with the position of the feature.
  7. 7. The method according to claim 6" comprising obtaining the feature data from a simultaneous localisation and mapping system.
  8. The method according to claim 6 or claim 7, comprising determining at the feature is positioned at a height that satisfies a height condition.
  9. 9. The method according to any one of claims to 6, wherein the doorway is a first doorway, the internal wall is a first internal wall, and the method comprises: identifying a set of candidate doorways containing the first doorway; sorting the set of candidate doorways into a processing order, according to a doorway characteristic; and processing the set of candidate doorways in rocessing order, to identify a set of candidate internal walls within the environment. the set of candidate internal walls containing the first internal wall.
  10. 10. The method according o cl aim 9, wherein the doorway characteristic comprises at least one of: a distance between a doorway and a further doorway; a distance between a doorway and an external wall associated with a bound of the environment; a size of a room associated h a doorway.
  11. 11. The method according to any one of claims I to 10, wherein determining the at least one endpoint comprises: determining an area of overlap between a first area associated with the first oom and a second area associated with the second room; and determining a location of the doorway based on the area of overlap.
  12. 12. The method according to any one of claims 1 to 11, comprising: determining a plurality of endpoints associated with the doorway, the plurality of endpoints comprising the at least one endpoint; determining that each of the plurality of endpoints associated with the doorway connects to at least one of an external wall associated with a boundary of the environment or an endpoint associated with a further doorway within the environment; and, subsequently, adding the internal wall to a map of thAe environment.
  13. 13. The method according to any one of claims 1 to 14, comprising determining that a watt area associated with the internal wall overlaps at least a first area associated with the first room and a second area associated with the second room.
  14. 4. The method according one of claims I to 13, comprising identifying the first room and the second room before determining the at least one endpoint.
  15. 15. The method according to claim 1, wherein using the navigation data comprises: using the navigation data to identify predicted wall regions associated with the en. o ment, and using the predicted all regions to identify the doorway between the first room and the second room.
  16. 16. The method according to claim 1 or claim 15, wherein using the navigation data comprises: using the navigation data to determine the at least one endpoint; and using the at least one endpoint to identif the first room and the second room.
  17. The method according to claim 16, wherein: the at least one endpoint comprises: a first endpoint corresponding to an end of a first redicted wall region associated with the environment; and a second endpoint corresponding to an end of a second predicted wall region associated with the environment, and using the at least one endpoint to identify the first room and the second room comprises: using the first endpoint and the second endpoint to determine the locatt.on of the doorway; and identifying the first room and the second room based on the location of the doorway.
  18. The method according to claim 1, or any one of claims 15 to 17, wherein the navigation data represents a navigation map and identifying the first room and the second room comprises: identifying a location of the doorway using the at least one endpoint; and applying a clustering process, based on the location of the doorway, to the navigation map.
  19. 19. The method according to claim 1, or any one of claims 15 to 18, comprising detennining a distance metric between a first location on a first side of the doorway and a second location on a second side of the doorway; opposite to the first side.
  20. 20. The method according to claim 19., wherein a value of the distance metric is larger than a distance between the first location and the second location.
  21. The method according to claim 19 or claim 20, comprising: using the navigation data to generate a graph representation of the environment, wherein the first location corresponds to a first node of the graph representation and the second location corresponds to a second node of the graph representation; weighting an edge between the first node and the second node using the distance metric; and dividing the graph representation into at least a first portion corresponding to the first room and a second portion corresponding to the second room.
  22. 22. The method according to claim 1, or any one of claims 15 to 21, comprising using the first room and the second room identified using the at least one endpoint to identify the internal wall.
  23. 23. The method according to claim 22 wherein using the first room and the second room identified using the at least one endpoint to identify the internal wall comprises: generating a first rectilinear representation of the first room; generating a second rectilinear representation of the second room; and identifying the internal wall using the first rectilinear representation, the second ec near representation and the doorway.
  24. 24. The method according to any one of claims 1 to 23, comprising determining that a width of the doorway satisfies a width condition before identifying the internal wall.
  25. 25. The method according to any one of claims1to24,whereinusin,thenavigation data to identify a first room and a second room within the environment comprises: identifying a plurality of rooms within a first area associated with the first room; determining that at least one of the plurality lads to satisfy an area condition; and merging the pii rality cf rooms tc define the first room before determining the at least one endpoint.
  26. 26. The method according to any one of claims 7 to 25, wherein at least one of the first room or the second room comprises a hallway or an entrance room.
  27. 27. The method according to any one of claims 1 to 26, wherein the navigation data represents a navigation map of the environment, and the method comprises removing a representation of an object from the navigation map, wherein the object is disconnected from a boundary of the environment.
  28. 28. The method according -o claim 27, wherein removing the ep esentation of the object uses flood-filling.
  29. 29. A system arranged to process navigation data from an autonomous robotic device to map an environment, the system comprising: a room identification engine to identify a first room and a second room within the environment, wherein the first room is adjacent to the second room; a doorway identification engine to determine at least one endpoint associated with a doorway between the first room and the second room; and a wall identification engine to identify an internal wall between the first room and the second room, using the at least one endpoint. 2.0
  30. 30. The system according to claim 29, wherein the system is arranged to generate map data. representative of a map of the environment, and the system comprises an interface to receive user--defined environment data representative of a user-defined wall, doorway or object within the environment, wherein the system is arranged to update the of the e.vironment used on the user-defined environment data.
  31. 3 The system according to claim 29 or claim 30, wherein the navigation data represents a navigation map of the environment and the system comprises an orientation engine to process the navigation data to orient the navigation map of the environment such that an axis associated with the navigation map is aligned along a predetermined direction.
  32. 32. The system according to any one of claims 9 to 3 I, wherein the navigation data represents a two-dimensional binary map.
  33. 33. The system according to claim 32; wherein pixels oft] two-dimensional binary map indicate at least one of an occupancy or an accessibility of a corresponding region of the environment.
  34. 34. The system according to claim 32 or claim 33, wherein an endpoint of the at least one endpoint corresponds to at least one pixel in the two-dimensional binary map.
  35. A computing, system comprising: the system according to any one of claims 29 to 34, wherein the system is arranged to generate a map of the environment comprising the doorway and the internal wall; and a capture device to obtain image data representative of an observation of the environment by the autonomous robotic device.
  36. 36. The computing system according to claim 35, comprising: a simultaneous localisation and mapping system to provide feature data representative of observed features of the environment, wherein the system is arranged to use the feature data o identify the internal wall.
  37. 37. A robotic device comprising the computing system according to claim 35 or claim 36; one or more actuators to enable the robotic device to interact with the environment; and an interaction engine comprising at least one processor to control the one or more actuators; wherein the interaction engine is configured to use thecrap of the environment to interact with the environment.
  38. 38. A user device comprising: the system according to any one of claims 29 to 34, wherein the system is arranged to generate a map of the environment comprising the doorway and the internal wall; an image data interface arranged to receive image data representative of an observation of the environment from the autonomous robotic device: arid a display device arranged to display the map of the environment.
  39. 39. A non-transitory computer-readable storage medium comprising computer-executable instructions which, when executed by a processor, cause a computing device to perform the method according to any one of claims 1 to 28.
GB1908432.6A 2019-06-12 2019-06-12 Mapping of an environment Active GB2584839B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1908432.6A GB2584839B (en) 2019-06-12 2019-06-12 Mapping of an environment
CN202010528248.0A CN112087573B (en) 2019-06-12 2020-06-11 Drawing of an environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1908432.6A GB2584839B (en) 2019-06-12 2019-06-12 Mapping of an environment

Publications (3)

Publication Number Publication Date
GB201908432D0 GB201908432D0 (en) 2019-07-24
GB2584839A true GB2584839A (en) 2020-12-23
GB2584839B GB2584839B (en) 2022-12-21

Family

ID=67386319

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1908432.6A Active GB2584839B (en) 2019-06-12 2019-06-12 Mapping of an environment

Country Status (2)

Country Link
CN (1) CN112087573B (en)
GB (1) GB2584839B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2621371A (en) * 2022-08-10 2024-02-14 Dyson Technology Ltd A method and system for exploring a real-world environment

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115393234A (en) * 2021-05-25 2022-11-25 速感科技(北京)有限公司 Map region fusion method and device, autonomous mobile equipment and storage medium
CN113359766B (en) * 2021-07-05 2023-06-23 杭州萤石软件有限公司 Mobile robot movement control method and mobile robot

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050010331A1 (en) * 2003-03-14 2005-01-13 Taylor Charles E. Robot vacuum with floor type modes
CN104898660A (en) * 2015-03-27 2015-09-09 中国科学技术大学 Indoor map building method for improving robot path planning efficiency
CN106325266A (en) * 2015-06-15 2017-01-11 联想(北京)有限公司 Spatial distribution map building method and electronic device
EP3428885A1 (en) * 2016-03-09 2019-01-16 Guangzhou Airob Robot Technology Co., Ltd. Map construction method, and correction method and apparatus
US20190094870A1 (en) * 2014-12-16 2019-03-28 AI Incorporated Methods and systems for robotic surface coverage

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2398394B (en) * 2003-02-14 2006-05-17 Dyson Ltd An autonomous machine
JP5302016B2 (en) * 2009-01-15 2013-10-02 株式会社日立製作所 Spatial information management system, map information server device, and program
US9140559B2 (en) * 2009-10-01 2015-09-22 Qualcomm Incorporated Routing graphs for buildings using schematics
US8698671B2 (en) * 2009-10-16 2014-04-15 Qualcomm Incorporated Binning venues into categories based on propagation characteristics
US9020191B2 (en) * 2012-11-30 2015-04-28 Qualcomm Incorporated Image-based indoor position determination
KR102158695B1 (en) * 2014-02-12 2020-10-23 엘지전자 주식회사 robot cleaner and a control method of the same
KR102527645B1 (en) * 2014-08-20 2023-05-03 삼성전자주식회사 Cleaning robot and controlling method thereof
DE102015119501A1 (en) * 2015-11-11 2017-05-11 RobArt GmbH Subdivision of maps for robot navigation
WO2018024897A1 (en) * 2016-08-05 2018-02-08 RobArt GmbH Method for controlling an autonomous mobile robot
DE112017006018T5 (en) * 2016-12-30 2019-09-12 Robert Bosch Gmbh MOBILE ROBOTIC DEVICE PROCESSING UNSTRUCTURED DATA OF INTERIOR ENVIRONMENTS TO SEGMENT ROOMS IN A FACILITY TO IMPROVE THE MOVEMENT OF THE DEVICE THROUGH THE EQUIPMENT
CN109308838A (en) * 2018-09-11 2019-02-05 中国人民解放军战略支援部队信息工程大学 A kind of interior space topology road network generation method and device based on indoor map

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050010331A1 (en) * 2003-03-14 2005-01-13 Taylor Charles E. Robot vacuum with floor type modes
US20190094870A1 (en) * 2014-12-16 2019-03-28 AI Incorporated Methods and systems for robotic surface coverage
CN104898660A (en) * 2015-03-27 2015-09-09 中国科学技术大学 Indoor map building method for improving robot path planning efficiency
CN106325266A (en) * 2015-06-15 2017-01-11 联想(北京)有限公司 Spatial distribution map building method and electronic device
EP3428885A1 (en) * 2016-03-09 2019-01-16 Guangzhou Airob Robot Technology Co., Ltd. Map construction method, and correction method and apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Kleiner et al., "A Solution to Room-by-Room Coverage for Autonomous Cleaning Robots", Proceedings of the IEEE International Conference on Intelligent Robots and Systems (IROS), (2017) *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2621371A (en) * 2022-08-10 2024-02-14 Dyson Technology Ltd A method and system for exploring a real-world environment

Also Published As

Publication number Publication date
GB2584839B (en) 2022-12-21
GB201908432D0 (en) 2019-07-24
CN112087573A (en) 2020-12-15
CN112087573B (en) 2022-04-19

Similar Documents

Publication Publication Date Title
Borrmann et al. A mobile robot based system for fully automated thermal 3D mapping
US10717193B2 (en) Artificial intelligence moving robot and control method thereof
Adán et al. An autonomous robotic platform for automatic extraction of detailed semantic models of buildings
US10852729B2 (en) Moving robot and control method thereof
CN104536445B (en) Mobile navigation method and system
US20190120633A1 (en) Discovering and plotting the boundary of an enclosure
US11054839B2 (en) Mobile robotic device that processes unstructured data of indoor environments to segment rooms in a facility to improve movement of the device through the facility
Rusu et al. Laser-based perception for door and handle identification
US9978149B1 (en) System and method for door detection for corridor exploration
CN112087573B (en) Drawing of an environment
Quintana et al. Semantic scan planning for indoor structural elements of buildings
CN111127500A (en) Space partitioning method and device and mobile robot
Fiala et al. Robot navigation using panoramic tracking
CN111679661A (en) Semantic map construction method based on depth camera and sweeping robot
US11734883B2 (en) Generating mappings of physical spaces from point cloud data
WO2020038155A1 (en) Autonomous movement device, control method and storage medium
CN114365974B (en) Indoor cleaning and partitioning method and device and floor sweeping robot
Mason et al. Textured occupancy grids for monocular localization without features
Maurović et al. Autonomous exploration of large unknown indoor environments for dense 3D model building
KR20230035363A (en) Method, Apparatus, and Device for Generating Maps for Autonomous Mobile Devices
CN111609854A (en) Three-dimensional map construction method based on multiple depth cameras and sweeping robot
Wong et al. Visual gaze analysis of robotic pedestrians moving in urban space
Li et al. Indoor layout estimation by 2d lidar and camera fusion
An et al. Ceiling vision-based active SLAM framework for dynamic and wide-open environments
Langer et al. On-the-fly detection of novel objects in indoor environments