WO2007051972A1 - Navigation system - Google Patents

Navigation system Download PDF

Info

Publication number
WO2007051972A1
WO2007051972A1 PCT/GB2006/003958 GB2006003958W WO2007051972A1 WO 2007051972 A1 WO2007051972 A1 WO 2007051972A1 GB 2006003958 W GB2006003958 W GB 2006003958W WO 2007051972 A1 WO2007051972 A1 WO 2007051972A1
Authority
WO
WIPO (PCT)
Prior art keywords
environment
robotic device
map
navigation system
mode
Prior art date
Application number
PCT/GB2006/003958
Other languages
French (fr)
Inventor
Tej Paul Kaushal
Ronald Vincent Smith
Christopher Andrew Smith
David Bryan Mullin
James Edward Carroll
Original Assignee
Qinetiq Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GB0522153A external-priority patent/GB0522153D0/en
Application filed by Qinetiq Limited filed Critical Qinetiq Limited
Publication of WO2007051972A1 publication Critical patent/WO2007051972A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3837Data obtained from a single source
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/933Lidar systems specially adapted for specific applications for anti-collision purposes of aircraft or spacecraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0248Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0272Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising means for registering the travel distance, e.g. revolutions of wheels
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow

Definitions

  • the present invention relates to a navigation system, and in particular to a navigation system for a mobile robotic device.
  • the invention relates specifically, but not exclusively, to a navigation system for a domestic robotic device, for example an autonomous cleaner or lawn mower.
  • Existing domestic robots for example autonomous cleaners and lawn mowers, are primitive and typically require a large number of basic sensors in order to operate autonomously. Even so, the operation of such domestic robots is simplistic.
  • conventional autonomous cleaners move in a random motion over an area to be cleaned.
  • the motion of the cleaner is typically controlled by a random-motion algorithm stored within the device.
  • the cleaner will implement the random- motion algorithm in response to internal events (elapse of a random or predetermined time period, battery condition etc.) and external stimuli (for example, detection of obstacles such as walls and furniture using simple bump sensors and detection of precipices, such as the top of a flight of stairs, using an infrared emitter and detector etc.).
  • the efficiency of an autonomous cleaner may be enhanced by incorporating a deterministic component to the otherwise random motion of the device.
  • the autonomous cleaner follows a defined path that is calculated to completely cover the area to be cleaned while minimising redundant cleaning.
  • the autonomous cleaner In order to implement deterministic cleaning, the autonomous cleaner must have a preprogrammed map of the area to be cleaned or create such a map by exploring the environment within which it is located. This prerequisite reduces the ease of use of the cleaner and requires the map to be updated to reflect changes in the environment.
  • the autonomous cleaner must also maintain a knowledge of its current position and remember where it has cleaned, which in turn requires a sophisticated, accurate positioning and navigation system.
  • a suitable positioning system might rely on marker beacons positioned around the space to be cleaned, a differential Global Positioning System (GPS), scanning laser ranging systems or other sophisticated systems.
  • GPS Global Positioning System
  • a laser ranging system could be used in conjunction with simultaneous localisation and mapping (SLAM) software to construct a detailed map of the space to be cleaned.
  • SLAM simultaneous localisation and mapping
  • Simultaneous localisation and mapping is a process of concurrently building a map of an environment based on stationary features or landmarks within the environment and using this map to obtain estimates of the location of a vehicle (an autonomous cleaner in this example).
  • Simultaneous localisation and mapping is already used in the field of mobile robotics as a tool to enable fully autonomous navigation of a vehicle. In essence, the vehicle relies on its ability to extract useful navigation information from data returned by sensors mounted on the vehicle. Typical sensors might include a dead reckoning system (for example an odometry sensor or inertial measurement system) in combination with a laser rangefinder.
  • the vehicle starts at an unknown location with no a priori knowledge of landmark locations. From relative observations of landmarks, it simultaneously computes an estimate of vehicle location and an estimate of landmark locations. While continuing in motion, the vehicle builds a complete map of landmarks and uses these to provide continuous estimates of the vehicle location. By tracking the relative position between the vehicle and identifiable features in the environment, both the position of the vehicle and the position of the features can be estimated simultaneously. In the absence of external information about the vehicle's position, this algorithm presents an autonomous system with the tools necessary to navigate in unknown environments.
  • VSLAM visual SLAM
  • VSLAM uses passive sensing to provide a low power and dynamic localisation system.
  • VSLAM uses a simple video camera as the input device and pattern processing algorithms to locate features in the images acquired by the video camera.
  • the features are input to the SLAM algorithm which is then able to accurately compute the three-dimensional location of each feature and hence start to build a three- dimensional map as the cleaner moves around the space to be cleaned.
  • the data processing load is high (proportional to the square of the number of features selected).
  • a navigation system comprising a primary mapping apparatus adapted to detect features within an environment and to create a summary map of the environment including an estimate of a point of current location within the environment; a secondary mapping apparatus adapted to provide a detailed three-dimensional map of the local environment in the vicinity of the point of current location; and a processor adapted to determine navigable points within the environment by combining information from the summary map and the detailed map.
  • the present navigation system utilises a summary map of the environment (a sparse map having few details therein) rather than a detailed map of the environment. Deliberately degrading the quality of the environmental map in this way is counter intuitive, since conventional navigation systems typically aim to maximise environmental details in a map.
  • the present navigation system is predicated on the realisation that, for many applications, adequate navigational performance may be achieved using an approximate knowledge of the global environment in conjunction with a detailed map of the environment in the vicinity of the point of current location.
  • the present navigation system confers an unexpected advantage in terms of providing reliable navigation within the environment despite the paucity of detail within the environmental map.
  • the summary map of the environment and the detailed three- dimensional map in the vicinity of the point of current location mutually support each other to such an extent that a new technical result is achieved.
  • the foregoing navigation system is advantageous in that the data processing load is reduced in comparison with conventional navigation systems having detailed maps of the environment.
  • the hardware requirements of the navigation system (processor specification, memory etc.) are correspondingly reduced.
  • the processor is configured to provide instructions to a motion control system so as to navigate from the point of current location to another navigable point within the environment.
  • the primary mapping apparatus has an optical sensor adapted to detect the features within the environment and wherein the primary mapping apparatus utilises a visual simultaneous localisation and mapping (VSLAM) process to create the summary map of the environment.
  • VSLAM visual simultaneous localisation and mapping
  • VSLAM visual simultaneous localisation and mapping
  • a passive optical sensor for example a simple video camera
  • dead reckoning to make visual measurements of an environment.
  • VSLAM uses pattern processing algorithms to locate visual features in the images acquired by the video camera. The features are input to the SLAM algorithm which is then able to accurately compute the three-dimensional location of the visual features and hence start to build a three-dimensional map of the environment. Navigation within the environment is based on recognised visual features or landmarks on the VSLAM map.
  • the VSLAM-enabled navigation system Upon entering a new environment, the VSLAM-enabled navigation system starts creating a new map through exploration of the environment. By this process, the navigation system detects features in the environment, creates landmarks there-from, and corrects its position information as necessary using its VSLAM map.
  • the secondary mapping apparatus may comprise any suitable means of simply generating a localised three-dimensional map.
  • the secondary mapping apparatus may comprise a passive system, e.g. a stereoscopic imaging system.
  • the secondary mapping apparatus may comprise an imaging apparatus having a structured light generator capable of creating a structured pattern of light within the environment.
  • the imaging apparatus typically includes an imaging sensor, for example a camera, arranged to capture images of the environment including the structured pattern of light projected therein.
  • the camera is capable of detecting the position relative thereto of a plurality of points in the pattern projected within the environment, for example using triangulation / trigonometry.
  • the secondary mapping apparatus preferably includes a pattern processor arranged in use to determine from the detected position of a point in the pattern within the environment, the range to that point. The detected position and range information are then used to create a detailed three-dimensional map of the local environment in the vicinity of the point of current location.
  • the imaging apparatus comprises at least one of a spot projector and a pattern projector adapted to project a geometric shape.
  • the spots may comprise intersections between continuous lines.
  • the projector may project two sets of regularly spaced lines, the two sets preferably being substantially orthogonal.
  • the projector may comprise at least one of a kaleidoscope and a holographic element through which light is transmitted into the environment.
  • the summary VSLAM map merely comprises a collection of features or landmarks within the environment with associated vectors defining their relative positions with respect to the point of current location and with respect to each other.
  • the summary map is an abstract representation of the environment since it contains little or no information regarding the environment between the sparse features or landmarks within the map.
  • the navigation system In order to navigate within the environment, the navigation system infers information about the environment based on the relationship(s) between features or landmarks in the summary map. For example, a plurality of features lying within a common plane may be indicative of a solid planer feature within the environment, e.g. a floor, a wall etc.
  • ambiguities remain in the summary map which give rise to uncertainties as to whether points within the environment are navigable.
  • the information produced by the secondary mapping apparatus is a detailed three-dimensional representation of the environment local to the point of current location.
  • the detailed three- dimensional map is complementary to the summary map and may be used to remove ambiguities in the summary map by confirming navigable points in the summary map.
  • the imaging sensor of the secondary mapping apparatus and the optical sensor of the primary mapping apparatus comprise a single common sensor. It is noteworthy that the level of detail held within the global environmental map remains low at all times, despite the use of optical sensor(s) for both primary and secondary mapping apparatuses and the availability of detailed environmental information from the latter.
  • the common sensor is one of a video camera, a CMOS camera and a charge-coupled device (CCD).
  • the optical sensor may be arranged to have a field of view which includes an upward direction.
  • the optical sensor may be arranged, in use, to detect features disposed on a substantially planar surface within the environment, for example a portion of a ceiling or roof.
  • the primary mapping apparatus is adapted to create from said detected features a summary map of the environment underlying said planar surface.
  • the mobile robotic device comprises any of a domestic robotic device, an industrial robotic device, a security robotic device, a floor cleaner, a vacuum cleaner, a lawn mower, a robotic entertainment device, a maintenance device, a surveying device, a fork lift truck, and a car.
  • the present navigation system represents a cost-effective way of delivering information about the immediate and general environment in which a robot, particularly a domestic mobile robot, operates to allow it to undertake navigation and therefore domestic and other tasks (e.g. cleaning, mowing, repair work, maintenance, surveillance, surveying tasks etc.) more quickly and effectively.
  • a robot particularly a domestic mobile robot
  • domestic and other tasks e.g. cleaning, mowing, repair work, maintenance, surveillance, surveying tasks etc.
  • the present navigation systems enables deterministic cleaning of an area to be cleaned.
  • deterministic cleaning the robotic cleaner follows a defined path that is calculated to completely cover the area to be cleaned while minimising redundant cleaning.
  • the present navigation system confers benefits in terms of reduced cost, reduction in the number of sensors required, operational simplicity, faster cleaning of a given area, cleaning of larger areas, lower power consumption (or in the case of a vacuum cleaner increased power for vacuuming), lower use of consumables, less noise, reduced wear and use of consumables.
  • the secondary mapping apparatus ensures safe operation by helping to determine whether the robotic cleaner can move into a given space. This is particularly important within environments inhabited by people or pets, in which case the secondary mapping apparatus facilitates safe and intelligent movements around inhabitants within the environment.
  • the secondary mapping apparatus provides a capability to detect precipices within the environment; a vital safety feature when used in an environment having different levels separated by stairs.
  • the secondary mapping apparatus enables the robotic cleaner to avoid precipices and stop in the vicinity of an object without colliding therewith. This capability confers a high degree of confidence when navigating within the environment, allowing the robotic cleaner to accelerate and decelerate gracefully as it moves within the environment.
  • deterministic cleaning allows a robotic cleaner to have a. shape, e.g. a flat or acutely shaped front with corners, which facilitates cleaning of edges and corners.
  • the robotic cleaner may be arranged to have a substantially semi-circular rear portion.
  • conventional robotic cleaners which move in a random motion are typically circular so that they can extricate themselves from small spaces by merely rotating about their own axis.
  • the present navigation system enables the performance of a domestic robotic cleaner to be improved by a factor of approximately twelve over conventional devices (based on the following assessments: the robot cleans a given area six times faster than a cleaner employing random motion; the size of the battery is halved owing to the reduction in redundant cleaning).
  • the time taken to clean a given area may be reduced by a factor of approximately twelve over conventional devices (based on the following assessments: the robot cleans a given area six times faster than a cleaner employing random motion; the velocity at which the cleaner moves within the environment can be doubled due to the high degree of confidence with which the cleaner navigates within the environment; the size of the battery is maintained).
  • the time taken to clean a room may be reduced from approximately one hour using a conventional device to approximately five minutes using the present robotic cleaner.
  • the mobile robotic device may be adapted to receive a detachable accessory. With the accessory fitted the robotic device may adopt a supplementary mode of operation.
  • the detachable accessory may be a lead or a leash.
  • the robotic pet could then be safely taken for a walk with the lead attached thereto.
  • the mobile robotic device comprises a vacuum cleaner adapted to receive a detachable accessory.
  • the mobile robotic device has a vacuum pressure variable between a minimum and a maximum, wherein maximum vacuum pressure may be automatically selected in the supplementary mode of operation.
  • the robotic device may be configured to operate with maximum suction (maximum vacuum pressure) in the supplementary mode of operation.
  • the accessory includes an accessory hose adapted to engage in fluid communication with a vacuum inlet on the vacuum cleaner.
  • the accessory hose is manoeuvrable by a user and the navigation system may be adapted to provide instructions to the motion control system so as to follow the user, while preferably concurrently avoiding obstacles and inhabitants within the environment.
  • the navigation system is disabled in the supplementary mode of operation.
  • the accessory may include a handle. This facilitates manual cleaning of areas inaccessible to the robot when under autonomous control, for example stairs etc.
  • the robotic vacuum cleaner of the invention can be a primary cleaner in a domestic dwelling, obviating an additional manual cleaner.
  • the mobile robotic device comprises a lawnmower
  • the ability to manually mow areas of the lawn obviates an additional manual lawnmower.
  • a fourth aspect of the present invention there is now proposed a method of controlling a robotic device within an area to be traversed, the robotic device having a variable power requirement and a navigation system adapted to map features in an environment, the method comprising the steps of:
  • the secondary mapping apparatus ensures safe operation of the robotic device during all operating modes, by helping to determine whether the robotic device can move into a given space. As will be readily appreciated, this is of particular importance during the first and second modes of operation while the summary map of the area to be traversed is being created and refined. Hence, the robot device is able to avoid precipices, to stop in the vicinity of objects within said area without colliding with them, and move in an intelligent manner around inhabitants within the area to be traversed.
  • the device may be configured only to use sufficient power to traverse the area and map the environment during said first mode of operation.
  • the robotic device may operate in the first, second and third modes of operation in numerical sequence.
  • the phase within which the robotic device operates is selected in response to a status condition.
  • the status condition is derived from a plurality of variables, each variable having a changeable weighting factor applied thereto so as to optimise the behaviour of the robotic device.
  • the variables are selected from: exploration of the area to be traversed, operation of the device, localisation within the environment, efficiency of operation and operating time.
  • the robotic device reverts to the first mode of operation in the event of a failure in the navigation system, or alternatively may switch itself off.
  • the device may be a cleaner or mower or any other device mentioned above and the power consumption may relate to suction power, for example, for a cleaner or to brush action or dispensation of cleaning fluid.
  • minimum suction or no suction may be used when operating in the first mode of operation (in particular where the device is configured only to use sufficient power to traverse the area and map the environment during said first mode of operation), and maximum suction may be used when operating in at least one of the second or third operating modes.
  • the power consumption may relate to the speed at which the device traverses the area.
  • power consumption may relate to operation of a cutting blade, strimmer or the like.
  • power consumption may relate to operation of features on the device for carrying out such operations.
  • VSLAM visual simultaneous localisation and mapping
  • Figure 1 shows a block diagram of a navigation system according to one embodiment of the present invention.
  • Figure 2 shows a schematic diagram of a mobile robotic device incorporating the navigation system shown in figure 1.
  • Figure 3 illustrates the architecture of a robotic vacuum cleaner incorporating the navigation system according to an embodiment of the present invention.
  • Figure 4 shows an example of a map of an area to be cleaned created by the robotic vacuum cleaner of figure 3.
  • figure 1 shows a block diagram of a navigation system 2 according to one embodiment of the present invention.
  • the navigation system 2 incorporates a primary mapping apparatus comprising a visual simultaneous localisation and mapping (VSLAM) apparatus and a secondary mapping apparatus consisting of an accurate three-dimensional mapping apparatus.
  • VSLAM visual simultaneous localisation and mapping
  • the VSLAM apparatus and the three-dimensional mapping apparatus work in parallel and have complementary functions within the navigation system 2.
  • the VSLAM apparatus comprises a camera 4 arranged to capture images of the environment.
  • the images are passed from the camera 4 to a feature detector 6 which processes the image to detect naturally occurring features within the environment, e.g. edges etc.
  • the VSLAM apparatus uses a sophisticated vision algorithm 8 to build a summary map of the environment consisting of a set of unique "landmarks" derived from the features observed in the environment.
  • the VSLAM summary map typically comprises a vector map of landmarks within environment.
  • VSLAM apparatuses The skilled person would be aware of a variety of VSLAM apparatuses. However, it should be noted that conventional VSLAM apparatuses are aimed at high accuracy applications, whereas the present VSLAM apparatus is configured to produce a summary map of the environment (a sparse map having few landmarks therein) rather than a detailed map of the environment. This is based on the realisation that, for many applications, adequate navigational performance may be achieved using an approximate knowledge of the global environment in conjunction with a separate detailed map of the local environment in order to assess whether the immediate environment is navigable, for example is free from encumbrances, obstacles, inhabitants etc.
  • VSLAM apparatus produces a summary map of the environment having few landmarks, the location of each landmark within the VSLAM map is accurately known.
  • the accurate three-dimensional mapping apparatus provides the requisite detailed map of the local environment.
  • the three-dimensional mapping apparatus comprises three-dimensional sensors 10 arranged to capture detailed information about the local environment. Electromechanical sensors and / or optical sensors are used. Typically, the sensors 10 could comprise scanning laser rangefinders. Preferably the sensors 10 are low cost, simple and eye safe.
  • the accurate three-dimensional mapping apparatus incorporates a processor 12 for processing the raw information from the three-dimensional sensor 10.
  • the processor 12 provides a detailed map of the local environment in the vicinity of a point of current location within the environment.
  • the detailed map typically comprises a collection of points within the environment measured with respect to the three-dimensional sensor 10 (also referred to hereinafter as point cloud information).
  • the summary map produced by the VSLAM apparatus and the detailed map from the accurate three-dimensional mapping apparatus are passed to a navigation controller 14 which provides instructions to a motion control system 16 in order to control movement of a mobile platform, for example a robot, within the mapped environment.
  • the navigation system 2 uses recognised landmarks to estimate the position of the robot within the environment. Furthermore, the VSLAM apparatus updates and refines the summary map as the robot revisits mapped areas, allowing it to adapt to changes in the environment. In the case of a domestic mobile robot, such changes in the environment may arise due to rearrangement of furniture or objects within a room etc.
  • FIG 2 a schematic block diagram of a mobile robotic device 20 incorporating the navigation system of figure 1 is shown.
  • the robotic device 20 incorporates the elements of the navigation system described above.
  • the separate camera 4 and three-dimensional sensors 10 are replaced by a single common sensor in the form of a camera 22.
  • the camera is a conventional video camera, typically a CMOS camera, alternatively a charge-coupled device (CCD).
  • CCD charge-coupled device
  • the camera is mounted on the robotic device 20 so as to capture images of the environment in front of the robotic device within the field of view 24 of the camera 22.
  • the robotic device 20 also incorporates a structured light generator comprising a spot projector 26, operating at visible or infrared wavelengths, to provide information about the local environment to the accurate three-dimensional mapping apparatus.
  • the principle of operation of the accurate three-dimensional mapping apparatus is as follows.
  • the spot projector 26 projects a plurality of spots into the environment.
  • the camera 22 is arranged to capture images of the environment including the array of spots projected therein.
  • the processor 12 analyses the images from the camera 22 and determines the position with respect to the camera of each projected spot within the environment, for example using triangulation / trigonometry.
  • the processor 12 preferably includes a pattern processor adapted to determine from the detected position of a spot in the environment, the range to that spot. The detected position and range information are then used to create a detailed three-dimensional map of the environment local to the robotic device.
  • the spot projector 26 is adapted to project a two dimensional array of spots into the environment.
  • a suitable spot projector is described in PCT patent application publication WO 2004/044523 the content of which is hereby incorporated by reference thereto.
  • the spot projector 26 comprises a light source arranged to illuminate part of an input face of a light guide, the light guide comprising a tube having substantially reflective sides and being arranged together with projection optics so as to project an array of distinct images of the light source into the environment.
  • the light guide in effect operates as a kaleidoscope. Light from the source is reflected from the sides of the tube and can undergo a number of reflection paths within the tube. The result is that multiple images of the light source are produced and projected into the environment. Thus the environment is illuminated with an array of images of the light source. Where the source is a simple light emitting diode the scene is therefore illuminated with an array of spots of light.
  • the light guide kaleidoscope gives very good image replication characteristics and projects images of the input face of the light guide in a wide angle, i.e. a large number of spots are projected in all directions. Further the kaleidoscope produces a large depth of field and so delivers a large operating window.
  • the light guide comprises a tube with substantially reflective walls and a constant cross section which is conveniently a regular polygon. Having a regular cross section means that the array of images of the light source will also be regular which is advantageous for ensuring the whole environment is covered and eases processing. A square section tube is most preferred.
  • the light guide has a cross sectional area in the range of a few square millimetres to a few tens of square millimetres, for instance the cross sectional area may be in the range of 1 - 50mm 2 .
  • the light guide preferably has a regular shape cross section with a longest dimension of a few millimetres, say 1 - 5mm.
  • One embodiment as mentioned is a square section tube having a side length of 2-3mm.
  • the light guide may have a length of a few tens of millimetres, a light guide may be between 10 and 70mm long.
  • Such light guides can generate a grid of spots over an angle of 50-100 degrees (typically about twice the total internal angle within the light guide).
  • the tube comprises a hollow tube having reflective internal surfaces, i.e. mirrored internal walls.
  • the tube is fabricated from a solid material and arranged such that a substantial amount of light incident at an interface between the material of the tube and surrounding material undergoes total internal reflection.
  • the tube material is either coated in a coating with a suitable refractive index or designed to operate in air, in which case the refractive index of the light guide material should be such that total internal reflection occurs at the material air interface.
  • Using a tube like this as a light guide results in multiple images of the light source being generated which can be projected to the environment to form the array of spots.
  • the light guide is easy to manufacture and assemble and couples the majority of the light from the source to the scene. Thus low power sources such as light emitting diodes can be used.
  • the exit aperture can be small, the apparatus also has a large depth of field which makes it useful for ranging applications which require spots projected that are separated over a wide range of distances.
  • Either individual light sources are used close to the input face of the light guide to illuminate just part of the input face or one or more light sources are used to illuminate the input face of the light guide through a mask.
  • Using a mask with transmissive portion for passing light to a part of the light guide can be easier than using individual light sources. Accurate alignment of the mask is required at the input face of the light guide but this may be easier than accurately aligning an LED or LED array.
  • the illumination means comprises a homogeniser located between the light source and the mask so as to ensure that the mask is evenly illuminated.
  • the light source may therefore be any light source giving an acceptable level of brightness and does not need accurate alignment.
  • an LED with oversized dimensions is used to relax tolerances in manufacture/alignment.
  • the projection optics comprise a projection lens.
  • the projection lens is located adjacent the output face of the light guide.
  • the lens may be integral to the light guide, i.e. the tube may be shaped at the output face to form a lens.
  • All beams of light projected by the spot projector 26 pass through the end of the light guide and can be thought of as originating from the point at the centre of the end face of the light guide.
  • the projection optics can then comprise a hemispherical lens and if the centre of the hemisphere coincides with the centre of the light guide output face the apparent origin of the beams remains at the same point, i.e. each projected image has a common projection origin.
  • the projector does not have an axis as such as it can be thought of a source of beams radiating across a wide angle. What matters for the imaging apparatus therefore is the geometrical relationship between the point of origin of the beams and the principal point of the lens of the camera 22.
  • the spot projector 26 is controlled by a master clock 32 which provides synchronisation between the monochromatic camera 22, the spot projector 26 and a de-multiplexer 30 connected to the output of the camera 22.
  • the output from the camera 22 is switched between the primary mapping apparatus (the VSLAM apparatus) denoted by numerals 6 and 8 and a secondary mapping apparatus (the accurate three-dimensional mapping apparatus denoted by numerals 12 and 14).
  • the clock 32 controls the spot projector ' 26 and the de-multiplexer 30 such that a pattern of spots is projected into the local environment at the same time as the output from camera 22 is routed to the accurate three- dimensional mapping apparatus.
  • the clock 32 and the de-multiplexer 30 are configured such that images (frames) produced by the camera 22 are sent alternately to the VSLAM apparatus 6, 8 and the accurate three-dimensional mapping apparatus 12, 14.
  • the robotic device 20 includes a motion control system 16 comprising a driving unit having wheels and at least one motor arranged to provide a motive force to the wheels.
  • the motion control system 16 provides control signals to the motor(s) in response to instructions received from the navigation controller 14 so as to move the robotic device 20 within the environment.
  • VSLAM algorithm 8 has some knowledge of the movement of the robotic vehicle 20. Accordingly, information about the robot's movement, either from the use of a stepper motor or a wheel rotation encoder, is fed back to the VSLAM algorithm 8.
  • the field of view 24 of the camera optionally extends to image a portion of the ceiling or roof of the room or internal space within which the robot is located.
  • a separate camera may be used to capture images of the ceiling or roof.
  • the VSLAM apparatus maps it and indirectly determines the floor plan which is directly below.
  • This hitherto unknown technique provides the benefit that individual rooms can more easily be defined, for example since door openings do not normally extend to the ceiling.
  • the ceiling of a room is typically large, plain and white it may be an easy object to see even with a low performance camera. In use, doors can now be open or shut, as the user desires, with the robot still able to map and navigate one room before knowingly moving on to the next one.
  • Using a ceiling to create a VSLAM map of a room overcomes other disadvantages; for example when trying to map a room from a video image taken at floor level, the obstacles (chairs, tables, etc) obstruct a large portion of the view of the walls making it hard to immediately generate a map, of room boundaries. Also, from floor level it may not be obvious to the navigation system whether a gap is a doorway or merely a space between two pieces of furniture. Hence it would not be clear where the room boundaries were if the doors are left open. This becomes important in the case of robotic cleaner which should logically clean rooms one at a time, and not wander accidentally out of the room before it has finished cleaning it. The foregoing obviates the use of light beam units which are traditionally placed across a doorway in the prior art and which trigger a sensor on the robot to stop it leaving the room.
  • FIG. 3 the architecture of a robotic vacuum cleaner incorporating the navigation system according to an embodiment of the present invention is illustrated.
  • the robotic vacuum cleaner architecture is based broadly on that shown in figure 2 for a generic mobile robotic device.
  • the present robotic vacuum cleaner comprises a vacuum unit (not shown in figure 3) installed on a main body of the cleaner.
  • the vacuum unit is arranged to collect particles in an underlying floor surface by drawing-in air.
  • the vacuum unit can be of known construction, for example including a vacuum motor and a vacuum chamber for collecting particles from the air which is drawn in through a vacuum hole or pipe positioned adjacent the floor surface.
  • the vacuum unit includes a supplementary vacuum inlet adapted to engage in fluid communication with an accessory, for example an external accessory hose.
  • the suction unit has a dirt sensor (56) adapted to provide information about the level and rate of uptake of dirt from the floor, thereby giving a measure of how dirty is the area of floor currently being cleaned.
  • the VSLAM algorithm 8 is configured to extract features from the environment using mechanisms such as scale invariant feature transforms (Sl FTs) and Harris Corners. These mechanisms can be extended to make use of contextual information, such as plane constrained features from ceilings and walls, to reduce the computational complexity (and hence reduce the data processing load) and potentially increase reliability.
  • Sl FTs scale invariant feature transforms
  • Harris Corners Harris Corners
  • the robotic vacuum cleaner When instructed to commence cleaning, the robotic vacuum cleaner initially adopts a first mode of operation, moving in a substantially random motion within the area to be cleaned whilst concurrently mapping the environment and creating a summary VSLAM map of the area to be cleaned.
  • the robotic vacuum cleaner is configured to use a minimum vacuum pressure (minimum suction) during said first mode of operation. Obstacles within the area to be cleaned are detected by the three dimensional mapping apparatus 12 during this first mode of operation and consequently avoided.
  • the foregoing un-delayed initialisation technique gives a quick, robust and efficient three-dimensional estimate of the environment from the two-dimensional images produced by the camera 22 while also allowing fast low memory filter initialisation.
  • odometry sensing for example from wheel rotation encoders etc. is used to reduce the computational complexity of the VSLAM system.
  • the VSLAM system 8 produces an estimate of the position of the robot within the environment with associated confidences, and a summary map of spatially located features within the environment.
  • the position estimate is used to fuse sensor information into the internal map 44, to aid trajectory following, and, using information theory, to provide trajectory constraints on the planning algorithm 46 to maximise map and localisation quality whilst the vacuum cleaner is cleaning.
  • an optical flow system 40 uses passive sensing to provide an estimate of the relative distances of objects moving relative to the camera 22.
  • the optical flow system 40 monitors the movement of objects or texture within the images from the camera 22 that result from the camera's movements within the environment, and derives motion cues there-from.
  • the optical flow system 40 uses these motion cues to provide an estimate of the distance to an object within the environment based on the rate at which the object moves in the images produced by the camera.
  • This relative information is used in conjunction with the VSLAM summary map by a position registration system 42 to produce a more accurate, absolute estimate of the position and orientation of surfaces within the sensor field of view.
  • An internal mapping system 44 provides an estimate of the state of the environment within which the robotic vacuum cleaner is operating; it is updated by fusing point cloud information from the three dimensional mapping apparatus 12 and the optical flow system 40 with contextual information from the VSLAM summary map. In this respect, the internal mapping system 44 creates an internal map of the environment .
  • This set of sensor information provides a variable accuracy estimate of conditions within the environment dependent on the locality of the robot; this allows high accuracy trajectory control at short range, and room size estimation.
  • the information from the detailed three dimensional mapping apparatus 12 is not incorporated into the VSLAM summary map; rather the information from the detailed three dimensional mapping apparatus is used in the internal map 44 to identifying navigable and non-navigable points within the VSLAM summary map.
  • the robotic vacuum cleaner will rapidly know the size of the room within which it is operating (and the location of any obstacles within the room) and will have a navigation strategy to cover the floor space. Accordingly, it can calculate how long it will take to clean a particular room. So, instead of moving randomly about the room until the power runs out, a robotic vacuum cleaner having the present navigation system can display an estimate of how many minutes are left before it finishes cleaning the room.
  • the robotic vacuum cleaner knows how much power it has in its battery, it can optimise the use of that power to clean a given area. For example, if it is in a small room, then rather than finish the room at normal suction and still have some capacity left in its battery, it could automatically boost the suction power so its completes the job in the same time, but cleans better. It can then fully recharge its battery, thereby avoiding any memory effect which could reduce battery capacity.
  • the robotic cleaner includes a utility calculation system 48 which uses information relating to current and predicted states of the robotic vacuum cleaner and the environment within which it is operating, and calculated optimal strategy, to produce an estimate of current and future system utility costs.
  • the use of system utility costs provides a means by which the robotic vacuum cleaner can assess and prioritise a set of available actions, and select an appropriate course of action there-from.
  • the above-mentioned utility costs are calculated for the variables exploration, cleaning, localisation, efficiency, and cleaning time. These utility costs are then combined as per Equation 1 below to determine the appropriate set of actions (e.g. route) to take next.
  • the use of a utility based selection scheme allows the behaviour of the robotic vacuum cleaner to be optimised by switching between exploration at system start-up and cleaning once localised by altering the performance weights.
  • the utility based selection scheme also enables fine control of the behaviour within those strategies to be automatically controlled while the robot is operating.
  • the trajectory controller 16 (also referred to hereinbefore as the motion control system) is enabled by a robot localisation estimate provided by the VSLAM system 8; and provides local motion control of the robotic cleaner for negotiation of obstacles and emergency collision avoidance; this provides a second level of avoidance of all detectable obstacles.
  • the function of the navigation controller 14 referred to previously with respect to figures 1 and 2 is performed collectively by the internal mapping system 44, the global planner 46, the utility calculator 48, and the command fusion system 50.
  • Figure 4 shows an example of'a map of an area to be cleaned created by the robotic vacuum cleaner illustrated in figure 3.
  • the information contained within the internal map includes cleaned, un-cleaned, cleanable, un-cleanable, unknown, un-passable areas, cleaning and exploration frontiers, and optionally dirt level.
  • the optional dirt sensor 56 provides information about the level and rate of uptake of dirt, thereby giving a measure of how dirty is the area of floor currently being cleaned. This information is used during a single cleaning cycle to optimise the intensity or repetition of cleaning a particular area, and used over multiple cycles to predict both when and where the floor needs cleaning.
  • a detailed knowledge of the area to be cleaned and delineation of the space into separate rooms enables the user interface to comprise merely two buttons; one button labelled “clean this room”, and the other one “clean the apartment” / "clean the floor”, i.e. all rooms on this floor.
  • the navigation system according to the present invention has been described herein primarily in terms of its applications to domestic robotic devices, in particular with reference to an autonomous domestic vacuum cleaner. However, the present navigation system is not restricted to such applications and the skilled person will readily appreciate alternative applications of the present navigation system. By way of further example, a robotic device incorporating the present navigation system could be programmed to act as a guard, searching rooms, looking for activity out of hours, etc.
  • Such a robotic device could be used to move material around a warehouse, if connected to a trolley or forklift, for example.
  • Additional domestic robotic applications include infotainment / entertainment, robotic pets, etc.
  • Further uses envisaged for the navigation system of the present invention are for robots used for painting work, for repair and maintenance work, for surveying, e.g. topology and for flaw and crack detection in difficult-to-access areas.

Abstract

A navigation system (2) comprising a primary mapping apparatus adapted to detect features within an environment and to create a summary map of the environment including an estimate of a point of current location within the environment; a secondary mapping apparatus adapted to provide a detailed three-dimensional map of the local environment in the vicinity of the point of current location; and a processor adapted to determine navigable points within the environment by combining information from the summary map and the detailed map. Specifically, the primary mapping apparatus detects features within the environment using an optical detector and utilises a visual simultaneous localisation and mapping (VSLAM) process (8) to create the summary map of the general environment.

Description

NAVIGATION SYSTEM
The present invention relates to a navigation system, and in particular to a navigation system for a mobile robotic device. The invention relates specifically, but not exclusively, to a navigation system for a domestic robotic device, for example an autonomous cleaner or lawn mower.
Existing domestic robots, for example autonomous cleaners and lawn mowers, are primitive and typically require a large number of basic sensors in order to operate autonomously. Even so, the operation of such domestic robots is simplistic. By way of example, conventional autonomous cleaners move in a random motion over an area to be cleaned. The motion of the cleaner is typically controlled by a random-motion algorithm stored within the device. In use, the cleaner will implement the random- motion algorithm in response to internal events (elapse of a random or predetermined time period, battery condition etc.) and external stimuli (for example, detection of obstacles such as walls and furniture using simple bump sensors and detection of precipices, such as the top of a flight of stairs, using an infrared emitter and detector etc.).
Despite their unsophisticated nature, autonomous cleaners utilising random motion can provide an acceptable cleaning coverage .given enough cleaning time. However, such operation is inefficient since the cleaner has no knowledge of where it has cleaned and therefore needs to operate for an extended period to achieve an acceptable coverage of the area to be cleaned.
The efficiency of an autonomous cleaner may be enhanced by incorporating a deterministic component to the otherwise random motion of the device. In deterministic cleaning, the autonomous cleaner follows a defined path that is calculated to completely cover the area to be cleaned while minimising redundant cleaning.
In order to implement deterministic cleaning, the autonomous cleaner must have a preprogrammed map of the area to be cleaned or create such a map by exploring the environment within which it is located. This prerequisite reduces the ease of use of the cleaner and requires the map to be updated to reflect changes in the environment. The autonomous cleaner must also maintain a knowledge of its current position and remember where it has cleaned, which in turn requires a sophisticated, accurate positioning and navigation system. A suitable positioning system might rely on marker beacons positioned around the space to be cleaned, a differential Global Positioning System (GPS), scanning laser ranging systems or other sophisticated systems.
By way of further example, a laser ranging system could be used in conjunction with simultaneous localisation and mapping (SLAM) software to construct a detailed map of the space to be cleaned.
Simultaneous localisation and mapping (SLAM) is a process of concurrently building a map of an environment based on stationary features or landmarks within the environment and using this map to obtain estimates of the location of a vehicle (an autonomous cleaner in this example). Simultaneous localisation and mapping is already used in the field of mobile robotics as a tool to enable fully autonomous navigation of a vehicle. In essence, the vehicle relies on its ability to extract useful navigation information from data returned by sensors mounted on the vehicle. Typical sensors might include a dead reckoning system (for example an odometry sensor or inertial measurement system) in combination with a laser rangefinder.
The vehicle starts at an unknown location with no a priori knowledge of landmark locations. From relative observations of landmarks, it simultaneously computes an estimate of vehicle location and an estimate of landmark locations. While continuing in motion, the vehicle builds a complete map of landmarks and uses these to provide continuous estimates of the vehicle location. By tracking the relative position between the vehicle and identifiable features in the environment, both the position of the vehicle and the position of the features can be estimated simultaneously. In the absence of external information about the vehicle's position, this algorithm presents an autonomous system with the tools necessary to navigate in unknown environments.
The prospect of deploying a system that can build a map of its environment while simultaneously using that map to localise itself promises to allow robotic vehicles to operate autonomously in unknown environments. However, this is an expensive solution and the data processing overhead is high. Alternatively, a vision based system could be used in conjunction with SLAM software to form a navigation system for a deterministic cleaner. This technique, known as visual SLAM (VSLAM) uses passive sensing to provide a low power and dynamic localisation system. VSLAM uses a simple video camera as the input device and pattern processing algorithms to locate features in the images acquired by the video camera. The features are input to the SLAM algorithm which is then able to accurately compute the three-dimensional location of each feature and hence start to build a three- dimensional map as the cleaner moves around the space to be cleaned. However, to do this in sufficient detail requires a large number of feature points in the image. In common with conventional SLAM, the data processing load is high (proportional to the square of the number of features selected).
Accordingly, the abovementioned navigation systems have hitherto been considered to be prohibitively complex and expensive for domestic robotic devices.
It is an object of the invention to provide a navigation system which mitigates at least some of the disadvantages of the conventional navigation systems described above.
According to a first aspect of the present invention there is now proposed a navigation system comprising a primary mapping apparatus adapted to detect features within an environment and to create a summary map of the environment including an estimate of a point of current location within the environment; a secondary mapping apparatus adapted to provide a detailed three-dimensional map of the local environment in the vicinity of the point of current location; and a processor adapted to determine navigable points within the environment by combining information from the summary map and the detailed map.
Contrary to conventional navigation theory, the present navigation system utilises a summary map of the environment (a sparse map having few details therein) rather than a detailed map of the environment. Deliberately degrading the quality of the environmental map in this way is counter intuitive, since conventional navigation systems typically aim to maximise environmental details in a map.
The present navigation system is predicated on the realisation that, for many applications, adequate navigational performance may be achieved using an approximate knowledge of the global environment in conjunction with a detailed map of the environment in the vicinity of the point of current location.
The present navigation system confers an unexpected advantage in terms of providing reliable navigation within the environment despite the paucity of detail within the environmental map.
In this respect, the summary map of the environment and the detailed three- dimensional map in the vicinity of the point of current location mutually support each other to such an extent that a new technical result is achieved.
The foregoing navigation system is advantageous in that the data processing load is reduced in comparison with conventional navigation systems having detailed maps of the environment. The hardware requirements of the navigation system (processor specification, memory etc.) are correspondingly reduced. In a preferred embodiment, the processor is configured to provide instructions to a motion control system so as to navigate from the point of current location to another navigable point within the environment.
Advantageously, the primary mapping apparatus has an optical sensor adapted to detect the features within the environment and wherein the primary mapping apparatus utilises a visual simultaneous localisation and mapping (VSLAM) process to create the summary map of the environment.
As an enhancement of the conventional SLAM process, visual simultaneous localisation and mapping (VSLAM) uses a passive optical sensor, for example a simple video camera, and dead reckoning to make visual measurements of an environment. VSLAM uses pattern processing algorithms to locate visual features in the images acquired by the video camera. The features are input to the SLAM algorithm which is then able to accurately compute the three-dimensional location of the visual features and hence start to build a three-dimensional map of the environment. Navigation within the environment is based on recognised visual features or landmarks on the VSLAM map.
Upon entering a new environment, the VSLAM-enabled navigation system starts creating a new map through exploration of the environment. By this process, the navigation system detects features in the environment, creates landmarks there-from, and corrects its position information as necessary using its VSLAM map.
The secondary mapping apparatus may comprise any suitable means of simply generating a localised three-dimensional map. For example, the secondary mapping apparatus may comprise a passive system, e.g. a stereoscopic imaging system. By way of further example, the secondary mapping apparatus may comprise an imaging apparatus having a structured light generator capable of creating a structured pattern of light within the environment. The imaging apparatus typically includes an imaging sensor, for example a camera, arranged to capture images of the environment including the structured pattern of light projected therein. The camera is capable of detecting the position relative thereto of a plurality of points in the pattern projected within the environment, for example using triangulation / trigonometry. The secondary mapping apparatus preferably includes a pattern processor arranged in use to determine from the detected position of a point in the pattern within the environment, the range to that point. The detected position and range information are then used to create a detailed three-dimensional map of the local environment in the vicinity of the point of current location.
Conveniently, the imaging apparatus comprises at least one of a spot projector and a pattern projector adapted to project a geometric shape. Where the imaging apparatus comprises a spot projector, the spots may comprise intersections between continuous lines. In this case, the projector may project two sets of regularly spaced lines, the two sets preferably being substantially orthogonal. By way of further example, the projector may comprise at least one of a kaleidoscope and a holographic element through which light is transmitted into the environment.
Conventional SLAM navigation systems if used with a secondary mapping means would traditionally seek to produce an exhaustive environmental map by augmenting the summary information from the primary mapping apparatus with detailed information captured by the secondary mapping apparatus. The present navigation system overcomes this technical prejudice by maintaining a summary map in respect of the global environment and only seeking further detail where necessary, i.e. in the vicinity of the point of current location.
This may be more easily understood by considering the type of information contained respectively within the summary and detailed three-dimensional maps. By way of brief explanation, the summary VSLAM map merely comprises a collection of features or landmarks within the environment with associated vectors defining their relative positions with respect to the point of current location and with respect to each other. In this respect the summary map is an abstract representation of the environment since it contains little or no information regarding the environment between the sparse features or landmarks within the map.
In order to navigate within the environment, the navigation system infers information about the environment based on the relationship(s) between features or landmarks in the summary map. For example, a plurality of features lying within a common plane may be indicative of a solid planer feature within the environment, e.g. a floor, a wall etc. However, ambiguities remain in the summary map which give rise to uncertainties as to whether points within the environment are navigable. The information produced by the secondary mapping apparatus is a detailed three-dimensional representation of the environment local to the point of current location. Hence, the detailed three- dimensional map is complementary to the summary map and may be used to remove ambiguities in the summary map by confirming navigable points in the summary map.
It should be noted that the information from the detailed three dimensional map is not incorporated into the summary map; the information from the former is merely used to refine the latter in terms of identifying navigable and non-navigable points therein.
Preferably, the imaging sensor of the secondary mapping apparatus and the optical sensor of the primary mapping apparatus comprise a single common sensor. It is noteworthy that the level of detail held within the global environmental map remains low at all times, despite the use of optical sensor(s) for both primary and secondary mapping apparatuses and the availability of detailed environmental information from the latter.
Even more preferably, the common sensor is one of a video camera, a CMOS camera and a charge-coupled device (CCD).
In a preferred embodiment, the optical sensor may be arranged to have a field of view which includes an upward direction. In this configuration the optical sensor may be arranged, in use, to detect features disposed on a substantially planar surface within the environment, for example a portion of a ceiling or roof. In this case, the primary mapping apparatus is adapted to create from said detected features a summary map of the environment underlying said planar surface.
According to a second aspect of the present invention there is now proposed a vehicle having a navigation system according to the first aspect of the invention.
According to a third aspect of the present invention there is now proposed a mobile robotic device having a navigation system according to the first aspect of the invention.
In one embodiment, the mobile robotic device comprises any of a domestic robotic device, an industrial robotic device, a security robotic device, a floor cleaner, a vacuum cleaner, a lawn mower, a robotic entertainment device, a maintenance device, a surveying device, a fork lift truck, and a car.
The present navigation system represents a cost-effective way of delivering information about the immediate and general environment in which a robot, particularly a domestic mobile robot, operates to allow it to undertake navigation and therefore domestic and other tasks (e.g. cleaning, mowing, repair work, maintenance, surveillance, surveying tasks etc.) more quickly and effectively.
In the case of robotic cleaner, the present navigation systems enables deterministic cleaning of an area to be cleaned. In deterministic cleaning, the robotic cleaner follows a defined path that is calculated to completely cover the area to be cleaned while minimising redundant cleaning. When applied to a robotic cleaner, the present navigation system confers benefits in terms of reduced cost, reduction in the number of sensors required, operational simplicity, faster cleaning of a given area, cleaning of larger areas, lower power consumption (or in the case of a vacuum cleaner increased power for vacuuming), lower use of consumables, less noise, reduced wear and use of consumables. The secondary mapping apparatus ensures safe operation by helping to determine whether the robotic cleaner can move into a given space. This is particularly important within environments inhabited by people or pets, in which case the secondary mapping apparatus facilitates safe and intelligent movements around inhabitants within the environment.
Additionally, the secondary mapping apparatus provides a capability to detect precipices within the environment; a vital safety feature when used in an environment having different levels separated by stairs.
Hence, the secondary mapping apparatus enables the robotic cleaner to avoid precipices and stop in the vicinity of an object without colliding therewith. This capability confers a high degree of confidence when navigating within the environment, allowing the robotic cleaner to accelerate and decelerate gracefully as it moves within the environment.
Moreover, deterministic cleaning allows a robotic cleaner to have a. shape, e.g. a flat or acutely shaped front with corners, which facilitates cleaning of edges and corners. In addition, the robotic cleaner may be arranged to have a substantially semi-circular rear portion. In contrast, conventional robotic cleaners which move in a random motion are typically circular so that they can extricate themselves from small spaces by merely rotating about their own axis.
It is estimated that the present navigation system enables the performance of a domestic robotic cleaner to be improved by a factor of approximately twelve over conventional devices (based on the following assessments: the robot cleans a given area six times faster than a cleaner employing random motion; the size of the battery is halved owing to the reduction in redundant cleaning). Alternatively, the time taken to clean a given area may be reduced by a factor of approximately twelve over conventional devices (based on the following assessments: the robot cleans a given area six times faster than a cleaner employing random motion; the velocity at which the cleaner moves within the environment can be doubled due to the high degree of confidence with which the cleaner navigates within the environment; the size of the battery is maintained). Hence, the time taken to clean a room may be reduced from approximately one hour using a conventional device to approximately five minutes using the present robotic cleaner.
In a preferred embodiment, the mobile robotic device may be adapted to receive a detachable accessory. With the accessory fitted the robotic device may adopt a supplementary mode of operation.
For example, in the case of an robotic entertainment device (e.g. a robotic pet), the detachable accessory may be a lead or a leash. The robotic pet could then be safely taken for a walk with the lead attached thereto.
Preferably, the mobile robotic device comprises a vacuum cleaner adapted to receive a detachable accessory.
Conveniently, the mobile robotic device has a vacuum pressure variable between a minimum and a maximum, wherein maximum vacuum pressure may be automatically selected in the supplementary mode of operation. Hence, the robotic device may be configured to operate with maximum suction (maximum vacuum pressure) in the supplementary mode of operation. Advantageously, the accessory includes an accessory hose adapted to engage in fluid communication with a vacuum inlet on the vacuum cleaner.
In a preferred embodiment, the accessory hose is manoeuvrable by a user and the navigation system may be adapted to provide instructions to the motion control system so as to follow the user, while preferably concurrently avoiding obstacles and inhabitants within the environment.
In another embodiment, the navigation system is disabled in the supplementary mode of operation. In this embodiment, the accessory may include a handle. This facilitates manual cleaning of areas inaccessible to the robot when under autonomous control, for example stairs etc. Hence, the robotic vacuum cleaner of the invention can be a primary cleaner in a domestic dwelling, obviating an additional manual cleaner. Similarly, where the mobile robotic device comprises a lawnmower, the ability to manually mow areas of the lawn obviates an additional manual lawnmower.
According to a fourth aspect of the present invention there is now proposed a method of controlling a robotic device within an area to be traversed, the robotic device having a variable power requirement and a navigation system adapted to map features in an environment, the method comprising the steps of:
(i) in a first mode of operation, moving the robotic device in a substantially random motion within the area to be traversed whilst concurrently mapping the environment so as to create a summary map of the area to be traversed, wherein the robotic device is configured to use a minimum power consumption during said first mode of operation,
(ii) in a second mode of operation, moving the robotic device in at least one direction so as to map the environment in greater detail and to create a complete summary map of the area to be traversed, wherein the device is configured to use a increased power consumption during said second mode of operation. (iii) in a third mode of operation, moving the robotic device in a deterministic motion so as to provide optimum traversing of the space to be traversed, wherein the device is configured to use increased power consumption during said third mode of operation.
The secondary mapping apparatus as hereinbefore described with respect to the first aspect of the invention ensures safe operation of the robotic device during all operating modes, by helping to determine whether the robotic device can move into a given space. As will be readily appreciated, this is of particular importance during the first and second modes of operation while the summary map of the area to be traversed is being created and refined. Hence, the robot device is able to avoid precipices, to stop in the vicinity of objects within said area without colliding with them, and move in an intelligent manner around inhabitants within the area to be traversed.
In a preferred embodiment, the device may be configured only to use sufficient power to traverse the area and map the environment during said first mode of operation.
In use, the robotic device may operate in the first, second and third modes of operation in numerical sequence.
Alternatively, or in addition, the phase within which the robotic device operates is selected in response to a status condition.
In a preferred embodiment, the status condition is derived from a plurality of variables, each variable having a changeable weighting factor applied thereto so as to optimise the behaviour of the robotic device.
Advantageously, the variables are selected from: exploration of the area to be traversed, operation of the device, localisation within the environment, efficiency of operation and operating time.
In another preferred embodiment, the robotic device reverts to the first mode of operation in the event of a failure in the navigation system, or alternatively may switch itself off. The device may be a cleaner or mower or any other device mentioned above and the power consumption may relate to suction power, for example, for a cleaner or to brush action or dispensation of cleaning fluid. In the case of a vacuum cleaner, minimum suction or no suction may be used when operating in the first mode of operation (in particular where the device is configured only to use sufficient power to traverse the area and map the environment during said first mode of operation), and maximum suction may be used when operating in at least one of the second or third operating modes. Similarly, the power consumption may relate to the speed at which the device traverses the area. For a mower, power consumption may relate to operation of a cutting blade, strimmer or the like. For maintenance and repair and survey equipment power consumption may relate to operation of features on the device for carrying out such operations.
According to a further aspect of the present invention, there is now proposed the use of a visual simultaneous localisation and mapping (VSLAM) process to create a summary map of an environment, and a localised three-dimensional imaging technique to generate a local map of the environment, for autonomous navigation.
The invention will now be described, by example only, with reference to the accompanying drawings in which;
Figure 1 shows a block diagram of a navigation system according to one embodiment of the present invention.
Figure 2 shows a schematic diagram of a mobile robotic device incorporating the navigation system shown in figure 1.
Figure 3 illustrates the architecture of a robotic vacuum cleaner incorporating the navigation system according to an embodiment of the present invention.
Figure 4 shows an example of a map of an area to be cleaned created by the robotic vacuum cleaner of figure 3.
Referring now to the drawings wherein like reference numerals identify corresponding or similar elements throughout the several views, figure 1 shows a block diagram of a navigation system 2 according to one embodiment of the present invention. The navigation system 2 incorporates a primary mapping apparatus comprising a visual simultaneous localisation and mapping (VSLAM) apparatus and a secondary mapping apparatus consisting of an accurate three-dimensional mapping apparatus. The VSLAM apparatus and the three-dimensional mapping apparatus work in parallel and have complementary functions within the navigation system 2.
The VSLAM apparatus comprises a camera 4 arranged to capture images of the environment. The images are passed from the camera 4 to a feature detector 6 which processes the image to detect naturally occurring features within the environment, e.g. edges etc. The VSLAM apparatus uses a sophisticated vision algorithm 8 to build a summary map of the environment consisting of a set of unique "landmarks" derived from the features observed in the environment. The VSLAM summary map typically comprises a vector map of landmarks within environment.
The skilled person would be aware of a variety of VSLAM apparatuses. However, it should be noted that conventional VSLAM apparatuses are aimed at high accuracy applications, whereas the present VSLAM apparatus is configured to produce a summary map of the environment (a sparse map having few landmarks therein) rather than a detailed map of the environment. This is based on the realisation that, for many applications, adequate navigational performance may be achieved using an approximate knowledge of the global environment in conjunction with a separate detailed map of the local environment in order to assess whether the immediate environment is navigable, for example is free from encumbrances, obstacles, inhabitants etc.
It is noteworthy that, although the VSLAM apparatus produces a summary map of the environment having few landmarks, the location of each landmark within the VSLAM map is accurately known.
The accurate three-dimensional mapping apparatus provides the requisite detailed map of the local environment. The three-dimensional mapping apparatus comprises three-dimensional sensors 10 arranged to capture detailed information about the local environment. Electromechanical sensors and / or optical sensors are used. Typically, the sensors 10 could comprise scanning laser rangefinders. Preferably the sensors 10 are low cost, simple and eye safe.
The accurate three-dimensional mapping apparatus incorporates a processor 12 for processing the raw information from the three-dimensional sensor 10. The processor 12 provides a detailed map of the local environment in the vicinity of a point of current location within the environment. The detailed map typically comprises a collection of points within the environment measured with respect to the three-dimensional sensor 10 (also referred to hereinafter as point cloud information).
The summary map produced by the VSLAM apparatus and the detailed map from the accurate three-dimensional mapping apparatus are passed to a navigation controller 14 which provides instructions to a motion control system 16 in order to control movement of a mobile platform, for example a robot, within the mapped environment.
When used in a mobile robotics application, the navigation system 2 uses recognised landmarks to estimate the position of the robot within the environment. Furthermore, the VSLAM apparatus updates and refines the summary map as the robot revisits mapped areas, allowing it to adapt to changes in the environment. In the case of a domestic mobile robot, such changes in the environment may arise due to rearrangement of furniture or objects within a room etc.
Referring to figure 2, a schematic block diagram of a mobile robotic device 20 incorporating the navigation system of figure 1 is shown. The robotic device 20 incorporates the elements of the navigation system described above. However, in this particular embodiment, the separate camera 4 and three-dimensional sensors 10 are replaced by a single common sensor in the form of a camera 22. The camera is a conventional video camera, typically a CMOS camera, alternatively a charge-coupled device (CCD).
The camera is mounted on the robotic device 20 so as to capture images of the environment in front of the robotic device within the field of view 24 of the camera 22. The robotic device 20 also incorporates a structured light generator comprising a spot projector 26, operating at visible or infrared wavelengths, to provide information about the local environment to the accurate three-dimensional mapping apparatus. The principle of operation of the accurate three-dimensional mapping apparatus is as follows. During use, the spot projector 26 projects a plurality of spots into the environment. The camera 22 is arranged to capture images of the environment including the array of spots projected therein. The processor 12 analyses the images from the camera 22 and determines the position with respect to the camera of each projected spot within the environment, for example using triangulation / trigonometry. The processor 12 preferably includes a pattern processor adapted to determine from the detected position of a spot in the environment, the range to that spot. The detected position and range information are then used to create a detailed three-dimensional map of the environment local to the robotic device.
In one embodiment, the spot projector 26 is adapted to project a two dimensional array of spots into the environment. A suitable spot projector is described in PCT patent application publication WO 2004/044523 the content of which is hereby incorporated by reference thereto.
By way of a brief explanation, the spot projector 26 comprises a light source arranged to illuminate part of an input face of a light guide, the light guide comprising a tube having substantially reflective sides and being arranged together with projection optics so as to project an array of distinct images of the light source into the environment. The light guide in effect operates as a kaleidoscope. Light from the source is reflected from the sides of the tube and can undergo a number of reflection paths within the tube. The result is that multiple images of the light source are produced and projected into the environment. Thus the environment is illuminated with an array of images of the light source. Where the source is a simple light emitting diode the scene is therefore illuminated with an array of spots of light. The light guide kaleidoscope gives very good image replication characteristics and projects images of the input face of the light guide in a wide angle, i.e. a large number of spots are projected in all directions. Further the kaleidoscope produces a large depth of field and so delivers a large operating window. The light guide comprises a tube with substantially reflective walls and a constant cross section which is conveniently a regular polygon. Having a regular cross section means that the array of images of the light source will also be regular which is advantageous for ensuring the whole environment is covered and eases processing. A square section tube is most preferred. Typically, the light guide has a cross sectional area in the range of a few square millimetres to a few tens of square millimetres, for instance the cross sectional area may be in the range of 1 - 50mm2. As mentioned the light guide preferably has a regular shape cross section with a longest dimension of a few millimetres, say 1 - 5mm. One embodiment as mentioned is a square section tube having a side length of 2-3mm. The light guide may have a length of a few tens of millimetres, a light guide may be between 10 and 70mm long. Such light guides can generate a grid of spots over an angle of 50-100 degrees (typically about twice the total internal angle within the light guide).
The tube comprises a hollow tube having reflective internal surfaces, i.e. mirrored internal walls. Alternatively the tube is fabricated from a solid material and arranged such that a substantial amount of light incident at an interface between the material of the tube and surrounding material undergoes total internal reflection. The tube material is either coated in a coating with a suitable refractive index or designed to operate in air, in which case the refractive index of the light guide material should be such that total internal reflection occurs at the material air interface.
Using a tube like this as a light guide results in multiple images of the light source being generated which can be projected to the environment to form the array of spots. The light guide is easy to manufacture and assemble and couples the majority of the light from the source to the scene. Thus low power sources such as light emitting diodes can be used. As the exit aperture can be small, the apparatus also has a large depth of field which makes it useful for ranging applications which require spots projected that are separated over a wide range of distances.
Either individual light sources are used close to the input face of the light guide to illuminate just part of the input face or one or more light sources are used to illuminate the input face of the light guide through a mask. Using a mask with transmissive portion for passing light to a part of the light guide can be easier than using individual light sources. Accurate alignment of the mask is required at the input face of the light guide but this may be easier than accurately aligning an LED or LED array.
Where a mask is used the illumination means comprises a homogeniser located between the light source and the mask so as to ensure that the mask is evenly illuminated. The light source may therefore be any light source giving an acceptable level of brightness and does not need accurate alignment. Alternatively an LED with oversized dimensions is used to relax tolerances in manufacture/alignment.
The projection optics comprise a projection lens. The projection lens is located adjacent the output face of the light guide. In some embodiments where the light guide is solid the lens may be integral to the light guide, i.e. the tube may be shaped at the output face to form a lens.
All beams of light projected by the spot projector 26 pass through the end of the light guide and can be thought of as originating from the point at the centre of the end face of the light guide. The projection optics can then comprise a hemispherical lens and if the centre of the hemisphere coincides with the centre of the light guide output face the apparent origin of the beams remains at the same point, i.e. each projected image has a common projection origin. In this arrangement the projector does not have an axis as such as it can be thought of a source of beams radiating across a wide angle. What matters for the imaging apparatus therefore is the geometrical relationship between the point of origin of the beams and the principal point of the lens of the camera 22.
The spot projector 26 is controlled by a master clock 32 which provides synchronisation between the monochromatic camera 22, the spot projector 26 and a de-multiplexer 30 connected to the output of the camera 22.
During operation, the output from the camera 22 is switched between the primary mapping apparatus (the VSLAM apparatus) denoted by numerals 6 and 8 and a secondary mapping apparatus (the accurate three-dimensional mapping apparatus denoted by numerals 12 and 14). The clock 32 controls the spot projector '26 and the de-multiplexer 30 such that a pattern of spots is projected into the local environment at the same time as the output from camera 22 is routed to the accurate three- dimensional mapping apparatus. The clock 32 and the de-multiplexer 30 are configured such that images (frames) produced by the camera 22 are sent alternately to the VSLAM apparatus 6, 8 and the accurate three-dimensional mapping apparatus 12, 14.
In order to move autonomously within the environment, the robotic device 20 includes a motion control system 16 comprising a driving unit having wheels and at least one motor arranged to provide a motive force to the wheels. The motion control system 16 provides control signals to the motor(s) in response to instructions received from the navigation controller 14 so as to move the robotic device 20 within the environment.
To constrain the computational power required, it is helpful if the VSLAM algorithm 8 has some knowledge of the movement of the robotic vehicle 20. Accordingly, information about the robot's movement, either from the use of a stepper motor or a wheel rotation encoder, is fed back to the VSLAM algorithm 8.
When the mobile robotic device is used indoors, the field of view 24 of the camera optionally extends to image a portion of the ceiling or roof of the room or internal space within which the robot is located. In the event that the field of view 24 of the forward looking camera is insufficient, a separate camera may be used to capture images of the ceiling or roof.
Where the ceiling or roof is visible, the VSLAM apparatus maps it and indirectly determines the floor plan which is directly below. This hitherto unknown technique provides the benefit that individual rooms can more easily be defined, for example since door openings do not normally extend to the ceiling. Also, as the ceiling of a room is typically large, plain and white it may be an easy object to see even with a low performance camera. In use, doors can now be open or shut, as the user desires, with the robot still able to map and navigate one room before knowingly moving on to the next one.
Using a ceiling to create a VSLAM map of a room overcomes other disadvantages; for example when trying to map a room from a video image taken at floor level, the obstacles (chairs, tables, etc) obstruct a large portion of the view of the walls making it hard to immediately generate a map, of room boundaries. Also, from floor level it may not be obvious to the navigation system whether a gap is a doorway or merely a space between two pieces of furniture. Hence it would not be clear where the room boundaries were if the doors are left open. This becomes important in the case of robotic cleaner which should logically clean rooms one at a time, and not wander accidentally out of the room before it has finished cleaning it. The foregoing obviates the use of light beam units which are traditionally placed across a doorway in the prior art and which trigger a sensor on the robot to stop it leaving the room.
Referring to Figure 3, the architecture of a robotic vacuum cleaner incorporating the navigation system according to an embodiment of the present invention is illustrated.
The robotic vacuum cleaner architecture is based broadly on that shown in figure 2 for a generic mobile robotic device. In addition, the present robotic vacuum cleaner comprises a vacuum unit (not shown in figure 3) installed on a main body of the cleaner. The vacuum unit is arranged to collect particles in an underlying floor surface by drawing-in air. The vacuum unit can be of known construction, for example including a vacuum motor and a vacuum chamber for collecting particles from the air which is drawn in through a vacuum hole or pipe positioned adjacent the floor surface.
Optionally, the vacuum unit includes a supplementary vacuum inlet adapted to engage in fluid communication with an accessory, for example an external accessory hose.
Optionally, the suction unit has a dirt sensor (56) adapted to provide information about the level and rate of uptake of dirt from the floor, thereby giving a measure of how dirty is the area of floor currently being cleaned.
With regard to the navigation system used in the robotic vacuum cleaner according to the present embodiment, the VSLAM algorithm 8 is configured to extract features from the environment using mechanisms such as scale invariant feature transforms (Sl FTs) and Harris Corners. These mechanisms can be extended to make use of contextual information, such as plane constrained features from ceilings and walls, to reduce the computational complexity (and hence reduce the data processing load) and potentially increase reliability.
When instructed to commence cleaning, the robotic vacuum cleaner initially adopts a first mode of operation, moving in a substantially random motion within the area to be cleaned whilst concurrently mapping the environment and creating a summary VSLAM map of the area to be cleaned. The robotic vacuum cleaner is configured to use a minimum vacuum pressure (minimum suction) during said first mode of operation. Obstacles within the area to be cleaned are detected by the three dimensional mapping apparatus 12 during this first mode of operation and consequently avoided.
The foregoing un-delayed initialisation technique gives a quick, robust and efficient three-dimensional estimate of the environment from the two-dimensional images produced by the camera 22 while also allowing fast low memory filter initialisation. Optionally, odometry sensing (for example from wheel rotation encoders etc.) is used to reduce the computational complexity of the VSLAM system.
The VSLAM system 8 produces an estimate of the position of the robot within the environment with associated confidences, and a summary map of spatially located features within the environment. The position estimate is used to fuse sensor information into the internal map 44, to aid trajectory following, and, using information theory, to provide trajectory constraints on the planning algorithm 46 to maximise map and localisation quality whilst the vacuum cleaner is cleaning.
Optionally, an optical flow system 40 uses passive sensing to provide an estimate of the relative distances of objects moving relative to the camera 22. The optical flow system 40 monitors the movement of objects or texture within the images from the camera 22 that result from the camera's movements within the environment, and derives motion cues there-from. The optical flow system 40 uses these motion cues to provide an estimate of the distance to an object within the environment based on the rate at which the object moves in the images produced by the camera. By way of a simple example, objects close to camera will appear to move within the images at a higher rate than objects distant there-from. This relative information is used in conjunction with the VSLAM summary map by a position registration system 42 to produce a more accurate, absolute estimate of the position and orientation of surfaces within the sensor field of view.
An internal mapping system 44 provides an estimate of the state of the environment within which the robotic vacuum cleaner is operating; it is updated by fusing point cloud information from the three dimensional mapping apparatus 12 and the optical flow system 40 with contextual information from the VSLAM summary map. In this respect, the internal mapping system 44 creates an internal map of the environment . This set of sensor information provides a variable accuracy estimate of conditions within the environment dependent on the locality of the robot; this allows high accuracy trajectory control at short range, and room size estimation.
As pointed out previously, the information from the detailed three dimensional mapping apparatus 12 is not incorporated into the VSLAM summary map; rather the information from the detailed three dimensional mapping apparatus is used in the internal map 44 to identifying navigable and non-navigable points within the VSLAM summary map.
The foregoing confers additional benefits in terms of automatic timing and power control. For example, the robotic vacuum cleaner will rapidly know the size of the room within which it is operating (and the location of any obstacles within the room) and will have a navigation strategy to cover the floor space. Accordingly, it can calculate how long it will take to clean a particular room. So, instead of moving randomly about the room until the power runs out, a robotic vacuum cleaner having the present navigation system can display an estimate of how many minutes are left before it finishes cleaning the room.
Also, as the robotic vacuum cleaner knows how much power it has in its battery, it can optimise the use of that power to clean a given area. For example, if it is in a small room, then rather than finish the room at normal suction and still have some capacity left in its battery, it could automatically boost the suction power so its completes the job in the same time, but cleans better. It can then fully recharge its battery, thereby avoiding any memory effect which could reduce battery capacity.
The robotic cleaner includes a utility calculation system 48 which uses information relating to current and predicted states of the robotic vacuum cleaner and the environment within which it is operating, and calculated optimal strategy, to produce an estimate of current and future system utility costs. The use of system utility costs provides a means by which the robotic vacuum cleaner can assess and prioritise a set of available actions, and select an appropriate course of action there-from.
The above-mentioned utility costs are calculated for the variables exploration, cleaning, localisation, efficiency, and cleaning time. These utility costs are then combined as per Equation 1 below to determine the appropriate set of actions (e.g. route) to take next. The use of a utility based selection scheme allows the behaviour of the robotic vacuum cleaner to be optimised by switching between exploration at system start-up and cleaning once localised by altering the performance weights. The utility based selection scheme also enables fine control of the behaviour within those strategies to be automatically controlled while the robot is operating.
System Utility Cost = ax .by .cz (Equation 1)
(where a, b, c are individual utility costs and x, y, z are relative performance weights)
The trajectory controller 16 (also referred to hereinbefore as the motion control system) is enabled by a robot localisation estimate provided by the VSLAM system 8; and provides local motion control of the robotic cleaner for negotiation of obstacles and emergency collision avoidance; this provides a second level of avoidance of all detectable obstacles. In the interests of clarity, the function of the navigation controller 14 referred to previously with respect to figures 1 and 2 is performed collectively by the internal mapping system 44, the global planner 46, the utility calculator 48, and the command fusion system 50.
Figure 4 shows an example of'a map of an area to be cleaned created by the robotic vacuum cleaner illustrated in figure 3.
The information contained within the internal map includes cleaned, un-cleaned, cleanable, un-cleanable, unknown, un-passable areas, cleaning and exploration frontiers, and optionally dirt level. The optional dirt sensor 56 provides information about the level and rate of uptake of dirt, thereby giving a measure of how dirty is the area of floor currently being cleaned. This information is used during a single cleaning cycle to optimise the intensity or repetition of cleaning a particular area, and used over multiple cycles to predict both when and where the floor needs cleaning.
A detailed knowledge of the area to be cleaned and delineation of the space into separate rooms enables the user interface to comprise merely two buttons; one button labelled "clean this room", and the other one "clean the apartment" / "clean the floor", i.e. all rooms on this floor. The navigation system according to the present invention has been described herein primarily in terms of its applications to domestic robotic devices, in particular with reference to an autonomous domestic vacuum cleaner. However, the present navigation system is not restricted to such applications and the skilled person will readily appreciate alternative applications of the present navigation system. By way of further example, a robotic device incorporating the present navigation system could be programmed to act as a guard, searching rooms, looking for activity out of hours, etc. Alternatively, such a robotic device could be used to move material around a warehouse, if connected to a trolley or forklift, for example. Additional domestic robotic applications include infotainment / entertainment, robotic pets, etc. Further uses envisaged for the navigation system of the present invention are for robots used for painting work, for repair and maintenance work, for surveying, e.g. topology and for flaw and crack detection in difficult-to-access areas.
In view of the foregoing description it will be evident to a person skilled in the art that various modifications may be made within the scope of the invention.
The scope of the present disclosure includes any novel feature or combination of features disclosed therein either explicitly or implicitly or any generalisation thereof irrespective of whether or not it relates to the claimed invention or mitigates any or all of the problems addressed by the present invention. The applicant hereby gives notice that new claims may be formulated to such features during the prosecution of this application or of any such further application derived there from. In particular, with reference to the appended claims, features from dependent claims may be combined with those of the independent claims and features from respective independent claims may be combined in any appropriate manner and not merely in the specific combinations enumerated in the claims.

Claims

Claims
1. A navigation system comprising a primary mapping apparatus adapted to detect features within an environment and to create a summary map of the environment including an estimate of a point of current location within the environment; a secondary mapping apparatus adapted to provide a detailed three-dimensional map of the local environment in the vicinity of the point of current location; and a processor adapted to determine navigable points within the environment by combining information from the summary map and the detailed map.
2. A navigation system according to claim 1 wherein the processor is configured to provide instructions to a motion control system so as to navigate from the point of current location to another navigable point within the environment.
3. A navigation system according to claim 1 or 2, the primary mapping apparatus having an optical sensor adapted to detect the features within the environment and wherein the mapping apparatus utilises a visual simultaneous localisation and mapping (VSLAM) process to create the summary map of the environment.
4. A navigation system according to any of the preceding claims wherein secondary mapping apparatus comprises an imaging apparatus having an imaging sensor and a structured light generator.
5. A navigation apparatus according to claim 4 wherein the imaging apparatus comprises at least one of a spot projector and a pattern projector.
6. A navigation apparatus according to claim 4 or 5 when directly or indirectly dependent on claim 3 wherein the imaging sensor and the optical sensor comprise a common sensor.
7. A navigation apparatus according to claim 6 wherein the common sensor is one of a video camera, a CMOS camera and a charge-coupled device (CCD).
8. A navigation apparatus according to any of claims 3 - 7 wherein the optical sensor is arranged to have a field of view which includes an upward direction.
9. A navigation apparatus according to claim 8 wherein the optical sensor is arranged, in use, to detect features disposed on a substantially planar surface within the environment, and the mapping apparatus is adapted to create from said detected features a summary map of the environment underlying said planar surface.
10. A vehicle having a navigation system according to any of claims 1 - 9.
11. A mobile robotic device having a navigation system according to any of claims 1 - 9.
12. A mobile robotic device according to claim 11 comprising any of a domestic robotic device, an industrial robotic device, a security robotic device, a floor cleaner, a vacuum cleaner, a lawn mower, a robotic entertainment device, a maintenance device, a surveying device, a fork lift truck, and a car.
13. A mobile robotic device according to claim 12 adapted to receive a detachable accessory, wherein the robotic device adopts a supplementary mode of operation when the accessory is attached to the robotic device.
14. A mobile robotic device according to claim 13 comprising a vacuum cleaner having a vacuum pressure variable between a minimum and a maximum, wherein maximum vacuum pressure is automatically selected in the supplementary mode of operation.
15. A mobile robotic device according to claim 14 wherein the accessory comprises an accessory hose adapted to engage in fluid communication with a vacuum inlet on the vacuum cleaner.
16. A mobile robotic device according to claim 15 in which the accessory hose is manoeuvrable by a user, wherein the navigation system is adapted , to provide instructions to the motion control system so as to follow the user.
17. A mobile robotic device according to any of claims 13 - 15 wherein the navigation system is disabled in the supplementary mode of operation.
18. A method of controlling a robotic device within an area to be traversed, the robotic device having a variable power requirement and a navigation system adapted to map features in an environment, the method comprising the steps of:
(i) in a first mode of operation, moving the robotic device in a substantially random motion within the area to be traversed whilst concurrently mapping the environment and creating a summary map of the area to be traversed, wherein the device is configured to use a minimum power consumption during said first mode of operation,
(ii) in a second mode of operation, moving the robotic device in at least one direction so as to map the environment in greater detail and to create a complete summary map of the area to be traversed, wherein the device is configured to use increased power consumption during said second mode of operation.
(iii) in a third mode of operation, moving the robotic device in a deterministic motion so as to provide optimum traversing of the space, wherein the device is configured to use a increased power consumption during said third mode of operation.
19. A method according to claim 18 wherein the device is configured only to use sufficient power to traverse the area and map the environment during said first mode of operation.
20. A method according to claim 18 or 19 wherein, in use, the robotic device operates in the first, second and third modes of operation in numerical sequence.
21. A method according to claim 18 or 19 wherein the mode within which the robotic device operates is selected in response to a status condition.
22. A method according to 21 wherein the status condition is derived from a plurality of variables, each variable having a changeable weighting factor applied thereto so as to optimise the behaviour of the robotic device.
23. A method according to claim 22 wherein the variables are selected from exploration of the area to be traversed, operation of the robotic device, localisation within the environment, efficiency of operation and operating time.
24. A method according to any of claims 21 - 23 wherein the robotic device reverts to the first mode of operation in the event of a failure in the navigation system.
25. A method according to any of claims 18 - 24 wherein the robotic device is a vacuum cleaner and the steps of configuring the device to use minimum and increased power consumption comprise configuring the vacuum cleaner to use minimum and increased suction power, respectively.
26. Use of a visual simultaneous localisation and mapping (VSLAM) process to create a summary map of an environment, and a localised three-dimensional imaging technique to generate a local map of the environment, for autonomous navigation.
PCT/GB2006/003958 2005-10-31 2006-10-25 Navigation system WO2007051972A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US73181605P 2005-10-31 2005-10-31
US60/731,816 2005-10-31
GB0522153.6 2005-10-31
GB0522153A GB0522153D0 (en) 2005-10-31 2005-10-31 Navigation system

Publications (1)

Publication Number Publication Date
WO2007051972A1 true WO2007051972A1 (en) 2007-05-10

Family

ID=37559849

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2006/003958 WO2007051972A1 (en) 2005-10-31 2006-10-25 Navigation system

Country Status (1)

Country Link
WO (1) WO2007051972A1 (en)

Cited By (106)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2058720A2 (en) * 2007-11-09 2009-05-13 Samsung Electronics Co., Ltd. Apparatus and method for generating three-dimensional map using structured light
EP2169507A2 (en) * 2008-09-11 2010-03-31 Deere & Company Distributed knowledge base method for vehicular localization and work-site management
EP2169505A2 (en) * 2008-09-11 2010-03-31 Deere & Company Distributed knowledge base for vehicular localization and work-site management
EP2252190A2 (en) * 2008-01-28 2010-11-24 Seegrid Corporation Service robot and method of operating same
WO2010141209A1 (en) * 2009-06-01 2010-12-09 Robert Bosch Gmbh Method and apparatus for combining three-dimensional position and two-dimensional intensity mapping for localization
WO2012013280A1 (en) * 2010-07-29 2012-02-02 Faro Technologies Inc. Device for optically scanning and measuring an environment
US20120121161A1 (en) * 2010-09-24 2012-05-17 Evolution Robotics, Inc. Systems and methods for vslam optimization
US8384914B2 (en) 2009-07-22 2013-02-26 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8433442B2 (en) 2008-01-28 2013-04-30 Seegrid Corporation Methods for repurposing temporal-spatial information collected by service robots
CN103325296A (en) * 2012-03-19 2013-09-25 联想(北京)有限公司 Information processing method and equipment used for simultaneous localization and mapping
WO2013153415A1 (en) * 2012-04-11 2013-10-17 Balyo Apparatus and method for determining reference elements of an environment
EP2294960A3 (en) * 2009-09-11 2013-11-20 Vorwerk & Co. Interholding GmbH Method for operating a cleaning robot
EP2677386A1 (en) * 2012-06-18 2013-12-25 LG Electronics Inc. Robot cleaner and obstacle detection control method of the same
US8625106B2 (en) 2009-07-22 2014-01-07 Faro Technologies, Inc. Method for optically scanning and measuring an object
DE102012112401B3 (en) * 2012-12-17 2014-03-27 Miele & Cie. Kg Self-propelled robot and method for determining a rotational position of at least one drive wheel of a self-propelled robot
DE102012109481A1 (en) * 2012-10-05 2014-04-10 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8699007B2 (en) 2010-07-26 2014-04-15 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8705012B2 (en) 2010-07-26 2014-04-22 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8705016B2 (en) 2009-11-20 2014-04-22 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8719474B2 (en) 2009-02-13 2014-05-06 Faro Technologies, Inc. Interface for communication between internal and external devices
US8730477B2 (en) 2010-07-26 2014-05-20 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8755936B2 (en) 2008-01-28 2014-06-17 Seegrid Corporation Distributed multi-robot system
DE102013100192A1 (en) * 2013-01-10 2014-07-10 Miele & Cie. Kg Self-propelled robot and method for distance determination in a self-propelled robot
US8830485B2 (en) 2012-08-17 2014-09-09 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8892256B2 (en) 2008-01-28 2014-11-18 Seegrid Corporation Methods for real-time and near real-time interactions with robots that service a facility
EP2804065A1 (en) * 2013-05-17 2014-11-19 MSI Computer (Shenzhen) Co., Ltd. Mobile device for generating first map information for an electronic device and receiving second map information from the electronic device
US8896819B2 (en) 2009-11-20 2014-11-25 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8989972B2 (en) 2008-09-11 2015-03-24 Deere & Company Leader-follower fully-autonomous vehicle with operator on side
US8997362B2 (en) 2012-07-17 2015-04-07 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine with optical communications bus
US9009000B2 (en) 2010-01-20 2015-04-14 Faro Technologies, Inc. Method for evaluating mounting stability of articulated arm coordinate measurement machine using inclinometers
EP2584522A3 (en) * 2011-10-23 2015-04-22 Bond Textile Research BV A system for assessing a spatial use load of a building, a method for using the system, a dirt collector device and a method of covering a floor, both for use in the said system or method
US9026315B2 (en) 2010-10-13 2015-05-05 Deere & Company Apparatus for machine coordination which maintains line-of-site contact
CN104714547A (en) * 2013-12-12 2015-06-17 赫克斯冈技术中心 Autonomous gardening vehicle with camera
WO2015090397A1 (en) * 2013-12-19 2015-06-25 Aktiebolaget Electrolux Robotic cleaning device
US9074883B2 (en) 2009-03-25 2015-07-07 Faro Technologies, Inc. Device for optically scanning and measuring an environment
EP2894533A2 (en) 2013-10-31 2015-07-15 LG Electronics Inc. Mobile robot and operating method thereof
US9113023B2 (en) 2009-11-20 2015-08-18 Faro Technologies, Inc. Three-dimensional scanner with spectroscopic energy detector
CN104916216A (en) * 2015-06-26 2015-09-16 深圳乐行天下科技有限公司 Map construction method and system thereof
DE102014012811A1 (en) * 2014-03-27 2015-10-01 Miele & Cie. Kg Floor cleaning apparatus and method and system for determining a floor plan by a self-propelled floor cleaning device
US9163922B2 (en) 2010-01-20 2015-10-20 Faro Technologies, Inc. Coordinate measurement machine with distance meter and camera to determine dimensions within camera images
US9168654B2 (en) 2010-11-16 2015-10-27 Faro Technologies, Inc. Coordinate measuring machines with dual layer arm
US20150313432A1 (en) * 2014-05-02 2015-11-05 Lg Electronics Inc. Cleaner
US9188980B2 (en) 2008-09-11 2015-11-17 Deere & Company Vehicle with high integrity perception system
CN105094127A (en) * 2014-05-15 2015-11-25 Lg电子株式会社 Cleaner and method of controlling cleaner
US9210288B2 (en) 2009-11-20 2015-12-08 Faro Technologies, Inc. Three-dimensional scanner with dichroic beam splitters to capture a variety of signals
CN105123090A (en) * 2015-07-16 2015-12-09 张萍 Multifunctional intelligent mower based on internet of things
USRE45854E1 (en) 2006-07-03 2016-01-19 Faro Technologies, Inc. Method and an apparatus for capturing three-dimensional data of an area of space
WO2016054691A1 (en) * 2014-10-07 2016-04-14 Commonwealth Scientific And Industrial Research Organisation A method of setting up a tracking system
JP2016052514A (en) * 2014-09-03 2016-04-14 ダイソン・テクノロジー・リミテッド Mobile robot
US9329271B2 (en) 2010-05-10 2016-05-03 Faro Technologies, Inc. Method for optically scanning and measuring an environment
US9378554B2 (en) 2014-10-09 2016-06-28 Caterpillar Inc. Real-time range map generation
US9417056B2 (en) 2012-01-25 2016-08-16 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9417316B2 (en) 2009-11-20 2016-08-16 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9513107B2 (en) 2012-10-05 2016-12-06 Faro Technologies, Inc. Registration calculation between three-dimensional (3D) scans based on two-dimensional (2D) scan data from a 3D scanner
US9529083B2 (en) 2009-11-20 2016-12-27 Faro Technologies, Inc. Three-dimensional scanner with enhanced spectroscopic energy detector
US9551575B2 (en) 2009-03-25 2017-01-24 Faro Technologies, Inc. Laser scanner having a multi-color light source and real-time color receiver
US9563204B2 (en) 2012-08-14 2017-02-07 Husqvarna Ab Mower with object detection system
US9607239B2 (en) 2010-01-20 2017-03-28 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US9628775B2 (en) 2010-01-20 2017-04-18 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
EP3168704A1 (en) * 2015-11-12 2017-05-17 Hexagon Technology Center GmbH 3d surveying of a surface by mobile vehicles
EP2741161A3 (en) * 2012-12-10 2017-07-12 Miele & Cie. KG Autonomous apparatus for processing a surface and method for detecting the position of the autonomous apparatus
US9811089B2 (en) 2013-12-19 2017-11-07 Aktiebolaget Electrolux Robotic cleaning device with perimeter recording function
US9939529B2 (en) 2012-08-27 2018-04-10 Aktiebolaget Electrolux Robot positioning system
US9946263B2 (en) 2013-12-19 2018-04-17 Aktiebolaget Electrolux Prioritizing cleaning areas
CN107920709A (en) * 2015-09-03 2018-04-17 伊莱克斯公司 Robotic cleaning device system
US10045675B2 (en) 2013-12-19 2018-08-14 Aktiebolaget Electrolux Robotic vacuum cleaner with side brush moving in spiral pattern
EP3360455A1 (en) * 2010-07-12 2018-08-15 LG Electronics Inc. Robot cleaner and controlling method of the same
US10067231B2 (en) 2012-10-05 2018-09-04 Faro Technologies, Inc. Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner
CN108692701A (en) * 2018-05-28 2018-10-23 佛山市南海区广工大数控装备协同创新研究院 Mobile robot Multi-sensor Fusion localization method based on particle filter
CN108896027A (en) * 2018-06-04 2018-11-27 河南工程学院 A kind of landform plotting board for building
US10144342B2 (en) 2014-09-03 2018-12-04 Dyson Technology Limited Mobile robot
US10149589B2 (en) 2013-12-19 2018-12-11 Aktiebolaget Electrolux Sensing climb of obstacle of a robotic cleaning device
US10175037B2 (en) 2015-12-27 2019-01-08 Faro Technologies, Inc. 3-D measuring device with battery pack
DE102017118227A1 (en) * 2017-08-10 2019-02-14 Vorwerk & Co. Interholding Gmbh Soil cultivation device with an accumulator
US10219665B2 (en) 2013-04-15 2019-03-05 Aktiebolaget Electrolux Robotic vacuum cleaner with protruding sidebrush
US10231591B2 (en) 2013-12-20 2019-03-19 Aktiebolaget Electrolux Dust container
WO2019063299A1 (en) * 2017-09-26 2019-04-04 Robert Bosch Gmbh Method and system for mapping and locating a vehicle based on radar measurements
US10281259B2 (en) 2010-01-20 2019-05-07 Faro Technologies, Inc. Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features
WO2019101651A1 (en) * 2017-11-27 2019-05-31 Robert Bosch Gmbh Method and device for operating a mobile system
US10433697B2 (en) 2013-12-19 2019-10-08 Aktiebolaget Electrolux Adaptive speed control of rotating side brush
US10448794B2 (en) 2013-04-15 2019-10-22 Aktiebolaget Electrolux Robotic vacuum cleaner
EP3417755A4 (en) * 2016-02-16 2019-11-20 Toshiba Lifestyle Products & Services Corporation Autonomous traveling device
US10499778B2 (en) 2014-09-08 2019-12-10 Aktiebolaget Electrolux Robotic vacuum cleaner
US10518416B2 (en) 2014-07-10 2019-12-31 Aktiebolaget Electrolux Method for detecting a measurement error in a robotic cleaning device
US10534367B2 (en) 2014-12-16 2020-01-14 Aktiebolaget Electrolux Experience-based roadmap for a robotic cleaning device
US10599149B2 (en) 2015-10-09 2020-03-24 SZ DJI Technology Co., Ltd. Salient feature based vehicle positioning
US10617271B2 (en) 2013-12-19 2020-04-14 Aktiebolaget Electrolux Robotic cleaning device and method for landmark recognition
WO2020076422A3 (en) * 2018-10-12 2020-05-14 Boston Dynamics, Inc. Autonomous map traversal with waypoint matching
US10678251B2 (en) 2014-12-16 2020-06-09 Aktiebolaget Electrolux Cleaning method for a robotic cleaning device
US10729297B2 (en) 2014-09-08 2020-08-04 Aktiebolaget Electrolux Robotic vacuum cleaner
EP3569125A4 (en) * 2017-01-13 2020-10-28 LG Electronics Inc. -1- Cleaner and method for controlling same
CN111973076A (en) * 2020-08-21 2020-11-24 苏州三六零机器人科技有限公司 Room attribute identification method and device, sweeping robot and storage medium
US10874271B2 (en) 2014-12-12 2020-12-29 Aktiebolaget Electrolux Side brush and robotic cleaner
US10877484B2 (en) 2014-12-10 2020-12-29 Aktiebolaget Electrolux Using laser sensor for floor type detection
CN112462768A (en) * 2020-11-25 2021-03-09 深圳拓邦股份有限公司 Mobile robot navigation map creating method and device and mobile robot
EP3794921A1 (en) 2019-09-23 2021-03-24 Fabrizio Bernini Lawn mower robot and corresponding process
US11099554B2 (en) 2015-04-17 2021-08-24 Aktiebolaget Electrolux Robotic cleaning device and a method of controlling the robotic cleaning device
US11122953B2 (en) 2016-05-11 2021-09-21 Aktiebolaget Electrolux Robotic cleaning device
EP3770711A4 (en) * 2018-03-19 2021-09-29 Amicro Semiconductor Co., Ltd. Method for repositioning robot
US11169533B2 (en) 2016-03-15 2021-11-09 Aktiebolaget Electrolux Robotic cleaning device and a method at the robotic cleaning device of performing cliff detection
US11172605B2 (en) 2016-06-30 2021-11-16 Tti (Macao Commercial Offshore) Limited Autonomous lawn mower and a system for navigating thereof
US11172608B2 (en) 2016-06-30 2021-11-16 Tti (Macao Commercial Offshore) Limited Autonomous lawn mower and a system for navigating thereof
US11474533B2 (en) 2017-06-02 2022-10-18 Aktiebolaget Electrolux Method of detecting a difference in level of a surface in front of a robotic cleaning device
US11774247B2 (en) 2019-08-06 2023-10-03 Boston Dynamics, Inc. Intermediate waypoint generator
SE2250834A1 (en) * 2022-07-04 2024-01-05 Husqvarna Ab Improved determination of pose for a robotic work tool
US11921517B2 (en) 2017-09-26 2024-03-05 Aktiebolaget Electrolux Controlling movement of a robotic cleaning device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0358628A2 (en) * 1988-09-06 1990-03-14 Transitions Research Corporation Visual navigation and obstacle avoidance structured light system
US5323593A (en) * 1993-01-13 1994-06-28 Cline Lohn G Method and apparatus for mowing lawns
DE19753668A1 (en) * 1997-05-12 1998-11-19 Kwang Ju Electronics Co Ltd Remote-controlled vacuum cleaner
DE10323418A1 (en) * 2002-06-26 2004-01-22 Samsung Gwangju Electronics Co. Ltd. Automatic cleaning device and automatic cleaning system and method for controlling them

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0358628A2 (en) * 1988-09-06 1990-03-14 Transitions Research Corporation Visual navigation and obstacle avoidance structured light system
US5323593A (en) * 1993-01-13 1994-06-28 Cline Lohn G Method and apparatus for mowing lawns
DE19753668A1 (en) * 1997-05-12 1998-11-19 Kwang Ju Electronics Co Ltd Remote-controlled vacuum cleaner
DE10323418A1 (en) * 2002-06-26 2004-01-22 Samsung Gwangju Electronics Co. Ltd. Automatic cleaning device and automatic cleaning system and method for controlling them

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
MALLOT H A; ET AL: "View-based cognitive map learning by an autonomous robot", PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON ARTIFICIAL NEURAL NETWORKS, XX, XX, 1995, pages 381 - 386, XP007901498 *
TOMONO M: "Environment modeling by a mobile robot with a laser range finder and a monocular camera", ADVANCED ROBOTICS AND ITS SOCIAL IMPACTS, 2005. IEEE WORKSHOP ON NAGOYA, JAPAN JUNE 12-15, 2005, PISCATAWAY, NJ, USA,IEEE, 12 June 2005 (2005-06-12), pages 133 - 138, XP010838219, ISBN: 0-7803-8947-6 *

Cited By (169)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE45854E1 (en) 2006-07-03 2016-01-19 Faro Technologies, Inc. Method and an apparatus for capturing three-dimensional data of an area of space
KR101461185B1 (en) * 2007-11-09 2014-11-14 삼성전자 주식회사 Apparatus and method for building 3D map using structured light
EP2058720A2 (en) * 2007-11-09 2009-05-13 Samsung Electronics Co., Ltd. Apparatus and method for generating three-dimensional map using structured light
EP2058720A3 (en) * 2007-11-09 2012-12-19 Samsung Electronics Co., Ltd. Apparatus and method for generating three-dimensional map using structured light
US9182763B2 (en) 2007-11-09 2015-11-10 Samsung Electronics Co., Ltd. Apparatus and method for generating three-dimensional map using structured light
US8838268B2 (en) 2008-01-28 2014-09-16 Seegrid Corporation Service robot and method of operating same
US8755936B2 (en) 2008-01-28 2014-06-17 Seegrid Corporation Distributed multi-robot system
EP2252190A2 (en) * 2008-01-28 2010-11-24 Seegrid Corporation Service robot and method of operating same
US8892256B2 (en) 2008-01-28 2014-11-18 Seegrid Corporation Methods for real-time and near real-time interactions with robots that service a facility
EP2252190A4 (en) * 2008-01-28 2011-04-06 Seegrid Corp Service robot and method of operating same
US9603499B2 (en) 2008-01-28 2017-03-28 Seegrid Corporation Service robot and method of operating same
US8433442B2 (en) 2008-01-28 2013-04-30 Seegrid Corporation Methods for repurposing temporal-spatial information collected by service robots
US9188980B2 (en) 2008-09-11 2015-11-17 Deere & Company Vehicle with high integrity perception system
US8989972B2 (en) 2008-09-11 2015-03-24 Deere & Company Leader-follower fully-autonomous vehicle with operator on side
US9235214B2 (en) 2008-09-11 2016-01-12 Deere & Company Distributed knowledge base method for vehicular localization and work-site management
EP2169507A2 (en) * 2008-09-11 2010-03-31 Deere & Company Distributed knowledge base method for vehicular localization and work-site management
EP2169505A3 (en) * 2008-09-11 2014-11-12 Deere & Company Distributed knowledge base for vehicular localization and work-site management
EP2169507A3 (en) * 2008-09-11 2014-11-12 Deere & Company Distributed knowledge base method for vehicular localization and work-site management
EP2169505A2 (en) * 2008-09-11 2010-03-31 Deere & Company Distributed knowledge base for vehicular localization and work-site management
US9274524B2 (en) 2008-09-11 2016-03-01 Deere & Company Method for machine coordination which maintains line-of-site contact
US8719474B2 (en) 2009-02-13 2014-05-06 Faro Technologies, Inc. Interface for communication between internal and external devices
US9074883B2 (en) 2009-03-25 2015-07-07 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9551575B2 (en) 2009-03-25 2017-01-24 Faro Technologies, Inc. Laser scanner having a multi-color light source and real-time color receiver
WO2010141209A1 (en) * 2009-06-01 2010-12-09 Robert Bosch Gmbh Method and apparatus for combining three-dimensional position and two-dimensional intensity mapping for localization
US8473187B2 (en) 2009-06-01 2013-06-25 Robert Bosch Gmbh Method and apparatus for combining three-dimensional position and two-dimensional intensity mapping for localization
JP2012529012A (en) * 2009-06-01 2012-11-15 ロベルト ボッシュ ゲーエムベーハー Method and apparatus for combining 3D position and 2D intensity mapping for localization
US8625106B2 (en) 2009-07-22 2014-01-07 Faro Technologies, Inc. Method for optically scanning and measuring an object
US8384914B2 (en) 2009-07-22 2013-02-26 Faro Technologies, Inc. Device for optically scanning and measuring an environment
EP2294960A3 (en) * 2009-09-11 2013-11-20 Vorwerk & Co. Interholding GmbH Method for operating a cleaning robot
US8896819B2 (en) 2009-11-20 2014-11-25 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9417316B2 (en) 2009-11-20 2016-08-16 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9210288B2 (en) 2009-11-20 2015-12-08 Faro Technologies, Inc. Three-dimensional scanner with dichroic beam splitters to capture a variety of signals
US9529083B2 (en) 2009-11-20 2016-12-27 Faro Technologies, Inc. Three-dimensional scanner with enhanced spectroscopic energy detector
US9113023B2 (en) 2009-11-20 2015-08-18 Faro Technologies, Inc. Three-dimensional scanner with spectroscopic energy detector
US8705016B2 (en) 2009-11-20 2014-04-22 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US10060722B2 (en) 2010-01-20 2018-08-28 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US10281259B2 (en) 2010-01-20 2019-05-07 Faro Technologies, Inc. Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features
US9163922B2 (en) 2010-01-20 2015-10-20 Faro Technologies, Inc. Coordinate measurement machine with distance meter and camera to determine dimensions within camera images
US9607239B2 (en) 2010-01-20 2017-03-28 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US9628775B2 (en) 2010-01-20 2017-04-18 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US9009000B2 (en) 2010-01-20 2015-04-14 Faro Technologies, Inc. Method for evaluating mounting stability of articulated arm coordinate measurement machine using inclinometers
US9329271B2 (en) 2010-05-10 2016-05-03 Faro Technologies, Inc. Method for optically scanning and measuring an environment
US9684078B2 (en) 2010-05-10 2017-06-20 Faro Technologies, Inc. Method for optically scanning and measuring an environment
EP3360455A1 (en) * 2010-07-12 2018-08-15 LG Electronics Inc. Robot cleaner and controlling method of the same
EP3360456A1 (en) * 2010-07-12 2018-08-15 LG Electronics Inc. Robot cleaner and controlling method of the same
US8705012B2 (en) 2010-07-26 2014-04-22 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8730477B2 (en) 2010-07-26 2014-05-20 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8699007B2 (en) 2010-07-26 2014-04-15 Faro Technologies, Inc. Device for optically scanning and measuring an environment
GB2496087B (en) * 2010-07-29 2014-08-06 Faro Tech Inc Device for optically scanning and measuring an environment
US8699036B2 (en) 2010-07-29 2014-04-15 Faro Technologies, Inc. Device for optically scanning and measuring an environment
GB2496087A (en) * 2010-07-29 2013-05-01 Faro Tech Inc Device for optically scanning and measuring an environment
WO2012013280A1 (en) * 2010-07-29 2012-02-02 Faro Technologies Inc. Device for optically scanning and measuring an environment
US9910444B2 (en) 2010-09-24 2018-03-06 Irobot Corporation Systems and methods for VSLAM optimization
US9286810B2 (en) * 2010-09-24 2016-03-15 Irobot Corporation Systems and methods for VSLAM optimization
US20120121161A1 (en) * 2010-09-24 2012-05-17 Evolution Robotics, Inc. Systems and methods for vslam optimization
US9026315B2 (en) 2010-10-13 2015-05-05 Deere & Company Apparatus for machine coordination which maintains line-of-site contact
US9168654B2 (en) 2010-11-16 2015-10-27 Faro Technologies, Inc. Coordinate measuring machines with dual layer arm
EP2584522A3 (en) * 2011-10-23 2015-04-22 Bond Textile Research BV A system for assessing a spatial use load of a building, a method for using the system, a dirt collector device and a method of covering a floor, both for use in the said system or method
US9417056B2 (en) 2012-01-25 2016-08-16 Faro Technologies, Inc. Device for optically scanning and measuring an environment
CN103325296A (en) * 2012-03-19 2013-09-25 联想(北京)有限公司 Information processing method and equipment used for simultaneous localization and mapping
US9448062B2 (en) 2012-04-11 2016-09-20 Balyo S. A. Apparatus and method for determining reference elements of an environment
WO2013153415A1 (en) * 2012-04-11 2013-10-17 Balyo Apparatus and method for determining reference elements of an environment
US9511494B2 (en) 2012-06-18 2016-12-06 Lg Electronics Inc. Robot cleaner and controlling method of the same
EP2677386A1 (en) * 2012-06-18 2013-12-25 LG Electronics Inc. Robot cleaner and obstacle detection control method of the same
US8997362B2 (en) 2012-07-17 2015-04-07 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine with optical communications bus
US9563204B2 (en) 2012-08-14 2017-02-07 Husqvarna Ab Mower with object detection system
US8830485B2 (en) 2012-08-17 2014-09-09 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9939529B2 (en) 2012-08-27 2018-04-10 Aktiebolaget Electrolux Robot positioning system
US9618620B2 (en) 2012-10-05 2017-04-11 Faro Technologies, Inc. Using depth-camera images to speed registration of three-dimensional scans
US9746559B2 (en) 2012-10-05 2017-08-29 Faro Technologies, Inc. Using two-dimensional camera images to speed registration of three-dimensional scans
US11815600B2 (en) 2012-10-05 2023-11-14 Faro Technologies, Inc. Using a two-dimensional scanner to speed registration of three-dimensional scan data
US11112501B2 (en) 2012-10-05 2021-09-07 Faro Technologies, Inc. Using a two-dimensional scanner to speed registration of three-dimensional scan data
US10739458B2 (en) 2012-10-05 2020-08-11 Faro Technologies, Inc. Using two-dimensional camera images to speed registration of three-dimensional scans
US10067231B2 (en) 2012-10-05 2018-09-04 Faro Technologies, Inc. Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner
US9372265B2 (en) 2012-10-05 2016-06-21 Faro Technologies, Inc. Intermediate two-dimensional scanning with a three-dimensional scanner to speed registration
DE102012109481A1 (en) * 2012-10-05 2014-04-10 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9513107B2 (en) 2012-10-05 2016-12-06 Faro Technologies, Inc. Registration calculation between three-dimensional (3D) scans based on two-dimensional (2D) scan data from a 3D scanner
US9739886B2 (en) 2012-10-05 2017-08-22 Faro Technologies, Inc. Using a two-dimensional scanner to speed registration of three-dimensional scan data
US10203413B2 (en) 2012-10-05 2019-02-12 Faro Technologies, Inc. Using a two-dimensional scanner to speed registration of three-dimensional scan data
EP2741161A3 (en) * 2012-12-10 2017-07-12 Miele & Cie. KG Autonomous apparatus for processing a surface and method for detecting the position of the autonomous apparatus
DE102012112401B3 (en) * 2012-12-17 2014-03-27 Miele & Cie. Kg Self-propelled robot and method for determining a rotational position of at least one drive wheel of a self-propelled robot
EP2755101A2 (en) 2013-01-10 2014-07-16 Miele & Cie. KG Self-propelled robot and method for distance determination in a self-propelled robot
EP2755101A3 (en) * 2013-01-10 2017-11-29 Miele & Cie. KG Self-propelled robot and method for distance determination in a self-propelled robot
DE102013100192A1 (en) * 2013-01-10 2014-07-10 Miele & Cie. Kg Self-propelled robot and method for distance determination in a self-propelled robot
US10448794B2 (en) 2013-04-15 2019-10-22 Aktiebolaget Electrolux Robotic vacuum cleaner
US10219665B2 (en) 2013-04-15 2019-03-05 Aktiebolaget Electrolux Robotic vacuum cleaner with protruding sidebrush
CN104161487A (en) * 2013-05-17 2014-11-26 恩斯迈电子(深圳)有限公司 Mobile device
EP2804065A1 (en) * 2013-05-17 2014-11-19 MSI Computer (Shenzhen) Co., Ltd. Mobile device for generating first map information for an electronic device and receiving second map information from the electronic device
US9357893B2 (en) 2013-05-17 2016-06-07 Msi Computer (Shenzhen) Co., Ltd. Mobile device generating map information to an electronic device
EP2894533A2 (en) 2013-10-31 2015-07-15 LG Electronics Inc. Mobile robot and operating method thereof
EP2894533B1 (en) * 2013-10-31 2019-07-31 LG Electronics Inc. Mobile robot and operating method thereof
EP3588230A1 (en) * 2013-10-31 2020-01-01 Lg Electronics Inc. Method of controlling a mobile robot
EP2884364A1 (en) 2013-12-12 2015-06-17 Hexagon Technology Center GmbH Autonomous gardening vehicle with camera
US9603300B2 (en) 2013-12-12 2017-03-28 Hexagon Technology Center Gmbh Autonomous gardening vehicle with camera
CN104714547A (en) * 2013-12-12 2015-06-17 赫克斯冈技术中心 Autonomous gardening vehicle with camera
US9811089B2 (en) 2013-12-19 2017-11-07 Aktiebolaget Electrolux Robotic cleaning device with perimeter recording function
US10617271B2 (en) 2013-12-19 2020-04-14 Aktiebolaget Electrolux Robotic cleaning device and method for landmark recognition
US10149589B2 (en) 2013-12-19 2018-12-11 Aktiebolaget Electrolux Sensing climb of obstacle of a robotic cleaning device
US9946263B2 (en) 2013-12-19 2018-04-17 Aktiebolaget Electrolux Prioritizing cleaning areas
CN105849660A (en) * 2013-12-19 2016-08-10 伊莱克斯公司 Robotic cleaning device
US10045675B2 (en) 2013-12-19 2018-08-14 Aktiebolaget Electrolux Robotic vacuum cleaner with side brush moving in spiral pattern
US10433697B2 (en) 2013-12-19 2019-10-08 Aktiebolaget Electrolux Adaptive speed control of rotating side brush
US10209080B2 (en) * 2013-12-19 2019-02-19 Aktiebolaget Electrolux Robotic cleaning device
WO2015090397A1 (en) * 2013-12-19 2015-06-25 Aktiebolaget Electrolux Robotic cleaning device
US10231591B2 (en) 2013-12-20 2019-03-19 Aktiebolaget Electrolux Dust container
DE102014012811B4 (en) * 2014-03-27 2017-09-21 Miele & Cie. Kg Floor cleaning apparatus and method and system for determining a floor plan by a self-propelled floor cleaning device
DE102014012811A1 (en) * 2014-03-27 2015-10-01 Miele & Cie. Kg Floor cleaning apparatus and method and system for determining a floor plan by a self-propelled floor cleaning device
US10638899B2 (en) * 2014-05-02 2020-05-05 Lg Electronics Inc. Cleaner
US20150313432A1 (en) * 2014-05-02 2015-11-05 Lg Electronics Inc. Cleaner
EP2945038A3 (en) * 2014-05-15 2016-07-06 LG Electronics Inc. Method of controlling a cleaner
CN105094127A (en) * 2014-05-15 2015-11-25 Lg电子株式会社 Cleaner and method of controlling cleaner
EP3407152A1 (en) * 2014-05-15 2018-11-28 LG Electronics Inc. Method of controlling a cleaner
US9851720B2 (en) 2014-05-15 2017-12-26 Lg Electronics Inc. Method of controlling a cleaner
US10518416B2 (en) 2014-07-10 2019-12-31 Aktiebolaget Electrolux Method for detecting a measurement error in a robotic cleaning device
US10112302B2 (en) 2014-09-03 2018-10-30 Dyson Technology Limited Mobile robot
US10144342B2 (en) 2014-09-03 2018-12-04 Dyson Technology Limited Mobile robot
JP2016052514A (en) * 2014-09-03 2016-04-14 ダイソン・テクノロジー・リミテッド Mobile robot
US10729297B2 (en) 2014-09-08 2020-08-04 Aktiebolaget Electrolux Robotic vacuum cleaner
US10499778B2 (en) 2014-09-08 2019-12-10 Aktiebolaget Electrolux Robotic vacuum cleaner
WO2016054691A1 (en) * 2014-10-07 2016-04-14 Commonwealth Scientific And Industrial Research Organisation A method of setting up a tracking system
US9378554B2 (en) 2014-10-09 2016-06-28 Caterpillar Inc. Real-time range map generation
US10877484B2 (en) 2014-12-10 2020-12-29 Aktiebolaget Electrolux Using laser sensor for floor type detection
US10874271B2 (en) 2014-12-12 2020-12-29 Aktiebolaget Electrolux Side brush and robotic cleaner
US10534367B2 (en) 2014-12-16 2020-01-14 Aktiebolaget Electrolux Experience-based roadmap for a robotic cleaning device
US10678251B2 (en) 2014-12-16 2020-06-09 Aktiebolaget Electrolux Cleaning method for a robotic cleaning device
US11099554B2 (en) 2015-04-17 2021-08-24 Aktiebolaget Electrolux Robotic cleaning device and a method of controlling the robotic cleaning device
CN104916216A (en) * 2015-06-26 2015-09-16 深圳乐行天下科技有限公司 Map construction method and system thereof
CN105123090A (en) * 2015-07-16 2015-12-09 张萍 Multifunctional intelligent mower based on internet of things
US11712142B2 (en) 2015-09-03 2023-08-01 Aktiebolaget Electrolux System of robotic cleaning devices
CN107920709A (en) * 2015-09-03 2018-04-17 伊莱克斯公司 Robotic cleaning device system
US10874274B2 (en) 2015-09-03 2020-12-29 Aktiebolaget Electrolux System of robotic cleaning devices
US10599149B2 (en) 2015-10-09 2020-03-24 SZ DJI Technology Co., Ltd. Salient feature based vehicle positioning
EP3168704A1 (en) * 2015-11-12 2017-05-17 Hexagon Technology Center GmbH 3d surveying of a surface by mobile vehicles
CN107024199A (en) * 2015-11-12 2017-08-08 赫克斯冈技术中心 Surveyed and drawn by mobile delivery vehicle
US10175037B2 (en) 2015-12-27 2019-01-08 Faro Technologies, Inc. 3-D measuring device with battery pack
EP3417755A4 (en) * 2016-02-16 2019-11-20 Toshiba Lifestyle Products & Services Corporation Autonomous traveling device
US11169533B2 (en) 2016-03-15 2021-11-09 Aktiebolaget Electrolux Robotic cleaning device and a method at the robotic cleaning device of performing cliff detection
US11122953B2 (en) 2016-05-11 2021-09-21 Aktiebolaget Electrolux Robotic cleaning device
US11172605B2 (en) 2016-06-30 2021-11-16 Tti (Macao Commercial Offshore) Limited Autonomous lawn mower and a system for navigating thereof
US11832552B2 (en) 2016-06-30 2023-12-05 Techtronic Outdoor Products Technology Limited Autonomous lawn mower and a system for navigating thereof
US11172608B2 (en) 2016-06-30 2021-11-16 Tti (Macao Commercial Offshore) Limited Autonomous lawn mower and a system for navigating thereof
EP3569125A4 (en) * 2017-01-13 2020-10-28 LG Electronics Inc. -1- Cleaner and method for controlling same
US11324371B2 (en) 2017-01-13 2022-05-10 Lg Electronics Inc. Robot and method for controlling same
US11474533B2 (en) 2017-06-02 2022-10-18 Aktiebolaget Electrolux Method of detecting a difference in level of a surface in front of a robotic cleaning device
DE102017118227A1 (en) * 2017-08-10 2019-02-14 Vorwerk & Co. Interholding Gmbh Soil cultivation device with an accumulator
WO2019063299A1 (en) * 2017-09-26 2019-04-04 Robert Bosch Gmbh Method and system for mapping and locating a vehicle based on radar measurements
US11921517B2 (en) 2017-09-26 2024-03-05 Aktiebolaget Electrolux Controlling movement of a robotic cleaning device
US11493624B2 (en) 2017-09-26 2022-11-08 Robert Bosch Gmbh Method and system for mapping and locating a vehicle based on radar measurements
CN111373338A (en) * 2017-11-27 2020-07-03 罗伯特·博世有限公司 Method and apparatus for operating a mobile system
WO2019101651A1 (en) * 2017-11-27 2019-05-31 Robert Bosch Gmbh Method and device for operating a mobile system
US11537142B2 (en) 2018-03-19 2022-12-27 Amicro Semiconductor Co., Ltd. Method for robot repositioning
EP3770711A4 (en) * 2018-03-19 2021-09-29 Amicro Semiconductor Co., Ltd. Method for repositioning robot
CN108692701A (en) * 2018-05-28 2018-10-23 佛山市南海区广工大数控装备协同创新研究院 Mobile robot Multi-sensor Fusion localization method based on particle filter
CN108896027B (en) * 2018-06-04 2020-08-04 河南工程学院 Topographic mapping device for building
CN108896027A (en) * 2018-06-04 2018-11-27 河南工程学院 A kind of landform plotting board for building
US11656630B2 (en) * 2018-10-12 2023-05-23 Boston Dynamics, Inc. Autonomous map traversal with waypoint matching
WO2020076422A3 (en) * 2018-10-12 2020-05-14 Boston Dynamics, Inc. Autonomous map traversal with waypoint matching
KR102504729B1 (en) * 2018-10-12 2023-03-02 보스턴 다이나믹스, 인크. Autonomous map driving using waypoint matching
KR20230035139A (en) * 2018-10-12 2023-03-10 보스턴 다이나믹스, 인크. Autonomous map traversal with waypoint matching
KR20210058998A (en) * 2018-10-12 2021-05-24 보스턴 다이나믹스, 인크. Autonomous Map Driving Using Waypoint Matching
CN112840285A (en) * 2018-10-12 2021-05-25 波士顿动力公司 Autonomous map traversal with waypoint matching
US11747825B2 (en) 2018-10-12 2023-09-05 Boston Dynamics, Inc. Autonomous map traversal with waypoint matching
KR102648771B1 (en) 2018-10-12 2024-03-18 보스턴 다이나믹스, 인크. Autonomous map traversal with waypoint matching
US11774247B2 (en) 2019-08-06 2023-10-03 Boston Dynamics, Inc. Intermediate waypoint generator
EP3794921A1 (en) 2019-09-23 2021-03-24 Fabrizio Bernini Lawn mower robot and corresponding process
CN111973076A (en) * 2020-08-21 2020-11-24 苏州三六零机器人科技有限公司 Room attribute identification method and device, sweeping robot and storage medium
CN112462768A (en) * 2020-11-25 2021-03-09 深圳拓邦股份有限公司 Mobile robot navigation map creating method and device and mobile robot
CN112462768B (en) * 2020-11-25 2024-03-29 深圳拓邦股份有限公司 Mobile robot navigation map creation method and device and mobile robot
SE2250834A1 (en) * 2022-07-04 2024-01-05 Husqvarna Ab Improved determination of pose for a robotic work tool

Similar Documents

Publication Publication Date Title
WO2007051972A1 (en) Navigation system
CN112739244B (en) Mobile robot cleaning system
JP7259015B2 (en) Mobile robot and its control method
JP6539699B2 (en) Autonomous coverage robot navigation system
US20200047337A1 (en) Robotic platform with event based mode change
US20200047343A1 (en) Remote planning and locally adaptive service mapping
JP7060813B2 (en) Floor treatment machine for treating floor surface and its method
US9229454B1 (en) Autonomous mobile robot system
JP5255448B2 (en) Autonomous coverage robot navigation system
US9360300B2 (en) Methods and apparatus for position estimation using reflected light sources
Prassler et al. A short history of cleaning robots
JP4256812B2 (en) Obstacle avoidance method for moving body and moving body
CN112004645A (en) Intelligent cleaning robot
KR20180080498A (en) Robot for airport and method thereof
JP2015092348A (en) Mobile human interface robot
KR20090077547A (en) Method and apparatus of path planning for a mobile robot
JP2014197403A (en) Self-propelled teleconferencing platform
JP2022540160A (en) Mobile robot using artificial intelligence and control method for mobile robot
US20200397202A1 (en) Floor treatment by means of an autonomous mobile robot
WO2020086557A1 (en) Apparatus and method for operations of a robotic platform
EP2325713B1 (en) Methods and systems for movement of robotic device using video signal
KR20220121483A (en) Method of intelligently generating map and mobile robot thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06794890

Country of ref document: EP

Kind code of ref document: A1