WO2007051972A1 - Navigation system - Google Patents

Navigation system Download PDF

Info

Publication number
WO2007051972A1
WO2007051972A1 PCT/GB2006/003958 GB2006003958W WO2007051972A1 WO 2007051972 A1 WO2007051972 A1 WO 2007051972A1 GB 2006003958 W GB2006003958 W GB 2006003958W WO 2007051972 A1 WO2007051972 A1 WO 2007051972A1
Authority
WO
WIPO (PCT)
Prior art keywords
environment
robotic device
map
operation
navigation system
Prior art date
Application number
PCT/GB2006/003958
Other languages
French (fr)
Inventor
Tej Paul Kaushal
Ronald Vincent Smith
Christopher Andrew Smith
David Bryan Mullin
James Edward Carroll
Original Assignee
Qinetiq Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US73181605P priority Critical
Priority to GB0522153A priority patent/GB0522153D0/en
Priority to GB0522153.6 priority
Priority to US60/731,816 priority
Application filed by Qinetiq Limited filed Critical Qinetiq Limited
Publication of WO2007051972A1 publication Critical patent/WO2007051972A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/933Lidar systems specially adapted for specific applications for anti-collision purposes between aircrafts or spacecrafts; between aircrafts or spacecrafts and fixed obstacles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0248Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0272Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising means for registering the travel distance, e.g. revolutions of wheels
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2201/00Application
    • G05D2201/02Control of position of land vehicles
    • G05D2201/0203Cleaning or polishing vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2201/00Application
    • G05D2201/02Control of position of land vehicles
    • G05D2201/0208Lawn mower
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2201/00Application
    • G05D2201/02Control of position of land vehicles
    • G05D2201/0215Vacuum cleaner

Abstract

A navigation system (2) comprising a primary mapping apparatus adapted to detect features within an environment and to create a summary map of the environment including an estimate of a point of current location within the environment; a secondary mapping apparatus adapted to provide a detailed three-dimensional map of the local environment in the vicinity of the point of current location; and a processor adapted to determine navigable points within the environment by combining information from the summary map and the detailed map. Specifically, the primary mapping apparatus detects features within the environment using an optical detector and utilises a visual simultaneous localisation and mapping (VSLAM) process (8) to create the summary map of the general environment.

Description

NAVIGATION SYSTEM

The present invention relates to a navigation system, and in particular to a navigation system for a mobile robotic device. The invention relates specifically, but not exclusively, to a navigation system for a domestic robotic device, for example an autonomous cleaner or lawn mower.

Existing domestic robots, for example autonomous cleaners and lawn mowers, are primitive and typically require a large number of basic sensors in order to operate autonomously. Even so, the operation of such domestic robots is simplistic. By way of example, conventional autonomous cleaners move in a random motion over an area to be cleaned. The motion of the cleaner is typically controlled by a random-motion algorithm stored within the device. In use, the cleaner will implement the random- motion algorithm in response to internal events (elapse of a random or predetermined time period, battery condition etc.) and external stimuli (for example, detection of obstacles such as walls and furniture using simple bump sensors and detection of precipices, such as the top of a flight of stairs, using an infrared emitter and detector etc.).

Despite their unsophisticated nature, autonomous cleaners utilising random motion can provide an acceptable cleaning coverage .given enough cleaning time. However, such operation is inefficient since the cleaner has no knowledge of where it has cleaned and therefore needs to operate for an extended period to achieve an acceptable coverage of the area to be cleaned.

The efficiency of an autonomous cleaner may be enhanced by incorporating a deterministic component to the otherwise random motion of the device. In deterministic cleaning, the autonomous cleaner follows a defined path that is calculated to completely cover the area to be cleaned while minimising redundant cleaning.

In order to implement deterministic cleaning, the autonomous cleaner must have a preprogrammed map of the area to be cleaned or create such a map by exploring the environment within which it is located. This prerequisite reduces the ease of use of the cleaner and requires the map to be updated to reflect changes in the environment. The autonomous cleaner must also maintain a knowledge of its current position and remember where it has cleaned, which in turn requires a sophisticated, accurate positioning and navigation system. A suitable positioning system might rely on marker beacons positioned around the space to be cleaned, a differential Global Positioning System (GPS), scanning laser ranging systems or other sophisticated systems.

By way of further example, a laser ranging system could be used in conjunction with simultaneous localisation and mapping (SLAM) software to construct a detailed map of the space to be cleaned.

Simultaneous localisation and mapping (SLAM) is a process of concurrently building a map of an environment based on stationary features or landmarks within the environment and using this map to obtain estimates of the location of a vehicle (an autonomous cleaner in this example). Simultaneous localisation and mapping is already used in the field of mobile robotics as a tool to enable fully autonomous navigation of a vehicle. In essence, the vehicle relies on its ability to extract useful navigation information from data returned by sensors mounted on the vehicle. Typical sensors might include a dead reckoning system (for example an odometry sensor or inertial measurement system) in combination with a laser rangefinder.

The vehicle starts at an unknown location with no a priori knowledge of landmark locations. From relative observations of landmarks, it simultaneously computes an estimate of vehicle location and an estimate of landmark locations. While continuing in motion, the vehicle builds a complete map of landmarks and uses these to provide continuous estimates of the vehicle location. By tracking the relative position between the vehicle and identifiable features in the environment, both the position of the vehicle and the position of the features can be estimated simultaneously. In the absence of external information about the vehicle's position, this algorithm presents an autonomous system with the tools necessary to navigate in unknown environments.

The prospect of deploying a system that can build a map of its environment while simultaneously using that map to localise itself promises to allow robotic vehicles to operate autonomously in unknown environments. However, this is an expensive solution and the data processing overhead is high. Alternatively, a vision based system could be used in conjunction with SLAM software to form a navigation system for a deterministic cleaner. This technique, known as visual SLAM (VSLAM) uses passive sensing to provide a low power and dynamic localisation system. VSLAM uses a simple video camera as the input device and pattern processing algorithms to locate features in the images acquired by the video camera. The features are input to the SLAM algorithm which is then able to accurately compute the three-dimensional location of each feature and hence start to build a three- dimensional map as the cleaner moves around the space to be cleaned. However, to do this in sufficient detail requires a large number of feature points in the image. In common with conventional SLAM, the data processing load is high (proportional to the square of the number of features selected).

Accordingly, the abovementioned navigation systems have hitherto been considered to be prohibitively complex and expensive for domestic robotic devices.

It is an object of the invention to provide a navigation system which mitigates at least some of the disadvantages of the conventional navigation systems described above.

According to a first aspect of the present invention there is now proposed a navigation system comprising a primary mapping apparatus adapted to detect features within an environment and to create a summary map of the environment including an estimate of a point of current location within the environment; a secondary mapping apparatus adapted to provide a detailed three-dimensional map of the local environment in the vicinity of the point of current location; and a processor adapted to determine navigable points within the environment by combining information from the summary map and the detailed map.

Contrary to conventional navigation theory, the present navigation system utilises a summary map of the environment (a sparse map having few details therein) rather than a detailed map of the environment. Deliberately degrading the quality of the environmental map in this way is counter intuitive, since conventional navigation systems typically aim to maximise environmental details in a map.

The present navigation system is predicated on the realisation that, for many applications, adequate navigational performance may be achieved using an approximate knowledge of the global environment in conjunction with a detailed map of the environment in the vicinity of the point of current location.

The present navigation system confers an unexpected advantage in terms of providing reliable navigation within the environment despite the paucity of detail within the environmental map.

In this respect, the summary map of the environment and the detailed three- dimensional map in the vicinity of the point of current location mutually support each other to such an extent that a new technical result is achieved.

The foregoing navigation system is advantageous in that the data processing load is reduced in comparison with conventional navigation systems having detailed maps of the environment. The hardware requirements of the navigation system (processor specification, memory etc.) are correspondingly reduced. In a preferred embodiment, the processor is configured to provide instructions to a motion control system so as to navigate from the point of current location to another navigable point within the environment.

Advantageously, the primary mapping apparatus has an optical sensor adapted to detect the features within the environment and wherein the primary mapping apparatus utilises a visual simultaneous localisation and mapping (VSLAM) process to create the summary map of the environment.

As an enhancement of the conventional SLAM process, visual simultaneous localisation and mapping (VSLAM) uses a passive optical sensor, for example a simple video camera, and dead reckoning to make visual measurements of an environment. VSLAM uses pattern processing algorithms to locate visual features in the images acquired by the video camera. The features are input to the SLAM algorithm which is then able to accurately compute the three-dimensional location of the visual features and hence start to build a three-dimensional map of the environment. Navigation within the environment is based on recognised visual features or landmarks on the VSLAM map.

Upon entering a new environment, the VSLAM-enabled navigation system starts creating a new map through exploration of the environment. By this process, the navigation system detects features in the environment, creates landmarks there-from, and corrects its position information as necessary using its VSLAM map.

The secondary mapping apparatus may comprise any suitable means of simply generating a localised three-dimensional map. For example, the secondary mapping apparatus may comprise a passive system, e.g. a stereoscopic imaging system. By way of further example, the secondary mapping apparatus may comprise an imaging apparatus having a structured light generator capable of creating a structured pattern of light within the environment. The imaging apparatus typically includes an imaging sensor, for example a camera, arranged to capture images of the environment including the structured pattern of light projected therein. The camera is capable of detecting the position relative thereto of a plurality of points in the pattern projected within the environment, for example using triangulation / trigonometry. The secondary mapping apparatus preferably includes a pattern processor arranged in use to determine from the detected position of a point in the pattern within the environment, the range to that point. The detected position and range information are then used to create a detailed three-dimensional map of the local environment in the vicinity of the point of current location.

Conveniently, the imaging apparatus comprises at least one of a spot projector and a pattern projector adapted to project a geometric shape. Where the imaging apparatus comprises a spot projector, the spots may comprise intersections between continuous lines. In this case, the projector may project two sets of regularly spaced lines, the two sets preferably being substantially orthogonal. By way of further example, the projector may comprise at least one of a kaleidoscope and a holographic element through which light is transmitted into the environment.

Conventional SLAM navigation systems if used with a secondary mapping means would traditionally seek to produce an exhaustive environmental map by augmenting the summary information from the primary mapping apparatus with detailed information captured by the secondary mapping apparatus. The present navigation system overcomes this technical prejudice by maintaining a summary map in respect of the global environment and only seeking further detail where necessary, i.e. in the vicinity of the point of current location.

This may be more easily understood by considering the type of information contained respectively within the summary and detailed three-dimensional maps. By way of brief explanation, the summary VSLAM map merely comprises a collection of features or landmarks within the environment with associated vectors defining their relative positions with respect to the point of current location and with respect to each other. In this respect the summary map is an abstract representation of the environment since it contains little or no information regarding the environment between the sparse features or landmarks within the map.

In order to navigate within the environment, the navigation system infers information about the environment based on the relationship(s) between features or landmarks in the summary map. For example, a plurality of features lying within a common plane may be indicative of a solid planer feature within the environment, e.g. a floor, a wall etc. However, ambiguities remain in the summary map which give rise to uncertainties as to whether points within the environment are navigable. The information produced by the secondary mapping apparatus is a detailed three-dimensional representation of the environment local to the point of current location. Hence, the detailed three- dimensional map is complementary to the summary map and may be used to remove ambiguities in the summary map by confirming navigable points in the summary map.

It should be noted that the information from the detailed three dimensional map is not incorporated into the summary map; the information from the former is merely used to refine the latter in terms of identifying navigable and non-navigable points therein.

Preferably, the imaging sensor of the secondary mapping apparatus and the optical sensor of the primary mapping apparatus comprise a single common sensor. It is noteworthy that the level of detail held within the global environmental map remains low at all times, despite the use of optical sensor(s) for both primary and secondary mapping apparatuses and the availability of detailed environmental information from the latter.

Even more preferably, the common sensor is one of a video camera, a CMOS camera and a charge-coupled device (CCD).

In a preferred embodiment, the optical sensor may be arranged to have a field of view which includes an upward direction. In this configuration the optical sensor may be arranged, in use, to detect features disposed on a substantially planar surface within the environment, for example a portion of a ceiling or roof. In this case, the primary mapping apparatus is adapted to create from said detected features a summary map of the environment underlying said planar surface.

According to a second aspect of the present invention there is now proposed a vehicle having a navigation system according to the first aspect of the invention.

According to a third aspect of the present invention there is now proposed a mobile robotic device having a navigation system according to the first aspect of the invention.

In one embodiment, the mobile robotic device comprises any of a domestic robotic device, an industrial robotic device, a security robotic device, a floor cleaner, a vacuum cleaner, a lawn mower, a robotic entertainment device, a maintenance device, a surveying device, a fork lift truck, and a car.

The present navigation system represents a cost-effective way of delivering information about the immediate and general environment in which a robot, particularly a domestic mobile robot, operates to allow it to undertake navigation and therefore domestic and other tasks (e.g. cleaning, mowing, repair work, maintenance, surveillance, surveying tasks etc.) more quickly and effectively.

In the case of robotic cleaner, the present navigation systems enables deterministic cleaning of an area to be cleaned. In deterministic cleaning, the robotic cleaner follows a defined path that is calculated to completely cover the area to be cleaned while minimising redundant cleaning. When applied to a robotic cleaner, the present navigation system confers benefits in terms of reduced cost, reduction in the number of sensors required, operational simplicity, faster cleaning of a given area, cleaning of larger areas, lower power consumption (or in the case of a vacuum cleaner increased power for vacuuming), lower use of consumables, less noise, reduced wear and use of consumables. The secondary mapping apparatus ensures safe operation by helping to determine whether the robotic cleaner can move into a given space. This is particularly important within environments inhabited by people or pets, in which case the secondary mapping apparatus facilitates safe and intelligent movements around inhabitants within the environment.

Additionally, the secondary mapping apparatus provides a capability to detect precipices within the environment; a vital safety feature when used in an environment having different levels separated by stairs.

Hence, the secondary mapping apparatus enables the robotic cleaner to avoid precipices and stop in the vicinity of an object without colliding therewith. This capability confers a high degree of confidence when navigating within the environment, allowing the robotic cleaner to accelerate and decelerate gracefully as it moves within the environment.

Moreover, deterministic cleaning allows a robotic cleaner to have a. shape, e.g. a flat or acutely shaped front with corners, which facilitates cleaning of edges and corners. In addition, the robotic cleaner may be arranged to have a substantially semi-circular rear portion. In contrast, conventional robotic cleaners which move in a random motion are typically circular so that they can extricate themselves from small spaces by merely rotating about their own axis.

It is estimated that the present navigation system enables the performance of a domestic robotic cleaner to be improved by a factor of approximately twelve over conventional devices (based on the following assessments: the robot cleans a given area six times faster than a cleaner employing random motion; the size of the battery is halved owing to the reduction in redundant cleaning). Alternatively, the time taken to clean a given area may be reduced by a factor of approximately twelve over conventional devices (based on the following assessments: the robot cleans a given area six times faster than a cleaner employing random motion; the velocity at which the cleaner moves within the environment can be doubled due to the high degree of confidence with which the cleaner navigates within the environment; the size of the battery is maintained). Hence, the time taken to clean a room may be reduced from approximately one hour using a conventional device to approximately five minutes using the present robotic cleaner.

In a preferred embodiment, the mobile robotic device may be adapted to receive a detachable accessory. With the accessory fitted the robotic device may adopt a supplementary mode of operation.

For example, in the case of an robotic entertainment device (e.g. a robotic pet), the detachable accessory may be a lead or a leash. The robotic pet could then be safely taken for a walk with the lead attached thereto.

Preferably, the mobile robotic device comprises a vacuum cleaner adapted to receive a detachable accessory.

Conveniently, the mobile robotic device has a vacuum pressure variable between a minimum and a maximum, wherein maximum vacuum pressure may be automatically selected in the supplementary mode of operation. Hence, the robotic device may be configured to operate with maximum suction (maximum vacuum pressure) in the supplementary mode of operation. Advantageously, the accessory includes an accessory hose adapted to engage in fluid communication with a vacuum inlet on the vacuum cleaner.

In a preferred embodiment, the accessory hose is manoeuvrable by a user and the navigation system may be adapted to provide instructions to the motion control system so as to follow the user, while preferably concurrently avoiding obstacles and inhabitants within the environment.

In another embodiment, the navigation system is disabled in the supplementary mode of operation. In this embodiment, the accessory may include a handle. This facilitates manual cleaning of areas inaccessible to the robot when under autonomous control, for example stairs etc. Hence, the robotic vacuum cleaner of the invention can be a primary cleaner in a domestic dwelling, obviating an additional manual cleaner. Similarly, where the mobile robotic device comprises a lawnmower, the ability to manually mow areas of the lawn obviates an additional manual lawnmower.

According to a fourth aspect of the present invention there is now proposed a method of controlling a robotic device within an area to be traversed, the robotic device having a variable power requirement and a navigation system adapted to map features in an environment, the method comprising the steps of:

(i) in a first mode of operation, moving the robotic device in a substantially random motion within the area to be traversed whilst concurrently mapping the environment so as to create a summary map of the area to be traversed, wherein the robotic device is configured to use a minimum power consumption during said first mode of operation,

(ii) in a second mode of operation, moving the robotic device in at least one direction so as to map the environment in greater detail and to create a complete summary map of the area to be traversed, wherein the device is configured to use a increased power consumption during said second mode of operation. (iii) in a third mode of operation, moving the robotic device in a deterministic motion so as to provide optimum traversing of the space to be traversed, wherein the device is configured to use increased power consumption during said third mode of operation.

The secondary mapping apparatus as hereinbefore described with respect to the first aspect of the invention ensures safe operation of the robotic device during all operating modes, by helping to determine whether the robotic device can move into a given space. As will be readily appreciated, this is of particular importance during the first and second modes of operation while the summary map of the area to be traversed is being created and refined. Hence, the robot device is able to avoid precipices, to stop in the vicinity of objects within said area without colliding with them, and move in an intelligent manner around inhabitants within the area to be traversed.

In a preferred embodiment, the device may be configured only to use sufficient power to traverse the area and map the environment during said first mode of operation.

In use, the robotic device may operate in the first, second and third modes of operation in numerical sequence.

Alternatively, or in addition, the phase within which the robotic device operates is selected in response to a status condition.

In a preferred embodiment, the status condition is derived from a plurality of variables, each variable having a changeable weighting factor applied thereto so as to optimise the behaviour of the robotic device.

Advantageously, the variables are selected from: exploration of the area to be traversed, operation of the device, localisation within the environment, efficiency of operation and operating time.

In another preferred embodiment, the robotic device reverts to the first mode of operation in the event of a failure in the navigation system, or alternatively may switch itself off. The device may be a cleaner or mower or any other device mentioned above and the power consumption may relate to suction power, for example, for a cleaner or to brush action or dispensation of cleaning fluid. In the case of a vacuum cleaner, minimum suction or no suction may be used when operating in the first mode of operation (in particular where the device is configured only to use sufficient power to traverse the area and map the environment during said first mode of operation), and maximum suction may be used when operating in at least one of the second or third operating modes. Similarly, the power consumption may relate to the speed at which the device traverses the area. For a mower, power consumption may relate to operation of a cutting blade, strimmer or the like. For maintenance and repair and survey equipment power consumption may relate to operation of features on the device for carrying out such operations.

According to a further aspect of the present invention, there is now proposed the use of a visual simultaneous localisation and mapping (VSLAM) process to create a summary map of an environment, and a localised three-dimensional imaging technique to generate a local map of the environment, for autonomous navigation.

The invention will now be described, by example only, with reference to the accompanying drawings in which;

Figure 1 shows a block diagram of a navigation system according to one embodiment of the present invention.

Figure 2 shows a schematic diagram of a mobile robotic device incorporating the navigation system shown in figure 1.

Figure 3 illustrates the architecture of a robotic vacuum cleaner incorporating the navigation system according to an embodiment of the present invention.

Figure 4 shows an example of a map of an area to be cleaned created by the robotic vacuum cleaner of figure 3.

Referring now to the drawings wherein like reference numerals identify corresponding or similar elements throughout the several views, figure 1 shows a block diagram of a navigation system 2 according to one embodiment of the present invention. The navigation system 2 incorporates a primary mapping apparatus comprising a visual simultaneous localisation and mapping (VSLAM) apparatus and a secondary mapping apparatus consisting of an accurate three-dimensional mapping apparatus. The VSLAM apparatus and the three-dimensional mapping apparatus work in parallel and have complementary functions within the navigation system 2.

The VSLAM apparatus comprises a camera 4 arranged to capture images of the environment. The images are passed from the camera 4 to a feature detector 6 which processes the image to detect naturally occurring features within the environment, e.g. edges etc. The VSLAM apparatus uses a sophisticated vision algorithm 8 to build a summary map of the environment consisting of a set of unique "landmarks" derived from the features observed in the environment. The VSLAM summary map typically comprises a vector map of landmarks within environment.

The skilled person would be aware of a variety of VSLAM apparatuses. However, it should be noted that conventional VSLAM apparatuses are aimed at high accuracy applications, whereas the present VSLAM apparatus is configured to produce a summary map of the environment (a sparse map having few landmarks therein) rather than a detailed map of the environment. This is based on the realisation that, for many applications, adequate navigational performance may be achieved using an approximate knowledge of the global environment in conjunction with a separate detailed map of the local environment in order to assess whether the immediate environment is navigable, for example is free from encumbrances, obstacles, inhabitants etc.

It is noteworthy that, although the VSLAM apparatus produces a summary map of the environment having few landmarks, the location of each landmark within the VSLAM map is accurately known.

The accurate three-dimensional mapping apparatus provides the requisite detailed map of the local environment. The three-dimensional mapping apparatus comprises three-dimensional sensors 10 arranged to capture detailed information about the local environment. Electromechanical sensors and / or optical sensors are used. Typically, the sensors 10 could comprise scanning laser rangefinders. Preferably the sensors 10 are low cost, simple and eye safe.

The accurate three-dimensional mapping apparatus incorporates a processor 12 for processing the raw information from the three-dimensional sensor 10. The processor 12 provides a detailed map of the local environment in the vicinity of a point of current location within the environment. The detailed map typically comprises a collection of points within the environment measured with respect to the three-dimensional sensor 10 (also referred to hereinafter as point cloud information).

The summary map produced by the VSLAM apparatus and the detailed map from the accurate three-dimensional mapping apparatus are passed to a navigation controller 14 which provides instructions to a motion control system 16 in order to control movement of a mobile platform, for example a robot, within the mapped environment.

When used in a mobile robotics application, the navigation system 2 uses recognised landmarks to estimate the position of the robot within the environment. Furthermore, the VSLAM apparatus updates and refines the summary map as the robot revisits mapped areas, allowing it to adapt to changes in the environment. In the case of a domestic mobile robot, such changes in the environment may arise due to rearrangement of furniture or objects within a room etc.

Referring to figure 2, a schematic block diagram of a mobile robotic device 20 incorporating the navigation system of figure 1 is shown. The robotic device 20 incorporates the elements of the navigation system described above. However, in this particular embodiment, the separate camera 4 and three-dimensional sensors 10 are replaced by a single common sensor in the form of a camera 22. The camera is a conventional video camera, typically a CMOS camera, alternatively a charge-coupled device (CCD).

The camera is mounted on the robotic device 20 so as to capture images of the environment in front of the robotic device within the field of view 24 of the camera 22. The robotic device 20 also incorporates a structured light generator comprising a spot projector 26, operating at visible or infrared wavelengths, to provide information about the local environment to the accurate three-dimensional mapping apparatus. The principle of operation of the accurate three-dimensional mapping apparatus is as follows. During use, the spot projector 26 projects a plurality of spots into the environment. The camera 22 is arranged to capture images of the environment including the array of spots projected therein. The processor 12 analyses the images from the camera 22 and determines the position with respect to the camera of each projected spot within the environment, for example using triangulation / trigonometry. The processor 12 preferably includes a pattern processor adapted to determine from the detected position of a spot in the environment, the range to that spot. The detected position and range information are then used to create a detailed three-dimensional map of the environment local to the robotic device.

In one embodiment, the spot projector 26 is adapted to project a two dimensional array of spots into the environment. A suitable spot projector is described in PCT patent application publication WO 2004/044523 the content of which is hereby incorporated by reference thereto.

By way of a brief explanation, the spot projector 26 comprises a light source arranged to illuminate part of an input face of a light guide, the light guide comprising a tube having substantially reflective sides and being arranged together with projection optics so as to project an array of distinct images of the light source into the environment. The light guide in effect operates as a kaleidoscope. Light from the source is reflected from the sides of the tube and can undergo a number of reflection paths within the tube. The result is that multiple images of the light source are produced and projected into the environment. Thus the environment is illuminated with an array of images of the light source. Where the source is a simple light emitting diode the scene is therefore illuminated with an array of spots of light. The light guide kaleidoscope gives very good image replication characteristics and projects images of the input face of the light guide in a wide angle, i.e. a large number of spots are projected in all directions. Further the kaleidoscope produces a large depth of field and so delivers a large operating window. The light guide comprises a tube with substantially reflective walls and a constant cross section which is conveniently a regular polygon. Having a regular cross section means that the array of images of the light source will also be regular which is advantageous for ensuring the whole environment is covered and eases processing. A square section tube is most preferred. Typically, the light guide has a cross sectional area in the range of a few square millimetres to a few tens of square millimetres, for instance the cross sectional area may be in the range of 1 - 50mm2. As mentioned the light guide preferably has a regular shape cross section with a longest dimension of a few millimetres, say 1 - 5mm. One embodiment as mentioned is a square section tube having a side length of 2-3mm. The light guide may have a length of a few tens of millimetres, a light guide may be between 10 and 70mm long. Such light guides can generate a grid of spots over an angle of 50-100 degrees (typically about twice the total internal angle within the light guide).

The tube comprises a hollow tube having reflective internal surfaces, i.e. mirrored internal walls. Alternatively the tube is fabricated from a solid material and arranged such that a substantial amount of light incident at an interface between the material of the tube and surrounding material undergoes total internal reflection. The tube material is either coated in a coating with a suitable refractive index or designed to operate in air, in which case the refractive index of the light guide material should be such that total internal reflection occurs at the material air interface.

Using a tube like this as a light guide results in multiple images of the light source being generated which can be projected to the environment to form the array of spots. The light guide is easy to manufacture and assemble and couples the majority of the light from the source to the scene. Thus low power sources such as light emitting diodes can be used. As the exit aperture can be small, the apparatus also has a large depth of field which makes it useful for ranging applications which require spots projected that are separated over a wide range of distances.

Either individual light sources are used close to the input face of the light guide to illuminate just part of the input face or one or more light sources are used to illuminate the input face of the light guide through a mask. Using a mask with transmissive portion for passing light to a part of the light guide can be easier than using individual light sources. Accurate alignment of the mask is required at the input face of the light guide but this may be easier than accurately aligning an LED or LED array.

Where a mask is used the illumination means comprises a homogeniser located between the light source and the mask so as to ensure that the mask is evenly illuminated. The light source may therefore be any light source giving an acceptable level of brightness and does not need accurate alignment. Alternatively an LED with oversized dimensions is used to relax tolerances in manufacture/alignment.

The projection optics comprise a projection lens. The projection lens is located adjacent the output face of the light guide. In some embodiments where the light guide is solid the lens may be integral to the light guide, i.e. the tube may be shaped at the output face to form a lens.

All beams of light projected by the spot projector 26 pass through the end of the light guide and can be thought of as originating from the point at the centre of the end face of the light guide. The projection optics can then comprise a hemispherical lens and if the centre of the hemisphere coincides with the centre of the light guide output face the apparent origin of the beams remains at the same point, i.e. each projected image has a common projection origin. In this arrangement the projector does not have an axis as such as it can be thought of a source of beams radiating across a wide angle. What matters for the imaging apparatus therefore is the geometrical relationship between the point of origin of the beams and the principal point of the lens of the camera 22.

The spot projector 26 is controlled by a master clock 32 which provides synchronisation between the monochromatic camera 22, the spot projector 26 and a de-multiplexer 30 connected to the output of the camera 22.

During operation, the output from the camera 22 is switched between the primary mapping apparatus (the VSLAM apparatus) denoted by numerals 6 and 8 and a secondary mapping apparatus (the accurate three-dimensional mapping apparatus denoted by numerals 12 and 14). The clock 32 controls the spot projector '26 and the de-multiplexer 30 such that a pattern of spots is projected into the local environment at the same time as the output from camera 22 is routed to the accurate three- dimensional mapping apparatus. The clock 32 and the de-multiplexer 30 are configured such that images (frames) produced by the camera 22 are sent alternately to the VSLAM apparatus 6, 8 and the accurate three-dimensional mapping apparatus 12, 14.

In order to move autonomously within the environment, the robotic device 20 includes a motion control system 16 comprising a driving unit having wheels and at least one motor arranged to provide a motive force to the wheels. The motion control system 16 provides control signals to the motor(s) in response to instructions received from the navigation controller 14 so as to move the robotic device 20 within the environment.

To constrain the computational power required, it is helpful if the VSLAM algorithm 8 has some knowledge of the movement of the robotic vehicle 20. Accordingly, information about the robot's movement, either from the use of a stepper motor or a wheel rotation encoder, is fed back to the VSLAM algorithm 8.

When the mobile robotic device is used indoors, the field of view 24 of the camera optionally extends to image a portion of the ceiling or roof of the room or internal space within which the robot is located. In the event that the field of view 24 of the forward looking camera is insufficient, a separate camera may be used to capture images of the ceiling or roof.

Where the ceiling or roof is visible, the VSLAM apparatus maps it and indirectly determines the floor plan which is directly below. This hitherto unknown technique provides the benefit that individual rooms can more easily be defined, for example since door openings do not normally extend to the ceiling. Also, as the ceiling of a room is typically large, plain and white it may be an easy object to see even with a low performance camera. In use, doors can now be open or shut, as the user desires, with the robot still able to map and navigate one room before knowingly moving on to the next one.

Using a ceiling to create a VSLAM map of a room overcomes other disadvantages; for example when trying to map a room from a video image taken at floor level, the obstacles (chairs, tables, etc) obstruct a large portion of the view of the walls making it hard to immediately generate a map, of room boundaries. Also, from floor level it may not be obvious to the navigation system whether a gap is a doorway or merely a space between two pieces of furniture. Hence it would not be clear where the room boundaries were if the doors are left open. This becomes important in the case of robotic cleaner which should logically clean rooms one at a time, and not wander accidentally out of the room before it has finished cleaning it. The foregoing obviates the use of light beam units which are traditionally placed across a doorway in the prior art and which trigger a sensor on the robot to stop it leaving the room.

Referring to Figure 3, the architecture of a robotic vacuum cleaner incorporating the navigation system according to an embodiment of the present invention is illustrated.

The robotic vacuum cleaner architecture is based broadly on that shown in figure 2 for a generic mobile robotic device. In addition, the present robotic vacuum cleaner comprises a vacuum unit (not shown in figure 3) installed on a main body of the cleaner. The vacuum unit is arranged to collect particles in an underlying floor surface by drawing-in air. The vacuum unit can be of known construction, for example including a vacuum motor and a vacuum chamber for collecting particles from the air which is drawn in through a vacuum hole or pipe positioned adjacent the floor surface.

Optionally, the vacuum unit includes a supplementary vacuum inlet adapted to engage in fluid communication with an accessory, for example an external accessory hose.

Optionally, the suction unit has a dirt sensor (56) adapted to provide information about the level and rate of uptake of dirt from the floor, thereby giving a measure of how dirty is the area of floor currently being cleaned.

With regard to the navigation system used in the robotic vacuum cleaner according to the present embodiment, the VSLAM algorithm 8 is configured to extract features from the environment using mechanisms such as scale invariant feature transforms (Sl FTs) and Harris Corners. These mechanisms can be extended to make use of contextual information, such as plane constrained features from ceilings and walls, to reduce the computational complexity (and hence reduce the data processing load) and potentially increase reliability.

When instructed to commence cleaning, the robotic vacuum cleaner initially adopts a first mode of operation, moving in a substantially random motion within the area to be cleaned whilst concurrently mapping the environment and creating a summary VSLAM map of the area to be cleaned. The robotic vacuum cleaner is configured to use a minimum vacuum pressure (minimum suction) during said first mode of operation. Obstacles within the area to be cleaned are detected by the three dimensional mapping apparatus 12 during this first mode of operation and consequently avoided.

The foregoing un-delayed initialisation technique gives a quick, robust and efficient three-dimensional estimate of the environment from the two-dimensional images produced by the camera 22 while also allowing fast low memory filter initialisation. Optionally, odometry sensing (for example from wheel rotation encoders etc.) is used to reduce the computational complexity of the VSLAM system.

The VSLAM system 8 produces an estimate of the position of the robot within the environment with associated confidences, and a summary map of spatially located features within the environment. The position estimate is used to fuse sensor information into the internal map 44, to aid trajectory following, and, using information theory, to provide trajectory constraints on the planning algorithm 46 to maximise map and localisation quality whilst the vacuum cleaner is cleaning.

Optionally, an optical flow system 40 uses passive sensing to provide an estimate of the relative distances of objects moving relative to the camera 22. The optical flow system 40 monitors the movement of objects or texture within the images from the camera 22 that result from the camera's movements within the environment, and derives motion cues there-from. The optical flow system 40 uses these motion cues to provide an estimate of the distance to an object within the environment based on the rate at which the object moves in the images produced by the camera. By way of a simple example, objects close to camera will appear to move within the images at a higher rate than objects distant there-from. This relative information is used in conjunction with the VSLAM summary map by a position registration system 42 to produce a more accurate, absolute estimate of the position and orientation of surfaces within the sensor field of view.

An internal mapping system 44 provides an estimate of the state of the environment within which the robotic vacuum cleaner is operating; it is updated by fusing point cloud information from the three dimensional mapping apparatus 12 and the optical flow system 40 with contextual information from the VSLAM summary map. In this respect, the internal mapping system 44 creates an internal map of the environment . This set of sensor information provides a variable accuracy estimate of conditions within the environment dependent on the locality of the robot; this allows high accuracy trajectory control at short range, and room size estimation.

As pointed out previously, the information from the detailed three dimensional mapping apparatus 12 is not incorporated into the VSLAM summary map; rather the information from the detailed three dimensional mapping apparatus is used in the internal map 44 to identifying navigable and non-navigable points within the VSLAM summary map.

The foregoing confers additional benefits in terms of automatic timing and power control. For example, the robotic vacuum cleaner will rapidly know the size of the room within which it is operating (and the location of any obstacles within the room) and will have a navigation strategy to cover the floor space. Accordingly, it can calculate how long it will take to clean a particular room. So, instead of moving randomly about the room until the power runs out, a robotic vacuum cleaner having the present navigation system can display an estimate of how many minutes are left before it finishes cleaning the room.

Also, as the robotic vacuum cleaner knows how much power it has in its battery, it can optimise the use of that power to clean a given area. For example, if it is in a small room, then rather than finish the room at normal suction and still have some capacity left in its battery, it could automatically boost the suction power so its completes the job in the same time, but cleans better. It can then fully recharge its battery, thereby avoiding any memory effect which could reduce battery capacity.

The robotic cleaner includes a utility calculation system 48 which uses information relating to current and predicted states of the robotic vacuum cleaner and the environment within which it is operating, and calculated optimal strategy, to produce an estimate of current and future system utility costs. The use of system utility costs provides a means by which the robotic vacuum cleaner can assess and prioritise a set of available actions, and select an appropriate course of action there-from.

The above-mentioned utility costs are calculated for the variables exploration, cleaning, localisation, efficiency, and cleaning time. These utility costs are then combined as per Equation 1 below to determine the appropriate set of actions (e.g. route) to take next. The use of a utility based selection scheme allows the behaviour of the robotic vacuum cleaner to be optimised by switching between exploration at system start-up and cleaning once localised by altering the performance weights. The utility based selection scheme also enables fine control of the behaviour within those strategies to be automatically controlled while the robot is operating.

System Utility Cost = ax .by .cz (Equation 1)

(where a, b, c are individual utility costs and x, y, z are relative performance weights)

The trajectory controller 16 (also referred to hereinbefore as the motion control system) is enabled by a robot localisation estimate provided by the VSLAM system 8; and provides local motion control of the robotic cleaner for negotiation of obstacles and emergency collision avoidance; this provides a second level of avoidance of all detectable obstacles. In the interests of clarity, the function of the navigation controller 14 referred to previously with respect to figures 1 and 2 is performed collectively by the internal mapping system 44, the global planner 46, the utility calculator 48, and the command fusion system 50.

Figure 4 shows an example of'a map of an area to be cleaned created by the robotic vacuum cleaner illustrated in figure 3.

The information contained within the internal map includes cleaned, un-cleaned, cleanable, un-cleanable, unknown, un-passable areas, cleaning and exploration frontiers, and optionally dirt level. The optional dirt sensor 56 provides information about the level and rate of uptake of dirt, thereby giving a measure of how dirty is the area of floor currently being cleaned. This information is used during a single cleaning cycle to optimise the intensity or repetition of cleaning a particular area, and used over multiple cycles to predict both when and where the floor needs cleaning.

A detailed knowledge of the area to be cleaned and delineation of the space into separate rooms enables the user interface to comprise merely two buttons; one button labelled "clean this room", and the other one "clean the apartment" / "clean the floor", i.e. all rooms on this floor. The navigation system according to the present invention has been described herein primarily in terms of its applications to domestic robotic devices, in particular with reference to an autonomous domestic vacuum cleaner. However, the present navigation system is not restricted to such applications and the skilled person will readily appreciate alternative applications of the present navigation system. By way of further example, a robotic device incorporating the present navigation system could be programmed to act as a guard, searching rooms, looking for activity out of hours, etc. Alternatively, such a robotic device could be used to move material around a warehouse, if connected to a trolley or forklift, for example. Additional domestic robotic applications include infotainment / entertainment, robotic pets, etc. Further uses envisaged for the navigation system of the present invention are for robots used for painting work, for repair and maintenance work, for surveying, e.g. topology and for flaw and crack detection in difficult-to-access areas.

In view of the foregoing description it will be evident to a person skilled in the art that various modifications may be made within the scope of the invention.

The scope of the present disclosure includes any novel feature or combination of features disclosed therein either explicitly or implicitly or any generalisation thereof irrespective of whether or not it relates to the claimed invention or mitigates any or all of the problems addressed by the present invention. The applicant hereby gives notice that new claims may be formulated to such features during the prosecution of this application or of any such further application derived there from. In particular, with reference to the appended claims, features from dependent claims may be combined with those of the independent claims and features from respective independent claims may be combined in any appropriate manner and not merely in the specific combinations enumerated in the claims.

Claims

Claims
1. A navigation system comprising a primary mapping apparatus adapted to detect features within an environment and to create a summary map of the environment including an estimate of a point of current location within the environment; a secondary mapping apparatus adapted to provide a detailed three-dimensional map of the local environment in the vicinity of the point of current location; and a processor adapted to determine navigable points within the environment by combining information from the summary map and the detailed map.
2. A navigation system according to claim 1 wherein the processor is configured to provide instructions to a motion control system so as to navigate from the point of current location to another navigable point within the environment.
3. A navigation system according to claim 1 or 2, the primary mapping apparatus having an optical sensor adapted to detect the features within the environment and wherein the mapping apparatus utilises a visual simultaneous localisation and mapping (VSLAM) process to create the summary map of the environment.
4. A navigation system according to any of the preceding claims wherein secondary mapping apparatus comprises an imaging apparatus having an imaging sensor and a structured light generator.
5. A navigation apparatus according to claim 4 wherein the imaging apparatus comprises at least one of a spot projector and a pattern projector.
6. A navigation apparatus according to claim 4 or 5 when directly or indirectly dependent on claim 3 wherein the imaging sensor and the optical sensor comprise a common sensor.
7. A navigation apparatus according to claim 6 wherein the common sensor is one of a video camera, a CMOS camera and a charge-coupled device (CCD).
8. A navigation apparatus according to any of claims 3 - 7 wherein the optical sensor is arranged to have a field of view which includes an upward direction.
9. A navigation apparatus according to claim 8 wherein the optical sensor is arranged, in use, to detect features disposed on a substantially planar surface within the environment, and the mapping apparatus is adapted to create from said detected features a summary map of the environment underlying said planar surface.
10. A vehicle having a navigation system according to any of claims 1 - 9.
11. A mobile robotic device having a navigation system according to any of claims 1 - 9.
12. A mobile robotic device according to claim 11 comprising any of a domestic robotic device, an industrial robotic device, a security robotic device, a floor cleaner, a vacuum cleaner, a lawn mower, a robotic entertainment device, a maintenance device, a surveying device, a fork lift truck, and a car.
13. A mobile robotic device according to claim 12 adapted to receive a detachable accessory, wherein the robotic device adopts a supplementary mode of operation when the accessory is attached to the robotic device.
14. A mobile robotic device according to claim 13 comprising a vacuum cleaner having a vacuum pressure variable between a minimum and a maximum, wherein maximum vacuum pressure is automatically selected in the supplementary mode of operation.
15. A mobile robotic device according to claim 14 wherein the accessory comprises an accessory hose adapted to engage in fluid communication with a vacuum inlet on the vacuum cleaner.
16. A mobile robotic device according to claim 15 in which the accessory hose is manoeuvrable by a user, wherein the navigation system is adapted , to provide instructions to the motion control system so as to follow the user.
17. A mobile robotic device according to any of claims 13 - 15 wherein the navigation system is disabled in the supplementary mode of operation.
18. A method of controlling a robotic device within an area to be traversed, the robotic device having a variable power requirement and a navigation system adapted to map features in an environment, the method comprising the steps of:
(i) in a first mode of operation, moving the robotic device in a substantially random motion within the area to be traversed whilst concurrently mapping the environment and creating a summary map of the area to be traversed, wherein the device is configured to use a minimum power consumption during said first mode of operation,
(ii) in a second mode of operation, moving the robotic device in at least one direction so as to map the environment in greater detail and to create a complete summary map of the area to be traversed, wherein the device is configured to use increased power consumption during said second mode of operation.
(iii) in a third mode of operation, moving the robotic device in a deterministic motion so as to provide optimum traversing of the space, wherein the device is configured to use a increased power consumption during said third mode of operation.
19. A method according to claim 18 wherein the device is configured only to use sufficient power to traverse the area and map the environment during said first mode of operation.
20. A method according to claim 18 or 19 wherein, in use, the robotic device operates in the first, second and third modes of operation in numerical sequence.
21. A method according to claim 18 or 19 wherein the mode within which the robotic device operates is selected in response to a status condition.
22. A method according to 21 wherein the status condition is derived from a plurality of variables, each variable having a changeable weighting factor applied thereto so as to optimise the behaviour of the robotic device.
23. A method according to claim 22 wherein the variables are selected from exploration of the area to be traversed, operation of the robotic device, localisation within the environment, efficiency of operation and operating time.
24. A method according to any of claims 21 - 23 wherein the robotic device reverts to the first mode of operation in the event of a failure in the navigation system.
25. A method according to any of claims 18 - 24 wherein the robotic device is a vacuum cleaner and the steps of configuring the device to use minimum and increased power consumption comprise configuring the vacuum cleaner to use minimum and increased suction power, respectively.
26. Use of a visual simultaneous localisation and mapping (VSLAM) process to create a summary map of an environment, and a localised three-dimensional imaging technique to generate a local map of the environment, for autonomous navigation.
PCT/GB2006/003958 2005-10-31 2006-10-25 Navigation system WO2007051972A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US73181605P true 2005-10-31 2005-10-31
GB0522153A GB0522153D0 (en) 2005-10-31 2005-10-31 Navigation system
GB0522153.6 2005-10-31
US60/731,816 2005-10-31

Publications (1)

Publication Number Publication Date
WO2007051972A1 true WO2007051972A1 (en) 2007-05-10

Family

ID=37559849

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2006/003958 WO2007051972A1 (en) 2005-10-31 2006-10-25 Navigation system

Country Status (1)

Country Link
WO (1) WO2007051972A1 (en)

Cited By (75)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2058720A2 (en) * 2007-11-09 2009-05-13 Samsung Electronics Co., Ltd. Apparatus and method for generating three-dimensional map using structured light
EP2169507A2 (en) * 2008-09-11 2010-03-31 Deere & Company Distributed knowledge base method for vehicular localization and work-site management
EP2169505A2 (en) * 2008-09-11 2010-03-31 Deere & Company Distributed knowledge base for vehicular localization and work-site management
EP2252190A2 (en) * 2008-01-28 2010-11-24 Seegrid Corporation Service robot and method of operating same
WO2010141209A1 (en) * 2009-06-01 2010-12-09 Robert Bosch Gmbh Method and apparatus for combining three-dimensional position and two-dimensional intensity mapping for localization
WO2012013280A1 (en) * 2010-07-29 2012-02-02 Faro Technologies Inc. Device for optically scanning and measuring an environment
US20120121161A1 (en) * 2010-09-24 2012-05-17 Evolution Robotics, Inc. Systems and methods for vslam optimization
US8384914B2 (en) 2009-07-22 2013-02-26 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8433442B2 (en) 2008-01-28 2013-04-30 Seegrid Corporation Methods for repurposing temporal-spatial information collected by service robots
CN103325296A (en) * 2012-03-19 2013-09-25 联想(北京)有限公司 Information processing method and equipment used for simultaneous localization and mapping
WO2013153415A1 (en) * 2012-04-11 2013-10-17 Balyo Apparatus and method for determining reference elements of an environment
EP2294960A3 (en) * 2009-09-11 2013-11-20 Vorwerk & Co. Interholding GmbH Method for operating a cleaning robot
EP2677386A1 (en) * 2012-06-18 2013-12-25 LG Electronics Inc. Robot cleaner and obstacle detection control method of the same
US8625106B2 (en) 2009-07-22 2014-01-07 Faro Technologies, Inc. Method for optically scanning and measuring an object
DE102012112401B3 (en) * 2012-12-17 2014-03-27 Miele & Cie. Kg A self-propelled robot and methods for determining a rotational position of at least one drive wheel of a self-propelled robot
DE102012109481A1 (en) * 2012-10-05 2014-04-10 Faro Technologies, Inc. An apparatus for optical scanning and measuring an environment
US8699007B2 (en) 2010-07-26 2014-04-15 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8705016B2 (en) 2009-11-20 2014-04-22 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8705012B2 (en) 2010-07-26 2014-04-22 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8719474B2 (en) 2009-02-13 2014-05-06 Faro Technologies, Inc. Interface for communication between internal and external devices
US8730477B2 (en) 2010-07-26 2014-05-20 Faro Technologies, Inc. Device for optically scanning and measuring an environment
DE102013100192A1 (en) * 2013-01-10 2014-07-10 Miele & Cie. Kg A self-propelled robot and methods for determining the distance with a self-propelled robot
US8830485B2 (en) 2012-08-17 2014-09-09 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8892256B2 (en) 2008-01-28 2014-11-18 Seegrid Corporation Methods for real-time and near real-time interactions with robots that service a facility
EP2804065A1 (en) * 2013-05-17 2014-11-19 MSI Computer (Shenzhen) Co., Ltd. Mobile device for generating first map information for an electronic device and receiving second map information from the electronic device
US8896819B2 (en) 2009-11-20 2014-11-25 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8989972B2 (en) 2008-09-11 2015-03-24 Deere & Company Leader-follower fully-autonomous vehicle with operator on side
US8997362B2 (en) 2012-07-17 2015-04-07 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine with optical communications bus
US9009000B2 (en) 2010-01-20 2015-04-14 Faro Technologies, Inc. Method for evaluating mounting stability of articulated arm coordinate measurement machine using inclinometers
EP2584522A3 (en) * 2011-10-23 2015-04-22 Bond Textile Research BV A system for assessing a spatial use load of a building, a method for using the system, a dirt collector device and a method of covering a floor, both for use in the said system or method
US9026315B2 (en) 2010-10-13 2015-05-05 Deere & Company Apparatus for machine coordination which maintains line-of-site contact
CN104714547A (en) * 2013-12-12 2015-06-17 赫克斯冈技术中心 Autonomous gardening vehicle with camera
WO2015090397A1 (en) * 2013-12-19 2015-06-25 Aktiebolaget Electrolux Robotic cleaning device
US9074883B2 (en) 2009-03-25 2015-07-07 Faro Technologies, Inc. Device for optically scanning and measuring an environment
EP2894533A2 (en) 2013-10-31 2015-07-15 LG Electronics Inc. Mobile robot and operating method thereof
US9113023B2 (en) 2009-11-20 2015-08-18 Faro Technologies, Inc. Three-dimensional scanner with spectroscopic energy detector
CN104916216A (en) * 2015-06-26 2015-09-16 深圳乐行天下科技有限公司 Map construction method and system thereof
DE102014012811A1 (en) * 2014-03-27 2015-10-01 Miele & Cie. Kg A floor cleaning device and method and system for determining a flat floor plan by a self-propelled floor cleaning appliance
US9163922B2 (en) 2010-01-20 2015-10-20 Faro Technologies, Inc. Coordinate measurement machine with distance meter and camera to determine dimensions within camera images
US9168654B2 (en) 2010-11-16 2015-10-27 Faro Technologies, Inc. Coordinate measuring machines with dual layer arm
US20150313432A1 (en) * 2014-05-02 2015-11-05 Lg Electronics Inc. Cleaner
US9188980B2 (en) 2008-09-11 2015-11-17 Deere & Company Vehicle with high integrity perception system
CN105094127A (en) * 2014-05-15 2015-11-25 Lg电子株式会社 Cleaner and method of controlling cleaner
US9210288B2 (en) 2009-11-20 2015-12-08 Faro Technologies, Inc. Three-dimensional scanner with dichroic beam splitters to capture a variety of signals
CN105123090A (en) * 2015-07-16 2015-12-09 张萍 Multifunctional intelligent mower based on internet of things
USRE45854E1 (en) 2006-07-03 2016-01-19 Faro Technologies, Inc. Method and an apparatus for capturing three-dimensional data of an area of space
WO2016054691A1 (en) * 2014-10-07 2016-04-14 Commonwealth Scientific And Industrial Research Organisation A method of setting up a tracking system
JP2016052514A (en) * 2014-09-03 2016-04-14 ダイソン・テクノロジー・リミテッド Mobile Robot
US9329271B2 (en) 2010-05-10 2016-05-03 Faro Technologies, Inc. Method for optically scanning and measuring an environment
US9378554B2 (en) 2014-10-09 2016-06-28 Caterpillar Inc. Real-time range map generation
US9417056B2 (en) 2012-01-25 2016-08-16 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9417316B2 (en) 2009-11-20 2016-08-16 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9513107B2 (en) 2012-10-05 2016-12-06 Faro Technologies, Inc. Registration calculation between three-dimensional (3D) scans based on two-dimensional (2D) scan data from a 3D scanner
US9529083B2 (en) 2009-11-20 2016-12-27 Faro Technologies, Inc. Three-dimensional scanner with enhanced spectroscopic energy detector
US9551575B2 (en) 2009-03-25 2017-01-24 Faro Technologies, Inc. Laser scanner having a multi-color light source and real-time color receiver
US9563204B2 (en) 2012-08-14 2017-02-07 Husqvarna Ab Mower with object detection system
US9607239B2 (en) 2010-01-20 2017-03-28 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US9628775B2 (en) 2010-01-20 2017-04-18 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
EP3168704A1 (en) * 2015-11-12 2017-05-17 Hexagon Technology Center GmbH 3d surveying of a surface by mobile vehicles
EP2741161A3 (en) * 2012-12-10 2017-07-12 Miele & Cie. KG Autonomous apparatus for processing a surface and method for detecting the position of the autonomous apparatus
US9811089B2 (en) 2013-12-19 2017-11-07 Aktiebolaget Electrolux Robotic cleaning device with perimeter recording function
US9939529B2 (en) 2012-08-27 2018-04-10 Aktiebolaget Electrolux Robot positioning system
US9946263B2 (en) 2013-12-19 2018-04-17 Aktiebolaget Electrolux Prioritizing cleaning areas
US10045675B2 (en) 2013-12-19 2018-08-14 Aktiebolaget Electrolux Robotic vacuum cleaner with side brush moving in spiral pattern
EP3360455A1 (en) * 2010-07-12 2018-08-15 LG Electronics Inc. Robot cleaner and controlling method of the same
US10067231B2 (en) 2012-10-05 2018-09-04 Faro Technologies, Inc. Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner
US10144342B2 (en) 2014-09-03 2018-12-04 Dyson Technology Limited Mobile robot
US10149589B2 (en) 2013-12-19 2018-12-11 Aktiebolaget Electrolux Sensing climb of obstacle of a robotic cleaning device
US10175037B2 (en) 2015-12-27 2019-01-08 Faro Technologies, Inc. 3-D measuring device with battery pack
DE102017118227A1 (en) * 2017-08-10 2019-02-14 Vorwerk & Co. Interholding Gmbh Harrow with an accumulator
US10219665B2 (en) 2013-04-15 2019-03-05 Aktiebolaget Electrolux Robotic vacuum cleaner with protruding sidebrush
US10231591B2 (en) 2013-12-20 2019-03-19 Aktiebolaget Electrolux Dust container
WO2019063299A1 (en) * 2017-09-26 2019-04-04 Robert Bosch Gmbh Method and system for mapping and locating a vehicle based on radar measurements
US10281259B2 (en) 2010-01-20 2019-05-07 Faro Technologies, Inc. Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features
WO2019101651A1 (en) * 2017-11-27 2019-05-31 Robert Bosch Gmbh Method and device for operating a mobile system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0358628A2 (en) * 1988-09-06 1990-03-14 Transitions Research Corporation Visual navigation and obstacle avoidance structured light system
US5323593A (en) * 1993-01-13 1994-06-28 Cline Lohn G Method and apparatus for mowing lawns
DE19753668A1 (en) * 1997-05-12 1998-11-19 Kwang Ju Electronics Co Ltd Remote-controlled vacuum cleaner
DE10323418A1 (en) * 2002-06-26 2004-01-22 Samsung Gwangju Electronics Co. Ltd. An automatic cleaning device and automatic cleaning system and methods for their control

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0358628A2 (en) * 1988-09-06 1990-03-14 Transitions Research Corporation Visual navigation and obstacle avoidance structured light system
US5323593A (en) * 1993-01-13 1994-06-28 Cline Lohn G Method and apparatus for mowing lawns
DE19753668A1 (en) * 1997-05-12 1998-11-19 Kwang Ju Electronics Co Ltd Remote-controlled vacuum cleaner
DE10323418A1 (en) * 2002-06-26 2004-01-22 Samsung Gwangju Electronics Co. Ltd. An automatic cleaning device and automatic cleaning system and methods for their control

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
MALLOT H A; ET AL: "View-based cognitive map learning by an autonomous robot", PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON ARTIFICIAL NEURAL NETWORKS, XX, XX, 1995, pages 381 - 386, XP007901498 *
TOMONO M: "Environment modeling by a mobile robot with a laser range finder and a monocular camera", ADVANCED ROBOTICS AND ITS SOCIAL IMPACTS, 2005. IEEE WORKSHOP ON NAGOYA, JAPAN JUNE 12-15, 2005, PISCATAWAY, NJ, USA,IEEE, 12 June 2005 (2005-06-12), pages 133 - 138, XP010838219, ISBN: 0-7803-8947-6 *

Cited By (117)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE45854E1 (en) 2006-07-03 2016-01-19 Faro Technologies, Inc. Method and an apparatus for capturing three-dimensional data of an area of space
EP2058720A2 (en) * 2007-11-09 2009-05-13 Samsung Electronics Co., Ltd. Apparatus and method for generating three-dimensional map using structured light
KR101461185B1 (en) * 2007-11-09 2014-11-14 삼성전자 주식회사 Apparatus and method for building 3D map using structured light
EP2058720A3 (en) * 2007-11-09 2012-12-19 Samsung Electronics Co., Ltd. Apparatus and method for generating three-dimensional map using structured light
US9182763B2 (en) 2007-11-09 2015-11-10 Samsung Electronics Co., Ltd. Apparatus and method for generating three-dimensional map using structured light
US8433442B2 (en) 2008-01-28 2013-04-30 Seegrid Corporation Methods for repurposing temporal-spatial information collected by service robots
US8892256B2 (en) 2008-01-28 2014-11-18 Seegrid Corporation Methods for real-time and near real-time interactions with robots that service a facility
US9603499B2 (en) 2008-01-28 2017-03-28 Seegrid Corporation Service robot and method of operating same
US8838268B2 (en) 2008-01-28 2014-09-16 Seegrid Corporation Service robot and method of operating same
EP2252190A2 (en) * 2008-01-28 2010-11-24 Seegrid Corporation Service robot and method of operating same
EP2252190A4 (en) * 2008-01-28 2011-04-06 Seegrid Corp Service robot and method of operating same
EP2169505A3 (en) * 2008-09-11 2014-11-12 Deere & Company Distributed knowledge base for vehicular localization and work-site management
US9235214B2 (en) 2008-09-11 2016-01-12 Deere & Company Distributed knowledge base method for vehicular localization and work-site management
EP2169505A2 (en) * 2008-09-11 2010-03-31 Deere & Company Distributed knowledge base for vehicular localization and work-site management
US9188980B2 (en) 2008-09-11 2015-11-17 Deere & Company Vehicle with high integrity perception system
EP2169507A2 (en) * 2008-09-11 2010-03-31 Deere & Company Distributed knowledge base method for vehicular localization and work-site management
US8989972B2 (en) 2008-09-11 2015-03-24 Deere & Company Leader-follower fully-autonomous vehicle with operator on side
EP2169507A3 (en) * 2008-09-11 2014-11-12 Deere & Company Distributed knowledge base method for vehicular localization and work-site management
US9274524B2 (en) 2008-09-11 2016-03-01 Deere & Company Method for machine coordination which maintains line-of-site contact
US8719474B2 (en) 2009-02-13 2014-05-06 Faro Technologies, Inc. Interface for communication between internal and external devices
US9551575B2 (en) 2009-03-25 2017-01-24 Faro Technologies, Inc. Laser scanner having a multi-color light source and real-time color receiver
US9074883B2 (en) 2009-03-25 2015-07-07 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8473187B2 (en) 2009-06-01 2013-06-25 Robert Bosch Gmbh Method and apparatus for combining three-dimensional position and two-dimensional intensity mapping for localization
WO2010141209A1 (en) * 2009-06-01 2010-12-09 Robert Bosch Gmbh Method and apparatus for combining three-dimensional position and two-dimensional intensity mapping for localization
JP2012529012A (en) * 2009-06-01 2012-11-15 ロベルト ボッシュ ゲーエムベーハー Method and apparatus for combining the mapping of the three-dimensional position and two-dimensional intensity for position confirmation
US8384914B2 (en) 2009-07-22 2013-02-26 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8625106B2 (en) 2009-07-22 2014-01-07 Faro Technologies, Inc. Method for optically scanning and measuring an object
EP2294960A3 (en) * 2009-09-11 2013-11-20 Vorwerk & Co. Interholding GmbH Method for operating a cleaning robot
US8705016B2 (en) 2009-11-20 2014-04-22 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9417316B2 (en) 2009-11-20 2016-08-16 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9210288B2 (en) 2009-11-20 2015-12-08 Faro Technologies, Inc. Three-dimensional scanner with dichroic beam splitters to capture a variety of signals
US9113023B2 (en) 2009-11-20 2015-08-18 Faro Technologies, Inc. Three-dimensional scanner with spectroscopic energy detector
US8896819B2 (en) 2009-11-20 2014-11-25 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9529083B2 (en) 2009-11-20 2016-12-27 Faro Technologies, Inc. Three-dimensional scanner with enhanced spectroscopic energy detector
US10060722B2 (en) 2010-01-20 2018-08-28 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US9163922B2 (en) 2010-01-20 2015-10-20 Faro Technologies, Inc. Coordinate measurement machine with distance meter and camera to determine dimensions within camera images
US9628775B2 (en) 2010-01-20 2017-04-18 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US10281259B2 (en) 2010-01-20 2019-05-07 Faro Technologies, Inc. Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features
US9607239B2 (en) 2010-01-20 2017-03-28 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US9009000B2 (en) 2010-01-20 2015-04-14 Faro Technologies, Inc. Method for evaluating mounting stability of articulated arm coordinate measurement machine using inclinometers
US9684078B2 (en) 2010-05-10 2017-06-20 Faro Technologies, Inc. Method for optically scanning and measuring an environment
US9329271B2 (en) 2010-05-10 2016-05-03 Faro Technologies, Inc. Method for optically scanning and measuring an environment
EP3360456A1 (en) * 2010-07-12 2018-08-15 LG Electronics Inc. Robot cleaner and controlling method of the same
EP3360455A1 (en) * 2010-07-12 2018-08-15 LG Electronics Inc. Robot cleaner and controlling method of the same
US8730477B2 (en) 2010-07-26 2014-05-20 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8699007B2 (en) 2010-07-26 2014-04-15 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8705012B2 (en) 2010-07-26 2014-04-22 Faro Technologies, Inc. Device for optically scanning and measuring an environment
GB2496087B (en) * 2010-07-29 2014-08-06 Faro Tech Inc Device for optically scanning and measuring an environment
US8699036B2 (en) 2010-07-29 2014-04-15 Faro Technologies, Inc. Device for optically scanning and measuring an environment
GB2496087A (en) * 2010-07-29 2013-05-01 Faro Tech Inc Device for optically scanning and measuring an environment
WO2012013280A1 (en) * 2010-07-29 2012-02-02 Faro Technologies Inc. Device for optically scanning and measuring an environment
US9910444B2 (en) 2010-09-24 2018-03-06 Irobot Corporation Systems and methods for VSLAM optimization
US20120121161A1 (en) * 2010-09-24 2012-05-17 Evolution Robotics, Inc. Systems and methods for vslam optimization
US9286810B2 (en) * 2010-09-24 2016-03-15 Irobot Corporation Systems and methods for VSLAM optimization
US9026315B2 (en) 2010-10-13 2015-05-05 Deere & Company Apparatus for machine coordination which maintains line-of-site contact
US9168654B2 (en) 2010-11-16 2015-10-27 Faro Technologies, Inc. Coordinate measuring machines with dual layer arm
EP2584522A3 (en) * 2011-10-23 2015-04-22 Bond Textile Research BV A system for assessing a spatial use load of a building, a method for using the system, a dirt collector device and a method of covering a floor, both for use in the said system or method
US9417056B2 (en) 2012-01-25 2016-08-16 Faro Technologies, Inc. Device for optically scanning and measuring an environment
CN103325296A (en) * 2012-03-19 2013-09-25 联想(北京)有限公司 Information processing method and equipment used for simultaneous localization and mapping
WO2013153415A1 (en) * 2012-04-11 2013-10-17 Balyo Apparatus and method for determining reference elements of an environment
US9448062B2 (en) 2012-04-11 2016-09-20 Balyo S. A. Apparatus and method for determining reference elements of an environment
EP2677386A1 (en) * 2012-06-18 2013-12-25 LG Electronics Inc. Robot cleaner and obstacle detection control method of the same
US9511494B2 (en) 2012-06-18 2016-12-06 Lg Electronics Inc. Robot cleaner and controlling method of the same
US8997362B2 (en) 2012-07-17 2015-04-07 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine with optical communications bus
US9563204B2 (en) 2012-08-14 2017-02-07 Husqvarna Ab Mower with object detection system
US8830485B2 (en) 2012-08-17 2014-09-09 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9939529B2 (en) 2012-08-27 2018-04-10 Aktiebolaget Electrolux Robot positioning system
US9618620B2 (en) 2012-10-05 2017-04-11 Faro Technologies, Inc. Using depth-camera images to speed registration of three-dimensional scans
US9372265B2 (en) 2012-10-05 2016-06-21 Faro Technologies, Inc. Intermediate two-dimensional scanning with a three-dimensional scanner to speed registration
DE102012109481A1 (en) * 2012-10-05 2014-04-10 Faro Technologies, Inc. An apparatus for optical scanning and measuring an environment
US10203413B2 (en) 2012-10-05 2019-02-12 Faro Technologies, Inc. Using a two-dimensional scanner to speed registration of three-dimensional scan data
US9513107B2 (en) 2012-10-05 2016-12-06 Faro Technologies, Inc. Registration calculation between three-dimensional (3D) scans based on two-dimensional (2D) scan data from a 3D scanner
US10067231B2 (en) 2012-10-05 2018-09-04 Faro Technologies, Inc. Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner
US9739886B2 (en) 2012-10-05 2017-08-22 Faro Technologies, Inc. Using a two-dimensional scanner to speed registration of three-dimensional scan data
US9746559B2 (en) 2012-10-05 2017-08-29 Faro Technologies, Inc. Using two-dimensional camera images to speed registration of three-dimensional scans
EP2741161A3 (en) * 2012-12-10 2017-07-12 Miele & Cie. KG Autonomous apparatus for processing a surface and method for detecting the position of the autonomous apparatus
DE102012112401B3 (en) * 2012-12-17 2014-03-27 Miele & Cie. Kg A self-propelled robot and methods for determining a rotational position of at least one drive wheel of a self-propelled robot
EP2755101A3 (en) * 2013-01-10 2017-11-29 Miele & Cie. KG Self-propelled robot and method for distance determination in a self-propelled robot
EP2755101A2 (en) 2013-01-10 2014-07-16 Miele & Cie. KG Self-propelled robot and method for distance determination in a self-propelled robot
DE102013100192A1 (en) * 2013-01-10 2014-07-10 Miele & Cie. Kg A self-propelled robot and methods for determining the distance with a self-propelled robot
US10219665B2 (en) 2013-04-15 2019-03-05 Aktiebolaget Electrolux Robotic vacuum cleaner with protruding sidebrush
US9357893B2 (en) 2013-05-17 2016-06-07 Msi Computer (Shenzhen) Co., Ltd. Mobile device generating map information to an electronic device
EP2804065A1 (en) * 2013-05-17 2014-11-19 MSI Computer (Shenzhen) Co., Ltd. Mobile device for generating first map information for an electronic device and receiving second map information from the electronic device
CN104161487A (en) * 2013-05-17 2014-11-26 恩斯迈电子(深圳)有限公司 Mobile device
EP2894533A2 (en) 2013-10-31 2015-07-15 LG Electronics Inc. Mobile robot and operating method thereof
EP2894533B1 (en) * 2013-10-31 2019-07-31 LG Electronics Inc. Mobile robot and operating method thereof
US9603300B2 (en) 2013-12-12 2017-03-28 Hexagon Technology Center Gmbh Autonomous gardening vehicle with camera
EP2884364A1 (en) 2013-12-12 2015-06-17 Hexagon Technology Center GmbH Autonomous gardening vehicle with camera
CN104714547A (en) * 2013-12-12 2015-06-17 赫克斯冈技术中心 Autonomous gardening vehicle with camera
US10045675B2 (en) 2013-12-19 2018-08-14 Aktiebolaget Electrolux Robotic vacuum cleaner with side brush moving in spiral pattern
US10209080B2 (en) * 2013-12-19 2019-02-19 Aktiebolaget Electrolux Robotic cleaning device
US9946263B2 (en) 2013-12-19 2018-04-17 Aktiebolaget Electrolux Prioritizing cleaning areas
US9811089B2 (en) 2013-12-19 2017-11-07 Aktiebolaget Electrolux Robotic cleaning device with perimeter recording function
CN105849660A (en) * 2013-12-19 2016-08-10 伊莱克斯公司 Robotic cleaning device
WO2015090397A1 (en) * 2013-12-19 2015-06-25 Aktiebolaget Electrolux Robotic cleaning device
US10149589B2 (en) 2013-12-19 2018-12-11 Aktiebolaget Electrolux Sensing climb of obstacle of a robotic cleaning device
US10231591B2 (en) 2013-12-20 2019-03-19 Aktiebolaget Electrolux Dust container
DE102014012811B4 (en) * 2014-03-27 2017-09-21 Miele & Cie. Kg A floor cleaning device and method and system for determining a flat floor plan by a self-propelled floor cleaning appliance
DE102014012811A1 (en) * 2014-03-27 2015-10-01 Miele & Cie. Kg A floor cleaning device and method and system for determining a flat floor plan by a self-propelled floor cleaning appliance
US20150313432A1 (en) * 2014-05-02 2015-11-05 Lg Electronics Inc. Cleaner
EP2945038A3 (en) * 2014-05-15 2016-07-06 LG Electronics Inc. Method of controlling a cleaner
US9851720B2 (en) 2014-05-15 2017-12-26 Lg Electronics Inc. Method of controlling a cleaner
CN105094127A (en) * 2014-05-15 2015-11-25 Lg电子株式会社 Cleaner and method of controlling cleaner
EP3407152A1 (en) * 2014-05-15 2018-11-28 LG Electronics Inc. Method of controlling a cleaner
US10112302B2 (en) 2014-09-03 2018-10-30 Dyson Technology Limited Mobile robot
US10144342B2 (en) 2014-09-03 2018-12-04 Dyson Technology Limited Mobile robot
JP2016052514A (en) * 2014-09-03 2016-04-14 ダイソン・テクノロジー・リミテッド Mobile Robot
WO2016054691A1 (en) * 2014-10-07 2016-04-14 Commonwealth Scientific And Industrial Research Organisation A method of setting up a tracking system
US9378554B2 (en) 2014-10-09 2016-06-28 Caterpillar Inc. Real-time range map generation
CN104916216A (en) * 2015-06-26 2015-09-16 深圳乐行天下科技有限公司 Map construction method and system thereof
CN105123090A (en) * 2015-07-16 2015-12-09 张萍 Multifunctional intelligent mower based on internet of things
EP3168704A1 (en) * 2015-11-12 2017-05-17 Hexagon Technology Center GmbH 3d surveying of a surface by mobile vehicles
CN107024199A (en) * 2015-11-12 2017-08-08 赫克斯冈技术中心 Survey by mobile carrier
US10175037B2 (en) 2015-12-27 2019-01-08 Faro Technologies, Inc. 3-D measuring device with battery pack
DE102017118227A1 (en) * 2017-08-10 2019-02-14 Vorwerk & Co. Interholding Gmbh Harrow with an accumulator
WO2019063299A1 (en) * 2017-09-26 2019-04-04 Robert Bosch Gmbh Method and system for mapping and locating a vehicle based on radar measurements
WO2019101651A1 (en) * 2017-11-27 2019-05-31 Robert Bosch Gmbh Method and device for operating a mobile system

Similar Documents

Publication Publication Date Title
US7974738B2 (en) Robotics virtual rail system and method
EP2998816B1 (en) Multi-code coverage for an autonomous robot
US8793020B2 (en) Navigational control system for a robotic device
EP1593011B1 (en) An autonomous machine
US8577538B2 (en) Method and system for controlling a remote vehicle
US7024278B2 (en) Navigational control system for a robotic device
CA2628657C (en) Landmark navigation for vehicles using blinking optical beacons
US6667592B2 (en) Mapped robot system
EP2834048B1 (en) Proximity sensing on mobile robots
JP5054010B2 (en) Method and apparatus for determining the position of the navigation and Robots
US8271132B2 (en) System and method for seamless task-directed autonomy for robots
US9538892B2 (en) Robot management systems for determining docking station pose including mobile robots and methods using same
KR101752190B1 (en) Robot cleaner and method for controlling the same
JP4951180B2 (en) Autonomous multi-platform robot system
US20030030399A1 (en) Robot touch shield
US7162056B2 (en) Systems and methods for the automated sensing of motion in a mobile robot using visual data
US7054716B2 (en) Sentry robot system
US20090182464A1 (en) Method and apparatus for planning path of mobile robot
US9020637B2 (en) Simultaneous localization and mapping for a mobile robot
US6836701B2 (en) Autonomous multi-platform robotic system
Topp et al. Tracking for following and passing persons
US20070019181A1 (en) Object detection system
JP5555737B2 (en) Autonomous coverage robot navigation system
EP3104194A1 (en) Robot positioning system
EP2336801A2 (en) System and method for deploying portable landmarks

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06794890

Country of ref document: EP

Kind code of ref document: A1