WO2016048238A1 - Method and apparatus for navigation of a robotic device - Google Patents
Method and apparatus for navigation of a robotic device Download PDFInfo
- Publication number
- WO2016048238A1 WO2016048238A1 PCT/SG2015/050332 SG2015050332W WO2016048238A1 WO 2016048238 A1 WO2016048238 A1 WO 2016048238A1 SG 2015050332 W SG2015050332 W SG 2015050332W WO 2016048238 A1 WO2016048238 A1 WO 2016048238A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robotic device
- navigation mode
- environment
- featured
- navigation
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 39
- 238000012545 processing Methods 0.000 description 16
- 238000013459 approach Methods 0.000 description 6
- 230000004807 localization Effects 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- 230000007613 environmental effect Effects 0.000 description 4
- 238000013507 mapping Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 238000010276 construction Methods 0.000 description 2
- 238000003708 edge detection Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 241001300198 Caperonia palustris Species 0.000 description 1
- 235000000384 Veronica chamaedrys Nutrition 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003032 molecular docking Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0253—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0272—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising means for registering the travel distance, e.g. revolutions of wheels
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16B—BIOINFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR GENETIC OR PROTEIN-RELATED DATA PROCESSING IN COMPUTATIONAL MOLECULAR BIOLOGY
- G16B45/00—ICT specially adapted for bioinformatics-related data visualisation, e.g. displaying of maps or networks
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
Definitions
- the present invention relates to navigation of a robotic device, and particularly, relates to a method and apparatus for navigation that allow changing of navigation mode according to an environment in which the robotic device operates.
- Autonomous navigation is a key enabler for the widespread use of robotic devices in the consumer market and industry. These robotic devices move around a place to perform tasks relying on sensors and navigation systems without any human direction. Two types of navigation are commonly employed to navigate these robotic devices: onboard navigation and offboard navigation.
- onboard navigation systems the navigation processing is performed onboard by the processors provided in the robotic devices so that the robotic devices are capable of working independently.
- offboard navigation systems the robotic devices are coupled to a remote server via a communication link (i.e. a wireless network) and harness the processing power of the server such that the navigation processing is entirely or mainly performed offboard at the remote server.
- sensor data of the environment in which the robotic devices move is transmitted from the robotic devices to the server and data of positional information of the robotic devices and/or navigation command is transmitted from the server to the robotic devices continuously. If the amount of data overwhelms the bandwidth of the wireless network, the robotic devices would be slowed down gradually until they come to a stop. If there are numerous robotic devices in the environment, the limited bandwidth is more easily overwhelmed.
- a method for navigation of a robotic device comprising: storing a map of a place in which the robotic device moves around, the map including at least one featured environment having regular features and at least one non-featured environment not having regular features; setting a navigation mode of the robotic device to a comprehensive navigation mode before the robotic device starts to move; obtaining sensor data of an environment in which the robotic device moves around to calculate positional information of the robotic device; determining whether the robotic device is in a featured environment; and switching to a limited navigation mode when the robotic device is determined to be in a featured environment.
- the positional information of the robotic device may be estimated based on information of the regular features of the featured environment.
- the positional information of the robotic device may be estimated using odometry.
- the positional information of the robotic device may be estimated using the regular features of the featured environment.
- the positional information of the robotic device may be calculated continuously based on the sensor data.
- the method may further comprise switching to the comprehensive navigation mode when the robotic device moves from the featured environment to a non-featured environment.
- the method may further comprise switching to the comprehensive navigation mode when the robotic device detects an error in the limited navigation mode.
- the regular features include at least one of regular wall structures, repeating floor patterns, repeating ceiling patterns and repeating wall patterns.
- an apparatus for navigation of a robotic device comprising: a robotic device having a plurality of sensors configured to obtain sensor data of an environment in which the robotic device moves around; a memory configured to store a map of a place in which the robotic device moves around, the map including at least one featured environment having regular features and at least one non-featured environment not having regular features; and a processor configured to set a navigation mode of the robotic device to a comprehensive navigation mode before the robotic device starts to move, calculate positional information of the robotic device based on the sensor data, determine whether the robotic device is in a featured environment, and switch to a limited navigation mode when the robotic device is determined to be in a featured environment.
- the robotic device may estimate the positional information of the robotic device based on information of the regular features of the featured environment.
- the robotic device may further comprise an odometer configured to estimate the positional information of the robotic device in the limited navigation mode.
- the robotic device may estimate the positional information of the robotic device using the regular features of the featured environment.
- the processor may continuously calculate the positional information of the robotic device based on the sensor data.
- the processor may be configured to switch to the comprehensive navigation mode when the robotic device moves from the featured environment to a non-featured environment.
- the processor may be configured to switch to the comprehensive navigation mode when the robotic device detects an error in the limited navigation mode.
- the regular features may include at least one of regular wall structures, repeating floor patterns, repeating ceiling patterns and repeating wall patterns.
- the processor may be provided in the robotic device.
- the processor may be provided in a server that communicates with the robotic device.
- FIG. 1 is a schematic diagram of an example of an apparatus of navigation according to an embodiment of the present invention
- Fig. 2 shows an example of an entire environmental map of a place in which a robotic device moves around
- Fig. 3 is a flowchart illustrating an example of a method of navigation according to an embodiment of the present invention
- Fig. 4 shows an example of a featured environment having regular wall structures
- Figs. 5A to 5C illustrate an example of a limited navigation mode in the featured environment of Fig. 4;
- Fig. 6 shows examples of regular floor tiles in featured environments
- Fig. 7 illustrates estimation of position of the robotic device in the featured environments of Fig. 6
- Fig. 8 is a flowchart illustrating an estimation method of the position of the robotic device in the featured environments of Fig. 6
- Figs. 9A to 9D are graphs illustrating total amount of network traffic and processing load in the limited navigation mode and in a comprehensive navigation mode respectively using the exemplary method of navigation in Fig. 3.
- Fig. 1 illustrates an exemplary apparatus 100 of navigation that allows changing of navigation mode according to the environment in which a robotic device 110 operates.
- the amount of data being transmitted from the robotic device 110 to a server 130 is reduced.
- the apparatus 100 includes the server 130 to which the robotic device 110 is coupled via a communication link such as a wireless network 140.
- the server 130 may be any type of device capable of receiving and sending data, and performing navigation and/or localization processing.
- the server 130 may be a cloud-computing platform so that the processing load is distributed among the cloud.
- Navigation applications having a user interface may also be installed at the server 130 so that a user is able to configure parameters related to navigation at the server 130 and monitor the entire navigation process.
- the robotic device 110 includes a plurality of sensors 112, a processor 114, a transmitter/receiver 118 and an odometer 120.
- the plurality of sensors 112 may be any type of sensor capable of obtaining data surrounding the robotic device 110 or data of the environment in which the robotic device 110 moves. Some examples of the sensors are lidars, cameras, IMU (Inertial Measurement Unit) and sonar sensors.
- the sensor data obtained is then transmitted via the transmitter/receiver 118 to the server 130 for localization processing, i.e. the processing to derive the actual position of the robotic device 110.
- the transmitter/receiver 118 not only transmits the sensor data from the robotic device 110 to the server 130, but also receives the localization information and navigation instructions from the server 130.
- a transmitter and a receiver may be separately provided to replace the transmitter/receiver 118.
- the processor 114 is used to perform processing required at the robotic device 110 such as interacting with the other components of the robotic device 110 and performing routine functions. For example, it monitors the gathering of the sensor data obtained by the plurality of sensors 112, gives instruction to send sensor data to the server 130 at appropriate timings through the wireless network 140, monitors the navigation instructions received and instructs the motors (not shown) of the robotic device 110 to move according to the navigation instructions.
- the odometer 120 is used to estimate the distance or displacement travelled by the robotic device 110.
- the visual odometry method in which the distance travelled is estimated by using sequential camera images may also be used in the robotic device 110. At least one camera is provided in the robotic device 110 when the visual odometry method is desired.
- the server 130 includes a transmitter/receiver 132, a processor 134 and a memory 136.
- the transmitter/receiver 132 receives the sensor data from the robotic device 110 transmitted by the transmitter/receiver 118 and the sensor data is processed by the processor 134 to derive the actual position of the robotic device 110.
- the processor 134 determines a navigation mode depending on the environment in which the robotic device 110 moves. Based on the determined navigation mode, the processor 134 then generates navigation instructions that are sent to the robotic device 110.
- regular environmental features that can be used to aid the navigation of the robotic device 110.
- These regular features include, but are not limited to, regular wall structures, repeating floor patterns, repeating ceiling patterns, repeating wall patterns, and door frame openings.
- Other regular features may be markers on walls and ceilings, entrances and exits, semi-permanent features which may be created manually, and known objects such as the docking station for charging the robotic device 110.
- the robotic device 110 may estimate its position to a satisfactory accuracy using information of these regular features.
- the entire space in which the robotic device 110 moves can be broadly divided into two types: a type that has regular features, namely a featured environment, and a type that has no useable regular features, namely a non-featured environment.
- FIG. 2 illustrates an example of a simple global map consisting of a non-featured environment 200, a featured environment 202 with regular wall structures and a featured environment 204 with repeating floor patterns.
- the global map in which walls, structures and facilities are indicated may be adapted from the layout map of the building or it may be adapted from a mapping process using sensors, e.g. lidar.
- the identification of the featured environments and non-featured environments may be performed manually or automatically by some algorithm techniques such as detecting straight walls using lidar or detecting floor tiles using camera. For example, when a working place for the robotic device 110 is first identified, it is manually surveyed for regular features and is then divided into a number of featured environments and non-featured environments. Each featured environment is identified by a set of regular features and the appropriate mapping and construction of the regular features are carried out.
- the information of the featured environments and the non-featured environments is included in the global map and the set of regular features may be stored in the memory 136 of the server 130.
- the server 130 will transmit a subset of the data to the robotic device 110 that corresponds to its immediate surrounding environment. As the robotic device 110 changes its position, the server 130 will transmit an incremental subset of data to the robotic device 110.
- Some examples of the regular features data are positions, lengths and widths of corridors on the global map, the distance to be moved in a featured environment, patterns and dimensions of the floor tiles and parameters to generate a map of the area covered by these floor tiles. These pieces of data are used by the robotic device 110 to estimate its position locally to move according to the desired path or planned route.
- the server 130 determines that the robotic device 110 is navigating a non-featured environment, a comprehensive navigation mode is determined and the normal method of offboard navigation is used.
- Data is communicated between the server 130 and the robotic device 110 continuously, similar to the existing offboard navigation systems, in which the robotic device 110 transmits the sensor data to the server 130 continuously and the server 130 transmits positional information of the robotic device 110 derived from the sensor data and/or the navigation command to the robotic device 110.
- a limited navigation mode is determined by the server 130.
- the robotic device 110 switches to the limited navigation mode and relies on its sensors 112, processor 114 and/or odometer 120 onboard to maintain an awareness of its local position, i.e. to estimate its position.
- the estimated positon of the robotic device 110 is transmitted to the server 130 periodically, but there is no need to transmit the sensor data continuously and calculate the position of the robotic device 110 constantly at the server 130.
- Fig. 3 shows a flowchart 300 of a method of navigation using the exemplary apparatus 100.
- the navigation apparatus 100 is initialized (step 302) and the robotic device 110 is always set to the comprehensive navigation mode immediately after initialization (step 304).
- the robotic device 110 obtains sensor data from the plurality of sensors 112 and transmits the sensor data to the server 130.
- the server 130 receives the sensor data and performs processing to calculate the positon of the robotic device 110 (step 306) based on the sensor data.
- the server 130 uses the calculated position of the robotic device 110 to determine if the robotic device 110 is in or enters a featured environment (step 308).
- the identification of the featured environments and non-featured environments may be performed automatically by some algorithm techniques such as detecting straight walls using lidar or detecting floor tiles using camera.
- the server 130 determines that the navigation mode has to be switched from the comprehensive navigation mode to the limited navigation mode.
- the server 130 instructs the robotic device 110 to switch to the limited mode (step 310) and sends information required for local navigation to the robotic device 110.
- Such information may include pieces of the global map that correspond to information of the regular features of the featured environment which is not stored in the robotic device 110.
- the robotic device 110 constantly estimates its position based on information of the regular features of the featured environment. For example, when navigating in a featured environment having regular wall structures, the robotic device 110 may estimate the distance moved by wheel or visual odometry while using lidar to keep a fixed distance from the wall.
- the robotic device 110 may use a vision algorithm to estimate its position which will be described later.
- the server 130 receives the estimated position of the robotic device 110 instead of the sensor data.
- the robotic device 110 finishes the predetermined distance in the limited navigation mode or detects that it moves from the featured environment to a non-featured environment (step 314)
- the robotic device 110 switches from the limited navigation mode to the comprehensive navigation mode (step 304).
- the server 130 starts to receive the sensor data obtained by the robotic device 110 to calculate the position of the robotic device 110 and resume the wireless transmissions necessary in the comprehensive navigation mode.
- the calculated position is sent to the robotic device 110 so that localization error accumulated in the limited mode may be corrected.
- the robotic device 110 In the event of any unexpected error detected by the robotic device 110 during navigation in a featured environment (step 312), for example, no walls or no tiles are sensed by the robotic device 110 or the navigation is aborted due to any reason in the limited mode, the robotic device 110 automatically switches from the limited navigation mode to the comprehensive navigation mode (step 304).
- the amount of wireless transmissions in the limited navigation mode is cut down as the sensor data is not transmitted to derive the position of the robotic device 110 in the limited mode.
- the processing load of the processor 134 is reduced greatly as well.
- the robotic device 110 in the limited mode may move at a faster speed than in the comprehensive mode, since the navigation is quite independent of the latency of the wireless network.
- the speed of the robotic device 110 in the comprehensive mode may have to be bounded by the network conditions as described in the Singapore Patent Application No. 201304289-0, where a trade-off exists between the speed of the robotic device 110 and its localization accuracy due to the latency of the network.
- Fig. 4 shows an example of a featured environment having regular wall structures, a long corridor.
- the desired path, or the planned route, of the robotic device 110 shown by the dashed arrows is parallel to the straight and perpendicular walls 400.
- a relative long distance has to be traversed before reaching the next waypoint at the end of the corridor.
- the robotic device 110 may be configured to move parallel to the walls 400 at a pre-determined distance, i.e.
- Figs. 5A to 5C illustrate an example of the limited navigation mode in the featured environment of Fig. 4.
- the robotic device 110 switches from the comprehensive navigation mode to the limited navigation mode when entering the long corridor, and maintains a distance of x metres away from the nearest wall by simple feedback loops, such as PID control, with a distance sensor.
- the robotic device 110 When the robotic device 110 moves near the end of the long corridor as shown in Fig. 5B, the robotic device 110 detects the far edge 510 of the opposite wall, and prepares to switch over from the limited navigation mode to the comprehensive navigation mode. While moving in the limited navigation mode, the robotic device 110 is also constantly estimating the distance of y metres it has moved in the limited navigation mode by wheel or visual odometry. The server 130 is constantly updated of this estimated position instead of performing calculation of the positions of the robotic device 110 based on the sensor data. The detected distance to the far edge 510 may also be used concurrently to improve the accuracy of the estimated position.
- the robotic device 110 switches from the limited navigation mode to the comprehensive navigation mode in Fig. 5C, and prepares the next maneuver.
- the robotic device 110 may switch to another limited navigation mode when it moves into another featured environment immediately after leaving the long corridor.
- Fig. 6 shows examples of regular floor tiles in featured environments having regular template patterns that tessellate over the floor area.
- the server 130 instructs the robotic device 110 to switch to the limited navigation mode.
- the template pattern and parameters e.g. physical dimensions of the template pattern
- the template pattern and parameters stored in the server 130 is transmitted to the robotic device 110 and a floor map of the area covered by the template pattern, i.e. the map of the featured environment, may be generated by the processor 114 of the robotic device 110.
- This method of generating floor map is more efficient than manually mapping the whole floor area covered by the template pattern in terms of time, effort and cost. There is also substantial reduction in bandwidth compared with transmitting the whole floor map to the robotic device 110.
- Fig. 7 illustrates the estimation of the position of the robotic device 110 in the limited navigation mode in a featured environment having the regular template pattern.
- a camera is mounted at the robotic device 110 to capture the images of the floor at an interval At.
- the position of the robotic device 110 at time T shown in solid lines 700 is known and the next camera image of the floor at time T+At is captured as indicated in the dotted lines 702.
- the position of the robotic device at time T+At may be determined by searching within the part of the floor map indicated in dashed lines 704, i.e. the searching window.
- At is designed to be small enough such that the distance moved by the robotic device 110 in At is less than half of the size of the regular template pattern. For example, in the featured environment of Fig.
- Fig. 8 is a flowchart illustrating an estimation method 800 of the position of the robotic device 110 in a featured environment having the regular template pattern, i.e. the vision algorithm mentioned earlier.
- step 802 the images of the floor at time T and time T+ At are captured and the pose of the robotic device 110 on the floor map at time T is known as (x T , y-r, ⁇ ⁇ ), where x, y indicate the position of the robotic device 110 and ⁇ indicates the orientation of the robotic device 110.
- step 804 the Inverse Perspective Mapping (IPM) is performed to obtain a bird-eye view of the floor.
- the camera is usually mounted on the robotic device 110 at a distance above ground and an angle to the horizontal plane, the camera image is usually distorted or warped due to perspective transform.
- the IPM step gets a top view of the floor (i.e. an IPM image) by reversing the perspective transform.
- One approach is to use the camera calibration parameters, such as the height, the angle and an intrinsic matrix of the camera, to determine the inverse perspective transformation matrix.
- Another approach is to find plane-to-plane homography by providing 4 points on the image plane and corresponding 4 points on the ground plane. If the camera is mounted vertically (i.e. directly facing floor), this step can be omitted.
- step 806 the edges or contours of the floor tiles are extracted from the IPM image of the top view of the floor to obtain a threshold image.
- a threshold image can be used.
- methods such as adaptive threshold, Sobel edge detection, Canny edge detection, can be used.
- the rotational component ⁇ ⁇ + ⁇ ( of the pose of the robotic device 11 0 is determined by detecting line segments in the threshold image using probabilistic Hough transform or other known methods.
- the angles of the detected lines are accumulated into a histogram. For example, a histogram bin resolution of 5 degrees may be used.
- the edge lines are perpendicular at 90 degrees, and the pair of line angles 90 degrees apart (i.e. a°, a+90°) that has highest combined bin score is selected.
- the rotational component ⁇ ⁇ + ⁇ ( is determined as one of (a°, a+90° , a+180° , a+270°) that has the smallest absolute change from ⁇ ⁇ .
- step 810 the translational components ⁇ ⁇ + ⁇ , ⁇ + ⁇ of the pose of the robotic device 110 are then determined.
- the floor map at the previous position x T , y T is rotated by the angle ⁇ + ⁇ and a search window image is extracted, after compensating for the offset between the centre of rotation and centre of the IPM image.
- the search window image size is slightly larger than the IPM image size. If the search window is too large, multiple ambiguous matches may be found; and if the search window is too small, a good match may not be found.
- a comfortable size of the search window may be equal to or larger than the size of the IPM image by half the size of a tile on all 4 borders (i.e.
- search window size enlarge IPM image size by 0.5a on all 4 borders). For example, all of bi to b 4 are 0.5a metres in Fig. 7.
- the threshold image as the template is compared with the search window image using a template matching algorithm.
- the matching score at each offset is stored in the result image.
- the peak location in the result image of the template matching denotes the translational movement of the robotic device 110 from T to T+At.
- the translational components ⁇ ⁇ + ⁇ , ⁇ + ⁇ of the updated pose, or the positon of the robotic device 11 0, can be determined. All the steps above are performed by the processor 114 of the robotic device 110.
- the server 130 causes the robotic device 110 to switch to the limited mode and the robotic device 110 navigates itself based on the estimated positions calculated onboard using the method described in Fig. 8.
- the robotic device 110 moves from the featured environment with regular floor patterns to a non-featured environment in which regular features are absent, it switches to the comprehensive mode.
- the algorithm described in Figs. 7 and 8 may be extended to featured environments having regular pattern on ceilings or walls instead of floor.
- Figs. 9A to 9D are graphs illustrating amount of network traffic and processing load in the limited mode and comprehensive mode respectively.
- Figs. 9A and 9C show the amount of total network traffic (i.e. data communicated between the server 130 and the robotic device 110) and the processing load of the processor 134 of the server 130 in the comprehensive mode. Since sensor data is not transmitted from the robotic device 110 to the server 130 and the position of the robotic device 110 is not calculated by the server 130 in the limited mode, the amount of total network traffic is cut down in the limited navigation mode as shown in Fig. 9B. The peaks in the traffic in Fig. 9B correspond to the transmissions of the set of regular features from the server 130 to the robotic device 110. The processing load of the processor 134 of the server 130 is hugely reduced as well as shown in Fig 9D.
- Table 1 below compares the two different types of environments and navigation modes.
- the server By making use of the regular features present in the place in which the robotic device moves, it is possible to perform simple onboard computations in real-time without excessive computational cost when the robotic device is navigating a featured environment. Instead of sending sensor data from the robotic device to the server continuously, the server sends out pieces of the entire environmental map that corresponds to the regular features to the robotic device. Accordingly, the amount of data that needs to be transmitted from the robotic devices to the server and vice versa is reduced drastically.
- the navigation apparatus described above it is possible to deploy more robotic devices to move simultaneously in a fixed wireless bandwidth environment without any additional upgrades in wireless infrastructure.
- the exemplary apparatus and method for navigation described above use offboard navigation. Nevertheless, the method of changing of navigation mode according to the environment in which a robotic device operates may also be used in the onboard navigation systems to reduce the onboard computational load.
- the global map is stored completely onboard, i.e.
- the processing performed by the server 130 of the exemplary apparatus 100 is performed onboard as well, i.e. by a processor within the robotic device. All the steps relating to communication performed by the exemplary apparatus 100 can be omitted in the onboard navigation apparatus.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Automation & Control Theory (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Life Sciences & Earth Sciences (AREA)
- Electromagnetism (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Medical Informatics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Theoretical Computer Science (AREA)
- Evolutionary Biology (AREA)
- Biotechnology (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Data Mining & Analysis (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Biophysics (AREA)
- Optics & Photonics (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
A method for navigation of a robotic device is provided, the method comprising: storing a map of a place in which the robotic device moves around, the map including at least one featured environment having regular features and at least one non-featured environment not having regular features; setting a navigation mode of the robotic device to a comprehensive navigation mode before the robotic device starts to move; obtaining sensor data of an environment in which the robotic device moves around to calculate positional information of the robotic device; determining whether the robotic device is in a featured environment; and switching to a limited navigation mode when the robotic device is determined to be in a featured environment.
Description
Method and Apparatus for Navigation of a Robotic Device Field of the Invention The present invention relates to navigation of a robotic device, and particularly, relates to a method and apparatus for navigation that allow changing of navigation mode according to an environment in which the robotic device operates.
Background
Autonomous navigation is a key enabler for the widespread use of robotic devices in the consumer market and industry. These robotic devices move around a place to perform tasks relying on sensors and navigation systems without any human direction. Two types of navigation are commonly employed to navigate these robotic devices: onboard navigation and offboard navigation. In the onboard navigation systems, the navigation processing is performed onboard by the processors provided in the robotic devices so that the robotic devices are capable of working independently. In the offboard navigation systems, the robotic devices are coupled to a remote server via a communication link (i.e. a wireless network) and harness the processing power of the server such that the navigation processing is entirely or mainly performed offboard at the remote server.
To achieve navigation of the robotic devices with a satisfactory level of accuracy is costly as it is computationally intensive and demanding in terms of processing load and power required to provide navigation commands in real time. In the onboard navigation systems, there is a trend to use more powerful computers and bigger batteries onboard. However, this approach results in higher costs. Besides the increase in cost, another disadvantage of this approach is the increase in the size of the robotic devices which may make these robotic devices unsuitable for applications where the size is one of the main concerns. While offboard navigation may address the cost and size issues, it is subject to actual network conditions because there is always a delay in transmitting and receiving signals between the robotic devices and the remote server when data is being communicated between the robotic devices and the server continuously. In particular, sensor data of the environment in which the robotic devices move is transmitted from the robotic devices to the server and data of positional information of the robotic devices and/or navigation command is transmitted from the server to the robotic devices continuously. If the amount of data overwhelms the bandwidth of the wireless network, the robotic devices would be
slowed down gradually until they come to a stop. If there are numerous robotic devices in the environment, the limited bandwidth is more easily overwhelmed.
The common approaches to the above problem in the offboard navigation are to reduce the number of robotic devices simultaneously deployed in the environment and/or to increase the bandwidth of the wireless network. However, due to the nature of deployment or infrastructure limitations, either of the approaches may not be always feasible. Therefore, it is desirable to have a navigation method and apparatus which reduce the amount of data being transmitted from the robotic devices to the server without limiting the number of robotic devices simultaneously deployed or the need to upgrade the wireless infrastructure. Summary
In accordance with one aspect of the present invention, there is provided a method for navigation of a robotic device, the method comprising: storing a map of a place in which the robotic device moves around, the map including at least one featured environment having regular features and at least one non-featured environment not having regular features; setting a navigation mode of the robotic device to a comprehensive navigation mode before the robotic device starts to move; obtaining sensor data of an environment in which the robotic device moves around to calculate positional information of the robotic device; determining whether the robotic device is in a featured environment; and switching to a limited navigation mode when the robotic device is determined to be in a featured environment.
In the limited navigation mode, the positional information of the robotic device may be estimated based on information of the regular features of the featured environment.
In the limited navigation mode, the positional information of the robotic device may be estimated using odometry.
In the limited navigation mode, the positional information of the robotic device may be estimated using the regular features of the featured environment.
In the comprehensive navigation mode, the positional information of the robotic device may be calculated continuously based on the sensor data.
The method may further comprise switching to the comprehensive navigation mode when the robotic device moves from the featured environment to a non-featured environment.
The method may further comprise switching to the comprehensive navigation mode when the robotic device detects an error in the limited navigation mode. The regular features include at least one of regular wall structures, repeating floor patterns, repeating ceiling patterns and repeating wall patterns.
In accordance with another aspect of the present invention, there is provided an apparatus for navigation of a robotic device, the apparatus comprising: a robotic device having a plurality of sensors configured to obtain sensor data of an environment in which the robotic device moves around; a memory configured to store a map of a place in which the robotic device moves around, the map including at least one featured environment having regular features and at least one non-featured environment not having regular features; and a processor configured to set a navigation mode of the robotic device to a comprehensive navigation mode before the robotic device starts to move, calculate positional information of the robotic device based on the sensor data, determine whether the robotic device is in a featured environment, and switch to a limited navigation mode when the robotic device is determined to be in a featured environment. In the limited navigation mode, the robotic device may estimate the positional information of the robotic device based on information of the regular features of the featured environment.
The robotic device may further comprise an odometer configured to estimate the positional information of the robotic device in the limited navigation mode.
In the limited navigation mode, the robotic device may estimate the positional information of the robotic device using the regular features of the featured environment. In the comprehensive navigation mode, the processor may continuously calculate the positional information of the robotic device based on the sensor data.
The processor may be configured to switch to the comprehensive navigation mode when the robotic device moves from the featured environment to a non-featured environment.
The processor may be configured to switch to the comprehensive navigation mode when the robotic device detects an error in the limited navigation mode.
The regular features may include at least one of regular wall structures, repeating floor patterns, repeating ceiling patterns and repeating wall patterns. The processor may be provided in the robotic device.
The processor may be provided in a server that communicates with the robotic device.
Brief Description of the Drawings
Embodiments of the method and the apparatus will now be described with reference to the accompanying figures in which: Fig. 1 is a schematic diagram of an example of an apparatus of navigation according to an embodiment of the present invention;
Fig. 2 shows an example of an entire environmental map of a place in which a robotic device moves around;
Fig. 3 is a flowchart illustrating an example of a method of navigation according to an embodiment of the present invention;
Fig. 4 shows an example of a featured environment having regular wall structures;
Figs. 5A to 5C illustrate an example of a limited navigation mode in the featured environment of Fig. 4;
Fig. 6 shows examples of regular floor tiles in featured environments;
Fig. 7 illustrates estimation of position of the robotic device in the featured environments of Fig. 6;
Fig. 8 is a flowchart illustrating an estimation method of the position of the robotic device in the featured environments of Fig. 6; and Figs. 9A to 9D are graphs illustrating total amount of network traffic and processing load in the limited navigation mode and in a comprehensive navigation mode respectively using the exemplary method of navigation in Fig. 3.
Detailed Description
Fig. 1 illustrates an exemplary apparatus 100 of navigation that allows changing of navigation mode according to the environment in which a robotic device 110 operates. In the exemplary apparatus 100, the amount of data being transmitted from the robotic device 110 to a server 130 is reduced.
The apparatus 100 includes the server 130 to which the robotic device 110 is coupled via a communication link such as a wireless network 140. The server 130 may be any type of device capable of receiving and sending data, and performing navigation and/or localization processing. The server 130 may be a cloud-computing platform so that the processing load is distributed among the cloud. Navigation applications having a user interface may also be installed at the server 130 so that a user is able to configure parameters related to navigation at the server 130 and monitor the entire navigation process. As illustrated, the robotic device 110 includes a plurality of sensors 112, a processor 114, a transmitter/receiver 118 and an odometer 120.
The plurality of sensors 112 may be any type of sensor capable of obtaining data surrounding the robotic device 110 or data of the environment in which the robotic device 110 moves. Some examples of the sensors are lidars, cameras, IMU (Inertial Measurement Unit) and sonar sensors. The sensor data obtained is then transmitted via the transmitter/receiver 118 to the server 130 for localization processing, i.e. the processing to derive the actual position of the robotic device 110. The transmitter/receiver 118 not only transmits the sensor data from the robotic device 110 to the server 130, but also receives the localization information and navigation
instructions from the server 130. A transmitter and a receiver may be separately provided to replace the transmitter/receiver 118.
The processor 114 is used to perform processing required at the robotic device 110 such as interacting with the other components of the robotic device 110 and performing routine functions. For example, it monitors the gathering of the sensor data obtained by the plurality of sensors 112, gives instruction to send sensor data to the server 130 at appropriate timings through the wireless network 140, monitors the navigation instructions received and instructs the motors (not shown) of the robotic device 110 to move according to the navigation instructions.
The odometer 120 is used to estimate the distance or displacement travelled by the robotic device 110. Other than the conventional wheel odometry method, the visual odometry method in which the distance travelled is estimated by using sequential camera images may also be used in the robotic device 110. At least one camera is provided in the robotic device 110 when the visual odometry method is desired.
The server 130 includes a transmitter/receiver 132, a processor 134 and a memory 136. The transmitter/receiver 132 receives the sensor data from the robotic device 110 transmitted by the transmitter/receiver 118 and the sensor data is processed by the processor 134 to derive the actual position of the robotic device 110.
The processor 134 determines a navigation mode depending on the environment in which the robotic device 110 moves. Based on the determined navigation mode, the processor 134 then generates navigation instructions that are sent to the robotic device 110.
In many environments, there are regular environmental features that can be used to aid the navigation of the robotic device 110. These regular features include, but are not limited to, regular wall structures, repeating floor patterns, repeating ceiling patterns, repeating wall patterns, and door frame openings. Other regular features may be markers on walls and ceilings, entrances and exits, semi-permanent features which may be created manually, and known objects such as the docking station for charging the robotic device 110. The robotic device 110 may estimate its position to a satisfactory accuracy using information of these regular features.
Thus, the entire space in which the robotic device 110 moves can be broadly divided into two types: a type that has regular features, namely a featured environment, and a type that has no useable regular features, namely a non-featured environment. An entire environmental map or a global map, i.e. the map of the entire space in which the robotic device moves, is created and stored in the memory 136 of the server 130. Fig. 2 illustrates an example of a simple global map consisting of a non-featured environment 200, a featured environment 202 with regular wall structures and a featured environment 204 with repeating floor patterns.
The global map in which walls, structures and facilities are indicated may be adapted from the layout map of the building or it may be adapted from a mapping process using sensors, e.g. lidar. The identification of the featured environments and non-featured environments may be performed manually or automatically by some algorithm techniques such as detecting straight walls using lidar or detecting floor tiles using camera. For example, when a working place for the robotic device 110 is first identified, it is manually surveyed for regular features and is then divided into a number of featured environments and non-featured environments. Each featured environment is identified by a set of regular features and the appropriate mapping and construction of the regular features are carried out. The information of the featured environments and the non-featured environments is included in the global map and the set of regular features may be stored in the memory 136 of the server 130. If the data of regular features is small, it may be stored entirely onboard, i.e. in a memory (not shown in Fig .1 ) of the robotic device 110. If the data of regular features is large, the server 130 will transmit a subset of the data to the robotic device 110 that corresponds to its immediate surrounding environment. As the robotic device 110 changes its position, the server 130 will transmit an incremental subset of data to the robotic device 110. Some examples of the regular features data are positions, lengths and widths of corridors on the global map, the distance to be moved in a featured environment, patterns and dimensions of the floor tiles and parameters to generate a map of the area covered by these floor tiles. These pieces of data are used by the robotic device 110 to estimate its position locally to move according to the desired path or planned route.
In the exemplary offboard navigation apparatus 100, when the server 130 determines that the robotic device 110 is navigating a non-featured environment, a comprehensive navigation mode is determined and the normal method of offboard navigation is used. Data is communicated between the server 130 and the robotic device 110 continuously,
similar to the existing offboard navigation systems, in which the robotic device 110 transmits the sensor data to the server 130 continuously and the server 130 transmits positional information of the robotic device 110 derived from the sensor data and/or the navigation command to the robotic device 110.
When the server 130 determines that the robotic device 110 is navigating a featured environment (for example, by using the positional information of the robotic device 110 derived from the sensor data or by detecting regular features using camera or lidar sensor), a limited navigation mode is determined by the server 130. The robotic device 110 switches to the limited navigation mode and relies on its sensors 112, processor 114 and/or odometer 120 onboard to maintain an awareness of its local position, i.e. to estimate its position. The estimated positon of the robotic device 110 is transmitted to the server 130 periodically, but there is no need to transmit the sensor data continuously and calculate the position of the robotic device 110 constantly at the server 130.
Fig. 3 shows a flowchart 300 of a method of navigation using the exemplary apparatus 100.
The navigation apparatus 100 is initialized (step 302) and the robotic device 110 is always set to the comprehensive navigation mode immediately after initialization (step 304). The robotic device 110 obtains sensor data from the plurality of sensors 112 and transmits the sensor data to the server 130. The server 130 receives the sensor data and performs processing to calculate the positon of the robotic device 110 (step 306) based on the sensor data. The server 130 uses the calculated position of the robotic device 110 to determine if the robotic device 110 is in or enters a featured environment (step 308). Alternatively, the identification of the featured environments and non-featured environments may be performed automatically by some algorithm techniques such as detecting straight walls using lidar or detecting floor tiles using camera. When it is determined that the robotic device 110 is in or enters a featured environment, the server 130 determines that the navigation mode has to be switched from the comprehensive navigation mode to the limited navigation mode. The server 130 instructs the robotic device 110 to switch to the limited mode (step 310) and sends information required for local navigation to the robotic device 110. Such information may include pieces of the global map that correspond to information of the regular features of the featured environment which is not stored in the robotic device 110.
In the limited navigation mode, the robotic device 110 constantly estimates its position based on information of the regular features of the featured environment. For example, when navigating in a featured environment having regular wall structures, the robotic device 110 may estimate the distance moved by wheel or visual odometry while using lidar to keep a fixed distance from the wall. When navigating in a featured environment having repeating floor patterns, the robotic device 110 may use a vision algorithm to estimate its position which will be described later. The server 130 receives the estimated position of the robotic device 110 instead of the sensor data. When the robotic device 110 finishes the predetermined distance in the limited navigation mode or detects that it moves from the featured environment to a non-featured environment (step 314), the robotic device 110 switches from the limited navigation mode to the comprehensive navigation mode (step 304). The server 130 starts to receive the sensor data obtained by the robotic device 110 to calculate the position of the robotic device 110 and resume the wireless transmissions necessary in the comprehensive navigation mode. The calculated position is sent to the robotic device 110 so that localization error accumulated in the limited mode may be corrected.
In the event of any unexpected error detected by the robotic device 110 during navigation in a featured environment (step 312), for example, no walls or no tiles are sensed by the robotic device 110 or the navigation is aborted due to any reason in the limited mode, the robotic device 110 automatically switches from the limited navigation mode to the comprehensive navigation mode (step 304). According to the above configuration, the amount of wireless transmissions in the limited navigation mode is cut down as the sensor data is not transmitted to derive the position of the robotic device 110 in the limited mode. The processing load of the processor 134 is reduced greatly as well. In addition, the robotic device 110 in the limited mode may move at a faster speed than in the comprehensive mode, since the navigation is quite independent of the latency of the wireless network.
The speed of the robotic device 110 in the comprehensive mode may have to be bounded by the network conditions as described in the Singapore Patent Application No. 201304289-0, where a trade-off exists between the speed of the robotic device 110 and its localization accuracy due to the latency of the network.
Fig. 4 shows an example of a featured environment having regular wall structures, a long corridor. The desired path, or the planned route, of the robotic device 110 shown by the dashed arrows is parallel to the straight and perpendicular walls 400. A relative long distance has to be traversed before reaching the next waypoint at the end of the corridor. In this case, the robotic device 110 may be configured to move parallel to the walls 400 at a pre-determined distance, i.e. perform wall-following, until the next waypoint is reached. A simple Proportional-lntegral-Derivative (PID) loop or feedback loop with a lidar sensor is sufficient to guarantee accurate parallel tracking, while the use of wheel or visual odometry tracks the distance moved in the simple control loop. Alternatively, the detection of certain features may be sufficient cue to signal that the long corridor is ending, e.g. detecting the far edge of the opposite wall at a predetermined distance in front of the robotic device 110 using the lidar sensor or sonar sensor. Figs. 5A to 5C illustrate an example of the limited navigation mode in the featured environment of Fig. 4.
In Fig. 5A, the robotic device 110 switches from the comprehensive navigation mode to the limited navigation mode when entering the long corridor, and maintains a distance of x metres away from the nearest wall by simple feedback loops, such as PID control, with a distance sensor.
When the robotic device 110 moves near the end of the long corridor as shown in Fig. 5B, the robotic device 110 detects the far edge 510 of the opposite wall, and prepares to switch over from the limited navigation mode to the comprehensive navigation mode. While moving in the limited navigation mode, the robotic device 110 is also constantly estimating the distance of y metres it has moved in the limited navigation mode by wheel or visual odometry. The server 130 is constantly updated of this estimated position instead of performing calculation of the positions of the robotic device 110 based on the sensor data. The detected distance to the far edge 510 may also be used concurrently to improve the accuracy of the estimated position.
At the end of the long corridor, the robotic device 110 switches from the limited navigation mode to the comprehensive navigation mode in Fig. 5C, and prepares the next maneuver. The robotic device 110 may switch to another limited navigation mode when it moves into another featured environment immediately after leaving the long corridor.
Fig. 6 shows examples of regular floor tiles in featured environments having regular template patterns that tessellate over the floor area. When the robotic device 110 enters the floor area covered by a template pattern, the server 130 instructs the robotic device 110 to switch to the limited navigation mode. The template pattern and parameters (e.g. physical dimensions of the template pattern) stored in the server 130 is transmitted to the robotic device 110 and a floor map of the area covered by the template pattern, i.e. the map of the featured environment, may be generated by the processor 114 of the robotic device 110. This method of generating floor map is more efficient than manually mapping the whole floor area covered by the template pattern in terms of time, effort and cost. There is also substantial reduction in bandwidth compared with transmitting the whole floor map to the robotic device 110.
Fig. 7 illustrates the estimation of the position of the robotic device 110 in the limited navigation mode in a featured environment having the regular template pattern. A camera is mounted at the robotic device 110 to capture the images of the floor at an interval At. The position of the robotic device 110 at time T shown in solid lines 700 is known and the next camera image of the floor at time T+At is captured as indicated in the dotted lines 702. The position of the robotic device at time T+At may be determined by searching within the part of the floor map indicated in dashed lines 704, i.e. the searching window. At is designed to be small enough such that the distance moved by the robotic device 110 in At is less than half of the size of the regular template pattern. For example, in the featured environment of Fig. 7, the distance moved by the robotic device 110 in At is less than 0.5a metres, where a is the size of the regular template pattern. Fig. 8 is a flowchart illustrating an estimation method 800 of the position of the robotic device 110 in a featured environment having the regular template pattern, i.e. the vision algorithm mentioned earlier.
In step 802, the images of the floor at time T and time T+ At are captured and the pose of the robotic device 110 on the floor map at time T is known as (xT, y-r, θτ), where x, y indicate the position of the robotic device 110 and Θ indicates the orientation of the robotic device 110.
In step 804, the Inverse Perspective Mapping (IPM) is performed to obtain a bird-eye view of the floor. As the camera is usually mounted on the robotic device 110 at a distance above ground and an angle to the horizontal plane, the camera image is usually distorted
or warped due to perspective transform. Thus, the IPM step gets a top view of the floor (i.e. an IPM image) by reversing the perspective transform. One approach is to use the camera calibration parameters, such as the height, the angle and an intrinsic matrix of the camera, to determine the inverse perspective transformation matrix. Another approach is to find plane-to-plane homography by providing 4 points on the image plane and corresponding 4 points on the ground plane. If the camera is mounted vertically (i.e. directly facing floor), this step can be omitted.
In step 806, the edges or contours of the floor tiles are extracted from the IPM image of the top view of the floor to obtain a threshold image. A variety of methods, such as adaptive threshold, Sobel edge detection, Canny edge detection, can be used.
In step 808, the rotational component Θτ+Δ( of the pose of the robotic device 11 0 is determined by detecting line segments in the threshold image using probabilistic Hough transform or other known methods. The angles of the detected lines are accumulated into a histogram. For example, a histogram bin resolution of 5 degrees may be used. For square or rectangle tiles, the edge lines are perpendicular at 90 degrees, and the pair of line angles 90 degrees apart (i.e. a°, a+90°) that has highest combined bin score is selected. The rotational component Θτ+Δ( is determined as one of (a°, a+90° , a+180° , a+270°) that has the smallest absolute change from θτ.
In step 810, the translational components ΧΤ+ΔΙ, ΥΤ+ΔΙ of the pose of the robotic device 110 are then determined. The floor map at the previous position xT, yT is rotated by the angle Θτ+Δΐ and a search window image is extracted, after compensating for the offset between the centre of rotation and centre of the IPM image. The search window image size is slightly larger than the IPM image size. If the search window is too large, multiple ambiguous matches may be found; and if the search window is too small, a good match may not be found. A comfortable size of the search window may be equal to or larger than the size of the IPM image by half the size of a tile on all 4 borders (i.e. search window size = enlarge IPM image size by 0.5a on all 4 borders). For example, all of bi to b4 are 0.5a metres in Fig. 7. The threshold image as the template is compared with the search window image using a template matching algorithm. The matching score at each offset is stored in the result image. The peak location in the result image of the template matching denotes the translational movement of the robotic device 110 from T to T+At. The translational components ΧΤ+ΔΙ, ΥΤ+ΔΙ of the updated pose, or the positon of the robotic device 11 0, can be determined.
All the steps above are performed by the processor 114 of the robotic device 110. Therefore, when the robotic device 110 enters the area which is covered with tiles having regular template patterns, the server 130 causes the robotic device 110 to switch to the limited mode and the robotic device 110 navigates itself based on the estimated positions calculated onboard using the method described in Fig. 8. When the robotic device 110 moves from the featured environment with regular floor patterns to a non-featured environment in which regular features are absent, it switches to the comprehensive mode. The algorithm described in Figs. 7 and 8 may be extended to featured environments having regular pattern on ceilings or walls instead of floor.
Figs. 9A to 9D are graphs illustrating amount of network traffic and processing load in the limited mode and comprehensive mode respectively. Figs. 9A and 9C show the amount of total network traffic (i.e. data communicated between the server 130 and the robotic device 110) and the processing load of the processor 134 of the server 130 in the comprehensive mode. Since sensor data is not transmitted from the robotic device 110 to the server 130 and the position of the robotic device 110 is not calculated by the server 130 in the limited mode, the amount of total network traffic is cut down in the limited navigation mode as shown in Fig. 9B. The peaks in the traffic in Fig. 9B correspond to the transmissions of the set of regular features from the server 130 to the robotic device 110. The processing load of the processor 134 of the server 130 is hugely reduced as well as shown in Fig 9D.
Table 1 below compares the two different types of environments and navigation modes.
Table 1 Comparison between Navigation Modes in Featured Environment and Non- Featured Environment
By making use of the regular features present in the place in which the robotic device moves, it is possible to perform simple onboard computations in real-time without excessive computational cost when the robotic device is navigating a featured environment. Instead of sending sensor data from the robotic device to the server continuously, the server sends out pieces of the entire environmental map that corresponds to the regular features to the robotic device. Accordingly, the amount of data that needs to be transmitted from the robotic devices to the server and vice versa is reduced drastically.
Therefore, with the navigation apparatus described above, it is possible to deploy more robotic devices to move simultaneously in a fixed wireless bandwidth environment without any additional upgrades in wireless infrastructure. Of course, it is always feasible to switch to the comprehensive mode any time under certain conditions, for example, when there are fewer robotic devices in the same working place or when the bandwidth of the network permits.
The exemplary apparatus and method for navigation described above use offboard navigation. Nevertheless, the method of changing of navigation mode according to the environment in which a robotic device operates may also be used in the onboard navigation systems to reduce the onboard computational load. In an onbaord navigation apparatus which allows changing of navigation mode according to the environment in which a robotic device operates, the global map is stored completely onboard, i.e. within the robotic device itself, and the processing performed by the server 130 of the exemplary apparatus 100 is performed onboard as well, i.e. by a processor within the robotic device. All the steps relating to communication performed by the exemplary apparatus 100 can be omitted in the onboard navigation apparatus.
Whilst there has been described in the foregoing description embodiments of the present invention, it will be understood by those skilled in the technology concerned that many variations or modifications in details of design or construction may be made without departing from the present invention.
Claims
1 . A method for navigation of a robotic device, the method comprising:
storing a map of a place in which the robotic device moves around, the map including at least one featured environment having regular features and at least one non- featured environment not having regular features;
setting a navigation mode of the robotic device to a comprehensive navigation mode before the robotic device starts to move;
obtaining sensor data of an environment in which the robotic device moves around to calculate positional information of the robotic device;
determining whether the robotic device is in a featured environment; and switching to a limited navigation mode when the robotic device is determined to be in a featured environment.
2. The method according to claim 1 , wherein in the limited navigation mode, the positional information of the robotic device is estimated based on information of the regular features of the featured environment.
3. The method according to claim 2, wherein in the limited navigation mode, the positional information of the robotic device is estimated using odometry.
4. The method according to claim 2, wherein in the limited navigation mode, the positional information of the robotic device is estimated using the regular features of the featured environment.
5. The method according to any one of claims 1 to 4, wherein in the comprehensive navigation mode, the positional information of the robotic device is calculated continuously based on the sensor data.
6. The method according to any one of claims 1 to 5, further comprising switching to the comprehensive navigation mode when the robotic device moves from the featured environment to a non-featured environment.
7. The method according to any one of claims 1 to 6, further comprising switching to the comprehensive navigation mode when the robotic device detects an error in the limited navigation mode.
8. The method according to any one of claims 1 to 7, wherein the regular features include at least one of regular wall structures, repeating floor patterns, repeating ceiling patterns and repeating wall patterns.
9. An apparatus for navigation of a robotic device, the apparatus comprising:
a robotic device having a plurality of sensors configured to obtain sensor data of an environment in which the robotic device moves around;
a memory configured to store a map of a place in which the robotic device moves around, the map including at least one featured environment having regular features and at least one non-featured environment not having regular features; and
a processor configured to
set a navigation mode of the robotic device to a comprehensive navigation mode before the robotic device starts to move,
calculate positional information of the robotic device based on the sensor data,
determine whether the robotic device is in a featured environment, and switch to a limited navigation mode when the robotic device is determined to be in a featured environment.
10. The apparatus according to claim 9, wherein in the limited navigation mode, the robotic device estimates the positional information of the robotic device based on information of the regular features of the featured environment.
11 . The apparatus according to claim 10, wherein the robotic device further comprises an odometer configured to estimate the positional information of the robotic device in the limited navigation mode.
12. The apparatus according to claim 10, wherein in the limited navigation mode, the robotic device estimates the positional information of the robotic device using the regular features of the featured environment.
13. The apparatus according to any one of claims 9 to 12, wherein in the comprehensive navigation mode, the processor continuously calculates the positional information of the robotic device based on the sensor data.
14. The apparatus according to any one of claims 9 to 13, wherein the processor is configured to switch to the comprehensive navigation mode when the robotic device moves from the featured environment to a non-featured environment.
15. The apparatus according to any one of claims 9 to 14, wherein the processor is configured to switch to the comprehensive navigation mode when the robotic device detects an error in the limited navigation mode.
16. The apparatus according to any one of claims 9 to 15, wherein the regular features include at least one of regular wall structures, repeating floor patterns, repeating ceiling patterns and repeating wall patterns.
17. The apparatus according to any one of claims 9 to 16, wherein the processor is provided in the robotic device.
18. The apparatus according to any one of claims 9 to 16, wherein the processor is provided in a server that communicates with the robotic device.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SG10201405949XA SG10201405949XA (en) | 2014-09-22 | 2014-09-22 | Method and apparatus for navigation of a robotic device |
SG10201405949X | 2014-09-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016048238A1 true WO2016048238A1 (en) | 2016-03-31 |
Family
ID=55581578
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/SG2015/050332 WO2016048238A1 (en) | 2014-09-22 | 2015-09-22 | Method and apparatus for navigation of a robotic device |
Country Status (2)
Country | Link |
---|---|
SG (1) | SG10201405949XA (en) |
WO (1) | WO2016048238A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018107024A1 (en) * | 2016-12-09 | 2018-06-14 | Diversey, Inc. | Robotic cleaning device with operating speed variation based on environment |
CN109115223A (en) * | 2018-08-30 | 2019-01-01 | 江苏大学 | A kind of full source integrated navigation system of full landform towards intelligent agricultural machinery |
US10612934B2 (en) | 2018-01-12 | 2020-04-07 | General Electric Company | System and methods for robotic autonomous motion planning and navigation |
CN114137949A (en) * | 2020-08-28 | 2022-03-04 | 德马科技集团股份有限公司 | Overhead visual navigation robot |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120215380A1 (en) * | 2011-02-23 | 2012-08-23 | Microsoft Corporation | Semi-autonomous robot that supports multiple modes of navigation |
US20140249695A1 (en) * | 2013-03-01 | 2014-09-04 | Robotex Inc. | Low latency data link system and method |
-
2014
- 2014-09-22 SG SG10201405949XA patent/SG10201405949XA/en unknown
-
2015
- 2015-09-22 WO PCT/SG2015/050332 patent/WO2016048238A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120215380A1 (en) * | 2011-02-23 | 2012-08-23 | Microsoft Corporation | Semi-autonomous robot that supports multiple modes of navigation |
US20140249695A1 (en) * | 2013-03-01 | 2014-09-04 | Robotex Inc. | Low latency data link system and method |
Non-Patent Citations (1)
Title |
---|
MANO, H. ET AL.: "Treaded Control System for Rescue Robots in Indoor Environmenf, Proceedings of the 2008 IEEE International Conference on Robotics and Biomimetics", ROBIO 2008, 22 February 2009 (2009-02-22), Bangkok, Thailand, pages 1836 - 1843 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018107024A1 (en) * | 2016-12-09 | 2018-06-14 | Diversey, Inc. | Robotic cleaning device with operating speed variation based on environment |
US11402850B2 (en) | 2016-12-09 | 2022-08-02 | Diversey, Inc. | Robotic cleaning device with operating speed variation based on environment |
US10612934B2 (en) | 2018-01-12 | 2020-04-07 | General Electric Company | System and methods for robotic autonomous motion planning and navigation |
CN109115223A (en) * | 2018-08-30 | 2019-01-01 | 江苏大学 | A kind of full source integrated navigation system of full landform towards intelligent agricultural machinery |
CN114137949A (en) * | 2020-08-28 | 2022-03-04 | 德马科技集团股份有限公司 | Overhead visual navigation robot |
Also Published As
Publication number | Publication date |
---|---|
SG10201405949XA (en) | 2016-04-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7353747B2 (en) | Information processing device, system, method, and program | |
US10809071B2 (en) | Method for constructing a map while performing work | |
EP3118705B1 (en) | Map production method, mobile robot, and map production system | |
EP3552072B1 (en) | Robotic cleaning device with operating speed variation based on environment | |
EP3627269A1 (en) | Target tracking method and apparatus, mobile device and storage medium | |
US10657833B2 (en) | Vision-based cooperative collision avoidance | |
US6496755B2 (en) | Autonomous multi-platform robot system | |
US8679260B2 (en) | Methods and systems for movement of an automatic cleaning device using video signal | |
CN112050810B (en) | Indoor positioning navigation method and system based on computer vision | |
EP3147629B1 (en) | Object detection device and object detection method | |
US8423225B2 (en) | Methods and systems for movement of robotic device using video signal | |
CN109773783B (en) | Patrol intelligent robot based on space point cloud identification and police system thereof | |
KR20150058679A (en) | Apparatus and method for localization of autonomous vehicle in a complex | |
KR20110122022A (en) | Apparatus for building map and method thereof | |
WO2016048238A1 (en) | Method and apparatus for navigation of a robotic device | |
WO2018228258A1 (en) | Mobile electronic device and method therein | |
Bao et al. | Outdoor navigation of a mobile robot by following GPS waypoints and local pedestrian lane | |
JP2004133882A (en) | Autonomous multi-platform robot system | |
WO2021246170A1 (en) | Information processing device, information processing system and method, and program | |
WO2020230410A1 (en) | Mobile object | |
WO2023010870A1 (en) | Robot navigation method, robot, robot system, apparatus and medium | |
US20210149412A1 (en) | Position estimating apparatus, method for determining position of movable apparatus, and non-transitory computer readable medium | |
JP2021103410A (en) | Mobile body and imaging system | |
CN104390642A (en) | Omnidirectional ranging indoor automatic detection and navigation equipment capable of being remotely monitored | |
KR102651108B1 (en) | Apparatus and Method for Estimating Position |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15844078 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 30/06/2017) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15844078 Country of ref document: EP Kind code of ref document: A1 |