GB2382251A - Mobile Robot - Google Patents

Mobile Robot Download PDF

Info

Publication number
GB2382251A
GB2382251A GB0300113A GB0300113A GB2382251A GB 2382251 A GB2382251 A GB 2382251A GB 0300113 A GB0300113 A GB 0300113A GB 0300113 A GB0300113 A GB 0300113A GB 2382251 A GB2382251 A GB 2382251A
Authority
GB
United Kingdom
Prior art keywords
robot
obstacle
control system
location
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB0300113A
Other versions
GB0300113D0 (en
GB2382251B (en
Inventor
Jeong-Gon Song
Sang-Yong Lee
Seung-Bin Moon
Kyoung-Mu Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Gwangju Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020000068446A external-priority patent/KR20020038296A/en
Priority claimed from KR1020000068445A external-priority patent/KR100632241B1/en
Priority claimed from KR1020000069621A external-priority patent/KR100632242B1/en
Application filed by Samsung Gwangju Electronics Co Ltd filed Critical Samsung Gwangju Electronics Co Ltd
Priority claimed from GB0115871A external-priority patent/GB2369511B/en
Publication of GB0300113D0 publication Critical patent/GB0300113D0/en
Publication of GB2382251A publication Critical patent/GB2382251A/en
Application granted granted Critical
Publication of GB2382251B publication Critical patent/GB2382251B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/009Carrying-vehicles; Arrangements of trollies or wheels; Means for avoiding mechanical obstacles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0248Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data

Abstract

A mobile robot (1), which may be in the form of an automatically guided vacuum cleaner, is capable of recognising its location within a room (30) and adjusting its direction in response to obstacles in its way. The robot has a running device (30), an obstacle detecting device (41, 43, 45) for detecting the presence of obstacles, a location recognising device (21, 23), a control system (10), and a power supply (60). The location recognising device includes a first vision camera (21) directed towards the ceiling of the room and a first vision board (23). The first vision camera (21) detects a base mark on the ceiling. The first vision board (23) processes an image from the first camera (21) and transmits the resulting image data to the controlling portion (10). The obstacle detecting device includes a line laser (41) for emitting a linear light beam towards an obstacle, a second vision camera (43) for recognising a linear light beam reflected from the obstacle, and a second vision board (45) for processing image data captured by the second vision camera (43). More than one laser may be provided.

Description

<Desc/Clms Page number 1>
MOBILE ROBOT The present invention relates to a mobile robot which automatically moves about a room and, more particularly, to a mobile robot constructed as a vacuum cleaner having a camera to recognise its location and to avoid collisions with obstacles in the room.
The invention also includes a course adjusting method for robot and an automatic cleaning system.
Generally, a mobile robot has a power source and a sensor mounted in its body and can, therefore, automatically move about a given area without an external power supply or manipulation. There are two main types of mobile robots that are used inside a house: robots that clean the rooms of the house; and robots that guard the house from possible intruders.
A conventional mobile robot uses a random motion method, whereby the robot moves in a random direction repeatedly shifting its direction whenever it encounters obstacles, such as a wall, table, etc. Such a robot includes a motive device for moving the mobile robot about a room, an obstacle detecting device for detecting the presence of an obstacle, such as a wall, table, etc. , a control system for adjusting an orientation of the mobile robot by controlling the motive device and the obstacle detecting device, and a power supply for storing and supplying power to the respective devices.
The motive device can be a wheel-type device that employs a servo-motor or stepping motor to rotate a plurality of wheels and move the mobile robot, a caterpillar-type device that uses an endless track, or a joint-type device that uses a plurality of legs.
Among these types of devices, the wheel-type motive device is most widely used.
The obstacle detecting device detects obstacles with an ultrasonic or laser sensor, and sends out a corresponding signal to the control system. The sensor is preferably mounted on a front side of the robot divided parallel with a running surface over which the robot travels, so as to detect accurately obstacles located in the path of travel.
<Desc/Clms Page number 2>
The control system includes a microprocessor and memory for controlling general operations of the robot, such as sending a start command to the motive device, controlling movement of the motive device to avoid obstacles in accordance with signals received from the obstacle detecting device and an internal, pre-loaded program, and electrically charging the power supply when it determines that the power level is below a predetermined value.
The power supply supplies power for operating various parts of the robot, such as the motor, which rotates the wheels of the motive device, the sensor, which detects the presence of obstacles, and the control system, etc. The power supply is usually a storage battery, enabling the robot to operate for a predetermined period of time without connection to an external power source.
In operation, when the robot receives a start command, the control system sends a running command and corresponding sensing signal to the motive device and the obstacle detecting device, respectively. In response to the signal from the control system, the motive device runs in a certain direction by driving the motor. At this time, by operating the sensor, the obstacle detecting device sends out a sensing signal to the control system. When the sensor senses the presence of an obstacle within a predetermined distance range, the control system sends a command to the motive device to shift the path or running direction of the robot. Then the motive device resumes running of the robot. Whenever the robot encounters an obstacle, the running direction of the robot is altered by the processes described above. That is, the robot runs according to its initial position and the locations of the obstacles, drawing a random track as shown in Figure 1.
Such a random motion robot is found to be inefficient when running in a limited area, since it follows a random track. Another drawback is that it repeatedly travels across the same area.
<Desc/Clms Page number 3>
The ultrasonic sensor of the conventional obstacle detecting device includes an ultrasonic transmitter for emitting ultrasonic waves, and an ultrasonic receiver for receiving ultrasound waves reflected from obstacles. By measuring the time delay between ultrasound transmission and reception, the control system calculates the distance from the robot to a obstacle, and accordingly controls the motor of the robot to avoid the obstacle.
The robot is required to keep a predetermined orientation in order to perform a cleaning or guarding operation efficiently. For this, it is necessary to periodically check whether the robot is following the right course and to adjust the orientation of the robot, if it is determined that the robot has deviated off course.
Conventionally, this is achieved using a guide tape or the wall of the room as a reference. When using a guide tape attached to the floor, the robot checks for the presence of guide tape using a photo-sensor or a magnetic sensor. The robot runs along the guide tape, the relative location of the guide tape with respect to the sensor determining whether the robot is on the right course or not.
When using the wall of the room as a reference, whether the robot moves along the right course or not is determined according to the distance of the robot from the wall, as detected by a sensor, such as an ultrasonic sensor. Depending on the distance between the robot and the wall, the robot will adjust its orientation.
It is an object of the invention to provide a mobile robot capable of operating more efficiently.
According to one aspect of the present invention, a mobile robot comprises a motive device for moving the robot about a room, an obstacle detecting device for detecting the presence of an obstacle, a control system coupled to and arranged to control the motive device and the obstacle detecting device; a location recognising device coupled to the control system for recognising a current location of the robot, the location recognising device including a first vision camera and a first vision circuit, the camera being
<Desc/Clms Page number 4>
arranged to image the ceiling of the room and to recognise a base mark on the ceiling, the first vision circuit being operable to process an image from the camera and to transmit data to the control system; and a power supply coupled to the control system, the power supply being arranged to store electrical energy and to supply electrical energy to the motive device, obstacle detecting device, location recognising device, and the control system.
The obstacle detecting device may include a line laser for emitting a linear light beam towards the obstacle, a second vision camera for recognising a reflected linear light beam from the obstacle, and a second vision board for processing image data captured by the second vision camera.
According to another aspect of the invention, a mobile robot comprises a motive device for moving the mobile robot about a room; a location recognising device for recognising a current location of the mobile robot; a control system for controlling the motive device and the location recognising device; an obstacle detecting device coupled to the control system for detecting the presence of an obstacle, the obstacle detecting device including a laser for emitting a linear light beam toward the obstacle, a camera for detecting a reflected light beam from the obstacle, and a circuit for processing image data from the camera; and a power supply coupled to the control system, the power supply being arranged to store electrical energy and supply the energy to the motive device, obstacle detecting device, location recognising device, and the control system.
The camera may include a filter for exclusively recognising light from the line laser.
According to yet a further aspect of the invention, there is provided a method for adjusting the course of a mobile robot, the mobile robot including a motive device for moving the robot about a room; an obstacle detecting device for detecting the presence of an obstacle; a location recognising device having a camera and a vision circuit and arranged to recognise a current location of the robot, a control system for controlling the motive device, the obstacle detecting device and the location recognising device, and a power supply for storing electrical energy and supplying the energy to the motive
<Desc/Clms Page number 5>
device, the obstacle detecting device, the location recognising device and the control system, wherein the method comprises: (i) imaging a base mark using the camera of the location recognising device and generating image data relating to the base mark using the vision circuit; (ii) determining whether coordinates of the base mark, which are obtained by data processing in the control system, match coordinates of a predetermined moving route; and (iii) controlling the motive device of the robot to cause the robot to move in a direction by a corresponding distance to compensate for any deviation from the predetermined moving route when the coordinates of the base mark do not match the coordinates of the predetermined moving route.
The mobile robot disclosed herein includes a motive device for moving the mobile robot about a room, an obstacle detecting device for detecting a presence of an obstacle, a location recognising device for recognising a current location of the mobile robot, a control system for controlling the motive device, the obstacle detecting device and the location recognising device, and a power supply for storing and supplying electricity to each of the devices and the control system. The method includes photographing a base mark using a first vision camera of the location recognising device and generating image data of the base mark using a first vision board, determining whether coordinates of the base mark, which are obtained by data processing of the control system, match coordinates of a predetermined moving route, and controlling the motive device to move the mobile robot in a direction by a corresponding distance to compensate for any deviation from the predetermined moving route, when the coordinates of the base mark do not match the coordinates of the predetermined moving route.
The invention further includes an automatically guided vacuum cleaner which has an upward-viewing camera for imaging a reference mark on the ceiling of a room to be cleaned, a control system for processing image data representing the reference mark, and a motive unit coupled to the control system, the control system being configured to cause the motive unit to operate in response to the processed image data whereby the movement of the cleaner is guided according to the relative positions of the cleaner and the ceiling reference mark.
<Desc/Clms Page number 6>
Such a cleaner may form part of an automatic cleaning system including a base element for mounting on the ceiling of a room to be cleaned, the base element including the reference mark.
The robot described in this specification has the advantage of being able to cover a room more efficiently by moving along a certain course while recognising its location and avoiding repeat passes in the same area. It is possible to give the robot the ability to recognise the status, e. g. the shape, of an obstacle and thereby determine whether to pass an obstacle or take avoiding action. Separate provision of a guide tape, a guidetape sensor or ultrasonic sensor is unnecessary, simplifying manufacture and reducing manufacturing costs.
The present invention will now be described by way of example with reference to the drawings, in which: Figure I is a diagram showing the movement track of a conventional mobile robot in a room; Figure 2 is a schematic perspective view showing a structure of a mobile robot in accordance with the invention; Figure 3 is a functional block diagram showing parts of a mobile robot in accordance with the present invention; Figure 4 is a flow chart showing a location determining and robot control method; Figure 5 is a plan view of a base mark; Figure 6 is a diagram showing the track of a robot which travels according to a mask image set by a teaching process;
<Desc/Clms Page number 7>
Figure 7 is a flow chart showing a method for detecting an obstacle using an obstacle detecting device; Figure 8 is a diagram illustrating the determination of the distance from the robot to an obstacle; Figure 9 is a diagram showing in simplified form a method for generating a three-dimensional image from a plurality of linear images ; Figure 10 is a flow chart showing a method for adjusting the orientation of the robot ; Figures 11 A, 11 B, and I I C are views showing lines formed by base marks which are shown in an image window of a camera mounted on the robot, according to the track followed by the robot; and Figure 12 is a diagram showing the movement track of a robot running along a certain course while recognising its location.
Referring firstly to Figures 2 and 3, a mobile robot I includes a motive device 30 for moving the robot I along a planar surface, a location recognising device 20 for recognising the location of the robot I using a first vision camera (CCD camera 21), an obstacle detecting device 40 for detecting the presence of obstacles in the path of the robot I, a remotly-controllable transceiver 50 for transmitting or receiving a start/stop command to/from the robot 1, and a power supply 60 for storing electrical energy and supplying the respective components of the robot 1.
The motive device 30 includes a pair of wheels 33, which are capable of moving forwards and backwards and left and right, a motor 32 for driving the wheels 33, and a motor driver 31 for controlling the motor 32 in response to signals that the motor driver 31 receives from the control system 10.
<Desc/Clms Page number 8>
The first camera 2 I is vertically disposed to image the ceiling on which a base mark 70 (Figure 5) is attached. The location recognising device 20 further includes a first vision board 23 for setting relevant thresholds with images photographed by the camera 21.
The obstacle detecting device 40 includes a laser 41 for emitting a linear beam of light in the path or direction of travel of the robot 1, a second vision camera 43 for detecting a linear beam of light reflected from an obstacle located in the path of the robot !, and a second vision board 45 for processing the images photographed by the second vision camera 43.
The laser 41 is of a kind often called a"line-emitter,"since it emits a beam in the form of straight line on an image plane. The image plane is perpendicular to the optical axis of the laser 41. The laser 41 and camera 43 are, mounted on the front of the robot I and detect the presence of any obstacle which may be located in the robot's path. In this embodiment the second camera 43 is mounted above the laser 41 to capture any linear beam of the line laser 41 that is reflected from an obstacle, the camera 43 including an attached filter to distinguish exclusively a beam of the laser 41 reflected by an object. The filter also exclusively passes radiation of a wavelength that corresponds to that of the linear beam from the laser 41, thereby permitting exclusive recognition of the linear beam from the laser 41.
The second vision board 45 is mounted on one side of the second camera 43 and connected via wires to the control system 10 and the second camera 43.
The transceiver 50 enables a user to control starting and stopping of the robot from a remote location. That is, the transceiver 50 receives a start or stop command from the user and transmits a status signal of the robot I to the user.
The power supply 60 is a battery.
For overall control of the robot 1, the control system 10 is connected to the motor driver 31 of the motive device 30, the location recognising device 20, the obstacle detecting
<Desc/Clms Page number 9>
device 40, the remote-controllable transceiver 50, and the power supply 60. The control system 10 includes an image data processor 11, which has a microprocessor for calculating positional data with image data transmitted from the first and second vision boards 23 and 45. That is, the control system 10 uses its own location information and the position and shape information of an obstacle to set a target point and a running course for the mobile robot 1. The control system 10 further directs the robot 1 along the right course to the target point. The robot's location information is obtained using the image data relating to the base mark 70, which is obtained by imaging with the first vision camera 21 the ceiling to which the base mark 70 is attached and processing the resulting image in the first vision board 23. The position and shape of the obstacle are obtained using linear image data obtained by imaging the obstacle with the second vision camera 43 and processing the obstacle image in the second vision board 45.
The operation of the robot I constructed as above will be described in greater detail below.
The movement and location recognising process of the robot 1 through the first vision camera 21 is now described with reference to Figure 4.
Firstly, when the robot I receives a start command, the control system 10 initialises and checks for a predetermined direction and distance (steps S10 and Sl 1). When there is no data relating to the predetermined direction and distance, the control system 10 requests image data from the location recognising device 20 (step S12). Upon receipt of the request for image data from the control system 10, the location recognising device 20 uses the first camera 21 to view the ceiling from the current location of the robot 1. Based on the resulting image a relevant threshold is set and transmitted to the image data processor I I of the control system 10 (step S13). Upon receipt of the image data from the first vision board 23, the image data processor 11 detects the location and relative orientation of recognition dots 71 and 73 (Figure 5) of the base mark 70 by a region correlation, and outputs a distance and direction that the motive device 30 has to move (step SI4). The base mark 70, which is attached to the ceiling, can be formed of any suitable material, so long as it is recognizable by the first vision camera 21. It is
<Desc/Clms Page number 10>
preferable to use recognition marks for clearer recognition. One exemplary base mark 70 is shown in Figure 5. The base mark 70 is a recognition mark which includes a plate 75, a larger reference dot 71 and a smaller reference dot 73. The larger reference dot 71 is for determining the base location, while the smaller reference dot 73 is for checking the direction of the robot I based on its relationship with the bigger reference dot 71.
The control system 10 transmits data about travel distance and direction from the image data processor 11 to the motive device 30, and the motive device 30 operates in the direction and over the distance determined by the signal from the control system 10 (step S 15).
The process of searching for the recognition dots 71 and 73 through a region correlation of an image data threshold will be described in greater detail below.
Region correlation is a method of comparing mask image data representing the base mark 70 with image data obtained from the image of the ceiling viewed from a certain distance, and determining a position indicating a similar mask image in an image window which is obtained by the first camera 21. As shown by the arrow in Figure 6, teaching operation of the mask image of the base mark 70 is performed in a downward orientation.
The location having a similar mask image with that of the mobile robot I is determined as follows. Firstly, region correlation coefficients of the mask image, which is a result of the teaching operation, are obtained from the whole area of the image data of the image as viewed from a certain distance. Then, the area having the greatest correlation coefficient is selected, since it has the image which is most similar to the image of the recognition marks 73 and 75 of the base 70 that the robot I is targeting. The location of the base mark 70 is defined by the image photographed by the first camera 21 and formed in the image window (W) in pixel coordinates. Accordingly, using the original coordinates of the base 70 and the coordinates of the base 70 in the current image window (W), the current location and direction of the mobile robot 1 are obtained. Furthermore, since the location of the base 70 is obtained in pixel coordinates in every
<Desc/Clms Page number 11>
sampling period during which the ceiling is imaged by the first vision camera 21, the movement and path of the mobile robot I can also be obtained.
The region correlation coefficient is expressed by
where r (dx, dy) is a region correlation coefficient, f, is a teaching mask image, 11 is an average value of teaching mask image, 12 is an average value of f2, (dx, dy) is the required moving distance of the mask image expressed in cartesian coordinates, (x, y) is a coordinate, and S is an original image.
Next, a method for detecting the presence of an obstacle while travelling will be described with respect to Figure 7.
The obstacle detecting method includes the steps of : directing the line laser 41 to emit a linear beam toward an obstacle located in the robot's path (step S31) ; having the second camera 43 detect the reflective linear beam from the obstacle (step S32); having the second vision board 45 process the image from the second camera 43 to generate image data suitable for software calculation (step S33); and calculating a distance from the robot 1 to the obstacle using the image data (step S34).
In the light emitting step (S31), when the line laser 41 directs a linear beam at an obstacle, the shape of the obstacle distorts the beam. In the detection or recognition step (S32), the second camera 43 forms an image by recognising the reflective distorted beam from the filter. In the image data processing step (S33), the second vision board 45 performs a thresholding process in order to simplify the image detected in the recognising step (S32) and using a thinning process, reduces the size of the image to as small as possible. In the distance calculating step (S34), the distance from the robot 1 to the obstacle is calculated based on the image data obtained from the image data
<Desc/Clms Page number 12>
processing step (S43). The robot I repeats the above-mentioned obstacle detecting processes until it obtains all of the information relating to the obstacle in its path.
The range from the robot 1-to the obstacle can now easily be obtained with a few values by trigonometry. As shown in Figure 8, those values are : an angle (eu.) between the line laser 41 and a reference line on the robot 1, the distance (YLP) between the camera 43 and the laser 41, distance (fo) between a lens 43a of the vision camera 43 and an image plane 43b on which the image of the obstacle is formed, and distance (yi) from the image plane 43b to a centre of the lens 43a. With these values, the distance (Z) from the robot I to the imaged point on the obstacle is obtained by the trigonometric equation (refer to Figure 8):
= tan SLP YLP-Y Z rearrange----= tanp by substituting xo=yLp-tanOLP and y=- (Z-y))/fo, and obtain YLP - Y z-xo I- (y,//,. tan)
Since the angle (th) between the laser 41 and the robot reference, the distance (fo) between the lens 43a of the camera 43 and the obstacle image plane 43b, and the value of the Xo = Y'tan are constants, the distance (Z) from the robot 1 to the obstacle can be obtained simply by obtaining a value (y) corresponding to the horizontal distance from the centre of the lens 43a to an end of the image data of the image formed on the image plane.
By solving the above equations with the image data, the shape of the obstacle can be determined.
In an alternative embodiment, a three-dimensional image can also be obtained using a plurality of line lasers 41. A plurality of lasers 41 are positioned to emit laser beams towards the obstacle at an angle of incidence such that the linear beam incident on the obstacle can be recognised by the vision camera 43. The plurality of lasers 41 emit the linear beams towards the obstacle, and the camera 43 recognises the reflective beams of
<Desc/Clms Page number 13>
the line lasers 41. Then, by image processing of image data representing the reflective beams, the three-dimensional image is obtained. Figure 9 illustrates the processes of forming a three-dimensional image from a plurality of linear images. In this way, the robot 1 obtains more accurate data about the obstacle, such as its shape.
Finally, the process of reaching a target location while maintaining the right course will be described in greater detail below.
When the robot I receives the start command, the control system 10 initialises and requests the location recognising device 20 and the obstacle detecting device 40 for image data. Upon receipt of the image data request from the control system 10, the location recognising device 20 images the ceiling to which the base mark 70 is attached and creates an image. Then the second vision board 23 processes the image by comparison with a threshold and transmits the processed image to the control system 10. The obstacle detecting device 40 uses the line laser 41 and the second vision camera 43 to generate image data representing the obstacle located in the robot's path and transmits the same to the control system 10.
Software in the control system 10 processes the image data received from the location recognising device 20 and the obstacle detecting device 40 to obtain information about the obstacle and the current location of the robot. The control system 10 then sets a target location and route to the target location based on the information obtained above.
The control system 10 transmits a run-command to the motive device 30 along a determined route, periodically checks the coordinates of the base mark 70 at predetermined intervals, and determines whether or not the mobile robot 1 is moving along the determined route. If the coordinates of the base mark 70 deviate from the determined route, the control system 10 causes the motive device 30 to move the robot I in a direction opposite to that of the deviation, and thereby maintains the proper route of the robot 1. After several route adjustments, and when the robot 1 reaches the target location, the robot I stops moving or keeps moving if there is a subsequent command.
<Desc/Clms Page number 14>
The method of the control system 10 for obtaining the current position of the robot I corresponds generally to the location recognising process of the mobile robot 1, which has been described above. Accordingly, a detailed description thereof will be omitted.
The method of checking the course and adjusting the direction of the robot 1 when it
deviates off course will now be described in greater detail with reference to Figures 10, HA, l ! B, and IIC.
The control system 10 requests the location recognising device 20 for image data relating to the base mark 70. Upon receipt of the request from the control system 10, the location recognising device 20 images the ceiling to which the base mark 70 is attached, and generates an image of the base mark 70. Then, the first vision board 23 processes the image to generate image data that can be processed by software and transmits the image data to the control system 10 (step S51).
The control system 10 calculates the coordinates of the base mark 70 using the region correlation method, which is identical to the method for obtaining the robot location by using the image data transmitted from the location recognising device 20 (step S52).
Next, the control system 10 compares the coordinates of the base mark 70 obtained from the current location of the robot 1 with the coordinates of the route determined in the route determining step (step S53).
When the current coordinates of the base mark 70 do not agree with the coordinates of the determined course, the control system 10 calculates the deviations in direction and distance from the determined coordinates of the course. The control system 10 then controls the motor 32 of the motive device 30 to move the mobile robot 1 to compensate for the deviations, by moving in the opposite direction to a deviated distance (step S54). For example, if the mobile robot 1 is off course to the right of the base mark 70, the control system 10 causes the motor 32 to move the motive device 30 to the left, i. e. , back onto the course. Such processes are shown in Figures I I A, I I B, and II C. Figure 11 A shows a locus of base marks 70 indicated on the image window
<Desc/Clms Page number 15>
(W) of the first camera 21 when the robot I moves in a straight route. Likewise, Figure I I B shows the locus of the base marks 70 on the image window (W) of the first camera 21 when the mobile robot I moves away from the straight route, while Figure 11 C shows the locus of the base marks 70 when the mobile robot I returns to the movement route. The reference numerals 71 and 73 in Figure 1 IA refer to the two recognition dots of the base mark 70.
Next, the control system 10 determines whether the current location is the target location (step S55). If not, the control system 10 requests the location recognising device 20 for image data of the base mark 70 to determine whether the mobile robot I is at the same coordinates as the coordinates of the determined route.
The control system 10 periodically repeats the above-mentioned processes at predetermined intervals until the robot 1 reaches the target location, so as to keep the robot I running on the determined course.
Figure 12 illustrates the movement of the robot I that is capable of recognising its location and manoeuvring around objects in a room. Such a robot 1 may be used as a home appliance, i. e. , a vacuum cleaning mobile robot. Here, the robot I further includes a vacuum cleaner which has a suction port for sucking in contaminants, a dust collecting portion for collecting contaminants from the incoming air, and a motor driven portion for generating suction force. The course or moving route of the vacuum cleaning mobile robot 1 can be pre-furnished in various patterns of programs according to the geography of the room.
As described above, the mobile robot I described above can recognise its current location, and also follow on a given course efficiently without repeatedly running in the same area. Since the robot 1 can obtain information about the shape of the obstacle using the laser 41 and the second camera 43, it can determine whether to pass or avoid the obstacle according to the status of the obstacle. Location determination is achieved using the first camera 21, allowing automatic determination of whether or not to
<Desc/Clms Page number 16>
maintain the current path of travel, and adjustment of the robot's orientation when it determines any deviation from the desired course.

Claims (15)

1. A mobile robot comprising: a motive device for moving the mobile robot about a room; a location recognising device for recognising a current location of the mobile robot; a control system for controlling the motive device and the location recognising device; an obstacle detecting device coupled to the control system for detecting the presence of an obstacle, the obstacle detecting device including a laser for emitting a linear light beam toward the obstacle, a camera for detecting a reflected light beam from the obstacle, and a circuit for processing image data from the camera; and a power supply coupled to the control system, the power supply being arranged to store electrical energy and supply the energy to the motive device, obstacle detecting device, location recognising device, and the control system.
2. A robot according to claim 1, wherein the camera comprises a filter for exclusively detecting radiation from the laser.
3. A robot according to claim 1 or claim 2, wherein the obstacle detecting device comprises a plurality of lasers for emitting linear light beams at predetermined respective angles.
4. A robot according to any of claims I to 3, constructed as a vacuum cleaner comprising a suction port for sucking in contaminants, a dust collecting portion for collecting the contaminants therein, and a motor portion for generating a suction force.
5. A mobile robot comprising: a motive device for moving the robot about a room; an obstacle detecting device for detecting the presence of an obstacle; a control system coupled to and arranged to control the motive device and the obstacle detecting device ;
<Desc/Clms Page number 18>
a location recognising device coupled to the control system for recognising a current location of the robot, the location recognising device including a first vision camera and a first vision circuit, the camera being arranged to image the ceiling of the room and to recognise a base mark on the ceiling, the first vision circuit being operable to process an image from the camera and to transmit data to the control system; and a power supply coupled to the control system, the power supply being arranged to store electrical energy and to supply electrical energy to the motive device, obstacle detecting device, location recognising device, and the control system.
6. A mobile robot according to claim 5, wherein the camera and the vision circuit are configured to recognise a base mark in the form of a recognition mark having a base plate, and a plurality of dots formed on the base plate at a predetermined distance from each other.
7. A robot according to claim 5 or claim 6, wherein the obstacle detecting device comprises : a laser for emitting a linear light beam towards the obstacle; a camera for detecting a reflected light beam from the obstacle; and a circuit for processing image data captured by the detecting camera.
8. A robot according to claim 5, constructed as a vacuum cleaner comprising a suction port for sucking in contaminants, a dust collecting portion for collecting the contaminants therein, and a motor portion for generating a suction force.
9. A method for adjusting the course of a mobile robot, the mobile robot including a motive device for moving the robot about a room; an obstacle detecting device for detecting the presence of an obstacle; a location recognising device having a camera and a vision circuit and arranged to recognise a current location of the robot, a control system for controlling the motive device, the obstacle detecting device and the location recognising device, and a power supply for storing electrical energy and supplying the energy to the motive device, the obstacle detecting device, the location recognising device and the control system, wherein the method comprises:
<Desc/Clms Page number 19>
(i) imaging a base mark using the camera of the location recognising device and generating image data relating to the base mark using the vision circuit; (ii) determining whether coordinates of the base mark, which are obtained by data processing in the control system, match coordinates of a predetermined moving route; and (iii) controlling the motive device of the robot to cause the robot to move in a direction by a corresponding distance to compensate for any deviation from the predetermined moving route when the coordinates of the base mark do not match the coordinates of the predetermined moving route.
10. A method according to claim 9, wherein the imaging step includes generating an image of the base mark, and setting an image threshold using the vision circuit and generating the image data.
11. A method according to claim 9 or claim 10, wherein the determining step further includes performing region correlation in addition to calculating the coordinates of the base mark.
12. An automatically guided vacuum cleaner having an upward-viewing camera for imaging a reference mark on the ceiling of a room to be cleaned, a control system for processing image data representing the reference mark, and a motive unit coupled to the control system, the control system being configured to cause the motive unit to operate in response to the processed image data whereby the movement of the cleaner is guided according to the relative positions of the cleaner and the ceiling reference mark.
13. An automatic cleaning system comprising the combination of a vacuum cleaner as claimed in claim 12 and a base element for mounting on the ceiling of a room to be cleaned, the base element carrying the said reference mark.
14. A mobile robot constructed and arranged substantially as herein described and shown in Figures 2 to 12 of the drawings.
<Desc/Clms Page number 20>
15. A method for adjusting the course of a mobile robot, the method being substantially as herein described with reference to Figures 2 to 12 of the drawings.
GB0300113A 2000-11-17 2001-06-28 Mobile robot Expired - Fee Related GB2382251B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR1020000068446A KR20020038296A (en) 2000-11-17 2000-11-17 Apparatus for detecting obstacle in mobile robot and method for detecting the same
KR1020000068445A KR100632241B1 (en) 2000-11-17 2000-11-17 Mobile robot
KR1020000069621A KR100632242B1 (en) 2000-11-22 2000-11-22 Path correction method of mobile robot
GB0115871A GB2369511B (en) 2000-11-17 2001-06-28 Mobile robot

Publications (3)

Publication Number Publication Date
GB0300113D0 GB0300113D0 (en) 2003-02-05
GB2382251A true GB2382251A (en) 2003-05-21
GB2382251B GB2382251B (en) 2004-01-07

Family

ID=27447965

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0300113A Expired - Fee Related GB2382251B (en) 2000-11-17 2001-06-28 Mobile robot

Country Status (1)

Country Link
GB (1) GB2382251B (en)

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2392255A (en) * 2002-07-26 2004-02-25 Samsung Kwangju Electronics Co A robot cleaner
GB2404331A (en) * 2003-07-29 2005-02-02 Samsung Gwanju Electronics Co Robot cleaner equipped with negative-ion generator
FR2858204A1 (en) * 2003-07-29 2005-02-04 Samsung Kwangju Electronics Co ROBOT AND AIR CLEANER SYSTEM
EP1684143A1 (en) * 2005-01-25 2006-07-26 Samsung Electronics Co., Ltd. Apparatus and method for estimating location of mobile body and generating map of mobile body environment using upper image of mobile body environment, and computer readable recording medium storing computer program controlling the apparatus
DE202006002432U1 (en) * 2006-02-16 2007-06-28 Westermann Kg Mobile waste bin, comprises base on castors electrically driven and operated with remote control
WO2014033055A1 (en) * 2012-08-27 2014-03-06 Aktiebolaget Electrolux Robot positioning system
WO2015090397A1 (en) * 2013-12-19 2015-06-25 Aktiebolaget Electrolux Robotic cleaning device
EP3009907A1 (en) * 2014-10-02 2016-04-20 LG Electronics Inc. Robot cleaner
US9811089B2 (en) 2013-12-19 2017-11-07 Aktiebolaget Electrolux Robotic cleaning device with perimeter recording function
US9946263B2 (en) 2013-12-19 2018-04-17 Aktiebolaget Electrolux Prioritizing cleaning areas
US10045675B2 (en) 2013-12-19 2018-08-14 Aktiebolaget Electrolux Robotic vacuum cleaner with side brush moving in spiral pattern
US10149589B2 (en) 2013-12-19 2018-12-11 Aktiebolaget Electrolux Sensing climb of obstacle of a robotic cleaning device
US10219665B2 (en) 2013-04-15 2019-03-05 Aktiebolaget Electrolux Robotic vacuum cleaner with protruding sidebrush
US10231591B2 (en) 2013-12-20 2019-03-19 Aktiebolaget Electrolux Dust container
US10433697B2 (en) 2013-12-19 2019-10-08 Aktiebolaget Electrolux Adaptive speed control of rotating side brush
US10448794B2 (en) 2013-04-15 2019-10-22 Aktiebolaget Electrolux Robotic vacuum cleaner
US10499778B2 (en) 2014-09-08 2019-12-10 Aktiebolaget Electrolux Robotic vacuum cleaner
US10518416B2 (en) 2014-07-10 2019-12-31 Aktiebolaget Electrolux Method for detecting a measurement error in a robotic cleaning device
US10617271B2 (en) 2013-12-19 2020-04-14 Aktiebolaget Electrolux Robotic cleaning device and method for landmark recognition
US10678251B2 (en) 2014-12-16 2020-06-09 Aktiebolaget Electrolux Cleaning method for a robotic cleaning device
US10729297B2 (en) 2014-09-08 2020-08-04 Aktiebolaget Electrolux Robotic vacuum cleaner
US10874274B2 (en) 2015-09-03 2020-12-29 Aktiebolaget Electrolux System of robotic cleaning devices
US10874271B2 (en) 2014-12-12 2020-12-29 Aktiebolaget Electrolux Side brush and robotic cleaner
US10877484B2 (en) 2014-12-10 2020-12-29 Aktiebolaget Electrolux Using laser sensor for floor type detection
CN112578398A (en) * 2020-12-07 2021-03-30 中国工程物理研究院应用电子学研究所 Double-focal-plane detection and identification system and detection and identification method
US11099554B2 (en) 2015-04-17 2021-08-24 Aktiebolaget Electrolux Robotic cleaning device and a method of controlling the robotic cleaning device
US11122953B2 (en) 2016-05-11 2021-09-21 Aktiebolaget Electrolux Robotic cleaning device
US11169533B2 (en) 2016-03-15 2021-11-09 Aktiebolaget Electrolux Robotic cleaning device and a method at the robotic cleaning device of performing cliff detection
US11474533B2 (en) 2017-06-02 2022-10-18 Aktiebolaget Electrolux Method of detecting a difference in level of a surface in front of a robotic cleaning device
US11921517B2 (en) 2017-09-26 2024-03-05 Aktiebolaget Electrolux Controlling movement of a robotic cleaning device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0358628A2 (en) * 1988-09-06 1990-03-14 Transitions Research Corporation Visual navigation and obstacle avoidance structured light system
EP0361188A2 (en) * 1988-09-29 1990-04-04 Fraunhofer-Gesellschaft Zur Förderung Der Angewandten Forschung E.V. Method for safeguarding a vehicle against collision, and vehicle so safeguarded
GB2264601A (en) * 1991-12-31 1993-09-01 3D Scanners Ltd Object inspection
GB2286696A (en) * 1994-02-16 1995-08-23 Fuji Heavy Ind Ltd Autonomous vehicle guidance system
JPH0883124A (en) * 1994-09-13 1996-03-26 Kobe Steel Ltd Unmanned carrier
US5673082A (en) * 1995-04-10 1997-09-30 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Light-directed ranging system implementing single camera system for telerobotics applications

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0943353A (en) * 1995-07-28 1997-02-14 Hazama Gumi Ltd Automated guided vehicle

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0358628A2 (en) * 1988-09-06 1990-03-14 Transitions Research Corporation Visual navigation and obstacle avoidance structured light system
EP0361188A2 (en) * 1988-09-29 1990-04-04 Fraunhofer-Gesellschaft Zur Förderung Der Angewandten Forschung E.V. Method for safeguarding a vehicle against collision, and vehicle so safeguarded
GB2264601A (en) * 1991-12-31 1993-09-01 3D Scanners Ltd Object inspection
GB2286696A (en) * 1994-02-16 1995-08-23 Fuji Heavy Ind Ltd Autonomous vehicle guidance system
JPH0883124A (en) * 1994-09-13 1996-03-26 Kobe Steel Ltd Unmanned carrier
US5673082A (en) * 1995-04-10 1997-09-30 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Light-directed ranging system implementing single camera system for telerobotics applications

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7480958B2 (en) 2002-07-26 2009-01-27 Samsung Gwangju Electronics Co., Ltd. Robot cleaner, robot cleaning system and method of controlling same
GB2392255B (en) * 2002-07-26 2004-12-22 Samsung Kwangju Electronics Co Robot cleaner,robot cleaning system,and method of controlling same
GB2392255A (en) * 2002-07-26 2004-02-25 Samsung Kwangju Electronics Co A robot cleaner
US7108731B2 (en) 2003-07-29 2006-09-19 Samsung Gwangju Electronics Co., Ltd. Air cleaning robot and system thereof
GB2404331B (en) * 2003-07-29 2005-06-29 Samsung Gwanju Electronics Co Robot cleaner equipped with negative-ion generator
FR2858204A1 (en) * 2003-07-29 2005-02-04 Samsung Kwangju Electronics Co ROBOT AND AIR CLEANER SYSTEM
GB2404331A (en) * 2003-07-29 2005-02-02 Samsung Gwanju Electronics Co Robot cleaner equipped with negative-ion generator
EP1684143A1 (en) * 2005-01-25 2006-07-26 Samsung Electronics Co., Ltd. Apparatus and method for estimating location of mobile body and generating map of mobile body environment using upper image of mobile body environment, and computer readable recording medium storing computer program controlling the apparatus
US8831872B2 (en) 2005-01-25 2014-09-09 Samsung Electronics Co., Ltd. Apparatus and method for estimating location of mobile body and generating map of mobile body environment using upper image of mobile body environment, and computer readable recording medium storing computer program controlling the apparatus
DE202006002432U1 (en) * 2006-02-16 2007-06-28 Westermann Kg Mobile waste bin, comprises base on castors electrically driven and operated with remote control
US9939529B2 (en) 2012-08-27 2018-04-10 Aktiebolaget Electrolux Robot positioning system
WO2014033055A1 (en) * 2012-08-27 2014-03-06 Aktiebolaget Electrolux Robot positioning system
EP3104194A1 (en) * 2012-08-27 2016-12-14 Aktiebolaget Electrolux Robot positioning system
US10448794B2 (en) 2013-04-15 2019-10-22 Aktiebolaget Electrolux Robotic vacuum cleaner
US10219665B2 (en) 2013-04-15 2019-03-05 Aktiebolaget Electrolux Robotic vacuum cleaner with protruding sidebrush
US10209080B2 (en) * 2013-12-19 2019-02-19 Aktiebolaget Electrolux Robotic cleaning device
US10433697B2 (en) 2013-12-19 2019-10-08 Aktiebolaget Electrolux Adaptive speed control of rotating side brush
US9946263B2 (en) 2013-12-19 2018-04-17 Aktiebolaget Electrolux Prioritizing cleaning areas
US10045675B2 (en) 2013-12-19 2018-08-14 Aktiebolaget Electrolux Robotic vacuum cleaner with side brush moving in spiral pattern
US10149589B2 (en) 2013-12-19 2018-12-11 Aktiebolaget Electrolux Sensing climb of obstacle of a robotic cleaning device
US9811089B2 (en) 2013-12-19 2017-11-07 Aktiebolaget Electrolux Robotic cleaning device with perimeter recording function
US10617271B2 (en) 2013-12-19 2020-04-14 Aktiebolaget Electrolux Robotic cleaning device and method for landmark recognition
WO2015090397A1 (en) * 2013-12-19 2015-06-25 Aktiebolaget Electrolux Robotic cleaning device
US10231591B2 (en) 2013-12-20 2019-03-19 Aktiebolaget Electrolux Dust container
US10518416B2 (en) 2014-07-10 2019-12-31 Aktiebolaget Electrolux Method for detecting a measurement error in a robotic cleaning device
US10499778B2 (en) 2014-09-08 2019-12-10 Aktiebolaget Electrolux Robotic vacuum cleaner
US10729297B2 (en) 2014-09-08 2020-08-04 Aktiebolaget Electrolux Robotic vacuum cleaner
EP3009907A1 (en) * 2014-10-02 2016-04-20 LG Electronics Inc. Robot cleaner
US9488983B2 (en) 2014-10-02 2016-11-08 Lg Electronics Inc. Robot cleaner
US10877484B2 (en) 2014-12-10 2020-12-29 Aktiebolaget Electrolux Using laser sensor for floor type detection
US10874271B2 (en) 2014-12-12 2020-12-29 Aktiebolaget Electrolux Side brush and robotic cleaner
US10678251B2 (en) 2014-12-16 2020-06-09 Aktiebolaget Electrolux Cleaning method for a robotic cleaning device
US11099554B2 (en) 2015-04-17 2021-08-24 Aktiebolaget Electrolux Robotic cleaning device and a method of controlling the robotic cleaning device
US10874274B2 (en) 2015-09-03 2020-12-29 Aktiebolaget Electrolux System of robotic cleaning devices
US11712142B2 (en) 2015-09-03 2023-08-01 Aktiebolaget Electrolux System of robotic cleaning devices
US11169533B2 (en) 2016-03-15 2021-11-09 Aktiebolaget Electrolux Robotic cleaning device and a method at the robotic cleaning device of performing cliff detection
US11122953B2 (en) 2016-05-11 2021-09-21 Aktiebolaget Electrolux Robotic cleaning device
US11474533B2 (en) 2017-06-02 2022-10-18 Aktiebolaget Electrolux Method of detecting a difference in level of a surface in front of a robotic cleaning device
US11921517B2 (en) 2017-09-26 2024-03-05 Aktiebolaget Electrolux Controlling movement of a robotic cleaning device
CN112578398A (en) * 2020-12-07 2021-03-30 中国工程物理研究院应用电子学研究所 Double-focal-plane detection and identification system and detection and identification method
CN112578398B (en) * 2020-12-07 2022-11-29 中国工程物理研究院应用电子学研究所 Double-focal-plane detection and identification system and detection and identification method

Also Published As

Publication number Publication date
GB0300113D0 (en) 2003-02-05
GB2382251B (en) 2004-01-07

Similar Documents

Publication Publication Date Title
GB2382251A (en) Mobile Robot
US6496754B2 (en) Mobile robot and course adjusting method thereof
US6597143B2 (en) Mobile robot system using RF module
US7438766B2 (en) Robot cleaner coordinates compensation method and a robot cleaner system using the same
KR100565227B1 (en) Position recognition apparatus and method for mobile robot
US6868307B2 (en) Robot cleaner, robot cleaning system and method for controlling the same
US20230346187A1 (en) Cleaning robot, cleaning robot system and operating method thereof
WO2022188364A1 (en) Line laser module and self-moving device
US20060129276A1 (en) Autonomous mobile robot
WO2022188365A1 (en) Self-moving apparatus
EP3795050A1 (en) Vacuum cleaner and control method thereof
US20210129335A1 (en) Method for operating an automatically moving cleaning device and cleaning device of this type
CN112909712A (en) Line laser module and self-moving equipment
KR100632242B1 (en) Path correction method of mobile robot
KR20020080898A (en) Apparatus for correcting obstacle detection error of robot cleaner and method therefor
KR20020038296A (en) Apparatus for detecting obstacle in mobile robot and method for detecting the same
TWI828176B (en) Line laser module and autonomous mobile device
KR102358758B1 (en) Recharging apparatus for Robot Cleaner
CN215119524U (en) Line laser module and self-moving equipment
TW202413888A (en) Line laser module and autonomous mobile device

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20100628