WO2019179176A1 - 在此处键入发明名称 - Google Patents

在此处键入发明名称 Download PDF

Info

Publication number
WO2019179176A1
WO2019179176A1 PCT/CN2018/120198 CN2018120198W WO2019179176A1 WO 2019179176 A1 WO2019179176 A1 WO 2019179176A1 CN 2018120198 W CN2018120198 W CN 2018120198W WO 2019179176 A1 WO2019179176 A1 WO 2019179176A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
edge
obstacle
path
grid
Prior art date
Application number
PCT/CN2018/120198
Other languages
English (en)
French (fr)
Other versions
WO2019179176A8 (zh
Inventor
李永勇
肖刚军
Original Assignee
珠海市一微半导体有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 珠海市一微半导体有限公司 filed Critical 珠海市一微半导体有限公司
Priority to EP18910525.7A priority Critical patent/EP3770711A4/en
Priority to US16/982,068 priority patent/US11537142B2/en
Priority to JP2020550731A priority patent/JP7085296B2/ja
Priority to KR1020207028134A priority patent/KR102333984B1/ko
Publication of WO2019179176A1 publication Critical patent/WO2019179176A1/zh
Publication of WO2019179176A8 publication Critical patent/WO2019179176A8/zh

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/027Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector

Definitions

  • the invention relates to the field of robots, and in particular to a method for repositioning a robot. Background technique
  • the sweeping robot may cause walking errors due to defects such as the gyroscope or the code wheel itself or the ground slip, and the error will gradually accumulate.
  • the accumulated error over a long period of time may cause the map constructed by the robot.
  • the invention provides a method for repositioning a robot, which does not need to use an expensive device such as a camera or a laser radar, and can also re-determine the position of the robot, avoiding the problem of inaccurate positioning caused by excessive accumulation of robot walking errors, and improving The accuracy of robot positioning.
  • the specific technical solution of the present invention is as follows:
  • a method for repositioning a robot comprising the steps of: step si: the robot detects an obstacle, and proceeds to step S2; step S2: walking along an edge of the obstacle, and determining a path along the edge of the obstacle Whether the condition for determining that the obstacle is an isolated object is satisfied, if not, proceeding to step S3, and if yes, proceeding to step S4; step S3: adjusting the walking angle, leaving the obstacle, continuing to walk, and detecting the obstacle again Returning to step S2; Step S4: determining that the obstacle is an isolated object, recording an edge path along the edge of the isolated object, determining whether the currently recorded edge path and the previously stored edge path are similar, if no Then, the process proceeds to step S5. If yes, the process proceeds to S6.
  • Step S5 the recorded edge path along the edge of the isolated object is taken as the stored edge path, and the process returns to step S3;
  • step S6 the previously stored The edge path similar to the currently recorded edge path is used as the reference positioning path to determine the current edge path a first partial map and a second partial map where the reference positioning path is located, the first partial map and the same shape and size
  • the two partial maps are overlapped, and the grid element corresponding to the portion of the current edge of the first partial map that overlaps with the reference positioning path of the second partial map is used as a positioning unit, and the robot is currently detected when the positioning unit is located.
  • the raster coordinates to be replaced are replaced by the grid coordinates of the grid cells in the corresponding reference positioning path.
  • step S11 determining grid coordinates corresponding to when the robot detects the obstacle
  • step S12 determining a preset time estimated from the current time
  • step S13 determining whether the grid coordinates determined in step S11 are the same as or adjacent to the grid coordinates corresponding to the edge path determined in step S12, and if yes, proceeding to step S14, If no, the process proceeds to step S2
  • Step S14 The walking angle is adjusted, the obstacle is left, the walking is continued, and when the obstacle is detected again, the process returns to step S11.
  • the adjacent in the step S13 refers to a common edge or a corner between the grid cells corresponding to the two grid coordinates.
  • Step S2 is performed along the edge of the obstacle, and it is determined whether the path along the edge of the obstacle satisfies the condition that the obstacle is an isolated object, and the following steps are included:
  • Step S21 The edge of the obstacle travels, and the start information of the start position point is recorded;
  • Step S22 determining whether the amount of change of the angle detected by the robot from the start position point reaches 360 °, and if yes, proceeding to step S23 , otherwise continue to walk along the edge of the obstacle, until the amount of change of the angle detected by the robot from the starting position point reaches 360 °, proceeds to step S23;
  • Step S23 determines whether the robot returns to the step described in step S21 a starting position point, if yes, determining that the path of the robot walking along the edge of the obstacle satisfies the condition that the obstacle is an isolated object, otherwise proceeds to step S24;
  • step S24 continues to walk along the edge of the obstacle , determining whether the robot returns to the starting position point described in step S21, and
  • the obstacle is a condition of an isolated object, if the robot returns to the starting position point and the angle change amount exceeds 450 °, or the robot does not return to the starting position point and the angle change amount exceeds At 450 ° , it is determined that the path that the robot walks along the edge of the obstacle does not satisfy the condition that the obstacle is an isolated object.
  • step S21 the method further includes the following steps: s, the distance of the distance from the starting point is 0. 5 meters, the distance between the starting point and the angle of the angle of the obstacle is detected. If yes, go to step S213. If no, the robot continues to walk until the distance the robot travels reaches 1.5 meters, and proceeds to step S213.
  • Step S213 determines whether the angle of change of the robot from the starting position point is Up to 90 °, if not, the robot adjusts the walking angle, leaves the obstacle, continues to walk, and when the obstacle is detected again, returns to step S11, and if yes, proceeds to step S214;
  • Step S214 the robot continues along the The edge of the obstacle travels, and it is judged whether the distance traveled by the robot from the starting position point reaches 3 meters. If yes, the process proceeds to step S215. If not, the robot continues to walk until the robot travels to a distance of 3 meters.
  • Step S215 Determine the amount of change in the angle at which the robot starts walking from the starting position point If it does not reach 180 °, if not, the robot adjusts the walking angle, leaves the obstacle, continues to walk, and when the obstacle is detected again, returns to step S11, and if yes, proceeds to step S216; Step S216: continues the robot The edge of the obstacle is walked, and it is determined whether the distance traveled by the robot from the starting position point reaches 4. 5 meters. If yes, the process proceeds to step S217. If not, the robot continues to walk until the distance traveled by the robot reaches 4.
  • step S217 determining whether the amount of change in the angle of the robot from the starting position point reaches 270 °, and if not, the robot adjusts the walking angle, leaves the obstacle, and continues to walk. When the obstacle is detected again, the process returns to step S11, and if so, the process proceeds to step S22.
  • Step S211 detecting a time and an angle change amount of the robot walking along the edge of the obstacle from the starting position point
  • Step S212 Determining whether the time for the robot to start walking from the starting position point reaches 1 minute, if yes, proceeding to step S213, if not, the robot continues to walk until the robot walking time reaches 1 minute, and proceeds to step S213
  • step S213 Determining whether the amount of change in the angle of the robot from the starting position point reaches 90°. If not, the robot adjusts the walking angle, leaves the obstacle, continues to walk, and when the obstacle is detected again, returns to step S11.
  • step S214 the robot continues to walk along the edge of the obstacle, and determines whether the time for the robot to start walking from the starting position point reaches 2 minutes, and if yes, proceeds to step S215, if No, the robot continues to walk until the robot walks for 2 minutes.
  • Step S217 Step S217: determining whether the amount of change in the angle of the robot from the starting position point reaches 270°. If not, the robot adjusts the walking angle, leaves the obstacle, continues walking, and detects the obstacle again. Then, the process returns to step S11, and if yes, the process goes to step S218; step S218: the robot continues to walk along the edge of the obstacle, and determines whether the time for the robot to walk from the starting position point reaches 4 minutes, and if so, Then, the process proceeds to step S22. If not, the robot continues to walk until the robot walks. The time reaches 4 minutes, and the process proceeds to step S22.
  • the method further includes the following steps: determining whether the area circled by the robot along the obstacle is greater than 0.3 square meters, if yes, entering the determining robot to walk along the edge of the obstacle The path satisfies the step of determining the condition that the obstacle is an isolated object. If not, the robot adjusts the walking angle, leaves the obstacle, continues walking, and detects the obstacle again, and then returns to step S11.
  • Step S41 Record the current Recording the grid area of the area surrounded by the edge path along the grid coordinates of the corresponding grid unit along the edge path, and recording the grid coordinates of the center grid unit of the area surrounded by the current edge path
  • Step S42 determining whether the coordinate difference between the grid coordinates of the central grid unit surrounded by the current edge path and the grid coordinates of the central grid unit surrounded by the stored edge path is greater than the first pre-predetermined Set the coordinate difference value, if yes, determine that the edge path of the current record is not similar to the edge path that has been stored before, if not, proceed to step S43;
  • Step S43 determine that the current grid area corresponds to the stored edge path Whether the difference in the grid area of the area is greater than Presetting the area difference, if yes, determining that the edge path of the current record is not similar to the previously stored edge path,
  • step S45 The edge path of the edge is shifted relative to the stored edge path by the distance between the N grid cells in the up, down, left, and right directions, and the number of grid cells overlapping the current edge path and the stored edge path is determined to occupy the stored edge path.
  • N is a natural number and 1 ⁇ N ⁇ 3.
  • step S44 or step S45 The method includes the following steps: marking, according to the first partial map where the current edge path is located and the second partial map where the stored edge path is located, marking the current edge path and the stored raster element corresponding to the edge path as 1
  • the other grid unit is marked as 0; the first partial map and the corresponding grid unit in the second partial map are ANDed; after the judgment and operation, the obtained grid unit labeled 1 is obtained.
  • the replacing the grid coordinates detected when the robot is currently located in the positioning unit with the grid coordinates of the grid unit in the corresponding reference positioning path includes the following steps: Step S61: Whether there are M serially connected grid cells in the positioning unit, and the grid coordinates of the M consecutive grid cells currently recorded by the robot and the grid coordinates of the corresponding grid cells in the reference positioning path The difference is less than the second preset coordinate difference value, if yes, then proceeds to step S62, and if not, returns to step S3; step S62: the robot walks to any one of the M series of connected grid cells And proceeding to step S63; Step S63: replacing the currently detected grid coordinates with the grid coordinates of the grid cells in the corresponding reference positioning path.
  • M is a natural number and 2SMS3.
  • step S62 travels to the M series of grid cells Any one of the grid units includes the following steps: Step S621: determining whether the M series of connected grid cells have only one group, and if so, directly walking to any one of the M series of connected grid cells Grid unit, and proceeds to step S63, if no, proceeds to step S622; step S622: determines a group of the M consecutively connected grid cells with the earliest recording time, and the robot walks to a grid cell in which the recording time is the earliest And proceeds to step S63.
  • step S6 the method further includes the following steps: Step S71: determining that the coordinate value detected when the robot currently located in the step S6 is located at the center point of the positioning unit is (xl, yl); Step S72: determining step The coordinate value of the center point of the grid cell in the corresponding reference positioning path described in S6 is (x2, y2); Step S73: Replace the coordinate value ((xl, yl)) currently detected by the robot with (x2) , y2).
  • the beneficial effects of the present invention include: by using the path of the robot walking along the edge of the isolated object as a reference, the position deviation caused by the excessive accumulation of the robot walking error can be corrected to achieve repositioning, thereby improving the positioning of the robot during subsequent navigation walking. Accuracy and walking efficiency.
  • FIG. 1 is a schematic flow chart of a method for repositioning a robot according to the present invention.
  • FIG. 2 is a schematic diagram 1 after the first partial map and the second partial map are overlapped
  • FIG. 3 is a schematic diagram 2 after the first partial map and the second partial map are overlapped.
  • the robot according to the present invention is a smart cleaning robot (such as a sweeping robot or a mopping robot), and the robots mentioned in the following embodiments all refer to smart cleaning robots. These robots can automatically walk on certain occasions with certain artificial intelligence.
  • the robot's body is equipped with various sensors to detect the walking distance, the walking angle (ie, the direction of travel), the body state and obstacles, etc.
  • the robot of the present invention comprises the following structure: a robotic body capable of autonomously walking with a left driving wheel and a right driving wheel, a human-computer interaction interface is arranged on the body, and an obstacle detecting unit is arranged on the body.
  • An inertial sensor is disposed inside the body, and the inertial sensor includes an accelerometer and a gyroscope.
  • the two driving wheels are provided with an odometer (generally a code wheel) for detecting the walking distance of the driving wheel, and is also capable of processing related The parameters of the sensor and the ability to output control signals to the control module of the actuator.
  • a method for repositioning a robot as shown in FIG. 1 includes the following steps: In step si, the robot performs walking data detection while walking, and enters when a collision sensor or an infrared sensor of the front end of the robot detects an obstacle. Step S2.
  • the robot After proceeding to step S2, the robot walks along the edge of the obstacle and determines whether the path along the edge of the obstacle satisfies the condition that the obstacle is an isolated object, wherein determining the obstacle is a condition of the isolated object
  • the specific conditions can be set according to different design requirements. For example, it can be judged by the relationship between the starting point of the robot starting to travel along the edge of the obstacle and the ending point of walking along the edge of the obstacle. By judging the amount of change in the angle of rotation within a certain period of time, it is also possible to Judging the relationship with the grid position, you can also combine these factors for comprehensive judgment, and so on.
  • the process proceeds to step S3, the robot adjusts the walking angle, walks away from the obstacle, and leaves the obstacle. Then, the robot continues to walk according to the path or manner planned by the system.
  • the collision sensor or the infrared sensor at the front end of the robot detects the obstacle again, the process returns to step S2 to continue to determine whether the obstacle is an isolated object.
  • step S2 when the robot determines that the path along the edge of the obstacle satisfies the condition that the obstacle is an isolated object, then proceeds to step S4, the robot determines that the obstacle is an isolated object, and records the edge
  • the edge path of the edge of the isolated object is recorded, and the recorded information can be set according to specific design requirements, such as recording the grid coordinates corresponding to the edge path, and recording the gyroscope when the robot walks on the grid unit corresponding to the edge path.
  • the detected angle records the time of the robot at the start of the edge path and the time at the end of the edge path, and so on.
  • the robot judges whether the edge path of the current record is similar to the edge path that has been stored before, and the manner of judging can also be set according to different design requirements, for example, the grid of the grid unit corresponding to the edge path of the currently recorded edge.
  • the coordinates are compared with the grid coordinates of the grid cells corresponding to the previously stored edge-edge paths.
  • the two edge-edge paths can be considered similar; The difference between the overall arrangement orientation of the grid cells corresponding to the edge path and the overall arrangement orientation of the grid cells corresponding to the previously stored edge path, if only a few points have different orientations, It is considered that the two edge paths are similar; it can also be judged by the angle change rate per unit time, the length of the edge path and the total time of walking along the edge of the robot while walking along the edge. If these parameters are the same, then two can be considered The trailing edge paths are similar; of course, the factors can also be described as mutual Together comprehensive judgments.
  • edge path of the current record is not similar to the previously stored edge path, it indicates that the isolated object currently detected by the robot has not been touched before, and the relevant record data that does not travel along the edge of the isolated object is repositioned.
  • the reference cannot be repositioned by the isolated object, so the process proceeds to step S5, and the recorded edge path along the edge of the isolated object is used as the stored edge path, so that the subsequent robot can again encounter the isolated object. , can be repositioned, and then return to step S3. If it is judged that the edge path of the current record is similar to the previously stored edge path, it indicates that the robot is currently detecting the orphan.
  • the standing object which has been encountered before, and stores the relevant recorded data of the edge path along the edge of the isolated object, can be repositioned by the isolated object, so the process proceeds to step S6, and the previously stored edge is
  • the edge path of the edge of the isolated object is used as a reference positioning path, and then, the first partial map where the current edge path is located and the second partial map where the reference positioning path is located are determined, that is, a current edge path is circled in the raster map a partial map as a first partial map, delineating a partial map including a reference positioning path as a second partial map, the size and shape of the first partial map and the second partial map being the same, but the specific size and
  • the shape can be set according to the specific design requirements, such as setting the shape as a center, a square or a rectangle, and setting the maximum grid coordinate of the local map to be larger than the maximum grid coordinate in the edge path by four grid cells, The minimum grid coordinates of the map are set to be smaller than the edge The smallest grid coordinates of
  • the first partial map and the second partial map having the same shape and size are overlapped, since the current edge path and the reference positioning path are similar, the difference between the two is mainly the grid coordinates, and the overall path
  • the shapes are almost the same, and there are only a few differences. Therefore, by overlapping the first partial map and the second partial map, it is possible to obtain a portion where the current edge path and the reference positioning path overlap each other.
  • the part indicates that the robot does not produce errors when walking along the edge of the isolated object. It can be repositioned by the grid coordinates corresponding to the edge. Some parts or points that do not overlap indicate that the robot is in the corresponding edge segment. Or the edge point produces a walking error and is not suitable as a reference for repositioning.
  • FIG. 2 is a schematic diagram of the first partial map and the second partial map overlapping. Each small square in the figure represents a grid unit, and the path marked with the letter P is connected. The current edge path, the path marked with the letter B is the reference positioning path, and the letter P and the letter B are indicated in one square, indicating that the The portion of the current edge path that overlaps with the reference positioning path.
  • the method further includes the following steps: Step S11, first determining, by the gyroscope of the robot and the data detected by the odometer, that the robot is in the process of detecting the obstacle The grid cell and the grid coordinates corresponding to the grid cell. Then, the process proceeds to step S12, and the edge path that the robot has stored in the preset time estimated from the current time is determined.
  • the preset time may be set according to specific design requirements.
  • the time may be set to 10 minutes to For any value of 30 minutes, this embodiment is set to 20 minutes, that is, 20 minutes from the current time. In the 20 minutes, what are the stored edge paths of the robot, and the robot will perform data in the memory.
  • step S13 the robot compares the grid coordinates determined in step S11 with the grid coordinates corresponding to the edge path determined in step S12, if it is found that the two grid coordinates are the same or Adjacent, it is considered that the current obstacle has been encountered within 20 minutes, and has been walking along its edge. If the robot frequently walks around the edge of the same obstacle in a short period of time, Reduce the walking efficiency of the robot. At the same time, the error accumulated by the robot is not very large in a short time interval. It does not need to be repositioned. Frequent repositioning will also cause the walking efficiency of the robot to decrease.
  • step S13 it is determined in step S13 that the grid coordinates determined in step S11 are the same as or adjacent to the grid coordinates corresponding to the edge path determined in step S12, then the process proceeds to step S14, and the robot adjusts Walking angle, walking away from the obstacle, leaving the obstacle. Then, the robot continues to travel according to the path or manner planned by the system.
  • the collision sensor or the infrared sensor at the front end of the robot detects the obstacle again, the process returns to step S11 to restart the determination of whether the obstacle detected again meets the relevant requirements.
  • step S2 determines whether the obstacle meets the condition that enables the robot to reposition.
  • the adjacent in the step S13 refers to a common edge or a corner point between the grid cells corresponding to the two grid coordinates. The method described in this embodiment determines whether the robot needs to walk along the edge of the obstacle by the obstacle detected by the robot within a preset time, thereby improving the efficiency of the robot walking and avoiding the blindness of the robot action. And repeatability.
  • the step S2 is performed along the edge of the obstacle, and it is determined whether the path along the edge of the obstacle satisfies the condition that the obstacle is an isolated object, and includes the following steps:
  • step S21 the robot walks along the edge of the obstacle, and based on the data detected by the gyroscope and the odometer, records the start information of the starting position point when the robot starts to walk along the side, and the start information may include The point coordinate value of the starting position point, the grid coordinate of the grid cell corresponding to the starting position point, the direction of walking, and the time of starting the walking. Recording the starting information may provide reference data for subsequent determination of whether the obstacle is an isolated object, or may provide a navigation basis for subsequent searching for the obstacle.
  • step S22 based on the data detected by the gyroscope, it is determined whether the robot walks along the edge of the obstacle from the starting position point, and whether the angle of the rotation of the body reaches 360 °, thereby It is preliminarily judged whether the robot has walked a circle. If it is judged that it has walked a circle, it proceeds directly to step S23 for further confirmation. If it is judged that there is no one lap, it continues to travel along the edge of the obstacle until the amount of change in the angle detected by the robot from the start position reaches 360 °, and then proceeds to step S23 for further confirmation.
  • step S23 the robot first determines whether to return to the starting position point described in step S21, and the determining manner may adopt the manner in which the point coordinate values are the same, if the point coordinate value of the current position point and the starting position point are detected. If the point coordinate values are the same, the robot is considered to have returned to the starting position point, so that it can be determined that the robot walks a circle along the edge of the obstacle, and the path that is walking along the side can be that the obstacle is an isolated object, that is, satisfied. It is determined that the obstacle is a condition of an isolated object, and then it is possible to proceed to step S4 for the next operation.
  • step S24 the walking along the edge of the obstacle continues to determine whether the robot is Returning to the home position point described in step S21, and determining whether the amount of angle change detected by the robot from the start position point reaches 450 °.
  • the obstacle is determined to be an isolated object, that is, a condition for determining that the obstacle is an isolated object is satisfied. If the robot returns to the home position point, but the angle change amount exceeds 450 °, or the robot does not return to the home position point, and the angle change amount exceeds 450 °, the obstacle is indicated The object is not an isolated object, and determining that the path the robot is walking along the edge of the obstacle does not satisfy the condition that the obstacle is an isolated object.
  • the method described in this embodiment can accurately determine whether the robot can completely walk around the obstacle by combining the coordinate value and the rotation angle of the robot, thereby accurately determining whether the path of the robot walking along the edge of the obstacle is The conclusion that the condition that the obstacle is an isolated object is satisfied, and effective reference data can be provided for repositioning the subsequent robot.
  • step S211 based on the odometer and the gyroscope, detecting that the robot starts from the starting position point along the edge of the obstacle The distance traveled and the amount of angle change. Then, the process proceeds to step S212, and determines whether the distance traveled by the robot from the starting point is 1.5 m. If yes, the process proceeds to step S213. If not, the robot continues to walk until the distance traveled by the robot reaches 1.5. m, proceeds to step S213. In step S213, it is determined whether the amount of change in the angle of the robot from the starting position point reaches 90°.
  • step S214 the robot continues to walk along the edge of the obstacle, and determines whether the distance that the robot starts to travel from the starting position point reaches 3 meters. If yes, the process proceeds to step S215, and if not, the robot continues to walk.
  • step S215 it is determined whether the angle of change of the angle of the robot from the start position point reaches 180 °, and if not, it indicates that the edge line extension angle of the first 1.5 m of the obstacle is appropriate, and 1.
  • the extension angle of the edge line of 5 meters to 3 meters is relatively large, and it is very likely that the footprint of the obstacle is also enlarged, which is not suitable as a reference object for repositioning. Therefore, the robot adjusts the walking angle and leaves the obstacle. If the object continues to walk and the obstacle is detected again, the process returns to step S11 to restart the judgment.
  • step S216 the robot continues Walking along the edge of the obstacle, and determining whether the distance traveled by the robot from the starting position point reaches 4. 5 meters. If yes, the process proceeds to step S217. If not, the robot continues to walk until the robot walks. The distance reaches 4. 5 meters, and the flow proceeds to step S217.
  • step S217 it is determined whether the amount of change in the angle of the robot from the start position point reaches 270 °, and if not, it indicates that the edge line extension angle of the first 3 meters of the obstacle is appropriate, and 3 to 4
  • the extension angle of the edge line of 5 meters is relatively large, and it is very likely that the footprint of the obstacle is also large, which is not suitable as a reference object for repositioning. Therefore, the robot adjusts the walking angle and leaves the obstacle. When the walking is continued and the obstacle is detected again, the process returns to step S11 to restart the determination. If the robot walks for 4.5 meters and the angle of rotation reaches 270 °, it can be concluded that the footprint of the obstacle is suitable.
  • the obstacle has a suitable footprint and is the best reference object for robot repositioning.
  • the size of the footprint of the obstacle can be accurately determined, thereby effectively determining whether the obstacle is suitable as a robot. Repositioning the reference object, which in turn provides accurate reference data for subsequent robot repositioning.
  • step S211 based on the RTC clock timing module and the gyroscope in the robot, detecting that the robot starts from the starting position point The time and angle change amount of the edge walking of the obstacle is described, and then proceeds to step S212.
  • step S212 the robot determines whether the time of walking from the starting position point reaches 1 minute. If yes, it indicates that the robot has traveled a certain distance along the edge of the obstacle, then proceeds to step S213, and if not, the robot continues to walk. When the robot has traveled for 1 minute, the process proceeds to step S213.
  • step S213 it is determined whether the amount of change in the angle of the robot from the start position point reaches 90 °, and if not, it indicates that the obstacle corresponding to the change of the travel trajectory of the obstacle along the edge of the obstacle has a large footprint. It is not suitable as a reference object for repositioning. Therefore, when the robot adjusts the walking angle, leaves the obstacle, continues walking, and detects the obstacle again, the process returns to step S11 to restart the judgment. If the robot has been walking for 1 minute and the angle of change of the rotation reaches 90 °, it can be preliminarily judged that the size of the obstacle is suitable, and it is necessary to proceed to step S214 for further judgment.
  • step S214 the robot continues to walk along the edge of the obstacle, and determines whether the robot has traveled from the starting position point for 2 minutes, and if so, indicates the machine If the person walks for a longer distance, the process proceeds to step S215. If not, the robot continues to walk until the robot travels for 2 minutes, and the process proceeds to step S215.
  • step S215 it is determined whether the amount of change in the angle of the robot from the starting position point reaches 180 °, and if not, it indicates that the angle of the edge line of the robot walking along the edge of the obstacle at the first minute of the start is appropriate.
  • step S216 the robot continues to walk along the edge of the obstacle, and determines whether the robot has started walking from the starting position point for 3 minutes.
  • step S217 it is determined whether the amount of change in the angle of the robot from the start position point reaches 270 °, and if not, the edge line extending along the edge of the obstacle indicating the first minute and the second minute of the start of the robot The angle is more suitable, and the extension of the edge line along the edge of the obstacle during the third minute is relatively large. It is very likely that the footprint of the obstacle also becomes larger, which is not suitable as a reference object for repositioning.
  • the robot adjusts the walking angle, leaves the obstacle, continues walking, and detects the obstacle again, then returns to step S11 to restart the determination. If the robot walks for 3 minutes and the angle of rotation reaches 270 °, indicating that the extension angle of the edge line walking along the edge of the obstacle in the first 3 minutes of the robot is also reasonable, proceed to step S218, and the robot continues along the edge of the obstacle. Walking, and determining whether the time that the robot starts to walk from the starting position point reaches 4 minutes, and if yes, proceeds to step S22, and determines whether the amount of angular change detected by the robot from the starting position point reaches 360 °.
  • the robot From this, it can be judged whether the robot walks 360 ° along the edge of the obstacle, and realizes a circle along the side. If not, the robot continues to walk until the robot travels for 4 minutes, and then proceeds to step S22 to determine 360 °.
  • the size of the footprint of the obstacle can be accurately determined, thereby effectively determining whether the obstacle is suitable for repositioning as a robot.
  • the reference object can then provide accurate reference data for the repositioning of subsequent robots.
  • the determining robot described in step S23 returns to the starting position point described in step S21, and determines that the path of the robot walking along the edge of the obstacle satisfies the determination of the obstacle Before the condition of the isolated object, or if the robot returns to the starting position point and the angle change amount does not reach 450 ° as described in step S24, and determines that the robot walks along the edge of the obstacle Before the path satisfies the condition that the obstacle is an isolated object
  • the method further includes the following steps: determining whether the area circled by the robot along the obstacle is greater than 0.33 square meters, if yes, indicating that the obstacle is occupied The size of the ground area is suitable, and a reference object that is repositioned as a robot may be considered, and then a path to determine that the robot travels along the edge of the obstacle satisfies the condition of determining that the obstacle is an isolated object, and if not, the robot Adjust the walking angle, leave the obstacle, continue walking, and detect obstacles again Returns to step S
  • the method described in this embodiment can select an ideal isolated object as a reference for subsequent robots to relocate or store new positioning parameters by using a footprint of more than 0.3 square meters as a limiting condition.
  • Step S41 Based on The data detected by the gyroscope and the odometer of the robot records the grid coordinates of the current grid unit corresponding to the edge path, records the grid area of the area surrounded by the current edge path, and records the current The grid coordinates of the central grid unit of the area enclosed by the edge path, wherein the grid coordinates can be converted by the coordinate values of the position points, and the grid area can be calculated by the number of grid units come out.
  • the grid coordinates of the center grid cell can be calculated by the top, bottom, left and right grid coordinates in the area.
  • the center is not only the positive center under the regular shape.
  • the grid coordinates calculated by the same method can also be used as the grid coordinates of the central grid unit of the area.
  • step S42 determining the grid coordinates of the central grid unit of the area surrounded by the current edge path and the stored Whether the coordinate difference of the grid coordinates of the central grid unit surrounded by the edge path is greater than the first preset coordinate difference, and the first preset coordinate difference may be correspondingly set according to specific design requirements, and may be set (2, 2) or (3, 3), that is, whether the X value in the compared grid coordinates is greater than 2 or 3, and whether the Y value is greater than 2 or 3, if yes, it is determined to be greater than the first pre- Set the coordinate difference, otherwise it is determined not to be greater than the first preset coordinate difference.
  • the coordinate difference between the grid coordinates of the central grid unit surrounded by the current edge path and the grid coordinate of the central grid unit surrounded by the stored edge path is greater than the first preset coordinate
  • the difference indicates that the difference between the upper and lower left and right most orthogonal grid coordinates of the two adjacent edge paths is relatively large, that is, the shapes of the two edge paths are largely different, so the edge of the current record can be determined.
  • the path is not similar to the edge path that was previously stored.
  • step S43 it is determined whether the difference between the current grid area and the grid area of the area corresponding to the stored edge path is greater than a preset area difference, and the preset area difference may be performed according to specific design requirements.
  • the corresponding setting can be set to the area of 1 to 5 grid cells. Due to the influence of walking error, if the set value is too small, it is difficult to find the matching object.
  • this embodiment is set to the area of three grid cells, so that the optimal matching effect can be achieved.
  • the grid area can be based on the grid coordinate values of the grid cells corresponding to the edge path, and the number of grid cells per row or the number of grid cells per column is added, and then multiplied by each grid cell. The area of the grid is obtained.
  • the difference between the current grid area and the grid area of the area corresponding to the stored edge path is greater than the preset area difference, it indicates that the shapes of the two areas are relatively different, and the shapes of the two side paths are The phase difference is also large, so it can be determined that the edge record of the current record is not similar to the edge path that has been previously stored. If it is judged that the difference between the current grid area and the grid area of the area corresponding to the stored edge path is less than or equal to the preset area difference, it does not necessarily indicate that the two grid paths are similar, and the steps need to be entered. S44 makes further judgments.
  • step S44 based on the first partial map where the current edge path is located and the second partial map where the stored edge path is located, the first partial map and the second partial map having the same shape and size are overlapped to determine the current Whether the ratio of the number of grid cells overlapping the edge path and the stored edge path to the number of grid cells in the stored edge path is greater than a preset scale value, the preset The scale value can be set accordingly according to the specific design requirements, and this embodiment is set to 90%.
  • FIG. 3 is a schematic diagram of the first partial map and the second partial map being overlapped, and the selected first partial map and the second partial map are each having a distance of 15 grid cells in the raster map.
  • Each small square in the figure represents a grid unit, and the path marked with the letter H is the current edge path, and the path marked with the letter Q is the previously stored path.
  • the edge path, marked with a letter H and a letter Q in a square indicates that this is the portion of the current edge path and the stored edge path overlap.
  • the number of grid cells overlapping the current edge path and the stored edge path is calculated.
  • the ratio of the number of grid cells in the edge path is 90.6%, which is greater than the preset ratio of 90%.
  • step S45 is required to make further determination.
  • step S45 the distance between the current edge path and the stored edge path is shifted by N grid cells in the up, down, left, and right directions, respectively, and the current edge path and the stored edge path overlap each other. Whether the number of raster cells in the stored edge path is greater than the preset scale value.
  • FIG. 4 is another schematic diagram of the first partial map and the second partial map overlapping. Each small square in the figure represents a grid unit, and the squares marked with the letter H are connected. The path is the current edge path, and the path marked with the letter Q is a previously stored edge path. In a square, the letter H is marked and the letter Q is marked, indicating that this is The current edge path and the stored portion of the edge path overlap.
  • the method described in this embodiment determines whether the edge path of the current record and the edge path that has been previously stored are similar by using the overlapping manner of the grid cells, so that a relatively accurate judgment result can be obtained, which is beneficial to improving the repositioning of the subsequent robot. accuracy.
  • the preset ratio value includes the following steps: the current edge path and the stored edge unit corresponding to the edge path according to the first partial map where the current edge path is located and the second partial map where the stored edge path is located Both are marked as 1, and other grid cells are marked as 0. Performing an AND operation on the corresponding partial cells in the first partial map and the second partial map, that is, the result of 1 and 1 is 1, the result of 1 and 0 is 0, and the result of 0 and 0 is also 0. .
  • the obtained ratio of the number of grid cells marked with 1 to the number of grid cells corresponding to the stored edge path is greater than a preset scale value.
  • the grid unit is binarized, and then analyzed by means of calculation, so that the number of grid cells overlapping each other in the two edge paths can be quickly and accurately obtained, thereby being fast and accurate. Determining whether the ratio of the number of overlapping grid cells to the stored number of grid cells corresponding to the edge path is greater than a preset ratio value, and more accurately providing accurate reference data for subsequent robot positioning .
  • the preset ratio value and the related calculation manner are the same as those in the foregoing embodiment, and details are not described herein again.
  • the grid coordinates detected when the robot is currently located in the positioning unit are replaced with the grid coordinates of the grid unit in the corresponding reference positioning path, and the following steps are included: S61: determining whether there are M connected grid cells in the positioning unit, and the grid coordinates of the M consecutive grid cells currently recorded by the robot and the corresponding grid cells in the reference positioning path
  • the difference of the grid coordinates is smaller than the second preset coordinate difference.
  • the value of the M may be set according to specific design requirements, where M is a natural number, and 2SMS3 is set to 3.
  • the second preset coordinate difference value may also be correspondingly set according to specific design requirements, and is set to (2, 2) in this embodiment.
  • the difference between the grid coordinate values of the three grid cells continuously detected by the robot and the grid coordinates of the corresponding grid cells in the reference positioning path is smaller than the second preset coordinate difference value (2, 2), that is, the difference between the X value of the grid coordinates of the recorded grid cell and the X value of the grid coordinate of the corresponding grid cell in the reference positioning path is less than 2, and the difference between the Y values If it is less than 2, it indicates that the position difference between the two paths being compared is not very large, and the objects belonging to the same orientation may proceed to step S62 to perform subsequent positioning correction. Otherwise, it indicates that the position difference between the two paths being compared is large, and may not belong to the same position. The object cannot be used as a positioning reference.
  • step S62 since the error of the above three consecutive grid cells is relatively small, the robot walks to any one of the three grid cells connected in series, and proceeds to step S63 to check the current detection.
  • the grid coordinates to be replaced are replaced by the grid coordinates of the grid cells in the corresponding reference positioning path, thereby realizing the repositioning of the robot. Since the robot performs the above-mentioned steps mainly to judge whether the edge path is similar or not, if two objects having the same footprint and shape appear in the home environment, the robot only corrects the positioning data by relying on the similar edge path, it is likely There will be errors.
  • the method in this embodiment further determines whether the object corresponding to the current object and the reference positioning path is in the same position by detecting and recording the difference between the grid coordinates of the edge path and the grid coordinates in the reference positioning path. In order to obtain more accurate positioning data, the positioning correction of the robot is more accurate.
  • the robot in step S62 travels to any one of the M series of grid cells, and includes the following steps: Step S621: determining the M series connected gates Whether the cell unit has only one group. Since the current object and the reference object have the same position and approximate shape, the grid coordinates are the same or approximate, so generally, there are multiple sets of the M serially connected grid cells, or only one group, but There are many grid cells in this group. If there is only one group, Then, directly go to any one of the M series of grid cells, and proceed to step S63 to replace the currently detected grid coordinates with the grid of the grid cells in the corresponding reference positioning path. Coordinates to achieve repositioning of the robot.
  • step S622 determines a group of the M consecutive grid cells with the earliest recording time, and the robot walks to a grid cell in which the recording time is the earliest, and proceeds to step S63, and the current detection is performed.
  • the grid coordinates are replaced by the grid coordinates of the grid cells in the corresponding reference positioning path, thereby realizing the repositioning of the robot. Because the longer the robot walks, the greater the error produced, so the earlier the data is recorded, the more accurate it is.
  • the grid unit with the earliest time is selected to correct the positioning data, thereby further ensuring the accuracy of the positioning data and ensuring the accuracy of the positioning of the robot.
  • step S6 the method further includes the following steps: Step S71: determining that the grid coordinate detected when the robot currently located in the step S6 is located in the positioning unit is (XI, Y1), and enters Step S72.
  • step S72 it is determined that the grid coordinates of the grid cells in the corresponding reference positioning path described in step S6 are (X2, Y2), and the flow proceeds to step S73.
  • step S73 it is determined that the side length of the grid unit is L, and the flow proceeds to step S74.
  • step S6 the method further includes the following steps: Step S71: determining, based on the data detected by the gyroscope and the odometer, that the robot in step S6 is currently located at a center point of the positioning unit The coordinate value is (xl, yl), and the flow proceeds to step S72.
  • step S72 based on the corresponding data in the saved stored path, determine the coordinate value of the center point of the grid unit in the corresponding reference positioning path described in step S6 (x2, y2), and enter the step S73.
  • step S73 the coordinate value ((xl, yl)) currently detected by the robot is replaced with (x2, y2).
  • the grid unit described in the above embodiments refers to a virtual square having a side length of 20 cm, and a map having a certain length and width formed by a plurality of grid units for indicating geographical environment information is a grid. map.
  • the robot can know the position of the grid unit currently located by the data detected while walking, and can update the state of the grid unit in real time, for example, marking the status of the grid unit that has passed smoothly as Walk through, mark the state of the grid cell that hits the obstacle as an obstacle, mark the state of the grid cell where the cliff is detected as a cliff, mark the state of the grid cell that has not been reached as unknown, and so on.
  • the isolated object described in the above embodiments refers to a separate object that does not lie on the wall or does not lie against the wall object, and the robot can walk one circle along the edge of the independent object.
  • an independent object is not only a single object, but also a plurality of objects that are close together and capable of forming a continuous footprint, and also belong to the independent object.
  • the previously stored edge path described in the above embodiments refers to an edge path in the robot system that has been stored in the memory along other edges of the isolated object that meet certain conditions, including the storage edge path corresponding to the edge path.
  • the edge path of the current record refers to an edge path along the edge of the current obstacle temporarily stored in the buffer area in the robot system, including the grid coordinates of the grid unit corresponding to the edge path, and the edge position along the edge.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)
  • Numerical Control (AREA)

Abstract

一种机器人重新定位的方法,通过利用机器人沿孤立物体的边缘行走的路径作为参考,可以对机器人行走误差累积过大而造成的位置偏差进行修正,实现重新定位,从而提高机器人在后续导航行走时定位的准确性和行走效率。

Description

一种机器人重定位的方法
技术领域
本发明涉及机器人领域, 具体涉及一种机器人重新定位的方法。 背景技术
目前, 扫地机器人在清扫过程中, 会由于陀螺仪或码盘等器件自身的缺陷 或者地面打滑等原因, 产生行走误差, 并且该误差会逐渐累积, 长时间累积的 误差会导致机器人所构建的地图也存在很大误差。 现有解决这种误差的方法中, 比较有效的是加入摄像头进行视觉定位或者采用激光雷达进行测距定位, 但是 采用这些方式需要较高的硬件成本, 不适于扫地机器人的推广应用。 发明内容
本发明提供了一种机器人重新定位的方法, 不需要使用摄像头或者激光雷 达等昂贵器件, 也可以对机器人的位置进行重新确定, 避免机器人行走误差累 积过大而导致的定位不准确的问题, 提高了机器人定位的准确性。 本发明的具 体技术方案如下:
一种机器人重定位的方法, 包括如下步骤: 步骤 si : 机器人检测到障碍物, 并进入步骤 S2 ; 步骤 S2 : 沿所述障碍物的边缘行走, 并判断沿所述障碍物的边 缘行走的路径是否满足确定所述障碍物是孤立物体的条件, 如果否, 则进入步 骤 S3 , 如果是, 则进入步骤 S4; 步骤 S3 : 调整行走角度, 离开所述障碍物, 继 续行走, 再次检测到障碍物时, 则返回步骤 S2; 步骤 S4: 确定所述障碍物是孤 立物体, 记录沿所述孤立物体的边缘行走的沿边路径, 判断当前记录的沿边路 径和此前已存储的沿边路径是否相似, 如果否, 则进入步骤 S5 , 如果是, 则进 入 S6; 步骤 S5 : 把所记录的沿所述孤立物体的边缘行走的沿边路径作为已存储 的沿边路径, 并返回步骤 S3 ; 步骤 S6 : 把此前已存储的与当前记录的沿边路径 相似的沿边路径作为参考定位路径, 确定当前的沿边路径所在的第一局部地图 和参考定位路径所在的第二局部地图, 把形状和大小相同的第一局部地图和第 二局部地图进行重叠, 确定第一局部地图的当前的沿边路径中, 与第二局部地 图的参考定位路径重叠的部分所对应的栅格单元作为定位单元, 将机器人当前 位于所述定位单元时检测到的栅格坐标替换为对应的参考定位路径中的栅格单 元的栅格坐标。
进一步地,在步骤 S1之后,且在步骤 S2之前,还包括如下步骤:步骤 S11 : 确定机器人检测到障碍物时所对应的栅格坐标; 步骤 S12 : 确定从当前时间往前 推算的预设时间内机器人已存储的沿边路径; 步骤 S13 : 判断步骤 S11中所确定 的栅格坐标与步骤 S12 中所确定的沿边路径所对应的栅格坐标是否相同或者相 邻, 如果是, 则进入步骤 S14, 如果否, 则进入步骤 S2 ; 步骤 S14: 调整行走角 度, 离开所述障碍物, 继续行走, 再次检测到障碍物时, 则返回步骤 S11。其中, 步骤 S13 中所述的相邻是指两个栅格坐标所对应的栅格单元之间具有共同的一 条边或者一个角点。
进一步地, 步骤 S2所述的沿所述障碍物的边缘行走, 并判断沿所述障碍物 的边缘行走的路径是否满足确定所述障碍物是孤立物体的条件, 包括如下步骤: 步骤 S21 :沿所述障碍物的边缘行走, 并记录起始位置点的起始信息; 步骤 S22 : 判断机器人从所述起始位置点开始检测到的角度变化量是否达到 360 ° , 如果 是, 则进入步骤 S23 , 否则继续沿所述障碍物的边缘行走, 至机器人从所述起始 位置点开始检测到的角度变化量达到 360 ° , 进入步骤 S23 ; 步骤 S23 : 判断机 器人是否回到步骤 S21 中所述的起始位置点, 如果是, 则确定机器人沿所述障 碍物的边缘行走的路径满足确定所述障碍物是孤立物体的条件, 否则进入步骤 S24; 步骤 S24: 继续沿所述障碍物的边缘行走, 判断机器人是否回到步骤 S21 中所述的起始位置点, 以及判断机器人从所述起始位置点开始检测到的角度变 化量是否达到 450 ° ,如果机器人回到所述起始位置点且所述角度变化量没有达 到 450 ° ,则确定机器人沿所述障碍物的边缘行走的路径满足确定所述障碍物是 孤立物体的条件, 如果机器人回到所述起始位置点且所述角度变化量超过了 450 ° , 或者机器人没有回到所述起始位置点且所述角度变化量超过了 450 ° , 则确定机器人沿所述障碍物的边缘行走的路径不满足确定所述障碍物是孤立物 体的条件。
进一步地, 在步骤 S21之后, 且在步骤 S22之前, 还包括如下步骤: 步骤 S211: 检测机器人从所述起始位置点开始沿所述障碍物的边缘行走的距离和角 度变化量; 步骤 S212: 判断机器人从所述起始位置点开始行走的距离是否达到 1. 5米, 如果是, 则进入步骤 S213 , 如果否, 则机器人继续行走, 至机器人行 走的距离达到 1. 5米, 进入步骤 S213 ; 步骤 S213 : 判断机器人从所述起始位置 点开始行走的角度变化量是否达到 90 ° , 如果否, 则机器人调整行走角度, 离 开所述障碍物, 继续行走, 再次检测到障碍物时, 则返回步骤 S11, 如果是, 则 进入步骤 S214; 步骤 S214: 机器人继续沿所述障碍物的边缘行走, 并判断机器 人从所述起始位置点开始行走的距离是否达到 3米, 如果是, 则进入步骤 S215 , 如果否, 则机器人继续行走, 至机器人行走的距离达到 3米, 进入步骤 S215 ; 步骤 S215: 判断机器人从所述起始位置点开始行走的角度变化量是否达到 180 ° , 如果否, 则机器人调整行走角度, 离开所述障碍物, 继续行走, 再次检 测到障碍物时, 则返回步骤 S11, 如果是, 则进入步骤 S216 ; 步骤 S216: 机器 人继续沿所述障碍物的边缘行走, 并判断机器人从所述起始位置点开始行走的 距离是否达到 4. 5米, 如果是, 则进入步骤 S217 , 如果否, 则机器人继续行走, 至机器人行走的距离达到 4. 5米, 进入步骤 S217; 步骤 S217 : 判断机器人从所 述起始位置点开始行走的角度变化量是否达到 270 ° , 如果否, 则机器人调整行 走角度, 离开所述障碍物, 继续行走, 再次检测到障碍物时, 则返回步骤 S11, 如果是, 则进入步骤 S22。
进一步地, 在步骤 S21之后, 且在步骤 S22之前, 还包括如下步骤: 步骤 S211: 检测机器人从所述起始位置点开始沿所述障碍物的边缘行走的时间和角 度变化量; 步骤 S212: 判断机器人从所述起始位置点开始行走的时间是否达到 1 分钟, 如果是, 则进入步骤 S213 , 如果否, 则机器人继续行走, 至机器人行 走的时间达到 1分钟, 进入步骤 S213 ; 步骤 S213 : 判断机器人从所述起始位置 点开始行走的角度变化量是否达到 90 ° , 如果否, 则机器人调整行走角度, 离 开所述障碍物, 继续行走, 再次检测到障碍物时, 则返回步骤 S11, 如果是, 则 进入步骤 S214; 步骤 S214: 机器人继续沿所述障碍物的边缘行走, 并判断机器 人从所述起始位置点开始行走的时间是否达到 2分钟,如果是,则进入步骤 S215 , 如果否, 则机器人继续行走, 至机器人行走的时间达到 2分钟, 进入步骤 S215 ; 步骤 S215: 判断机器人从所述起始位置点开始行走的角度变化量是否达到 180 ° , 如果否, 则机器人调整行走角度, 离开所述障碍物, 继续行走, 再次检 测到障碍物时, 则返回步骤 S11, 如果是, 则进入步骤 S216 ; 步骤 S216: 机器 人继续沿所述障碍物的边缘行走, 并判断机器人从所述起始位置点开始行走的 时间是否达到 3分钟, 如果是, 则进入步骤 S217 , 如果否, 则机器人继续行走, 至机器人行走的时间达到 3分钟, 进入步骤 S217; 步骤 S217 : 判断机器人从所 述起始位置点开始行走的角度变化量是否达到 270 ° , 如果否, 则机器人调整行 走角度, 离开所述障碍物, 继续行走, 再次检测到障碍物时, 则返回步骤 S11, 如果是, 则进入步骤 S218 ; 步骤 S218 : 机器人继续沿所述障碍物的边缘行走, 并判断机器人从所述起始位置点开始行走的时间是否达到 4分钟, 如果是, 则 进入步骤 S22 , 如果否, 则机器人继续行走, 至机器人行走的时间达到 4分钟, 进入步骤 S22。
进一步地, 在步骤 S23 中所述的判断机器人回到步骤 S21 中所述的起始位 置点之后, 且在确定机器人沿所述障碍物的边缘行走的路径满足确定所述障碍 物是孤立物体的条件之前, 或者在步骤 S24 中所述的如果机器人回到所述起始 位置点且所述角度变化量没有达到 450 ° 之后,且在确定机器人沿所述障碍物的 边缘行走的路径满足确定所述障碍物是孤立物体的条件之前, 还包括如下步骤: 判断机器人沿所述障碍物行走一周所圈定的面积是否大于 0. 3平方米, 如果是, 进入确定机器人沿所述障碍物的边缘行走的路径满足确定所述障碍物是孤立物 体的条件的步骤, 如果否, 则机器人调整行走角度, 离开所述障碍物, 继续行 走, 再次检测到障碍物时, 则返回步骤 S11。
进一步地, 步骤 S4中所述的记录沿所述孤立物体的边缘行走的沿边路径, 判断当前记录的沿边路径和此前已存储的沿边路径是否相似, 包括如下步骤: 步骤 S41 : 记录当前的所述沿边路径所对应栅格单元的栅格坐标, 记录当前的所 述沿边路径所围成的区域的栅格面积, 记录当前的所述沿边路径所围成的区域 的中心栅格单元的栅格坐标; 步骤 S42 : 判断当前的沿边路径所围成区域的中心 栅格单元的栅格坐标与已存储的沿边路径所围成区域的中心栅格单元的栅格坐 标的坐标差值是否大于第一预设坐标差值, 如果是, 则确定当前记录的沿边路 径和此前已存储的沿边路径不相似, 如果否, 则进入步骤 S43 ; 步骤 S43 : 判断 当前的栅格面积与已存储的沿边路径所对应的区域的栅格面积的差值是否大于 预设面积差值, 如果是, 则确定当前记录的沿边路径和此前已存储的沿边路径 不相似, 如果否, 则进入步骤 S44; 步骤 S44: 基于当前的沿边路径所在的第一 局部地图和已存储的沿边路径所在的第二局部地图, 把形状和大小相同的第一 局部地图和第二局部地图进行重叠, 判断当前的沿边路径与已存储的沿边路径 相互重叠的栅格单元数量占已存储的沿边路径中的栅格单元数量的比例是否大 于预设比例值, 如果是, 则确定当前记录的沿边路径和此前已存储的沿边路径 相似, 如果否, 则进入步骤 S45 ; 步骤 S45 : 将当前的沿边路径相对于已存储的 沿边路径分别向上下左右四个方向平移 N个栅格单元的距离, 判断当前的沿边 路径与已存储的沿边路径相互重叠的栅格单元数量占已存储的沿边路径中的栅 格单元数量的比例是否大于预设比例值, 如果是, 则确定当前记录的沿边路径 和此前已存储的沿边路径相似, 如果否, 则确定当前记录的沿边路径和此前已 存储的沿边路径不相似。 其中, 所述 N为自然数, 且 1 <N<3。
进一步地, 步骤 S44或者步骤 S45 中所述的判断当前的沿边路径与已存储 的沿边路径相互重叠的栅格单元数量占已存储的沿边路径中的栅格单元数量的 比例是否大于预设比例值, 包括如下步骤: 基于当前的沿边路径所在的第一局 部地图和已存储的沿边路径所在的第二局部地图, 将当前的沿边路径和已存储 的沿边路径所对应的栅格单元都标示为 1, 其它的栅格单元标示为 0; 将所述第 一局部地图和所述第二局部地图中相对应的栅格单元进行与运算; 判断与运算 后, 所得到的标示为 1 的栅格单元的数量占已存储的沿边路径所对应的栅格单 元的数量的比例是否大于预设比例值。
进一步地, 步骤 S6中所述的将机器人当前位于所述定位单元时检测到的栅 格坐标替换为对应的参考定位路径中的栅格单元的栅格坐标, 包括如下步骤: 步骤 S61 : 判断所述定位单元中是否存在 M个串连的栅格单元, 且机器人当前记 录的这 M个串连的栅格单元的栅格坐标与所述参考定位路径中相应的栅格单元 的栅格坐标的差值小于第二预设坐标差值, 如果是, 则进入步骤 S62 , 如果否, 则返回步骤 S3 ; 步骤 S62: 机器人行走至所述 M个串连的栅格单元中的任意一 个栅格单元, 并进入步骤 S63 ; 步骤 S63 : 将当前检测到的栅格坐标替换为对应 的参考定位路径中的栅格单元的栅格坐标。其中,所述 M为自然数,且 2SMS3。
进一步地, 步骤 S62 中所述的机器人行走至所述 M个串连的栅格单元中的 任意一个栅格单元, 包括如下步骤: 步骤 S621 : 判断所述 M个串连的栅格单元 是否只有一组, 如果是, 则直接行走至所述 M个串连的栅格单元中的任意一个 栅格单元, 并进入步骤 S63 , 如果否, 则进入步骤 S622 ; 步骤 S622 : 确定记录 时间最早的一组所述 M个串连的栅格单元, 机器人行走至其中记录时间最早的 一个栅格单元, 并进入步骤 S63。
进一步地, 在步骤 S6之后, 还包括如下步骤: 步骤 S71 : 确定步骤 S6中所 述的机器人当前位于所述定位单元时检测到的栅格坐标为 (XI, Y1), 并进入步 骤 S72 ; 步骤 S72 : 确定步骤 S6中所述的对应的参考定位路径中的栅格单元的 栅格坐标为 (X2 , Y2), 并进入步骤 S73 ; 步骤 S73 : 确定栅格单元的边长为 L, 并进入步骤 S74; 步骤 S74: 将机器人在当前位置点检测到的坐标值 (xl, yl) 替换为 (x2, y2), x2=xl- (X1-X2) *L, y2=yl- (Y1-Y2) 礼。
进一步地, 在步骤 S6之后, 还包括如下步骤: 步骤 S71 : 确定步骤 S6中所 述的机器人当前位于所述定位单元中心点时检测到的坐标值为 (xl, yl); 步骤 S72: 确定步骤 S6 中所述的对应的参考定位路径中的栅格单元的中心点的坐标 值为 (x2 , y2); 步骤 S73 : 将机器人当前检测到的坐标值 ((xl, yl)) 替换为 (x2 , y2)。
本发明的有益效果包括: 通过利用机器人沿孤立物体的边缘行走的路径作 为参考, 可以对机器人行走误差累积过大而造成的位置偏差进行修正, 实现重 新定位, 从而提高机器人在后续导航行走时定位的准确性和行走效率。 附图说明
图 i为本发明所述的机器人重定位的方法的流程示意图。
图 2是第一局部地图和第二局部地图重叠后的示意图一
图 3是第一局部地图和第二局部地图重叠后的示意图二。
图 4是第一局部地图和第二局部地图重叠后的示意图三。 具体实施方式 下面将结合本发明实施例中的附图, 对本发明实施例中的技术方案进行详 细描述。 应当理解, 下面所描述的具体实施例仅用于解释本发明, 并不用于限 定本发明。 本发明所述的机器人是一种智能清洁机器人 (比如扫地机器人或者拖地机 器人) , 下述实施例中提到的机器人都是指代智能清洁机器人。 这些机器人能 凭借一定的人工智能, 自动在某些场合自动进行行走。 机器人的机体上设有各 种传感器, 可检测行走距离、 行走角度 (即行进方向) 、 机体状态和障碍物等, 如碰到墙壁或其他障碍物, 会自行转弯, 并依不同的设定, 而走不同的路线, 有规划地行走, 还会根据行走过程中检测到的各种数据构建栅格地图, 比如把 检测到障碍物时所对应的栅格单元标示为障碍单元, 把检测到悬崖时所对应的 栅格单元标示悬崖单元, 把正常行走通过的栅格单元标示为已走过单元等。 本 发明所述的机器人包括如下结构: 带有左驱动轮和右驱动轮的能够自主行走的 机器人机体, 机体上设有人机交互界面, 机体上设有障碍检测单元。 机体内部 设置有惯性传感器, 所述惯性传感器包括加速度计和陀螺仪等, 两个驱动轮上 都设有用于检测驱动轮的行走距离的里程计 (一般是码盘) , 还设有能够处理 相关传感器的参数, 并能够输出控制信号到执行部件的控制模块。 如图 i所示的一种机器人重定位的方法, 包括如下步骤: 在步骤 si中, 机 器人一边行走一边进行行走数据的检测, 当机器人前端的碰撞式传感器或者红 外传感器检测到障碍物时, 进入步骤 S2。 进入步骤 S2后, 机器人沿所述障碍物 的边缘行走, 并判断沿所述障碍物的边缘行走的路径是否满足确定所述障碍物 是孤立物体的条件, 其中, 确定障碍物是孤立物体的条件, 其具体条件可以根 据不同的设计需求进行相应设置, 比如, 可以通过机器人开始沿障碍物的边缘 行走的起始位置点和结束沿障碍物边缘行走的结束位置点的关系来进行判断, 也可以通过在一定的时间内转动角度的变化量进行判断, 也可以通过转动角度 和栅格位置的关系进行判断, 还可以把这些因素结合起来进行综合判断, 等等。 当机器人确定沿所述障碍物的边缘行走的路径不满足所述条件, 则可以得知所 述障碍物不是孤立物体, 不能依据该障碍物进行重新定位, 也不能把与该障碍 物相关的数据作为定位参考依据, 所以, 进入步骤 S3 , 机器人调整行走角度, 朝远离障碍物的方向行走, 离开所述障碍物。 接着, 机器人按系统规划的路径 或者方式继续行走, 当机器人前端的碰撞式传感器或者红外传感器再次检测到 障碍物时, 则返回步骤 S2 , 继续判断该障碍物是不是孤立物体。 在步骤 S2中, 当机器人确定沿所述障碍物的边缘行走的路径满足确定所述障碍物是孤立物体 的条件时, 则进入步骤 S4, 机器人确定所述障碍物是孤立物体, 并记录沿所述 孤立物体的边缘行走的沿边路径, 所记录的信息可以根据具体的设计需求进行 相应设置, 比如记录沿边路径所对应的栅格坐标, 记录机器人行走在沿边路径 所对应的栅格单元时陀螺仪检测到的角度, 记录机器人在沿边路径的起点的时 间以及在沿边路径的终点的时间, 等等。 然后, 机器人判断当前记录的沿边路 径和此前已存储的沿边路径是否相似, 判断的方式也可以根据不同的设计需求 进行相应设置, 比如, 通过当前记录的沿边路径所对应的栅格单元的栅格坐标 与此前已存储的沿边路径所对应的栅格单元的栅格坐标进行对比, 如果仅有少 数几个栅格坐标不相同, 则可以认为两条沿边路径是相似的; 也可以通过当前 记录的沿边路径所对应的栅格单元的整体排布方位与此前已存储的沿边路径所 对应的栅格单元的整体排布方位的差异, 如果仅有少数几个点的排布方位有差 异, 则可以认为两条沿边路径是相似的; 还可以通过机器人沿边行走过程中单 位时间内的角度变化率、 沿边路径长度和沿边行走的总时间来进行判断, 如果 这几个参数都相同, 则可以认为两条沿边路径是相似的; 当然, 还可以把所述 这些因素相互结合起来进行综合判断。 如果判断得出当前记录的沿边路径和此 前已存储的沿边路径不相似, 表明机器人当前所检测到的孤立物体, 此前没有 碰到过, 没有沿该孤立物体的边缘行走的相关记录数据作为重新定位的参考, 无法利用该孤立物体进行重新定位, 所以, 进入步骤 S5 , 把所记录的沿所述孤 立物体的边缘行走的沿边路径作为已存储的沿边路径, 方便后续机器人再次碰 到该孤立物体时, 可以进行重新定位, 然后再返回步骤 S3。 如果判断得出当前 记录的沿边路径和此前已存储的沿边路径相似, 表明机器人当前所检测到的孤 立物体, 此前已经碰到过, 并且存储有沿该孤立物体的边缘行走过的沿边路径 的相关记录数据, 可以利用该孤立物体进行重新定位, 所以, 进入步骤 S6, 把 此前已存储的沿该孤立物体的边缘行走的沿边路径作为参考定位路径, 接着, 确定当前的沿边路径所在的第一局部地图和参考定位路径所在的第二局部地 图, 即在栅格地图中圈定一个包含当前的沿边路径的部分地图作为第一局部地 图, 圈定一个包含参考定位路径的部分地图作为第二局部地图, 所述第一局部 地图和所述第二局部地图的大小和形状是相同的, 但是具体的大小和形状可以 根据具体的设计需求进行相应设置, 比如把形状设置为圆心、 正方形或者长方 形, 把局部地图的最大栅格坐标设置为比沿边路径中的最大栅格坐标大四个栅 格单元, 把局部地图的最小栅格坐标设置为比沿边路径中的最小栅格坐标小四 个栅格单元, 等等。 紧接着, 把形状和大小相同的第一局部地图和第二局部地 图进行重叠, 由于当前的沿边路径和参考定位路径是相似的, 两者之间主要是 栅格坐标的不同, 而路径的整体形状是几乎相同的, 仅有少数几个区别点, 所 以, 通过将第一局部地图和第二局部地图进行重叠, 就可以得出当前的沿边路 径和参考定位路径相互重叠的部分, 这些重叠的部分表明机器人沿着孤立物体 的这段边缘行走时没有产生误差, 可以采用这段边缘所对应的栅格坐标进行重 新定位, 而有一些不重叠的部分或者点, 则表明机器人在对应的边缘段或者边 缘点产生了行走误差, 不适于作为重新定位的参考。 因此, 最后再确定第一局 部地图的当前的沿边路径中, 与第二局部地图的参考定位路径重叠的部分所对 应的栅格单元作为定位单元, 将机器人当前位于所述定位单元时检测到的栅格 坐标替换为对应的参考定位路径中的栅格单元的栅格坐标, 从而实现机器人的 重新定位。 如图 2所示, 图 2是第一局部地图和第二局部地图重叠后的示意图, 图中的每个小方格表示一个栅格单元, 标示有字母 P 的方格连成的路径为所述 的当前的沿边路径, 标示有字母 B 的方格连成的路径为所述的参考定位路径, 在一个方格中即标示有字母 P, 又标示有字母 B, 则表示这是所述的当前的沿边 路径和所述参考定位路径重叠的部分。 由图 2所示可知, 这两条沿边路径只有 左上角的两三个点出现一些差异, 大部分路径是重叠的, 所以, 可以得出这两 条路径是相似的, 同时, 机器人在任意一个同时含有 B和 P的方格所对应的位 置, 将当前检测到的栅格坐标, 替换成对应的 B方格中已存储的栅格坐标, 即 可实现机器人的重新定位。 本实施例所述的方法, 通过利用机器人沿孤立物体 的边缘行走的路径作为参考, 可以对机器人行走误差累积过大而造成的位置偏 差进行修正, 实现重新定位, 从而提高机器人在后续导航行走时定位的准确性 和行走效率。
作为其中一种实施方式, 在步骤 S1之后, 且在步骤 S2之前, 还包括如下 步骤: 步骤 S11, 先通过机器人的陀螺仪和里程计所检测的数据, 来确定机器人 检测到障碍物时所处于的栅格单元以及该栅格单元所对应的栅格坐标。 然后进 入步骤 S12 , 确定从当前时间往前推算的预设时间内机器人已存储的沿边路径, 其中, 所述预设时间可以根据具体的设计需求进行相应设置, 优选的, 可设置 为 10分钟至 30分钟中的任意一值, 本实施例设置为 20分钟, 即从当前时间往 前推算 20分钟, 在这 20分钟时间里, 机器人已存储的沿边路径有哪些, 机器 人会对内存中的数据进行搜索, 确定好相关的数据后, 再进入步骤 S13。 在步骤 S13中,机器人会把步骤 S11中所确定的栅格坐标与步骤 S12中所确定的沿边路 径所对应的栅格坐标逐一进行对比分析, 如果发现所对比的两个栅格坐标是相 同或者相邻的, 则认为当前障碍物已经在 20分钟内碰到过, 并且已经沿着它的 边缘进行了沿边行走, 如果机器人在较短的时间内频繁地绕同样的障碍物的边 缘行走, 会降低机器人的行走效率, 同时, 在较短的时间间隔内, 机器人所累 积的误差还不是很大, 不需要进行重新定位, 频繁的重新定位也会导致机器人 的行走效率降低, 所以, 为了使机器人达到最佳的行走效率, 在步骤 S13 中确 定了步骤 S11 中所确定的栅格坐标与步骤 S12中所确定的沿边路径所对应的栅 格坐标相同或者相邻时, 则进入步骤 S14, 机器人调整行走角度, 朝远离障碍物 的方向行走, 离开所述障碍物。 接着, 机器人按系统规划的路径或者方式继续 行走, 当机器人前端的碰撞式传感器或者红外传感器再次检测到障碍物时, 则 返回步骤 S11, 重新开始判断再次检测到的障碍物是否符合相关要求。如果在栅 格坐标的对比分析中, 发现所对比的两个栅格坐标不相同, 也不相邻, 则认为 在此前的 20分钟内机器人没有碰到过该障碍物, 并且, 机器人已经行走了较长 的时间, 累积的误差也比较大了, 所以, 接着进入步骤 S2 , 进一步判断该障碍 物是不是符合能够使机器人进行重新定位的条件。 其中, 步骤 S13 中所述的相 邻是指两个栅格坐标所对应的栅格单元之间具有共同的一条边或者一个角点。 本实施例所述的方法, 通过机器人在预设时间内所检测到的障碍物的情况, 来 判断机器人是否需要沿这个障碍物的边缘行走, 从而提高机器人行走的效率, 避免机器人行动的盲目性和重复性。
作为其中一种实施方式, 步骤 S2所述的沿所述障碍物的边缘行走, 并判断 沿所述障碍物的边缘行走的路径是否满足确定所述障碍物是孤立物体的条件, 包括如下步骤: 在步骤 S21 中, 机器人沿所述障碍物的边缘行走, 并基于陀螺 仪和里程计检测到的数据, 记录机器人开始沿边行走时的起始位置点的起始信 息, 所述起始信息可以包括起始位置点的点坐标值、 起始位置点所对应的栅格 单元的栅格坐标、 行走的方向和开始行走的时间等信息。 记录起始信息可以为 了后续判断障碍物是不是孤立物体提供参考数据, 也可以为后续寻找该障碍物 时提供导航依据。记录好起始信息后,进入步骤 S22 ,基于陀螺仪检测到的数据, 判断机器人从所述起始位置点开始沿障碍物的边缘行走, 机体转动的角度变化 量是否达到 360 ° , 由此可以初步判断机器人是否行走了一圈, 如果初步判断已 经行走了一圈, 则直接进入步骤 S23 , 作进一步的确认。 如果初步判断没有行走 一圈, 则继续沿所述障碍物的边缘行走, 至机器人从所述起始位置点开始检测 到的角度变化量达到 360 ° ,才进入步骤 S23 , 作进一步的确认。在步骤 S23中, 机器人先判断是否回到步骤 S21 中所述的起始位置点, 判断方式可以采用点坐 标值是否相同的方式, 如果检测到当前位置点的点坐标值与起始位置点的点坐 标值相同, 则认为机器人回到了起始位置点, 从而可以确定机器人沿所述障碍 物的边缘行走了一圈, 通过沿边行走的路径可以得出所述障碍物是孤立物体, 即满足了确定所述障碍物是孤立物体的条件, 然后可以进入步骤 S4, 作下一步 操作。 由于机器人行走误差的影响, 在陀螺仪检测到机体已经转动 360 ° 时, 机 器人可能还没有绕障碍物的边缘行走完一圈, 没能回到起始位置点。 当判断机 器人从起始位置点开始检测到的角度变化量达到 360 ° , 但是, 机器人又没有回 到的起始位置点时, 进入步骤 S24, 继续沿所述障碍物的边缘行走, 判断机器人 是否回到步骤 S21 中所述的起始位置点, 以及判断机器人从所述起始位置点开 始检测到的角度变化量是否达到 450 ° 。如果机器人回到所述起始位置点且所述 角度变化量没有达到 450 ° ,则表明机器人此前的确是受误差的影响,而在 450 ° 范围内还能够回到起始位置点, 说明所产生的误差在可接受的范围内, 可以确 定所述障碍物是孤立物体, 即满足了确定所述障碍物是孤立物体的条件。 如果 机器人回到所述起始位置点, 但是所述角度变化量超过了 450 ° , 或者机器人没 有回到所述起始位置点, 且所述角度变化量超过了 450 ° , 则表明所述障碍物不 是孤立物体, 确定机器人沿所述障碍物的边缘行走的路径不满足确定所述障碍 物是孤立物体的条件。 本实施例所述的方法, 通过结合坐标值和机器人的转动 角度, 可以准确地判断机器人是否能够完整地绕障碍物行走一圈, 进而准确得 出机器人沿所述障碍物的边缘行走的路径是否满足确定所述障碍物是孤立物体 的条件的结论, 可以为后续机器人进行重新定位提供有效参考数据。
作为其中一种实施方式, 在步骤 S21之后, 且在步骤 S22之前, 还包括如 下步骤: 步骤 S211: 基于里程计和陀螺仪, 检测机器人从所述起始位置点开始 沿所述障碍物的边缘行走的距离和角度变化量。 然后进入步骤 S212 , 判断机器 人从所述起始位置点开始行走的距离是否达到 1. 5米,如果是,则进入步骤 S213 , 如果否, 则机器人继续行走, 至机器人行走的距离达到 1. 5米, 进入步骤 S213。 在步骤 S213中, 判断机器人从所述起始位置点开始行走的角度变化量是否达到 90 ° , 如果否, 表明所述障碍物占地面积比较大, 边缘线路比较长, 不适于作 为重新定位的参考物体, 所以, 机器人调整行走角度, 离开所述障碍物, 继续 行走, 再次检测到障碍物时, 则返回步骤 S11, 重新开始判断。 如果机器人行走 了 1. 5米, 且转动的角度达到了 90 ° , 则可以初步判断该障碍物的占地面积大 小适合, 需进入步骤 S214, 作进一步判断。 在步骤 S214中, 机器人继续沿所述 障碍物的边缘行走,并判断机器人从所述起始位置点开始行走的距离是否达到 3 米, 如果是, 则进入步骤 S215 , 如果否, 则机器人继续行走, 至机器人行走的 距离达到 3米, 进入步骤 S215。 在步骤 S215中, 判断机器人从所述起始位置点 开始行走的角度变化量是否达到 180 ° , 如果否, 表明所述障碍物的前 1. 5米的 边缘线路延伸角度比较合适, 而 1. 5米至 3米这段边缘线路的延伸角度比较大, 很有可能障碍物的占地面积也随着变大, 不适于作为重新定位的参考物体, 所 以, 机器人调整行走角度, 离开所述障碍物, 继续行走, 再次检测到障碍物时, 则返回步骤 S11, 重新开始判断。 如果机器人行走了 3米, 且转动的角度达到了 180 ° , 则可以大致得出该障碍物的占地面积大小适合, 但是, 由于障碍物形状 的不确定性, 还需进入步骤 S216, 作最后的确认。 在步骤 S216中, 机器人继续 沿所述障碍物的边缘行走, 并判断机器人从所述起始位置点开始行走的距离是 否达到 4. 5米, 如果是, 则进入步骤 S217 , 如果否, 则机器人继续行走, 至机 器人行走的距离达到 4. 5米, 进入步骤 S217。 在步骤 S217中, 判断机器人从所 述起始位置点开始行走的角度变化量是否达到 270 ° , 如果否, 表明所述障碍物 的前 3米的边缘线路延伸角度比较合适, 而 3米至 4. 5米这段边缘线路的延伸 角度比较大, 很有可能障碍物的占地面积也随着变大, 不适于作为重新定位的 参考物体, 所以, 机器人调整行走角度, 离开所述障碍物, 继续行走, 再次检 测到障碍物时, 则返回步骤 S11, 重新开始判断。 如果机器人行走了 4. 5米, 且 转动的角度达到了 270 ° , 则可以得出该障碍物的占地面积大小适合, 即使剩余 边缘线路变化比较大, 但也只有 90 ° 的变化范围, 所以, 对障碍物的整体大小 影响不大, 所以, 可以最终确认该障碍物的占地面积大小适合, 是作为机器人 进行重新定位的最佳参考物体。 本实施例所述的方法, 通过结合沿障碍物边缘 行走的路径的距离和角度变化关系, 可以准确地判断所述障碍物的占地面积的 大小情况, 从而有效确定该障碍物是否适合作为机器人进行重新定位的参考物 体, 进而可以为后续机器人的重新定位提供准确的参考数据。
作为其中一种实施方式, 在步骤 S21之后, 且在步骤 S22之前, 还包括如 下步骤: 步骤 S211 : 基于机器人中的 RTC时钟计时模块和陀螺仪, 检测机器人 从所述起始位置点开始沿所述障碍物的边缘行走的时间和角度变化量, 然后进 入步骤 S212。在步骤 S212中, 机器人判断从所述起始位置点开始行走的时间是 否达到 1 分钟, 如果是, 表明机器人沿障碍物的边缘行走了一定距离, 则进入 步骤 S213 , 如果否, 则机器人继续行走, 至机器人行走的时间达到 1分钟, 进 入步骤 S213。在步骤 S213中, 判断机器人从所述起始位置点开始行走的角度变 化量是否达到 90 ° , 如果否, 表明机器人沿障碍物边缘的行走轨迹的变化所对 应的障碍物的占地面积比较大, 不适于作为重新定位的参考物体, 所以, 机器 人调整行走角度, 离开所述障碍物, 继续行走, 再次检测到障碍物时, 则返回 步骤 S11, 重新开始判断。 如果机器人已经行走了 1分钟, 并且转动的角度变化 量达到 90 ° , 可以初步判断该障碍物的占地面积大小适合, 需进入步骤 S214, 作进一步判断。 在步骤 S214中, 机器人继续沿所述障碍物的边缘行走, 并判断 机器人从所述起始位置点开始行走的时间是否达到 2 分钟, 如果是, 表明机器 人行走了更长的一段距离, 则进入步骤 S215 , 如果否, 则机器人继续行走, 至 机器人行走的时间达到 2分钟, 进入步骤 S215。 在步骤 S215中, 判断机器人从 所述起始位置点开始行走的角度变化量是否达到 180 ° , 如果否, 表明所述机器 人在开始的第 1 分钟沿障碍物边缘行走的边缘线路延伸角度比较合适, 而第 2 分钟这段时间沿障碍物的边缘行走的边缘线路的延伸角度比较大, 很有可能障 碍物的占地面积也随着变大, 不适于作为重新定位的参考物体, 所以, 机器人 调整行走角度, 离开所述障碍物, 继续行走, 再次检测到障碍物时, 则返回步 骤 S11, 重新开始判断。如果机器人行走了 2分钟, 且转动的角度达到了 180 ° , 则可以大致得出该障碍物的占地面积大小适合, 但是, 由于障碍物形状的不确 定性, 还需进入步骤 S216 , 作进一步的确认。 在步骤 S216中, 机器人继续沿所 述障碍物的边缘行走, 并判断机器人从所述起始位置点开始行走的时间是否达 到 3分钟, 如果是, 则进入步骤 S217 , 如果否, 则机器人继续行走, 至机器人 行走的时间达到 3分钟, 才进入步骤 S217。在步骤 S217中, 判断机器人从所述 起始位置点开始行走的角度变化量是否达到 270 ° , 如果否, 表明所述机器人开 始的第 1分钟和第 2分钟沿障碍物边缘行走的边缘线路延伸角度比较合适, 而 第 3 分钟这段时间沿障碍物的边缘行走的边缘线路的延伸角度比较大, 很有可 能障碍物的占地面积也随着变大, 不适于作为重新定位的参考物体, 所以, 机 器人调整行走角度, 离开所述障碍物, 继续行走, 再次检测到障碍物时, 则返 回步骤 S11, 重新开始判断。 如果机器人行走了 3 分钟, 且转动的角度达到了 270 ° , 表明机器人前 3分钟沿障碍物边缘行走的边缘线路的延伸角度也比较合 理, 则进入步骤 S218 , 机器人继续沿所述障碍物的边缘行走, 并判断机器人从 所述起始位置点开始行走的时间是否达到 4分钟, 如果是, 则进入步骤 S22 , 判 断机器人从所述起始位置点开始检测到的角度变化量是否达到 360 ° ,由此可以 判断机器人是否沿障碍物的边缘行走了 360 ° , 实现沿边行走一圈。 如果否, 则 机器人继续行走, 至机器人行走的时间达到 4分钟, 才进入步骤 S22进行 360 ° 的判断。 本实施例所述的方法, 通过逐个时间段地分析机器人的角度变化量, 可以准确地判断所述障碍物的占地面积的大小情况, 从而有效确定该障碍物是 否适合作为机器人进行重新定位的参考物体, 进而可以为后续机器人的重新定 位提供准确的参考数据。 作为其中一种实施方式, 在步骤 S23 中所述的判断机器人回到步骤 S21 中 所述的起始位置点之后, 且在确定机器人沿所述障碍物的边缘行走的路径满足 确定所述障碍物是孤立物体的条件之前, 或者在步骤 S24 中所述的如果机器人 回到所述起始位置点且所述角度变化量没有达到 450 ° 之后,且在确定机器人沿 所述障碍物的边缘行走的路径满足确定所述障碍物是孤立物体的条件之前, 还 包括如下步骤: 判断机器人沿所述障碍物行走一周所圈定的面积是否大于 0. 3 平方米, 如果是, 表明所述障碍物的占地面积大小比较合适, 可以考虑作为机 器人进行重新定位的参考物体, 然后进入确定机器人沿所述障碍物的边缘行走 的路径满足确定所述障碍物是孤立物体的条件的步骤, 如果否, 则机器人调整 行走角度,离开所述障碍物, 继续行走,再次检测到障碍物时, 则返回步骤 S11, 重新开始判断。 由于作为机器人定位参考的物体不宜过大, 也不宜过小, 如果 过大, 则定位耗时较多, 效率较低, 如果过小, 则定位不准确, 效果不理想。 所以, 本实施例所述的方法, 通过把占地面积大于 0. 3 平方米作为限制条件, 可以选出很理想的孤立物体作为后续机器人进行重新定位或者存储新的定位参 数的参考。
作为其中一种实施方式, 步骤 S4中所述的记录沿所述孤立物体的边缘行走 的沿边路径, 判断当前记录的沿边路径和此前已存储的沿边路径是否相似, 包 括如下步骤: 步骤 S41 : 基于机器人的陀螺仪和里程计所检测到的数据, 记录当 前的所述沿边路径所对应栅格单元的栅格坐标, 记录当前的所述沿边路径所围 成的区域的栅格面积, 记录当前的所述沿边路径所围成的区域的中心栅格单元 的栅格坐标, 其中, 所述栅格坐标可以通过位置点的坐标值换算出来, 所述栅 格面积则可以通过栅格单元的数量计算出来。 中心栅格单元的栅格坐标则可以 通过该区域中的上下左右的最值栅格坐标计算出来, 比如, 该区域最上端的栅 格坐标为(20, 30),最下端的栅格坐标为(8, 10), 最左端的栅格坐标为(6, 12), 最右端的栅格坐标为 (28 , 22), 则计算得出中心栅格单元的栅格坐标为 (6+ (28-6) /2, 10+ (30-10) /2) = (17, 20)。 所述的中心并不仅仅是指规则形 状下的正中心, 当区域的形状不规则时, 通过同样方式计算出来的栅格坐标, 也可以作为该区域的中心栅格单元的栅格坐标。 在记录好相关数据后, 进入步 骤 S42 ,判断当前的沿边路径所围成区域的中心栅格单元的栅格坐标与已存储的 沿边路径所围成区域的中心栅格单元的栅格坐标的坐标差值是否大于第一预设 坐标差值, 所述第一预设坐标差值可以根据具体的设计需求进行相应设置, 可 以设置为 ( 2, 2 ) 或者 ( 3, 3 ) , 即判断所对比的栅格坐标中的 X值是否大于 2或 3 , Y值是否大于 2或 3 , 如果都是, 则确定为大于第一预设坐标差值, 否则确 定为不大于第一预设坐标差值。 当判断得出当前的沿边路径所围成区域的中心 栅格单元的栅格坐标与已存储的沿边路径所围成区域的中心栅格单元的栅格坐 标的坐标差值大于第一预设坐标差值, 表明所对比的两条沿边路径所围成的区 域的上下左右的最值栅格坐标的差异比较大, 即两条沿边路径的形状有较大区 别, 所以, 可以确定当前记录的沿边路径和此前已存储的沿边路径不相似。 当 判断得出当前的沿边路径所围成区域的中心栅格单元的栅格坐标与已存储的沿 边路径所围成区域的中心栅格单元的栅格坐标的坐标差值小于或等于第一预设 坐标差值,并不一定表明这两条栅格路径相似,需进入步骤 S43 ,作进一步判断。 在步骤 S43 中, 判断当前的栅格面积与已存储的沿边路径所对应的区域的栅格 面积的差值是否大于预设面积差值, 所述预设面积差值可以根据具体的设计需 求进行相应设置, 可以设置为 1至 5个栅格单元的面积, 由于行走误差的影响, 如果设置的值过小, 很难找到相适配的对象, 如果设置的值过大, 找到的对象 的准确性又比较低, 所以, 本实施例设置为 3个栅格单元的面积, 如此可以达 到最优匹配效果。 而栅格面积可以基于沿边路径所对应的栅格单元的栅格坐标 值, 把每行栅格单元的个数或者每列栅格单元的个数相加起来, 然后乘以每个 栅格单元的面积, 即可得到所述栅格面积。 如果判断得出当前的栅格面积与已 存储的沿边路径所对应的区域的栅格面积的差值大于预设面积差值, 则表明两 个区域的形状相差比较大, 两条沿边路径的形状相差也比较大, 所以, 可以确 定当前记录的沿边路径和此前已存储的沿边路径不相似。 如果判断得出当前的 栅格面积与已存储的沿边路径所对应的区域的栅格面积的差值小于或等于预设 面积差值, 并不一定表明这两条栅格路径相似, 需要进入步骤 S44作进一步判 断。 在步骤 S44 中, 基于当前的沿边路径所在的第一局部地图和已存储的沿边 路径所在的第二局部地图, 把形状和大小相同的第一局部地图和第二局部地图 进行重叠, 判断当前的沿边路径与已存储的沿边路径相互重叠的栅格单元数量 占已存储的沿边路径中的栅格单元数量的比例是否大于预设比例值, 所述预设 比例值可以根据具体的设计需求进行相应设置, 本实施例设置为 90%。如图 3所 示, 图 3 是第一局部地图和第二局部地图重叠后的示意图, 所选取的第一局部 地图和第二局部地图都是栅格地图中长为 15个栅格单元距离, 宽为 13个栅格 单元距离的矩形局部地图, 这两个局部地图都把沿边路径所对应的栅格单元包 含在其中。 图中的每个小方格表示一个栅格单元, 标示有字母 H 的方格连成的 路径为所述的当前的沿边路径, 标示有字母 Q 的方格连成的路径为此前已存储 的沿边路径, 在一个方格中即标示有字母 H, 又标示有字母 Q, 则表示这是所述 的当前的沿边路径和已存储的沿边路径重叠的部分。 由图 3 所示可知, 重叠的 栅格单元有 29个, 标示有 Q的栅格单元有 32个, 求得当前的沿边路径与已存 储的沿边路径相互重叠的栅格单元数量占已存储的沿边路径中的栅格单元数量 的比例为 90. 6%, 大于预设比例值 90%。 表明这两条沿边路径大部分路径是重叠 的, 只有右上角的三个点出现一些差异。 所以, 经过综合判断, 如果当前的沿 边路径与已存储的沿边路径相互重叠的栅格单元数量占已存储的沿边路径中的 栅格单元数量的比例大于预设比例值, 可以最终确定当前记录的沿边路径和此 前已存储的沿边路径相似。 如果当前的沿边路径与已存储的沿边路径相互重叠 的栅格单元数量占已存储的沿边路径中的栅格单元数量的比例小于或等于预设 比例值, 则需要进入步骤 S45 , 作进一步判断。 在步骤 S45中, 将当前的沿边路 径相对于已存储的沿边路径分别向上下左右四个方向平移 N个栅格单元的距离, 判断当前的沿边路径与已存储的沿边路径相互重叠的栅格单元数量占已存储的 沿边路径中的栅格单元数量的比例是否大于预设比例值。 如图 4所示, 图 4是 第一局部地图和第二局部地图重叠后的另一种示意图, 图中的每个小方格表示 一个栅格单元, 标示有字母 H 的方格连成的路径为所述的当前的沿边路径, 标 示有字母 Q 的方格连成的路径为此前已存储的沿边路径, 在一个方格中即标示 有字母 H, 又标示有字母 Q, 则表示这是所述的当前的沿边路径和已存储的沿边 路径重叠的部分。 从图中可知, 重叠的栅格单元仅有 16个, 占已存储的沿边路 径中的栅格单元数量的比例为 50%, 远小于预设比例值 90%。 此时, 需要将标示 有 H的沿边路径整体向上平移一个栅格单元的距离, 得出如图 3所示的重叠效 果, 通过如上所述的对图 3 的分析, 可以最终确定当前记录的沿边路径和此前 已存储的沿边路径相似。 假设向上平移一个栅格单元的距离, 得出的比例依然 小于预设比例值, 则继续向上平移一个栅格单元的距离, 再次分析判断所得比 例与预设比例值的关系, 至平移 N个栅格单元的距离后, 也无法得出所述比例 值大于预设比例值的结果, 则回到原图 4所示的状态, 依次向下平移 N个栅格 单元的距离, 依然无法得出所述比例值大于预设比例值的结果, 则再次回到原 图 4所示的状态, 依次类推, 分别向左和右依次平移 N个栅格单元的距离, 如 果最终也无法得出所述比例值大于预设比例值的结果, 则可以确定当前记录的 沿边路径和此前已存储的沿边路径不相似。 在上述平移的过程中, 只要有一次 计算得出所述比例值大于预设比例值的结果, 则可以确定当前记录的沿边路径 和此前已存储的沿边路径相似。 其中, 所述 N可以根据具体的设计需求进行相 应设置, 如果设置的值过小, 则很难得到适配的对象, 如果设置的值过大, 则 所得到的对象可能会存在较大的误差, 不适于作为机器人重新定位的对象。 优 选的, N为自然数, 且 1 <N<3, 本实施例设置 N=2。 本实施例所述的方法, 通 过栅格单元重叠的方式来判断当前记录的沿边路径和此前已存储的沿边路径是 否相似, 可以得出比较准确的判断结果, 有利于提高后续机器人进行重新定位 的准确性。
作为其中一种实施方式, 步骤 S44或者步骤 S45中所述的判断当前的沿边 路径与已存储的沿边路径相互重叠的栅格单元数量占已存储的沿边路径中的栅 格单元数量的比例是否大于预设比例值, 包括如下步骤: 基于当前的沿边路径 所在的第一局部地图和已存储的沿边路径所在的第二局部地图, 将当前的沿边 路径和已存储的沿边路径所对应的栅格单元都标示为 1,其它的栅格单元标示为 0。 将所述第一局部地图和所述第二局部地图中相对应的栅格单元进行与运算, 即 1与 1的结果为 1, 1与 0的结果为 0, 0与 0的结果也为 0。判断与运算后, 所 得到的标示为 1 的栅格单元的数量占已存储的沿边路径所对应的栅格单元的数 量的比例是否大于预设比例值。 本实施例所述的方法, 把栅格单元进行二值化, 再通过与运算的方式进行分析, 可以快速准确地得出两条沿边路径中相互重叠 的栅格单元的数量, 从而可以快速准确地得出所述重叠的栅格单元的数量占已 存储的沿边路径所对应的栅格单元的数量的比例是否大于预设比例值的结果, 更高效地为后续机器人的定位提供准确的参考数据。 其中, 所述预设比例值和 相关计算方式与上述的实施例相同, 在此不再赘述。 作为其中一种实施方式, 步骤 S6中所述的将机器人当前位于所述定位单元 时检测到的栅格坐标替换为对应的参考定位路径中的栅格单元的栅格坐标, 包 括如下步骤: 步骤 S61 : 判断所述定位单元中是否存在 M个串连的栅格单元, 且 机器人当前记录的这 M个串连的栅格单元的栅格坐标与所述参考定位路径中相 应的栅格单元的栅格坐标的差值小于第二预设坐标差值。 其中, 所述 M 的值可 以根据具体的设计需求进行相应设置, M为自然数, 且 2SMS3 , 本实施例设置 为 3。所述第二预设坐标差值也可以根据具体的设计需求进行相应设置, 本实施 例设置为 ( 2, 2 )。 当机器人所记录的连续检测到的 3个栅格单元的栅格坐标值 与所述参考定位路径中相应的栅格单元的栅格坐标的差值都小于第二预设坐标 差值 ( 2, 2 ) , 即所记录的栅格单元的栅格坐标的 X值与参考定位路径中对应的 栅格单元的栅格坐标的 X值之间的差值小于 2, 且 Y值之间的差值也小于 2 , 则 表明所对比的两条路径之间的位置差别不是很大, 属于同一方位的物体, 可以 进入步骤 S62 , 进行后续的定位修正。 否则, 表明所对比的两条路径之间的位置 差别较大, 可能不属于同一方位的物体, 不能将此物体作为定位参考, 需要返 回步骤 S3 , 离开当前障碍物, 寻找下一个可以作为参考的物体。 在步骤 S62中, 因为上述的连续的 3个栅格单元的误差比较小, 所以, 机器人行走至这 3个串 连的栅格单元中的任意一个栅格单元, 并进入步骤 S63 , 将当前检测到的栅格坐 标替换为对应的参考定位路径中的栅格单元的栅格坐标, 从而实现机器人的重 新定位。 由于机器人在前述的步骤中主要是进行沿边路径是不是相似的判断, 如果家庭环境中出现占地面积和形状相同的两个物体, 机器人仅依靠沿边路径 相似就进行定位数据的修正, 则很可能会出现差错。 所以, 本实施例所述的方 法, 通过检测并记录的沿边路径的栅格坐标与参考定位路径中栅格坐标的差值, 来进一步判断当前物体与参考定位路径所对应的物体是否在同一位置, 从而得 出更准确的定位数据, 机器人的定位修正更准确。
作为其中一种实施方式, 步骤 S62中所述的机器人行走至所述 M个串连的 栅格单元中的任意一个栅格单元, 包括如下步骤: 步骤 S621 : 判断所述 M个串 连的栅格单元是否只有一组。 由于当前物体和参考物体的位置相同且形状近似 时, 栅格坐标相同或近似的数量会比较多, 所以, 一般会出现多组所述 M个串 连的栅格单元, 或者只有一组, 但是该组的栅格单元非常多。 如果只有一组, 则直接行走至所述 M个串连的栅格单元中的任意一个栅格单元,并进入步骤 S63 , 将当前检测到的栅格坐标替换为对应的参考定位路径中的栅格单元的栅格坐 标, 从而实现机器人的重新定位。 如果有多组, 则进入步骤 S622 , 确定记录时 间最早的一组所述 M个串连的栅格单元, 机器人行走至其中记录时间最早的一 个栅格单元, 并进入步骤 S63 , 将当前检测到的栅格坐标替换为对应的参考定位 路径中的栅格单元的栅格坐标, 从而实现机器人的重新定位。 因为机器人行走 的时间越久, 产生的误差越大, 所以, 越早记录的数据越准确。 本实施例所述 的方法, 在机器人进行定位时, 选择时间最早的栅格单元进行定位数据的修正, 更进一步地保证定位数据的准确性, 更确保了机器人定位的准确性。
作为其中一种实施方式, 在步骤 S6之后, 还包括如下步骤: 步骤 S71 : 确 定步骤 S6中所述的机器人当前位于所述定位单元时检测到的栅格坐标为 (XI, Y1), 并进入步骤 S72。 在步骤 S72中, 确定步骤 S6中所述的对应的参考定位路 径中的栅格单元的栅格坐标为 (X2, Y2), 并进入步骤 S73。 在步骤 S73中, 确 定栅格单元的边长为 L, 并进入步骤 S74。 在步骤 S74中, 将机器人在当前位置 点检测到的坐标值 (xl, yl) 替换为 (x2, y2), x2=xl- (X1-X2) 礼, y2=yl- (Y1-Y2) 礼。 假设 (Xl=10, Yl=20), (X2=12, Y2=21), L=20cm, 机器人检 测到当前位置点的坐标值为 (xl=208, yl=412), 求得 (x2=248, y2=432), 替 换后即得机器人当前位置点的坐标值为 (248, 432)。 由于机器人进行导航行走 时, 是先搜索出栅格路径, 然后沿着栅格路径, 逐个位置点的行走。 所以, 本 实施例所述的方法, 机器人在进行栅格坐标的定位修正之后, 还需要进一步地 进行具体位置点的坐标值的修正, 可以保证机器人导航行走的准确性和高效性。
作为其中一种实施方式, 在步骤 S6之后, 还包括如下步骤: 步骤 S71 : 基 于陀螺仪和里程计检测到的数据, 确定步骤 S6中所述的机器人当前位于所述定 位单元中心点时检测到的坐标值为(xl, yl), 并进入步骤 S72。 在步骤 S72中, 基于所保存的已存储路径中对应的数据, 确定步骤 S6中所述的对应的参考定位 路径中的栅格单元的中心点的坐标值为(x2 , y2), 并进入步骤 S73。在步骤 S73 中, 将机器人当前检测到的坐标值 ((xl, yl)) 替换为 (x2 , y2)。 通过采用已 存储数据直接替换的方法, 更高效地进行具体位置点的坐标值的修正, 保证了 机器人导航行走的准确性和高效性。 以上这些实施例所述的栅格单元是指边长为 20厘米的虚拟方格, 由很多栅 格单元连续排布所形成的具有一定长度和宽度的用于表示地理环境信息的地图 就是栅格地图。 根据栅格地图, 机器人可以由一边行走一边检测到的数据得知 当前所处的栅格单元位置, 并可以实时更新栅格单元的状态, 比如把顺利走过 的栅格单元的状态标示为已走过, 把碰撞到障碍物的栅格单元的状态标示为障 碍物, 把检测到悬崖的栅格单元的状态标示为悬崖, 把没有到过的栅格单元的 状态标示为未知, 等等。 此外, 以上这些实施例所述的孤立物体是指不挨着墙 壁或者不挨着靠墙物体的独立物体, 且机器人能够沿着该独立物体的边缘行走 一圈。 其中, 独立物体并不仅仅是指单一的物体, 靠拢在一起并且能够形成连 续的占地面积的多个物体, 也属于所述的独立物体。 此外, 以上这些实施例所 述的此前已存储的沿边路径是指机器人系统中已经在内存里存储好的沿其它符 合一定条件的孤立物体的边缘行走时的沿边路径, 包括存储沿边路径所对应的 栅格单元的栅格坐标、 沿边起始位置点所对应的栅格单元的栅格坐标、 沿边结 束位置点所对应的栅格单元的栅格坐标、 沿边起始的时间、 沿边结束的时间等 等。 这些在内存中存储的数据是不能随意删除的, 可以用作机器人进行重新定 位的参考数据。 而所述当前记录的沿边路径, 是指机器人系统中在缓存区暂时 保存的沿当前障碍物的边缘行走的沿边路径, 包括保存沿边路径所对应的栅格 单元的栅格坐标、 沿边起始位置点所对应的栅格单元的栅格坐标、 沿边结束位 置点所对应的栅格单元的栅格坐标、 沿边起始的时间、 沿边结束的时间等等。 如果这些在缓存区保存的数据符合作为机器人进行重新定位的参考数据的要 求, 则会被存储进内存中, 成为上述的已存储的沿边路径; 如果不符合要求, 则会被后续新记录的数据所覆盖。 本领域普通技术人员可以理解: 实现上述各方法实施例的全部或部分步骤 可以通过程序指令相关的硬件来完成。 这些程序可以存储于计算机可读取存储 介质 (比如 ROM、 RAM、 磁碟或者光盘等各种可以存储程序代码的介质) 中。 该 程序在执行时, 执行包括上述各方法实施例的步骤。
最后应说明的是: 本说明书中各个实施例采用递进的方式描述, 每个实施 例重点说明的都是与其它实施例的不同之处, 各个实施例之间相同或相似部分 互相参见即可, 各实施例之间的技术方案是可以相互结合的。 以上各实施例仅 用于说明本发明的技术方案, 而非对其限制, 尽管参照前述各实施例对本发明 进行了详细的说明, 本领域的普通技术人员依然可以对前述各实施例所记载的 技术方案进行修改, 或者对其中部分或者全部技术特征进行等同替换; 而这些 修改或者替换, 并不使相应技术方案的本质脱离本发明各实施例技术方案的范 围。

Claims

权利要求书
[权利要求 i] 一种机器人重定位的方法, 其特征在于, 包括如下步骤:
步骤 S1 : 机器人检测到障碍物, 并进入步骤 S2;
步骤 S2: 沿所述障碍物的边缘行走, 并判断沿所述障碍物的边缘行走 的路径是否满足确定所述障碍物是孤立物体的条件, 如果否, 则进入 步骤 S3 , 如果是, 则进入步骤 S4;
步骤 S3: 调整行走角度, 离开所述障碍物, 继续行走, 再次检测到障 碍物时, 则返回步骤 S2;
步骤 S4: 确定所述障碍物是孤立物体, 记录沿所述孤立物体的边缘行 走的沿边路径, 判断当前记录的沿边路径和此前已存储的沿边路径是 否相似, 如果否, 则进入步骤 S5 , 如果是, 则进入 S6;
步骤 S5: 把所记录的沿所述孤立物体的边缘行走的沿边路径作为已存 储的沿边路径, 并返回步骤 S3 ;
步骤 S6 : 把此前已存储的与当前记录的沿边路径相似的沿边路径作为 参考定位路径, 确定当前的沿边路径所在的第一局部地图和参考定位 路径所在的第二局部地图, 把形状和大小相同的第一局部地图和第二 局部地图进行重叠, 确定第一局部地图的当前的沿边路径中, 与第二 局部地图的参考定位路径重叠的部分所对应的栅格单元作为定位单元 , 将机器人当前位于所述定位单元时检测到的栅格坐标替换为对应的 参考定位路径中的栅格单元的栅格坐标。
[权利要求 2] 根据权利要求 1所述的方法, 其特征在于, 在步骤 S1之后, 且在步骤
S2之前, 还包括如下步骤:
步骤 S 11 : 确定机器人检测到障碍物时所对应的栅格坐标;
步骤 S12: 确定从当前时间往前推算的预设时间内机器人已存储的沿 边路径;
步骤 S13: 判断步骤 S11中所确定的栅格坐标与步骤 S12中所确定的沿 边路径所对应的栅格坐标是否相同或者相邻, 如果是, 则进入步骤 S1 4, 如果否, 则进入步骤 S2; 步骤 S14: 调整行走角度, 离开所述障碍物, 继续行走, 再次检测到 障碍物时, 则返回步骤 S11 ;
其中, 步骤 S13中所述的相邻是指两个栅格坐标所对应的栅格单元之 间具有共同的一条边或者一个角点。
[权利要求 3] 根据权利要求 2所述的方法, 其特征在于, 步骤 S2所述的沿所述障碍 物的边缘行走, 并判断沿所述障碍物的边缘行走的路径是否满足确定 所述障碍物是孤立物体的条件, 包括如下步骤:
步骤 S21 : 沿所述障碍物的边缘行走, 并记录起始位置点的起始信息 步骤 S22: 判断机器人从所述起始位置点开始检测到的角度变化量是 否达到 360°, 如果是, 则进入步骤 S23 , 否则继续沿所述障碍物的边 缘行走, 至机器人从所述起始位置点开始检测到的角度变化量达到 36 0° , 进入步骤 S23 ;
步骤 S23: 判断机器人是否回到步骤 S21中所述的起始位置点, 如果 是, 则确定机器人沿所述障碍物的边缘行走的路径满足确定所述障碍 物是孤立物体的条件, 否则进入步骤 S24;
步骤 S24: 继续沿所述障碍物的边缘行走, 判断机器人是否回到步骤 S21中所述的起始位置点, 以及判断机器人从所述起始位置点开始检 测到的角度变化量是否达到 450°, 如果机器人回到所述起始位置点且 所述角度变化量没有达到 450°, 则确定机器人沿所述障碍物的边缘行 走的路径满足确定所述障碍物是孤立物体的条件, 如果机器人回到所 述起始位置点且所述角度变化量超过了 450°, 或者机器人没有回到所 述起始位置点且所述角度变化量超过了 450°, 则确定机器人沿所述障 碍物的边缘行走的路径不满足确定所述障碍物是孤立物体的条件。
[权利要求 4] 根据权利要求 3所述的方法, 其特征在于, 在步骤 S21之后, 且在步骤
S22之前, 还包括如下步骤:
步骤 S211 : 检测机器人从所述起始位置点开始沿所述障碍物的边缘行 走的距离和角度变化量; 步骤 S212: 判断机器人从所述起始位置点开始行走的距离是否达到 1.
5米, 如果是, 则进入步骤 S213 , 如果否, 则机器人继续行走, 至机 器人行走的距离达到 1.5米, 进入步骤 S213 ;
步骤 S213: 判断机器人从所述起始位置点开始行走的角度变化量是否 达到 90°, 如果否, 则机器人调整行走角度, 离开所述障碍物, 继续 行走, 再次检测到障碍物时, 则返回步骤 S11, 如果是, 则进入步骤 S214;
步骤 S214: 机器人继续沿所述障碍物的边缘行走, 并判断机器人从所 述起始位置点开始行走的距离是否达到 3米, 如果是, 则进入步骤 S21 5 , 如果否, 则机器人继续行走, 至机器人行走的距离达到 3米, 进入 步骤 S215 ;
步骤 S215: 判断机器人从所述起始位置点开始行走的角度变化量是否 达到 180°, 如果否, 则机器人调整行走角度, 离开所述障碍物, 继续 行走, 再次检测到障碍物时, 则返回步骤 S11, 如果是, 则进入步骤 S216;
步骤 S216: 机器人继续沿所述障碍物的边缘行走, 并判断机器人从所 述起始位置点开始行走的距离是否达到 4.5米, 如果是, 则进入步骤 S 217 , 如果否, 则机器人继续行走, 至机器人行走的距离达到 4.5米, 进入步骤 S217 ;
步骤 S217: 判断机器人从所述起始位置点开始行走的角度变化量是否 达到 270°, 如果否, 则机器人调整行走角度, 离开所述障碍物, 继续 行走, 再次检测到障碍物时, 则返回步骤 S11, 如果是, 则进入步骤 S22。
[权利要求 5] 根据权利要求 3所述的方法, 其特征在于, 在步骤 S21之后, 且在步骤
S22之前, 还包括如下步骤:
步骤 S211 : 检测机器人从所述起始位置点开始沿所述障碍物的边缘行 走的时间和角度变化量;
步骤 S212: 判断机器人从所述起始位置点开始行走的时间是否达到 1 分钟, 如果是, 则进入步骤 S213 , 如果否, 则机器人继续行走, 至机 器人行走的时间达到 1分钟, 进入步骤 S213 ;
步骤 S213: 判断机器人从所述起始位置点开始行走的角度变化量是否 达到 90°, 如果否, 则机器人调整行走角度, 离开所述障碍物, 继续 行走, 再次检测到障碍物时, 则返回步骤 S11, 如果是, 则进入步骤 S214;
步骤 S214: 机器人继续沿所述障碍物的边缘行走, 并判断机器人从所 述起始位置点开始行走的时间是否达到 2分钟, 如果是, 则进入步骤 S 215 , 如果否, 则机器人继续行走, 至机器人行走的时间达到 2分钟, 进入步骤 S215 ;
步骤 S215: 判断机器人从所述起始位置点开始行走的角度变化量是否 达到 180°, 如果否, 则机器人调整行走角度, 离开所述障碍物, 继续 行走, 再次检测到障碍物时, 则返回步骤 S11, 如果是, 则进入步骤 S216;
步骤 S216: 机器人继续沿所述障碍物的边缘行走, 并判断机器人从所 述起始位置点开始行走的时间是否达到 3分钟, 如果是, 则进入步骤 S 217 , 如果否, 则机器人继续行走, 至机器人行走的时间达到 3分钟, 进入步骤 S217 ;
步骤 S217: 判断机器人从所述起始位置点开始行走的角度变化量是否 达到 270°, 如果否, 则机器人调整行走角度, 离开所述障碍物, 继续 行走, 再次检测到障碍物时, 则返回步骤 S11, 如果是, 则进入步骤 S218;
步骤 S218: 机器人继续沿所述障碍物的边缘行走, 并判断机器人从所 述起始位置点开始行走的时间是否达到 4分钟, 如果是, 则进入步骤 S 22, 如果否, 则机器人继续行走, 至机器人行走的时间达到 4分钟, 进入步骤 S22。
[权利要求 6] 根据权利要求 3所述的方法, 其特征在于, 在步骤 S23中所述的判断机 器人回到步骤 S21中所述的起始位置点之后, 且在确定机器人沿所述 障碍物的边缘行走的路径满足确定所述障碍物是孤立物体的条件之前 , 或者在步骤 S24中所述的如果机器人回到所述起始位置点且所述角 度变化量没有达到 450°之后, 且在确定机器人沿所述障碍物的边缘行 走的路径满足确定所述障碍物是孤立物体的条件之前, 还包括如下步 骤:
判断机器人沿所述障碍物行走一周所圈定的面积是否大于 0.3平方米 , 如果是, 进入确定机器人沿所述障碍物的边缘行走的路径满足确定 所述障碍物是孤立物体的条件的步骤, 如果否, 则机器人调整行走角 度, 离开所述障碍物, 继续行走, 再次检测到障碍物时, 则返回步骤
S11。
[权利要求 7] 根据权利要求 1所述的方法, 其特征在于, 步骤 S4中所述的记录沿所 述孤立物体的边缘行走的沿边路径, 判断当前记录的沿边路径和此前 已存储的沿边路径是否相似, 包括如下步骤:
步骤 S41 : 记录当前的所述沿边路径所对应栅格单元的栅格坐标, 记 录当前的所述沿边路径所围成的区域的栅格面积, 记录当前的所述沿 边路径所围成的区域的中心栅格单元的栅格坐标; 步骤 S42: 判断当前的沿边路径所围成区域的中心栅格单元的栅格坐 标与已存储的沿边路径所围成区域的中心栅格单元的栅格坐标的坐标 差值是否大于第一预设坐标差值, 如果是, 则确定当前记录的沿边路 径和此前已存储的沿边路径不相似, 如果否, 则进入步骤 S43 ;
步骤 S43 : 判断当前的栅格面积与已存储的沿边路径所对应的区域的 栅格面积的差值是否大于预设面积差值, 如果是, 则确定当前记录的 沿边路径和此前已存储的沿边路径不相似, 如果否, 则进入步骤 S44 步骤 S44 : 基于当前的沿边路径所在的第一局部地图和已存储的沿边 路径所在的第二局部地图, 把形状和大小相同的第一局部地图和第二 局部地图进行重叠, 判断当前的沿边路径与已存储的沿边路径相互重 叠的栅格单元数量占已存储的沿边路径中的栅格单元数量的比例是否 大于预设比例值, 如果是, 则确定当前记录的沿边路径和此前已存储 的沿边路径相似, 如果否, 则进入步骤 S45 ;
步骤 S45 : 将当前的沿边路径相对于已存储的沿边路径分别向上下左 右四个方向平移 N个栅格单元的距离, 判断当前的沿边路径与已存储 的沿边路径相互重叠的栅格单元数量占已存储的沿边路径中的栅格单 元数量的比例是否大于预设比例值, 如果是, 则确定当前记录的沿边 路径和此前已存储的沿边路径相似, 如果否, 则确定当前记录的沿边 路径和此前已存储的沿边路径不相似;
其中, 所述 N为自然数, 且 kNS3。
[权利要求 8] 根据权利要求 7所述的方法, 其特征在于, 步骤 S44或者步骤 S45中所 述的判断当前的沿边路径与已存储的沿边路径相互重叠的栅格单元数 量占已存储的沿边路径中的栅格单元数量的比例是否大于预设比例值 , 包括如下步骤:
基于当前的沿边路径所在的第一局部地图和已存储的沿边路径所在的 第二局部地图, 将当前的沿边路径和已存储的沿边路径所对应的栅格 单元都标示为 1, 其它的栅格单元标示为 0;
将所述第一局部地图和所述第二局部地图中相对应的栅格单元进行与 运算;
判断与运算后, 所得到的标示为 1的栅格单元的数量占已存储的沿边 路径所对应的栅格单元的数量的比例是否大于预设比例值。
[权利要求 9] 根据权利要求 7或 8所述的方法, 其特征在于, 步骤 S6中所述的将机器 人当前位于所述定位单元时检测到的栅格坐标替换为对应的参考定位 路径中的栅格单元的栅格坐标, 包括如下步骤: 步骤 S61 : 判断所述定位单元中是否存在 M个串连的栅格单元, 且机 器人当前记录的这 M个串连的栅格单元的栅格坐标与所述参考定位路 径中相应的栅格单元的栅格坐标的差值小于第二预设坐标差值, 如果 是, 则进入步骤 S62, 如果否, 则返回步骤 S3 ;
步骤 S62: 机器人行走至所述 M个串连的栅格单元中的任意一个栅格 单元, 并进入步骤 S63 ;
步骤 S63: 将当前检测到的栅格坐标替换为对应的参考定位路径中的 栅格单元的栅格坐标;
其中, 所述 M为自然数, 且 2SMS3。
[权利要求 10] 根据权利要求 9所述的方法, 其特征在于, 步骤 S62中所述的机器人行 走至所述 M个串连的栅格单元中的任意一个栅格单元, 包括如下步骤 步骤 S621 : 判断所述 M个串连的栅格单元是否只有一组, 如果是, 则 直接行走至所述 M个串连的栅格单元中的任意一个栅格单元, 并进入 步骤 S63, 如果否, 则进入步骤 S622;
步骤 S622: 确定记录时间最早的一组所述 M个串连的栅格单元, 机器 人行走至其中记录时间最早的一个栅格单元, 并进入步骤 S63。
[权利要求 11] 根据权利要求 1所述的方法, 其特征在于, 在步骤 S6之后, 还包括如 下步骤:
步骤 S71 : 确定步骤 S6中所述的机器人当前位于所述定位单元时检测 到的栅格坐标为 (XI, Y1) , 并进入步骤 S72;
步骤 S72: 确定步骤 S6中所述的对应的参考定位路径中的栅格单元的 栅格坐标为 (X2, Y2) , 并进入步骤 S73 ;
步骤 S73: 确定栅格单元的边长为 L, 并进入步骤 S74;
步骤 S74: 将机器人在当前位置点检测到的坐标值 (xl, yl) 替换为 (x2, y2) , x2=xl- (X1-X2) *L, y2=yl- (Y1-Y2) *L。
[权利要求 12] 根据权利要求 1所述的方法, 其特征在于, 在步骤 S6之后, 还包括如 下步骤:
步骤 S71 : 确定步骤 S6中所述的机器人当前位于所述定位单元中心点 时检测到的坐标值为 (xl, yl) ;
步骤 S72: 确定步骤 S6中所述的对应的参考定位路径中的栅格单元的 中心点的坐标值为 (x2, y2) ;
步骤 S73: 将机器人当前检测到的坐标值 ( (xl, yl) ) 替换为 (x2 y2)
PCT/CN2018/120198 2018-03-19 2019-01-10 在此处键入发明名称 WO2019179176A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP18910525.7A EP3770711A4 (en) 2018-03-19 2019-01-10 METHOD OF REPOSITIONING A ROBOT
US16/982,068 US11537142B2 (en) 2018-03-19 2019-01-10 Method for robot repositioning
JP2020550731A JP7085296B2 (ja) 2018-03-19 2019-01-10 ロボットの再測位方法
KR1020207028134A KR102333984B1 (ko) 2018-03-19 2019-01-10 로봇의 리포지셔닝 방법

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810226498.1 2018-03-19
CN201810226498.1A CN108508891B (zh) 2018-03-19 2018-03-19 一种机器人重定位的方法

Publications (2)

Publication Number Publication Date
WO2019179176A1 true WO2019179176A1 (zh) 2019-09-26
WO2019179176A8 WO2019179176A8 (zh) 2020-09-17

Family

ID=63377748

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/120198 WO2019179176A1 (zh) 2018-03-19 2019-01-10 在此处键入发明名称

Country Status (6)

Country Link
US (1) US11537142B2 (zh)
EP (1) EP3770711A4 (zh)
JP (1) JP7085296B2 (zh)
KR (1) KR102333984B1 (zh)
CN (1) CN108508891B (zh)
WO (1) WO2019179176A1 (zh)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108508891B (zh) * 2018-03-19 2019-08-09 珠海市一微半导体有限公司 一种机器人重定位的方法
CN109528095B (zh) * 2018-12-28 2020-11-17 深圳市愚公科技有限公司 扫地记录图的校准方法、扫地机器人及移动终端
CN109965797B (zh) * 2019-03-07 2021-08-24 深圳市愚公科技有限公司 扫地机器人地图的生成方法、扫地机器人控制方法及终端
CN109916393B (zh) * 2019-03-29 2023-03-31 电子科技大学 一种基于机器人位姿的多重栅格值导航方法及其应用
CN111837587B (zh) * 2019-04-29 2024-04-19 苏州科瓴精密机械科技有限公司 自动割草机及其控制方法
CN111941418B (zh) * 2019-05-15 2024-03-08 苏州科瓴精密机械科技有限公司 自移动机器人的控制方法及自移动机器人系统
CN113343739B (zh) * 2020-03-02 2022-07-22 杭州萤石软件有限公司 可移动设备的重定位方法和可移动设备
CN111407188A (zh) * 2020-03-27 2020-07-14 深圳市银星智能科技股份有限公司 移动机器人重定位方法、装置及移动机器人
CN111938513B (zh) * 2020-06-30 2021-11-09 珠海市一微半导体有限公司 一种机器人越障的沿边路径选择方法、芯片及机器人
CN111897336A (zh) * 2020-08-02 2020-11-06 珠海市一微半导体有限公司 一种机器人沿边行为结束的判断方法、芯片及机器人
US11945469B2 (en) * 2020-11-25 2024-04-02 Zoox, Inc. Object uncertainty models
CN112558616B (zh) * 2020-12-28 2023-11-21 南京苏美达智能技术有限公司 一种智能自行走设备及控制方法
CN113344263B (zh) * 2021-05-28 2022-12-27 深圳市无限动力发展有限公司 沿边行走过程的轨迹闭合判断方法、装置及计算机设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106066179A (zh) * 2016-07-27 2016-11-02 湖南晖龙股份有限公司 一种基于ros操作系统的机器人位置丢失找回方法和控制系统
CN106092104A (zh) * 2016-08-26 2016-11-09 深圳微服机器人科技有限公司 一种室内机器人的重定位方法及装置
CN106323273A (zh) * 2016-08-26 2017-01-11 深圳微服机器人科技有限公司 一种机器人重定位方法及装置
US20170225891A1 (en) * 2016-02-05 2017-08-10 inVia Robotics, LLC Robotic Navigation and Mapping
CN107037806A (zh) * 2016-02-04 2017-08-11 科沃斯机器人股份有限公司 自移动机器人重新定位方法及采用该方法的自移动机器人
CN108508891A (zh) * 2018-03-19 2018-09-07 珠海市微半导体有限公司 一种机器人重定位的方法

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0126497D0 (en) 2001-11-03 2002-01-02 Dyson Ltd An autonomous machine
JP2004275468A (ja) * 2003-03-17 2004-10-07 Hitachi Home & Life Solutions Inc 自走式掃除機およびその運転方法
JP4533659B2 (ja) * 2004-05-12 2010-09-01 株式会社日立製作所 レーザー計測により地図画像を生成する装置及び方法
WO2007051972A1 (en) * 2005-10-31 2007-05-10 Qinetiq Limited Navigation system
JP2007323402A (ja) * 2006-06-01 2007-12-13 Matsushita Electric Ind Co Ltd 自走式機器およびそのプログラム
KR100791386B1 (ko) * 2006-08-18 2008-01-07 삼성전자주식회사 이동 로봇의 영역 분리 방법 및 장치
JP5086942B2 (ja) * 2008-09-02 2012-11-28 トヨタ自動車株式会社 経路探索装置、経路探索方法、及び経路探索プログラム
JP6162955B2 (ja) 2009-11-06 2017-07-12 アイロボット コーポレイション 自律ロボットにより表面を完全にカバーする方法およびシステム
CN102138769B (zh) * 2010-01-28 2014-12-24 深圳先进技术研究院 清洁机器人及其清扫方法
JP2012194860A (ja) * 2011-03-17 2012-10-11 Murata Mach Ltd 走行車
US8798840B2 (en) * 2011-09-30 2014-08-05 Irobot Corporation Adaptive mapping with spatial summaries of sensor data
TWI505801B (zh) * 2014-05-09 2015-11-01 Kinpo Elect Inc 室內機器人與其定位方法
KR102527645B1 (ko) * 2014-08-20 2023-05-03 삼성전자주식회사 청소 로봇 및 그 제어 방법
GB201419883D0 (en) * 2014-11-07 2014-12-24 F Robotics Acquisitions Ltd Domestic robotic system and method
FR3034410B1 (fr) * 2015-04-02 2020-10-23 Gebo Packaging Solutions France Dispositif de convoyage par chariot autonome
CN104731101B (zh) * 2015-04-10 2017-08-04 河海大学常州校区 清洁机器人室内场景地图建模方法及机器人
CN106610665A (zh) * 2015-10-22 2017-05-03 沈阳新松机器人自动化股份有限公司 一种基于gps的自主行进机器人
JP2017102705A (ja) * 2015-12-02 2017-06-08 株式会社リコー 自律移動装置及び自律移動装置システム
CN107041718B (zh) * 2016-02-05 2021-06-01 北京小米移动软件有限公司 清洁机器人及其控制方法
JP2018022215A (ja) * 2016-08-01 2018-02-08 村田機械株式会社 移動教示装置、及び、移動教示方法
KR101930870B1 (ko) * 2016-08-03 2018-12-20 엘지전자 주식회사 이동 로봇 및 그 제어방법
KR20180023302A (ko) * 2016-08-25 2018-03-07 엘지전자 주식회사 이동 로봇 및 그 제어방법
CN107368071B (zh) * 2017-07-17 2020-11-06 纳恩博(北京)科技有限公司 一种异常恢复方法及电子设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107037806A (zh) * 2016-02-04 2017-08-11 科沃斯机器人股份有限公司 自移动机器人重新定位方法及采用该方法的自移动机器人
US20170225891A1 (en) * 2016-02-05 2017-08-10 inVia Robotics, LLC Robotic Navigation and Mapping
CN106066179A (zh) * 2016-07-27 2016-11-02 湖南晖龙股份有限公司 一种基于ros操作系统的机器人位置丢失找回方法和控制系统
CN106092104A (zh) * 2016-08-26 2016-11-09 深圳微服机器人科技有限公司 一种室内机器人的重定位方法及装置
CN106323273A (zh) * 2016-08-26 2017-01-11 深圳微服机器人科技有限公司 一种机器人重定位方法及装置
CN108508891A (zh) * 2018-03-19 2018-09-07 珠海市微半导体有限公司 一种机器人重定位的方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3770711A4 *

Also Published As

Publication number Publication date
US11537142B2 (en) 2022-12-27
CN108508891A (zh) 2018-09-07
KR20200127019A (ko) 2020-11-09
CN108508891B (zh) 2019-08-09
EP3770711A4 (en) 2021-09-29
US20210096580A1 (en) 2021-04-01
EP3770711A1 (en) 2021-01-27
WO2019179176A8 (zh) 2020-09-17
JP2021516403A (ja) 2021-07-01
KR102333984B1 (ko) 2021-12-02
JP7085296B2 (ja) 2022-06-16

Similar Documents

Publication Publication Date Title
WO2019179176A1 (zh) 在此处键入发明名称
EP3764186B1 (en) Method for controlling autonomous mobile robot to travel along edge
CN107943025B (zh) 机器人脱困的处理方法
JP6915209B2 (ja) 移動ロボットの地図作成方法および当該地図に基づく経路計画方法
CN107544517B (zh) 智能清洁机器人的控制方法
CN108628324B (zh) 基于矢量地图的无人车导航方法、装置、设备及存储介质
EP3018603B1 (en) Adaptive mapping with spatial summaries of sensor data
CN106527423B (zh) 清洁机器人及其控制方法
TWI742554B (zh) 定位方法、路徑確定方法、機器人及儲存介質
CN109407675B (zh) 机器人回座的避障方法和芯片以及自主移动机器人
CN107368079A (zh) 机器人清扫路径的规划方法及芯片
CN104536445A (zh) 移动导航方法和系统
CN108415432A (zh) 机器人基于直边的定位方法
US20110125358A1 (en) Control method for a robot vehicle, and robot vehicle
JP2007213236A (ja) 自律走行ロボットの経路計画方法及び自律走行ロボット
CN107678429B (zh) 一种机器人的控制方法及芯片
JP2020134528A (ja) 定常横方向偏差の除去方法、装置、記憶媒体、及びプログラム
CN108873889A (zh) 智能移动设备及其路径控制方法、计算机可读存储介质
CN113475977B (zh) 机器人路径规划方法、装置及机器人
CN113494917A (zh) 地图构建方法及系统、制定导航策略的方法以及存储介质
JP5569099B2 (ja) リンク情報生成装置及びリンク情報生成プログラム
KR102521940B1 (ko) 로봇 청소기 및 그 로봇 청소기의 제어 방법
CN112346446A (zh) 自动导引运输车脱码恢复方法、装置及电子设备
CN115041483A (zh) 一种清扫机器人及其控制方法
CN115981323A (zh) 一种多传感融合智能清洁车自动避障方法及智能清洁车

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18910525

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020550731

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20207028134

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2018910525

Country of ref document: EP

Effective date: 20201019