WO2019179176A1 - 在此处键入发明名称 - Google Patents
在此处键入发明名称 Download PDFInfo
- Publication number
- WO2019179176A1 WO2019179176A1 PCT/CN2018/120198 CN2018120198W WO2019179176A1 WO 2019179176 A1 WO2019179176 A1 WO 2019179176A1 CN 2018120198 W CN2018120198 W CN 2018120198W WO 2019179176 A1 WO2019179176 A1 WO 2019179176A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- edge
- obstacle
- path
- grid
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 107
- 230000008569 process Effects 0.000 claims description 53
- 230000008859 change Effects 0.000 claims description 51
- 238000009825 accumulation Methods 0.000 abstract description 4
- 238000013461 design Methods 0.000 description 11
- 238000010586 diagram Methods 0.000 description 6
- 238000012790 confirmation Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000004140 cleaning Methods 0.000 description 3
- 238000012937 correction Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000010408 sweeping Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 201000004569 Blindness Diseases 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000010835 comparative analysis Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0251—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/027—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector
Definitions
- the invention relates to the field of robots, and in particular to a method for repositioning a robot. Background technique
- the sweeping robot may cause walking errors due to defects such as the gyroscope or the code wheel itself or the ground slip, and the error will gradually accumulate.
- the accumulated error over a long period of time may cause the map constructed by the robot.
- the invention provides a method for repositioning a robot, which does not need to use an expensive device such as a camera or a laser radar, and can also re-determine the position of the robot, avoiding the problem of inaccurate positioning caused by excessive accumulation of robot walking errors, and improving The accuracy of robot positioning.
- the specific technical solution of the present invention is as follows:
- a method for repositioning a robot comprising the steps of: step si: the robot detects an obstacle, and proceeds to step S2; step S2: walking along an edge of the obstacle, and determining a path along the edge of the obstacle Whether the condition for determining that the obstacle is an isolated object is satisfied, if not, proceeding to step S3, and if yes, proceeding to step S4; step S3: adjusting the walking angle, leaving the obstacle, continuing to walk, and detecting the obstacle again Returning to step S2; Step S4: determining that the obstacle is an isolated object, recording an edge path along the edge of the isolated object, determining whether the currently recorded edge path and the previously stored edge path are similar, if no Then, the process proceeds to step S5. If yes, the process proceeds to S6.
- Step S5 the recorded edge path along the edge of the isolated object is taken as the stored edge path, and the process returns to step S3;
- step S6 the previously stored The edge path similar to the currently recorded edge path is used as the reference positioning path to determine the current edge path a first partial map and a second partial map where the reference positioning path is located, the first partial map and the same shape and size
- the two partial maps are overlapped, and the grid element corresponding to the portion of the current edge of the first partial map that overlaps with the reference positioning path of the second partial map is used as a positioning unit, and the robot is currently detected when the positioning unit is located.
- the raster coordinates to be replaced are replaced by the grid coordinates of the grid cells in the corresponding reference positioning path.
- step S11 determining grid coordinates corresponding to when the robot detects the obstacle
- step S12 determining a preset time estimated from the current time
- step S13 determining whether the grid coordinates determined in step S11 are the same as or adjacent to the grid coordinates corresponding to the edge path determined in step S12, and if yes, proceeding to step S14, If no, the process proceeds to step S2
- Step S14 The walking angle is adjusted, the obstacle is left, the walking is continued, and when the obstacle is detected again, the process returns to step S11.
- the adjacent in the step S13 refers to a common edge or a corner between the grid cells corresponding to the two grid coordinates.
- Step S2 is performed along the edge of the obstacle, and it is determined whether the path along the edge of the obstacle satisfies the condition that the obstacle is an isolated object, and the following steps are included:
- Step S21 The edge of the obstacle travels, and the start information of the start position point is recorded;
- Step S22 determining whether the amount of change of the angle detected by the robot from the start position point reaches 360 °, and if yes, proceeding to step S23 , otherwise continue to walk along the edge of the obstacle, until the amount of change of the angle detected by the robot from the starting position point reaches 360 °, proceeds to step S23;
- Step S23 determines whether the robot returns to the step described in step S21 a starting position point, if yes, determining that the path of the robot walking along the edge of the obstacle satisfies the condition that the obstacle is an isolated object, otherwise proceeds to step S24;
- step S24 continues to walk along the edge of the obstacle , determining whether the robot returns to the starting position point described in step S21, and
- the obstacle is a condition of an isolated object, if the robot returns to the starting position point and the angle change amount exceeds 450 °, or the robot does not return to the starting position point and the angle change amount exceeds At 450 ° , it is determined that the path that the robot walks along the edge of the obstacle does not satisfy the condition that the obstacle is an isolated object.
- step S21 the method further includes the following steps: s, the distance of the distance from the starting point is 0. 5 meters, the distance between the starting point and the angle of the angle of the obstacle is detected. If yes, go to step S213. If no, the robot continues to walk until the distance the robot travels reaches 1.5 meters, and proceeds to step S213.
- Step S213 determines whether the angle of change of the robot from the starting position point is Up to 90 °, if not, the robot adjusts the walking angle, leaves the obstacle, continues to walk, and when the obstacle is detected again, returns to step S11, and if yes, proceeds to step S214;
- Step S214 the robot continues along the The edge of the obstacle travels, and it is judged whether the distance traveled by the robot from the starting position point reaches 3 meters. If yes, the process proceeds to step S215. If not, the robot continues to walk until the robot travels to a distance of 3 meters.
- Step S215 Determine the amount of change in the angle at which the robot starts walking from the starting position point If it does not reach 180 °, if not, the robot adjusts the walking angle, leaves the obstacle, continues to walk, and when the obstacle is detected again, returns to step S11, and if yes, proceeds to step S216; Step S216: continues the robot The edge of the obstacle is walked, and it is determined whether the distance traveled by the robot from the starting position point reaches 4. 5 meters. If yes, the process proceeds to step S217. If not, the robot continues to walk until the distance traveled by the robot reaches 4.
- step S217 determining whether the amount of change in the angle of the robot from the starting position point reaches 270 °, and if not, the robot adjusts the walking angle, leaves the obstacle, and continues to walk. When the obstacle is detected again, the process returns to step S11, and if so, the process proceeds to step S22.
- Step S211 detecting a time and an angle change amount of the robot walking along the edge of the obstacle from the starting position point
- Step S212 Determining whether the time for the robot to start walking from the starting position point reaches 1 minute, if yes, proceeding to step S213, if not, the robot continues to walk until the robot walking time reaches 1 minute, and proceeds to step S213
- step S213 Determining whether the amount of change in the angle of the robot from the starting position point reaches 90°. If not, the robot adjusts the walking angle, leaves the obstacle, continues to walk, and when the obstacle is detected again, returns to step S11.
- step S214 the robot continues to walk along the edge of the obstacle, and determines whether the time for the robot to start walking from the starting position point reaches 2 minutes, and if yes, proceeds to step S215, if No, the robot continues to walk until the robot walks for 2 minutes.
- Step S217 Step S217: determining whether the amount of change in the angle of the robot from the starting position point reaches 270°. If not, the robot adjusts the walking angle, leaves the obstacle, continues walking, and detects the obstacle again. Then, the process returns to step S11, and if yes, the process goes to step S218; step S218: the robot continues to walk along the edge of the obstacle, and determines whether the time for the robot to walk from the starting position point reaches 4 minutes, and if so, Then, the process proceeds to step S22. If not, the robot continues to walk until the robot walks. The time reaches 4 minutes, and the process proceeds to step S22.
- the method further includes the following steps: determining whether the area circled by the robot along the obstacle is greater than 0.3 square meters, if yes, entering the determining robot to walk along the edge of the obstacle The path satisfies the step of determining the condition that the obstacle is an isolated object. If not, the robot adjusts the walking angle, leaves the obstacle, continues walking, and detects the obstacle again, and then returns to step S11.
- Step S41 Record the current Recording the grid area of the area surrounded by the edge path along the grid coordinates of the corresponding grid unit along the edge path, and recording the grid coordinates of the center grid unit of the area surrounded by the current edge path
- Step S42 determining whether the coordinate difference between the grid coordinates of the central grid unit surrounded by the current edge path and the grid coordinates of the central grid unit surrounded by the stored edge path is greater than the first pre-predetermined Set the coordinate difference value, if yes, determine that the edge path of the current record is not similar to the edge path that has been stored before, if not, proceed to step S43;
- Step S43 determine that the current grid area corresponds to the stored edge path Whether the difference in the grid area of the area is greater than Presetting the area difference, if yes, determining that the edge path of the current record is not similar to the previously stored edge path,
- step S45 The edge path of the edge is shifted relative to the stored edge path by the distance between the N grid cells in the up, down, left, and right directions, and the number of grid cells overlapping the current edge path and the stored edge path is determined to occupy the stored edge path.
- N is a natural number and 1 ⁇ N ⁇ 3.
- step S44 or step S45 The method includes the following steps: marking, according to the first partial map where the current edge path is located and the second partial map where the stored edge path is located, marking the current edge path and the stored raster element corresponding to the edge path as 1
- the other grid unit is marked as 0; the first partial map and the corresponding grid unit in the second partial map are ANDed; after the judgment and operation, the obtained grid unit labeled 1 is obtained.
- the replacing the grid coordinates detected when the robot is currently located in the positioning unit with the grid coordinates of the grid unit in the corresponding reference positioning path includes the following steps: Step S61: Whether there are M serially connected grid cells in the positioning unit, and the grid coordinates of the M consecutive grid cells currently recorded by the robot and the grid coordinates of the corresponding grid cells in the reference positioning path The difference is less than the second preset coordinate difference value, if yes, then proceeds to step S62, and if not, returns to step S3; step S62: the robot walks to any one of the M series of connected grid cells And proceeding to step S63; Step S63: replacing the currently detected grid coordinates with the grid coordinates of the grid cells in the corresponding reference positioning path.
- M is a natural number and 2SMS3.
- step S62 travels to the M series of grid cells Any one of the grid units includes the following steps: Step S621: determining whether the M series of connected grid cells have only one group, and if so, directly walking to any one of the M series of connected grid cells Grid unit, and proceeds to step S63, if no, proceeds to step S622; step S622: determines a group of the M consecutively connected grid cells with the earliest recording time, and the robot walks to a grid cell in which the recording time is the earliest And proceeds to step S63.
- step S6 the method further includes the following steps: Step S71: determining that the coordinate value detected when the robot currently located in the step S6 is located at the center point of the positioning unit is (xl, yl); Step S72: determining step The coordinate value of the center point of the grid cell in the corresponding reference positioning path described in S6 is (x2, y2); Step S73: Replace the coordinate value ((xl, yl)) currently detected by the robot with (x2) , y2).
- the beneficial effects of the present invention include: by using the path of the robot walking along the edge of the isolated object as a reference, the position deviation caused by the excessive accumulation of the robot walking error can be corrected to achieve repositioning, thereby improving the positioning of the robot during subsequent navigation walking. Accuracy and walking efficiency.
- FIG. 1 is a schematic flow chart of a method for repositioning a robot according to the present invention.
- FIG. 2 is a schematic diagram 1 after the first partial map and the second partial map are overlapped
- FIG. 3 is a schematic diagram 2 after the first partial map and the second partial map are overlapped.
- the robot according to the present invention is a smart cleaning robot (such as a sweeping robot or a mopping robot), and the robots mentioned in the following embodiments all refer to smart cleaning robots. These robots can automatically walk on certain occasions with certain artificial intelligence.
- the robot's body is equipped with various sensors to detect the walking distance, the walking angle (ie, the direction of travel), the body state and obstacles, etc.
- the robot of the present invention comprises the following structure: a robotic body capable of autonomously walking with a left driving wheel and a right driving wheel, a human-computer interaction interface is arranged on the body, and an obstacle detecting unit is arranged on the body.
- An inertial sensor is disposed inside the body, and the inertial sensor includes an accelerometer and a gyroscope.
- the two driving wheels are provided with an odometer (generally a code wheel) for detecting the walking distance of the driving wheel, and is also capable of processing related The parameters of the sensor and the ability to output control signals to the control module of the actuator.
- a method for repositioning a robot as shown in FIG. 1 includes the following steps: In step si, the robot performs walking data detection while walking, and enters when a collision sensor or an infrared sensor of the front end of the robot detects an obstacle. Step S2.
- the robot After proceeding to step S2, the robot walks along the edge of the obstacle and determines whether the path along the edge of the obstacle satisfies the condition that the obstacle is an isolated object, wherein determining the obstacle is a condition of the isolated object
- the specific conditions can be set according to different design requirements. For example, it can be judged by the relationship between the starting point of the robot starting to travel along the edge of the obstacle and the ending point of walking along the edge of the obstacle. By judging the amount of change in the angle of rotation within a certain period of time, it is also possible to Judging the relationship with the grid position, you can also combine these factors for comprehensive judgment, and so on.
- the process proceeds to step S3, the robot adjusts the walking angle, walks away from the obstacle, and leaves the obstacle. Then, the robot continues to walk according to the path or manner planned by the system.
- the collision sensor or the infrared sensor at the front end of the robot detects the obstacle again, the process returns to step S2 to continue to determine whether the obstacle is an isolated object.
- step S2 when the robot determines that the path along the edge of the obstacle satisfies the condition that the obstacle is an isolated object, then proceeds to step S4, the robot determines that the obstacle is an isolated object, and records the edge
- the edge path of the edge of the isolated object is recorded, and the recorded information can be set according to specific design requirements, such as recording the grid coordinates corresponding to the edge path, and recording the gyroscope when the robot walks on the grid unit corresponding to the edge path.
- the detected angle records the time of the robot at the start of the edge path and the time at the end of the edge path, and so on.
- the robot judges whether the edge path of the current record is similar to the edge path that has been stored before, and the manner of judging can also be set according to different design requirements, for example, the grid of the grid unit corresponding to the edge path of the currently recorded edge.
- the coordinates are compared with the grid coordinates of the grid cells corresponding to the previously stored edge-edge paths.
- the two edge-edge paths can be considered similar; The difference between the overall arrangement orientation of the grid cells corresponding to the edge path and the overall arrangement orientation of the grid cells corresponding to the previously stored edge path, if only a few points have different orientations, It is considered that the two edge paths are similar; it can also be judged by the angle change rate per unit time, the length of the edge path and the total time of walking along the edge of the robot while walking along the edge. If these parameters are the same, then two can be considered The trailing edge paths are similar; of course, the factors can also be described as mutual Together comprehensive judgments.
- edge path of the current record is not similar to the previously stored edge path, it indicates that the isolated object currently detected by the robot has not been touched before, and the relevant record data that does not travel along the edge of the isolated object is repositioned.
- the reference cannot be repositioned by the isolated object, so the process proceeds to step S5, and the recorded edge path along the edge of the isolated object is used as the stored edge path, so that the subsequent robot can again encounter the isolated object. , can be repositioned, and then return to step S3. If it is judged that the edge path of the current record is similar to the previously stored edge path, it indicates that the robot is currently detecting the orphan.
- the standing object which has been encountered before, and stores the relevant recorded data of the edge path along the edge of the isolated object, can be repositioned by the isolated object, so the process proceeds to step S6, and the previously stored edge is
- the edge path of the edge of the isolated object is used as a reference positioning path, and then, the first partial map where the current edge path is located and the second partial map where the reference positioning path is located are determined, that is, a current edge path is circled in the raster map a partial map as a first partial map, delineating a partial map including a reference positioning path as a second partial map, the size and shape of the first partial map and the second partial map being the same, but the specific size and
- the shape can be set according to the specific design requirements, such as setting the shape as a center, a square or a rectangle, and setting the maximum grid coordinate of the local map to be larger than the maximum grid coordinate in the edge path by four grid cells, The minimum grid coordinates of the map are set to be smaller than the edge The smallest grid coordinates of
- the first partial map and the second partial map having the same shape and size are overlapped, since the current edge path and the reference positioning path are similar, the difference between the two is mainly the grid coordinates, and the overall path
- the shapes are almost the same, and there are only a few differences. Therefore, by overlapping the first partial map and the second partial map, it is possible to obtain a portion where the current edge path and the reference positioning path overlap each other.
- the part indicates that the robot does not produce errors when walking along the edge of the isolated object. It can be repositioned by the grid coordinates corresponding to the edge. Some parts or points that do not overlap indicate that the robot is in the corresponding edge segment. Or the edge point produces a walking error and is not suitable as a reference for repositioning.
- FIG. 2 is a schematic diagram of the first partial map and the second partial map overlapping. Each small square in the figure represents a grid unit, and the path marked with the letter P is connected. The current edge path, the path marked with the letter B is the reference positioning path, and the letter P and the letter B are indicated in one square, indicating that the The portion of the current edge path that overlaps with the reference positioning path.
- the method further includes the following steps: Step S11, first determining, by the gyroscope of the robot and the data detected by the odometer, that the robot is in the process of detecting the obstacle The grid cell and the grid coordinates corresponding to the grid cell. Then, the process proceeds to step S12, and the edge path that the robot has stored in the preset time estimated from the current time is determined.
- the preset time may be set according to specific design requirements.
- the time may be set to 10 minutes to For any value of 30 minutes, this embodiment is set to 20 minutes, that is, 20 minutes from the current time. In the 20 minutes, what are the stored edge paths of the robot, and the robot will perform data in the memory.
- step S13 the robot compares the grid coordinates determined in step S11 with the grid coordinates corresponding to the edge path determined in step S12, if it is found that the two grid coordinates are the same or Adjacent, it is considered that the current obstacle has been encountered within 20 minutes, and has been walking along its edge. If the robot frequently walks around the edge of the same obstacle in a short period of time, Reduce the walking efficiency of the robot. At the same time, the error accumulated by the robot is not very large in a short time interval. It does not need to be repositioned. Frequent repositioning will also cause the walking efficiency of the robot to decrease.
- step S13 it is determined in step S13 that the grid coordinates determined in step S11 are the same as or adjacent to the grid coordinates corresponding to the edge path determined in step S12, then the process proceeds to step S14, and the robot adjusts Walking angle, walking away from the obstacle, leaving the obstacle. Then, the robot continues to travel according to the path or manner planned by the system.
- the collision sensor or the infrared sensor at the front end of the robot detects the obstacle again, the process returns to step S11 to restart the determination of whether the obstacle detected again meets the relevant requirements.
- step S2 determines whether the obstacle meets the condition that enables the robot to reposition.
- the adjacent in the step S13 refers to a common edge or a corner point between the grid cells corresponding to the two grid coordinates. The method described in this embodiment determines whether the robot needs to walk along the edge of the obstacle by the obstacle detected by the robot within a preset time, thereby improving the efficiency of the robot walking and avoiding the blindness of the robot action. And repeatability.
- the step S2 is performed along the edge of the obstacle, and it is determined whether the path along the edge of the obstacle satisfies the condition that the obstacle is an isolated object, and includes the following steps:
- step S21 the robot walks along the edge of the obstacle, and based on the data detected by the gyroscope and the odometer, records the start information of the starting position point when the robot starts to walk along the side, and the start information may include The point coordinate value of the starting position point, the grid coordinate of the grid cell corresponding to the starting position point, the direction of walking, and the time of starting the walking. Recording the starting information may provide reference data for subsequent determination of whether the obstacle is an isolated object, or may provide a navigation basis for subsequent searching for the obstacle.
- step S22 based on the data detected by the gyroscope, it is determined whether the robot walks along the edge of the obstacle from the starting position point, and whether the angle of the rotation of the body reaches 360 °, thereby It is preliminarily judged whether the robot has walked a circle. If it is judged that it has walked a circle, it proceeds directly to step S23 for further confirmation. If it is judged that there is no one lap, it continues to travel along the edge of the obstacle until the amount of change in the angle detected by the robot from the start position reaches 360 °, and then proceeds to step S23 for further confirmation.
- step S23 the robot first determines whether to return to the starting position point described in step S21, and the determining manner may adopt the manner in which the point coordinate values are the same, if the point coordinate value of the current position point and the starting position point are detected. If the point coordinate values are the same, the robot is considered to have returned to the starting position point, so that it can be determined that the robot walks a circle along the edge of the obstacle, and the path that is walking along the side can be that the obstacle is an isolated object, that is, satisfied. It is determined that the obstacle is a condition of an isolated object, and then it is possible to proceed to step S4 for the next operation.
- step S24 the walking along the edge of the obstacle continues to determine whether the robot is Returning to the home position point described in step S21, and determining whether the amount of angle change detected by the robot from the start position point reaches 450 °.
- the obstacle is determined to be an isolated object, that is, a condition for determining that the obstacle is an isolated object is satisfied. If the robot returns to the home position point, but the angle change amount exceeds 450 °, or the robot does not return to the home position point, and the angle change amount exceeds 450 °, the obstacle is indicated The object is not an isolated object, and determining that the path the robot is walking along the edge of the obstacle does not satisfy the condition that the obstacle is an isolated object.
- the method described in this embodiment can accurately determine whether the robot can completely walk around the obstacle by combining the coordinate value and the rotation angle of the robot, thereby accurately determining whether the path of the robot walking along the edge of the obstacle is The conclusion that the condition that the obstacle is an isolated object is satisfied, and effective reference data can be provided for repositioning the subsequent robot.
- step S211 based on the odometer and the gyroscope, detecting that the robot starts from the starting position point along the edge of the obstacle The distance traveled and the amount of angle change. Then, the process proceeds to step S212, and determines whether the distance traveled by the robot from the starting point is 1.5 m. If yes, the process proceeds to step S213. If not, the robot continues to walk until the distance traveled by the robot reaches 1.5. m, proceeds to step S213. In step S213, it is determined whether the amount of change in the angle of the robot from the starting position point reaches 90°.
- step S214 the robot continues to walk along the edge of the obstacle, and determines whether the distance that the robot starts to travel from the starting position point reaches 3 meters. If yes, the process proceeds to step S215, and if not, the robot continues to walk.
- step S215 it is determined whether the angle of change of the angle of the robot from the start position point reaches 180 °, and if not, it indicates that the edge line extension angle of the first 1.5 m of the obstacle is appropriate, and 1.
- the extension angle of the edge line of 5 meters to 3 meters is relatively large, and it is very likely that the footprint of the obstacle is also enlarged, which is not suitable as a reference object for repositioning. Therefore, the robot adjusts the walking angle and leaves the obstacle. If the object continues to walk and the obstacle is detected again, the process returns to step S11 to restart the judgment.
- step S216 the robot continues Walking along the edge of the obstacle, and determining whether the distance traveled by the robot from the starting position point reaches 4. 5 meters. If yes, the process proceeds to step S217. If not, the robot continues to walk until the robot walks. The distance reaches 4. 5 meters, and the flow proceeds to step S217.
- step S217 it is determined whether the amount of change in the angle of the robot from the start position point reaches 270 °, and if not, it indicates that the edge line extension angle of the first 3 meters of the obstacle is appropriate, and 3 to 4
- the extension angle of the edge line of 5 meters is relatively large, and it is very likely that the footprint of the obstacle is also large, which is not suitable as a reference object for repositioning. Therefore, the robot adjusts the walking angle and leaves the obstacle. When the walking is continued and the obstacle is detected again, the process returns to step S11 to restart the determination. If the robot walks for 4.5 meters and the angle of rotation reaches 270 °, it can be concluded that the footprint of the obstacle is suitable.
- the obstacle has a suitable footprint and is the best reference object for robot repositioning.
- the size of the footprint of the obstacle can be accurately determined, thereby effectively determining whether the obstacle is suitable as a robot. Repositioning the reference object, which in turn provides accurate reference data for subsequent robot repositioning.
- step S211 based on the RTC clock timing module and the gyroscope in the robot, detecting that the robot starts from the starting position point The time and angle change amount of the edge walking of the obstacle is described, and then proceeds to step S212.
- step S212 the robot determines whether the time of walking from the starting position point reaches 1 minute. If yes, it indicates that the robot has traveled a certain distance along the edge of the obstacle, then proceeds to step S213, and if not, the robot continues to walk. When the robot has traveled for 1 minute, the process proceeds to step S213.
- step S213 it is determined whether the amount of change in the angle of the robot from the start position point reaches 90 °, and if not, it indicates that the obstacle corresponding to the change of the travel trajectory of the obstacle along the edge of the obstacle has a large footprint. It is not suitable as a reference object for repositioning. Therefore, when the robot adjusts the walking angle, leaves the obstacle, continues walking, and detects the obstacle again, the process returns to step S11 to restart the judgment. If the robot has been walking for 1 minute and the angle of change of the rotation reaches 90 °, it can be preliminarily judged that the size of the obstacle is suitable, and it is necessary to proceed to step S214 for further judgment.
- step S214 the robot continues to walk along the edge of the obstacle, and determines whether the robot has traveled from the starting position point for 2 minutes, and if so, indicates the machine If the person walks for a longer distance, the process proceeds to step S215. If not, the robot continues to walk until the robot travels for 2 minutes, and the process proceeds to step S215.
- step S215 it is determined whether the amount of change in the angle of the robot from the starting position point reaches 180 °, and if not, it indicates that the angle of the edge line of the robot walking along the edge of the obstacle at the first minute of the start is appropriate.
- step S216 the robot continues to walk along the edge of the obstacle, and determines whether the robot has started walking from the starting position point for 3 minutes.
- step S217 it is determined whether the amount of change in the angle of the robot from the start position point reaches 270 °, and if not, the edge line extending along the edge of the obstacle indicating the first minute and the second minute of the start of the robot The angle is more suitable, and the extension of the edge line along the edge of the obstacle during the third minute is relatively large. It is very likely that the footprint of the obstacle also becomes larger, which is not suitable as a reference object for repositioning.
- the robot adjusts the walking angle, leaves the obstacle, continues walking, and detects the obstacle again, then returns to step S11 to restart the determination. If the robot walks for 3 minutes and the angle of rotation reaches 270 °, indicating that the extension angle of the edge line walking along the edge of the obstacle in the first 3 minutes of the robot is also reasonable, proceed to step S218, and the robot continues along the edge of the obstacle. Walking, and determining whether the time that the robot starts to walk from the starting position point reaches 4 minutes, and if yes, proceeds to step S22, and determines whether the amount of angular change detected by the robot from the starting position point reaches 360 °.
- the robot From this, it can be judged whether the robot walks 360 ° along the edge of the obstacle, and realizes a circle along the side. If not, the robot continues to walk until the robot travels for 4 minutes, and then proceeds to step S22 to determine 360 °.
- the size of the footprint of the obstacle can be accurately determined, thereby effectively determining whether the obstacle is suitable for repositioning as a robot.
- the reference object can then provide accurate reference data for the repositioning of subsequent robots.
- the determining robot described in step S23 returns to the starting position point described in step S21, and determines that the path of the robot walking along the edge of the obstacle satisfies the determination of the obstacle Before the condition of the isolated object, or if the robot returns to the starting position point and the angle change amount does not reach 450 ° as described in step S24, and determines that the robot walks along the edge of the obstacle Before the path satisfies the condition that the obstacle is an isolated object
- the method further includes the following steps: determining whether the area circled by the robot along the obstacle is greater than 0.33 square meters, if yes, indicating that the obstacle is occupied The size of the ground area is suitable, and a reference object that is repositioned as a robot may be considered, and then a path to determine that the robot travels along the edge of the obstacle satisfies the condition of determining that the obstacle is an isolated object, and if not, the robot Adjust the walking angle, leave the obstacle, continue walking, and detect obstacles again Returns to step S
- the method described in this embodiment can select an ideal isolated object as a reference for subsequent robots to relocate or store new positioning parameters by using a footprint of more than 0.3 square meters as a limiting condition.
- Step S41 Based on The data detected by the gyroscope and the odometer of the robot records the grid coordinates of the current grid unit corresponding to the edge path, records the grid area of the area surrounded by the current edge path, and records the current The grid coordinates of the central grid unit of the area enclosed by the edge path, wherein the grid coordinates can be converted by the coordinate values of the position points, and the grid area can be calculated by the number of grid units come out.
- the grid coordinates of the center grid cell can be calculated by the top, bottom, left and right grid coordinates in the area.
- the center is not only the positive center under the regular shape.
- the grid coordinates calculated by the same method can also be used as the grid coordinates of the central grid unit of the area.
- step S42 determining the grid coordinates of the central grid unit of the area surrounded by the current edge path and the stored Whether the coordinate difference of the grid coordinates of the central grid unit surrounded by the edge path is greater than the first preset coordinate difference, and the first preset coordinate difference may be correspondingly set according to specific design requirements, and may be set (2, 2) or (3, 3), that is, whether the X value in the compared grid coordinates is greater than 2 or 3, and whether the Y value is greater than 2 or 3, if yes, it is determined to be greater than the first pre- Set the coordinate difference, otherwise it is determined not to be greater than the first preset coordinate difference.
- the coordinate difference between the grid coordinates of the central grid unit surrounded by the current edge path and the grid coordinate of the central grid unit surrounded by the stored edge path is greater than the first preset coordinate
- the difference indicates that the difference between the upper and lower left and right most orthogonal grid coordinates of the two adjacent edge paths is relatively large, that is, the shapes of the two edge paths are largely different, so the edge of the current record can be determined.
- the path is not similar to the edge path that was previously stored.
- step S43 it is determined whether the difference between the current grid area and the grid area of the area corresponding to the stored edge path is greater than a preset area difference, and the preset area difference may be performed according to specific design requirements.
- the corresponding setting can be set to the area of 1 to 5 grid cells. Due to the influence of walking error, if the set value is too small, it is difficult to find the matching object.
- this embodiment is set to the area of three grid cells, so that the optimal matching effect can be achieved.
- the grid area can be based on the grid coordinate values of the grid cells corresponding to the edge path, and the number of grid cells per row or the number of grid cells per column is added, and then multiplied by each grid cell. The area of the grid is obtained.
- the difference between the current grid area and the grid area of the area corresponding to the stored edge path is greater than the preset area difference, it indicates that the shapes of the two areas are relatively different, and the shapes of the two side paths are The phase difference is also large, so it can be determined that the edge record of the current record is not similar to the edge path that has been previously stored. If it is judged that the difference between the current grid area and the grid area of the area corresponding to the stored edge path is less than or equal to the preset area difference, it does not necessarily indicate that the two grid paths are similar, and the steps need to be entered. S44 makes further judgments.
- step S44 based on the first partial map where the current edge path is located and the second partial map where the stored edge path is located, the first partial map and the second partial map having the same shape and size are overlapped to determine the current Whether the ratio of the number of grid cells overlapping the edge path and the stored edge path to the number of grid cells in the stored edge path is greater than a preset scale value, the preset The scale value can be set accordingly according to the specific design requirements, and this embodiment is set to 90%.
- FIG. 3 is a schematic diagram of the first partial map and the second partial map being overlapped, and the selected first partial map and the second partial map are each having a distance of 15 grid cells in the raster map.
- Each small square in the figure represents a grid unit, and the path marked with the letter H is the current edge path, and the path marked with the letter Q is the previously stored path.
- the edge path, marked with a letter H and a letter Q in a square indicates that this is the portion of the current edge path and the stored edge path overlap.
- the number of grid cells overlapping the current edge path and the stored edge path is calculated.
- the ratio of the number of grid cells in the edge path is 90.6%, which is greater than the preset ratio of 90%.
- step S45 is required to make further determination.
- step S45 the distance between the current edge path and the stored edge path is shifted by N grid cells in the up, down, left, and right directions, respectively, and the current edge path and the stored edge path overlap each other. Whether the number of raster cells in the stored edge path is greater than the preset scale value.
- FIG. 4 is another schematic diagram of the first partial map and the second partial map overlapping. Each small square in the figure represents a grid unit, and the squares marked with the letter H are connected. The path is the current edge path, and the path marked with the letter Q is a previously stored edge path. In a square, the letter H is marked and the letter Q is marked, indicating that this is The current edge path and the stored portion of the edge path overlap.
- the method described in this embodiment determines whether the edge path of the current record and the edge path that has been previously stored are similar by using the overlapping manner of the grid cells, so that a relatively accurate judgment result can be obtained, which is beneficial to improving the repositioning of the subsequent robot. accuracy.
- the preset ratio value includes the following steps: the current edge path and the stored edge unit corresponding to the edge path according to the first partial map where the current edge path is located and the second partial map where the stored edge path is located Both are marked as 1, and other grid cells are marked as 0. Performing an AND operation on the corresponding partial cells in the first partial map and the second partial map, that is, the result of 1 and 1 is 1, the result of 1 and 0 is 0, and the result of 0 and 0 is also 0. .
- the obtained ratio of the number of grid cells marked with 1 to the number of grid cells corresponding to the stored edge path is greater than a preset scale value.
- the grid unit is binarized, and then analyzed by means of calculation, so that the number of grid cells overlapping each other in the two edge paths can be quickly and accurately obtained, thereby being fast and accurate. Determining whether the ratio of the number of overlapping grid cells to the stored number of grid cells corresponding to the edge path is greater than a preset ratio value, and more accurately providing accurate reference data for subsequent robot positioning .
- the preset ratio value and the related calculation manner are the same as those in the foregoing embodiment, and details are not described herein again.
- the grid coordinates detected when the robot is currently located in the positioning unit are replaced with the grid coordinates of the grid unit in the corresponding reference positioning path, and the following steps are included: S61: determining whether there are M connected grid cells in the positioning unit, and the grid coordinates of the M consecutive grid cells currently recorded by the robot and the corresponding grid cells in the reference positioning path
- the difference of the grid coordinates is smaller than the second preset coordinate difference.
- the value of the M may be set according to specific design requirements, where M is a natural number, and 2SMS3 is set to 3.
- the second preset coordinate difference value may also be correspondingly set according to specific design requirements, and is set to (2, 2) in this embodiment.
- the difference between the grid coordinate values of the three grid cells continuously detected by the robot and the grid coordinates of the corresponding grid cells in the reference positioning path is smaller than the second preset coordinate difference value (2, 2), that is, the difference between the X value of the grid coordinates of the recorded grid cell and the X value of the grid coordinate of the corresponding grid cell in the reference positioning path is less than 2, and the difference between the Y values If it is less than 2, it indicates that the position difference between the two paths being compared is not very large, and the objects belonging to the same orientation may proceed to step S62 to perform subsequent positioning correction. Otherwise, it indicates that the position difference between the two paths being compared is large, and may not belong to the same position. The object cannot be used as a positioning reference.
- step S62 since the error of the above three consecutive grid cells is relatively small, the robot walks to any one of the three grid cells connected in series, and proceeds to step S63 to check the current detection.
- the grid coordinates to be replaced are replaced by the grid coordinates of the grid cells in the corresponding reference positioning path, thereby realizing the repositioning of the robot. Since the robot performs the above-mentioned steps mainly to judge whether the edge path is similar or not, if two objects having the same footprint and shape appear in the home environment, the robot only corrects the positioning data by relying on the similar edge path, it is likely There will be errors.
- the method in this embodiment further determines whether the object corresponding to the current object and the reference positioning path is in the same position by detecting and recording the difference between the grid coordinates of the edge path and the grid coordinates in the reference positioning path. In order to obtain more accurate positioning data, the positioning correction of the robot is more accurate.
- the robot in step S62 travels to any one of the M series of grid cells, and includes the following steps: Step S621: determining the M series connected gates Whether the cell unit has only one group. Since the current object and the reference object have the same position and approximate shape, the grid coordinates are the same or approximate, so generally, there are multiple sets of the M serially connected grid cells, or only one group, but There are many grid cells in this group. If there is only one group, Then, directly go to any one of the M series of grid cells, and proceed to step S63 to replace the currently detected grid coordinates with the grid of the grid cells in the corresponding reference positioning path. Coordinates to achieve repositioning of the robot.
- step S622 determines a group of the M consecutive grid cells with the earliest recording time, and the robot walks to a grid cell in which the recording time is the earliest, and proceeds to step S63, and the current detection is performed.
- the grid coordinates are replaced by the grid coordinates of the grid cells in the corresponding reference positioning path, thereby realizing the repositioning of the robot. Because the longer the robot walks, the greater the error produced, so the earlier the data is recorded, the more accurate it is.
- the grid unit with the earliest time is selected to correct the positioning data, thereby further ensuring the accuracy of the positioning data and ensuring the accuracy of the positioning of the robot.
- step S6 the method further includes the following steps: Step S71: determining that the grid coordinate detected when the robot currently located in the step S6 is located in the positioning unit is (XI, Y1), and enters Step S72.
- step S72 it is determined that the grid coordinates of the grid cells in the corresponding reference positioning path described in step S6 are (X2, Y2), and the flow proceeds to step S73.
- step S73 it is determined that the side length of the grid unit is L, and the flow proceeds to step S74.
- step S6 the method further includes the following steps: Step S71: determining, based on the data detected by the gyroscope and the odometer, that the robot in step S6 is currently located at a center point of the positioning unit The coordinate value is (xl, yl), and the flow proceeds to step S72.
- step S72 based on the corresponding data in the saved stored path, determine the coordinate value of the center point of the grid unit in the corresponding reference positioning path described in step S6 (x2, y2), and enter the step S73.
- step S73 the coordinate value ((xl, yl)) currently detected by the robot is replaced with (x2, y2).
- the grid unit described in the above embodiments refers to a virtual square having a side length of 20 cm, and a map having a certain length and width formed by a plurality of grid units for indicating geographical environment information is a grid. map.
- the robot can know the position of the grid unit currently located by the data detected while walking, and can update the state of the grid unit in real time, for example, marking the status of the grid unit that has passed smoothly as Walk through, mark the state of the grid cell that hits the obstacle as an obstacle, mark the state of the grid cell where the cliff is detected as a cliff, mark the state of the grid cell that has not been reached as unknown, and so on.
- the isolated object described in the above embodiments refers to a separate object that does not lie on the wall or does not lie against the wall object, and the robot can walk one circle along the edge of the independent object.
- an independent object is not only a single object, but also a plurality of objects that are close together and capable of forming a continuous footprint, and also belong to the independent object.
- the previously stored edge path described in the above embodiments refers to an edge path in the robot system that has been stored in the memory along other edges of the isolated object that meet certain conditions, including the storage edge path corresponding to the edge path.
- the edge path of the current record refers to an edge path along the edge of the current obstacle temporarily stored in the buffer area in the robot system, including the grid coordinates of the grid unit corresponding to the edge path, and the edge position along the edge.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Electromagnetism (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Manipulator (AREA)
- Numerical Control (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP18910525.7A EP3770711A4 (en) | 2018-03-19 | 2019-01-10 | METHOD OF REPOSITIONING A ROBOT |
US16/982,068 US11537142B2 (en) | 2018-03-19 | 2019-01-10 | Method for robot repositioning |
JP2020550731A JP7085296B2 (ja) | 2018-03-19 | 2019-01-10 | ロボットの再測位方法 |
KR1020207028134A KR102333984B1 (ko) | 2018-03-19 | 2019-01-10 | 로봇의 리포지셔닝 방법 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810226498.1 | 2018-03-19 | ||
CN201810226498.1A CN108508891B (zh) | 2018-03-19 | 2018-03-19 | 一种机器人重定位的方法 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2019179176A1 true WO2019179176A1 (zh) | 2019-09-26 |
WO2019179176A8 WO2019179176A8 (zh) | 2020-09-17 |
Family
ID=63377748
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2018/120198 WO2019179176A1 (zh) | 2018-03-19 | 2019-01-10 | 在此处键入发明名称 |
Country Status (6)
Country | Link |
---|---|
US (1) | US11537142B2 (zh) |
EP (1) | EP3770711A4 (zh) |
JP (1) | JP7085296B2 (zh) |
KR (1) | KR102333984B1 (zh) |
CN (1) | CN108508891B (zh) |
WO (1) | WO2019179176A1 (zh) |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108508891B (zh) * | 2018-03-19 | 2019-08-09 | 珠海市一微半导体有限公司 | 一种机器人重定位的方法 |
CN109528095B (zh) * | 2018-12-28 | 2020-11-17 | 深圳市愚公科技有限公司 | 扫地记录图的校准方法、扫地机器人及移动终端 |
CN109965797B (zh) * | 2019-03-07 | 2021-08-24 | 深圳市愚公科技有限公司 | 扫地机器人地图的生成方法、扫地机器人控制方法及终端 |
CN109916393B (zh) * | 2019-03-29 | 2023-03-31 | 电子科技大学 | 一种基于机器人位姿的多重栅格值导航方法及其应用 |
CN111837587B (zh) * | 2019-04-29 | 2024-04-19 | 苏州科瓴精密机械科技有限公司 | 自动割草机及其控制方法 |
CN111941418B (zh) * | 2019-05-15 | 2024-03-08 | 苏州科瓴精密机械科技有限公司 | 自移动机器人的控制方法及自移动机器人系统 |
CN113343739B (zh) * | 2020-03-02 | 2022-07-22 | 杭州萤石软件有限公司 | 可移动设备的重定位方法和可移动设备 |
CN111407188A (zh) * | 2020-03-27 | 2020-07-14 | 深圳市银星智能科技股份有限公司 | 移动机器人重定位方法、装置及移动机器人 |
CN111938513B (zh) * | 2020-06-30 | 2021-11-09 | 珠海市一微半导体有限公司 | 一种机器人越障的沿边路径选择方法、芯片及机器人 |
CN111897336A (zh) * | 2020-08-02 | 2020-11-06 | 珠海市一微半导体有限公司 | 一种机器人沿边行为结束的判断方法、芯片及机器人 |
US11945469B2 (en) * | 2020-11-25 | 2024-04-02 | Zoox, Inc. | Object uncertainty models |
CN112558616B (zh) * | 2020-12-28 | 2023-11-21 | 南京苏美达智能技术有限公司 | 一种智能自行走设备及控制方法 |
CN113344263B (zh) * | 2021-05-28 | 2022-12-27 | 深圳市无限动力发展有限公司 | 沿边行走过程的轨迹闭合判断方法、装置及计算机设备 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106066179A (zh) * | 2016-07-27 | 2016-11-02 | 湖南晖龙股份有限公司 | 一种基于ros操作系统的机器人位置丢失找回方法和控制系统 |
CN106092104A (zh) * | 2016-08-26 | 2016-11-09 | 深圳微服机器人科技有限公司 | 一种室内机器人的重定位方法及装置 |
CN106323273A (zh) * | 2016-08-26 | 2017-01-11 | 深圳微服机器人科技有限公司 | 一种机器人重定位方法及装置 |
US20170225891A1 (en) * | 2016-02-05 | 2017-08-10 | inVia Robotics, LLC | Robotic Navigation and Mapping |
CN107037806A (zh) * | 2016-02-04 | 2017-08-11 | 科沃斯机器人股份有限公司 | 自移动机器人重新定位方法及采用该方法的自移动机器人 |
CN108508891A (zh) * | 2018-03-19 | 2018-09-07 | 珠海市微半导体有限公司 | 一种机器人重定位的方法 |
Family Cites Families (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB0126497D0 (en) | 2001-11-03 | 2002-01-02 | Dyson Ltd | An autonomous machine |
JP2004275468A (ja) * | 2003-03-17 | 2004-10-07 | Hitachi Home & Life Solutions Inc | 自走式掃除機およびその運転方法 |
JP4533659B2 (ja) * | 2004-05-12 | 2010-09-01 | 株式会社日立製作所 | レーザー計測により地図画像を生成する装置及び方法 |
WO2007051972A1 (en) * | 2005-10-31 | 2007-05-10 | Qinetiq Limited | Navigation system |
JP2007323402A (ja) * | 2006-06-01 | 2007-12-13 | Matsushita Electric Ind Co Ltd | 自走式機器およびそのプログラム |
KR100791386B1 (ko) * | 2006-08-18 | 2008-01-07 | 삼성전자주식회사 | 이동 로봇의 영역 분리 방법 및 장치 |
JP5086942B2 (ja) * | 2008-09-02 | 2012-11-28 | トヨタ自動車株式会社 | 経路探索装置、経路探索方法、及び経路探索プログラム |
JP6162955B2 (ja) | 2009-11-06 | 2017-07-12 | アイロボット コーポレイション | 自律ロボットにより表面を完全にカバーする方法およびシステム |
CN102138769B (zh) * | 2010-01-28 | 2014-12-24 | 深圳先进技术研究院 | 清洁机器人及其清扫方法 |
JP2012194860A (ja) * | 2011-03-17 | 2012-10-11 | Murata Mach Ltd | 走行車 |
US8798840B2 (en) * | 2011-09-30 | 2014-08-05 | Irobot Corporation | Adaptive mapping with spatial summaries of sensor data |
TWI505801B (zh) * | 2014-05-09 | 2015-11-01 | Kinpo Elect Inc | 室內機器人與其定位方法 |
KR102527645B1 (ko) * | 2014-08-20 | 2023-05-03 | 삼성전자주식회사 | 청소 로봇 및 그 제어 방법 |
GB201419883D0 (en) * | 2014-11-07 | 2014-12-24 | F Robotics Acquisitions Ltd | Domestic robotic system and method |
FR3034410B1 (fr) * | 2015-04-02 | 2020-10-23 | Gebo Packaging Solutions France | Dispositif de convoyage par chariot autonome |
CN104731101B (zh) * | 2015-04-10 | 2017-08-04 | 河海大学常州校区 | 清洁机器人室内场景地图建模方法及机器人 |
CN106610665A (zh) * | 2015-10-22 | 2017-05-03 | 沈阳新松机器人自动化股份有限公司 | 一种基于gps的自主行进机器人 |
JP2017102705A (ja) * | 2015-12-02 | 2017-06-08 | 株式会社リコー | 自律移動装置及び自律移動装置システム |
CN107041718B (zh) * | 2016-02-05 | 2021-06-01 | 北京小米移动软件有限公司 | 清洁机器人及其控制方法 |
JP2018022215A (ja) * | 2016-08-01 | 2018-02-08 | 村田機械株式会社 | 移動教示装置、及び、移動教示方法 |
KR101930870B1 (ko) * | 2016-08-03 | 2018-12-20 | 엘지전자 주식회사 | 이동 로봇 및 그 제어방법 |
KR20180023302A (ko) * | 2016-08-25 | 2018-03-07 | 엘지전자 주식회사 | 이동 로봇 및 그 제어방법 |
CN107368071B (zh) * | 2017-07-17 | 2020-11-06 | 纳恩博(北京)科技有限公司 | 一种异常恢复方法及电子设备 |
-
2018
- 2018-03-19 CN CN201810226498.1A patent/CN108508891B/zh active Active
-
2019
- 2019-01-10 JP JP2020550731A patent/JP7085296B2/ja active Active
- 2019-01-10 KR KR1020207028134A patent/KR102333984B1/ko active IP Right Grant
- 2019-01-10 WO PCT/CN2018/120198 patent/WO2019179176A1/zh unknown
- 2019-01-10 US US16/982,068 patent/US11537142B2/en active Active
- 2019-01-10 EP EP18910525.7A patent/EP3770711A4/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107037806A (zh) * | 2016-02-04 | 2017-08-11 | 科沃斯机器人股份有限公司 | 自移动机器人重新定位方法及采用该方法的自移动机器人 |
US20170225891A1 (en) * | 2016-02-05 | 2017-08-10 | inVia Robotics, LLC | Robotic Navigation and Mapping |
CN106066179A (zh) * | 2016-07-27 | 2016-11-02 | 湖南晖龙股份有限公司 | 一种基于ros操作系统的机器人位置丢失找回方法和控制系统 |
CN106092104A (zh) * | 2016-08-26 | 2016-11-09 | 深圳微服机器人科技有限公司 | 一种室内机器人的重定位方法及装置 |
CN106323273A (zh) * | 2016-08-26 | 2017-01-11 | 深圳微服机器人科技有限公司 | 一种机器人重定位方法及装置 |
CN108508891A (zh) * | 2018-03-19 | 2018-09-07 | 珠海市微半导体有限公司 | 一种机器人重定位的方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3770711A4 * |
Also Published As
Publication number | Publication date |
---|---|
US11537142B2 (en) | 2022-12-27 |
CN108508891A (zh) | 2018-09-07 |
KR20200127019A (ko) | 2020-11-09 |
CN108508891B (zh) | 2019-08-09 |
EP3770711A4 (en) | 2021-09-29 |
US20210096580A1 (en) | 2021-04-01 |
EP3770711A1 (en) | 2021-01-27 |
WO2019179176A8 (zh) | 2020-09-17 |
JP2021516403A (ja) | 2021-07-01 |
KR102333984B1 (ko) | 2021-12-02 |
JP7085296B2 (ja) | 2022-06-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019179176A1 (zh) | 在此处键入发明名称 | |
EP3764186B1 (en) | Method for controlling autonomous mobile robot to travel along edge | |
CN107943025B (zh) | 机器人脱困的处理方法 | |
JP6915209B2 (ja) | 移動ロボットの地図作成方法および当該地図に基づく経路計画方法 | |
CN107544517B (zh) | 智能清洁机器人的控制方法 | |
CN108628324B (zh) | 基于矢量地图的无人车导航方法、装置、设备及存储介质 | |
EP3018603B1 (en) | Adaptive mapping with spatial summaries of sensor data | |
CN106527423B (zh) | 清洁机器人及其控制方法 | |
TWI742554B (zh) | 定位方法、路徑確定方法、機器人及儲存介質 | |
CN109407675B (zh) | 机器人回座的避障方法和芯片以及自主移动机器人 | |
CN107368079A (zh) | 机器人清扫路径的规划方法及芯片 | |
CN104536445A (zh) | 移动导航方法和系统 | |
CN108415432A (zh) | 机器人基于直边的定位方法 | |
US20110125358A1 (en) | Control method for a robot vehicle, and robot vehicle | |
JP2007213236A (ja) | 自律走行ロボットの経路計画方法及び自律走行ロボット | |
CN107678429B (zh) | 一种机器人的控制方法及芯片 | |
JP2020134528A (ja) | 定常横方向偏差の除去方法、装置、記憶媒体、及びプログラム | |
CN108873889A (zh) | 智能移动设备及其路径控制方法、计算机可读存储介质 | |
CN113475977B (zh) | 机器人路径规划方法、装置及机器人 | |
CN113494917A (zh) | 地图构建方法及系统、制定导航策略的方法以及存储介质 | |
JP5569099B2 (ja) | リンク情報生成装置及びリンク情報生成プログラム | |
KR102521940B1 (ko) | 로봇 청소기 및 그 로봇 청소기의 제어 방법 | |
CN112346446A (zh) | 自动导引运输车脱码恢复方法、装置及电子设备 | |
CN115041483A (zh) | 一种清扫机器人及其控制方法 | |
CN115981323A (zh) | 一种多传感融合智能清洁车自动避障方法及智能清洁车 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18910525 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2020550731 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20207028134 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2018910525 Country of ref document: EP Effective date: 20201019 |