CN114115212A - Cleaning robot positioning method and cleaning robot adopting same - Google Patents
Cleaning robot positioning method and cleaning robot adopting same Download PDFInfo
- Publication number
- CN114115212A CN114115212A CN202010869281.XA CN202010869281A CN114115212A CN 114115212 A CN114115212 A CN 114115212A CN 202010869281 A CN202010869281 A CN 202010869281A CN 114115212 A CN114115212 A CN 114115212A
- Authority
- CN
- China
- Prior art keywords
- room
- ceiling
- real
- cleaning robot
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000004140 cleaning Methods 0.000 title claims abstract description 53
- 238000000034 method Methods 0.000 title claims abstract description 37
- 238000000605 extraction Methods 0.000 claims description 11
- 238000001514 detection method Methods 0.000 claims description 8
- 230000002093 peripheral effect Effects 0.000 claims description 6
- 238000004364 calculation method Methods 0.000 abstract description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000013507 mapping Methods 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/24—Floor-sweeping machines, motor-driven
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
- A47L11/4061—Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
Abstract
The invention relates to a positioning method of a cleaning robot, which is characterized by comprising the following steps: step 1, obtaining a prior map of a ceiling of a room; step 2, in the actual cleaning process, the cleaning robot obtains the position of the current position in the room ceiling prior map through the room ceiling image shot by the current position: the method comprises the steps of taking a room ceiling image shot at the current position as a real-time image, taking the intersection point of two diagonal lines in the real-time image as a real-time position point, obtaining the boundary of a room ceiling in the real-time image, calculating the distance from the real-time position point to four straight lines in the boundary of the room ceiling in the real-time image, and then obtaining the specific position of the real-time position point in the prior map of the whole room ceiling according to the prior map of the room ceiling. Compared with the prior art, the invention has the advantages that: the positioning accuracy of the cleaning robot can be improved, and errors caused by calculation of positioning points can be reduced.
Description
Technical Field
The invention relates to a cleaning robot and a positioning method thereof.
Background
In the cleaning process, the cleaning robot needs to have higher positioning accuracy in order to achieve higher coverage rate and complete the path planning and navigation tasks more efficiently. However, because there are many obstacles in a room, there are certain errors in positioning and mapping, and many cleaning robots position the robots by installing a camera at the center to vertically shoot a ceiling. For example, in chinese patent No. ZL201410149081.1, "ceiling image information-based robot positioning and mapping system" and chinese patent application No. CN201610725330.6, "ceiling-based indoor mobile robot vision positioning method", a camera capable of taking a ceiling picture is provided on a cleaning robot, and by continuously comparing characteristic straight lines and characteristic points in two ceiling pictures, the comparison result is converted into a motion rotation angle and a motion displacement increment of the mobile robot in a world coordinate system, so that the self-positioning of the mobile robot is realized. By taking the ceiling as the reference target, compared with other objects, the ceiling is not easy to be blocked, elements in the photo are monotonous, and the image processing unit can conveniently extract the outline and analyze the photo.
However, the number of special marks on the ceiling is small, and the method for extracting and matching the features often causes matching errors due to close features, so that the positioning accuracy is influenced; in addition, the position of the robot is calculated by comparing the feature points of the two frames of images, certain errors exist in the feature extraction matching and calculation processes, and along with the movement of the cleaning robot, the positioning errors are accumulated continuously, so that the navigation and image building errors are increased, and the prior art is to be further improved.
Disclosure of Invention
The invention aims to solve the primary technical problem of providing a positioning method of a cleaning robot with small error and accurate positioning in the prior art.
The invention further aims to solve the technical problem of providing a cleaning robot with small error and accurate positioning.
The technical scheme adopted by the invention for solving the above-mentioned primary technical problems is as follows: a positioning method of a cleaning robot is characterized by comprising the following steps:
step 1, obtaining a room ceiling prior map, wherein the specific method comprises the following steps:
shooting a room ceiling image by using a camera arranged on a cleaning robot, extracting points in the room ceiling image with obviously different gray values from peripheral gray values as identification points, and fitting the extracted identification points into four straight lines by using a fitting method so as to obtain the boundary of the whole room ceiling and further obtain a room ceiling prior map;
step 2, in the actual cleaning process, the cleaning robot obtains the position of the current position in the room ceiling prior map through the room ceiling image shot by the current position, and the method specifically comprises the following steps:
step 2-1, a room ceiling image shot at the current position is called a real-time image, the intersection point of two diagonal lines in the real-time image is used as the position of the cleaning robot in the real-time image at the moment, and the position point is marked as a real-time position point;
2-2, extracting points with obviously different gray values of the ceiling in the real-time image and peripheral gray values as identification points, fitting the extracted identification points into a straight line by adopting a fitting method, and matching the fitted straight line with the boundary of the room ceiling prior map so as to obtain the boundary of the room ceiling in the real-time image;
and 2-3, calculating the distances from the real-time position point to four straight lines in the boundary of the ceiling of the room in the real-time image, and then obtaining the specific position of the real-time position point in the prior map of the ceiling of the whole room according to the prior map of the ceiling of the room.
As an improvement, in the step 1, the cleaning robot moves in a certain range in the room to be cleaned in an autonomous exploration mode to obtain the characteristic information of the complete ceiling in the whole room, so that a prior map of the ceiling of the room is obtained.
In step 1, the extraction method adopted when extracting the identification points in the ceiling image of the room includes, but is not limited to, a FAST feature detection method, a SURF feature detection method, a SIFT feature extraction method, and a line feature extraction method.
And then, improving, in the step 1, obtaining the prior map of the ceiling of the room, and then obtaining the length and width information of the whole room.
The technical scheme adopted by the invention for solving the further technical problems is as follows: the utility model provides a cleaning robot, includes the organism, and the top of organism is equipped with the camera, is equipped with running gear on the organism, its characterized in that: the robot body is internally provided with a controller which is electrically connected with the travelling mechanism and the camera, and the controller adopts the positioning method to position the cleaning robot.
Compared with the prior art, the invention has the advantages that:
1. the positioning precision of the cleaning robot can be improved, and errors caused by calculation of positioning points are reduced;
2. the accumulated positioning error generated after cleaning for a period of time can be reduced;
3. errors caused by image matching can be reduced.
Drawings
Fig. 1 is a flowchart of a positioning method of a cleaning robot according to an embodiment of the present invention.
Fig. 2 is a diagram of a method for acquiring a real-time location point from a real-time image taken by the cleaning robot.
FIG. 3 is a schematic diagram of a real-time image taken by the cleaning robot to fit a boundary line.
FIG. 4 is a schematic diagram of a real-time image captured by the cleaning robot with a fitted border that is linear to form a ceiling border.
Fig. 5 is a schematic diagram illustrating a calculation method of a distance between a real-time position point and a ceiling boundary in a real-time image captured by the cleaning robot.
Detailed Description
The invention is described in detail below with reference to the accompanying drawings.
As shown in fig. 1, the present invention provides a positioning method of a cleaning robot, characterized by comprising the steps of:
step 1, obtaining a room ceiling prior map, wherein the specific method comprises the following steps:
the cleaning robot moves in a room to be cleaned within a certain range in an autonomous exploration mode, a wide-angle camera or a 360-degree fisheye camera arranged at the top of the cleaning robot is used for shooting a room ceiling image, points with gray values obviously different from peripheral gray values in the room ceiling image are extracted as identification points by adopting a FAST (FAST fourier transform) feature detection method or a SURF (speeded up robust features) feature detection method or a SIFT (scale invariant feature transform) feature extraction method or a line feature extraction method, the extracted identification points are fitted into four straight lines by adopting a fitting method, so that the boundary of the whole room ceiling is obtained, a room ceiling prior map is obtained, and direct length and width information of the four boundaries in the room ceiling is obtained;
step 2, in the actual cleaning process, the cleaning robot obtains the position of the current position in the room ceiling prior map through the room ceiling image shot by the current position, and the method specifically comprises the following steps:
step 2-1, a room ceiling image shot at the current position is called a real-time image, the intersection point of two diagonal lines in the real-time image is used as the position of the cleaning robot in the real-time image at the moment, and the position point is marked as a real-time position point; taking a real-time image shot by the cleaning robot as an example, the intersection point of the diagonal lines is used as a real-time position point, which is shown in fig. 2;
2-2, extracting points with the gray value of the ceiling obviously different from the peripheral gray value in the real-time image as identification points by adopting a FAST feature detection method or a SURF feature detection method or a SIFT feature extraction method or a line feature extraction method, and fitting the extracted identification points into a straight line by adopting a fitting method; similarly, taking a real-time image shot by the cleaning robot as an example, the fitted straight line is shown in fig. 3; if the fitted straight line cannot form a complete rectangle or square, obtaining a complete rectangle or square by drawing a virtual extension line on the basis of the existing straight line, as shown in fig. 4, and then matching the fitted boundary straight line with the length information of the room in the room ceiling prior map according to the length and width information of the room, thereby obtaining the boundary of the room ceiling in the real-time image;
and 2-3, finally calculating the distances from the real-time position point to four straight lines in the boundary of the ceiling of the room in the real-time image, as shown in fig. 5, and then obtaining the specific position of the real-time position point in the prior map of the ceiling of the whole room according to the prior map of the ceiling of the room.
The invention also provides a cleaning robot which comprises a machine body, wherein the top of the machine body is provided with the camera, the machine body is provided with the travelling mechanism, the machine body is internally provided with a controller which is electrically connected with the travelling mechanism and the camera, and the controller adopts the positioning method to position the cleaning robot.
Claims (5)
1. A positioning method of a cleaning robot is characterized by comprising the following steps:
step 1, obtaining a room ceiling prior map, wherein the specific method comprises the following steps:
shooting a room ceiling image by using a camera arranged on a cleaning robot, extracting points in the room ceiling image with obviously different gray values from peripheral gray values as identification points, and fitting the extracted identification points into four straight lines by using a fitting method so as to obtain the boundary of the whole room ceiling and further obtain a room ceiling prior map;
step 2, in the actual cleaning process, the cleaning robot obtains the position of the current position in the room ceiling prior map through the room ceiling image shot by the current position, and the method specifically comprises the following steps:
step 2-1, a room ceiling image shot at the current position is called a real-time image, the intersection point of two diagonal lines in the real-time image is used as the position of the cleaning robot in the real-time image at the moment, and the position point is marked as a real-time position point;
2-2, extracting points with obviously different gray values of the ceiling in the real-time image and peripheral gray values as identification points, fitting the extracted identification points into a straight line by adopting a fitting method, and matching the fitted straight line with the boundary of the room ceiling prior map so as to obtain the boundary of the room ceiling in the real-time image;
and 2-3, calculating the distances from the real-time position point to four straight lines in the boundary of the ceiling of the room in the real-time image, and then obtaining the specific position of the real-time position point in the prior map of the ceiling of the whole room according to the prior map of the ceiling of the room.
2. The positioning method of a cleaning robot according to claim 1, wherein: in the step 1, the cleaning robot moves in a certain range in the room to be cleaned in an autonomous exploration mode to obtain the characteristic information of the complete ceiling in the whole room, so that a prior map of the ceiling of the room is obtained.
3. The positioning method of a cleaning robot according to claim 1, characterized in that: in the step 1, the extraction method adopted when extracting the identification points in the ceiling image of the room includes, but is not limited to, a FAST feature detection method, a SURF feature detection method, a SIFT feature extraction method, and a line feature extraction method.
4. The positioning method of a cleaning robot according to claim 1, characterized in that: in the step 1, after a prior map of the ceiling of the room is obtained, length and width information of the whole room is obtained; in the step 2-2, the fitted straight line is matched with the length information of the room in the room ceiling prior map according to the length and width information of the room, so that the boundary of the room ceiling in the real-time image is obtained.
5. The utility model provides a cleaning robot, includes the organism, and the top of organism is equipped with the camera, is equipped with running gear on the organism, its characterized in that: the cleaning robot is characterized in that a controller electrically connected with the walking mechanism and the camera is arranged in the machine body, and the cleaning robot is positioned by the controller according to the positioning method of claim 1.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010869281.XA CN114115212A (en) | 2020-08-26 | 2020-08-26 | Cleaning robot positioning method and cleaning robot adopting same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010869281.XA CN114115212A (en) | 2020-08-26 | 2020-08-26 | Cleaning robot positioning method and cleaning robot adopting same |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114115212A true CN114115212A (en) | 2022-03-01 |
Family
ID=80373948
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010869281.XA Pending CN114115212A (en) | 2020-08-26 | 2020-08-26 | Cleaning robot positioning method and cleaning robot adopting same |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114115212A (en) |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005283541A (en) * | 2004-03-31 | 2005-10-13 | Hitachi Plant Eng & Constr Co Ltd | Tunnel ceiling surface marking method and apparatus |
KR20120060060A (en) * | 2010-12-01 | 2012-06-11 | 한국기술교육대학교 산학협력단 | Self localization system of robot using landmark |
CN106338287A (en) * | 2016-08-24 | 2017-01-18 | 杭州国辰牵星科技有限公司 | Ceiling-based indoor moving robot vision positioning method |
CN206074001U (en) * | 2016-09-21 | 2017-04-05 | 旗瀚科技有限公司 | A kind of robot indoor locating system based on 3D video cameras |
CN106651990A (en) * | 2016-12-23 | 2017-05-10 | 芜湖哈特机器人产业技术研究院有限公司 | Indoor map construction method and indoor map-based indoor locating method |
CN108036786A (en) * | 2017-12-01 | 2018-05-15 | 安徽优思天成智能科技有限公司 | Position and posture detection method, device and computer-readable recording medium based on auxiliary line |
CN109074084A (en) * | 2017-08-02 | 2018-12-21 | 珊口(深圳)智能科技有限公司 | Control method, device, system and the robot being applicable in of robot |
CN109828280A (en) * | 2018-11-29 | 2019-05-31 | 亿嘉和科技股份有限公司 | A kind of localization method and autonomous charging of robots method based on three-dimensional laser grid |
CN110363738A (en) * | 2018-04-08 | 2019-10-22 | 中南大学 | A kind of retinal images method for registering and its device with affine-invariant features |
CN110554700A (en) * | 2019-09-03 | 2019-12-10 | 韦云智 | method for identifying room and door of mobile robot |
-
2020
- 2020-08-26 CN CN202010869281.XA patent/CN114115212A/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005283541A (en) * | 2004-03-31 | 2005-10-13 | Hitachi Plant Eng & Constr Co Ltd | Tunnel ceiling surface marking method and apparatus |
KR20120060060A (en) * | 2010-12-01 | 2012-06-11 | 한국기술교육대학교 산학협력단 | Self localization system of robot using landmark |
CN106338287A (en) * | 2016-08-24 | 2017-01-18 | 杭州国辰牵星科技有限公司 | Ceiling-based indoor moving robot vision positioning method |
CN206074001U (en) * | 2016-09-21 | 2017-04-05 | 旗瀚科技有限公司 | A kind of robot indoor locating system based on 3D video cameras |
CN106651990A (en) * | 2016-12-23 | 2017-05-10 | 芜湖哈特机器人产业技术研究院有限公司 | Indoor map construction method and indoor map-based indoor locating method |
CN109074084A (en) * | 2017-08-02 | 2018-12-21 | 珊口(深圳)智能科技有限公司 | Control method, device, system and the robot being applicable in of robot |
CN108036786A (en) * | 2017-12-01 | 2018-05-15 | 安徽优思天成智能科技有限公司 | Position and posture detection method, device and computer-readable recording medium based on auxiliary line |
CN110363738A (en) * | 2018-04-08 | 2019-10-22 | 中南大学 | A kind of retinal images method for registering and its device with affine-invariant features |
CN109828280A (en) * | 2018-11-29 | 2019-05-31 | 亿嘉和科技股份有限公司 | A kind of localization method and autonomous charging of robots method based on three-dimensional laser grid |
CN110554700A (en) * | 2019-09-03 | 2019-12-10 | 韦云智 | method for identifying room and door of mobile robot |
Non-Patent Citations (3)
Title |
---|
夏泽邑 等: "视觉模型标定中的高精度图像特征提取算法", 计算机辅助设计与图形学学报, vol. 17, no. 04, 30 April 2005 (2005-04-30), pages 819 - 824 * |
谭定忠 等: "室内移动机器人定位方法的研究", 机械与电子, no. 9, 30 September 2005 (2005-09-30), pages 46 - 47 * |
韩立伟;徐德;: "基于直线和单特征点的移动机器人视觉推算定位法", 机器人, vol. 30, no. 01, 31 January 2008 (2008-01-31), pages 79 - 84 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021196294A1 (en) | Cross-video person location tracking method and system, and device | |
CN109074083B (en) | Movement control method, mobile robot, and computer storage medium | |
US9990726B2 (en) | Method of determining a position and orientation of a device associated with a capturing device for capturing at least one image | |
US8265425B2 (en) | Rectangular table detection using hybrid RGB and depth camera sensors | |
CN106989683B (en) | A kind of shield tail clearance of shield machine vision measuring method | |
CN110458161B (en) | Mobile robot doorplate positioning method combined with deep learning | |
CN108171715B (en) | Image segmentation method and device | |
CN110827353B (en) | Robot positioning method based on monocular camera assistance | |
CN104023228A (en) | Self-adaptive indoor vision positioning method based on global motion estimation | |
Momeni-k et al. | Height estimation from a single camera view | |
JPWO2016016955A1 (en) | Autonomous mobile device and self-position estimation method | |
CN113223050B (en) | Robot motion track real-time acquisition method based on Aruco code | |
CN108544494A (en) | A kind of positioning device, method and robot based on inertia and visual signature | |
JP6410231B2 (en) | Alignment apparatus, alignment method, and computer program for alignment | |
Betge-Brezetz et al. | Object-based modelling and localization in natural environments | |
CN115205825B (en) | Traffic sign detection and identification method based on improved YOLOV5 driving video sequence image | |
Strasdat et al. | Multi-cue localization for soccer playing humanoid robots | |
CN114115212A (en) | Cleaning robot positioning method and cleaning robot adopting same | |
CN111239761B (en) | Method for indoor real-time establishment of two-dimensional map | |
CN110031014B (en) | Visual positioning method based on pattern recognition | |
CN203077301U (en) | Real-time detection device for positions and angles of wheel type motion robot | |
CN113689365A (en) | Target tracking and positioning method based on Azure Kinect | |
Lee et al. | Visual odometry for absolute position estimation using template matching on known environment | |
CN112419409A (en) | Pose estimation method based on real-time video | |
KR101725685B1 (en) | Method and apparatus for detecting localization of mobile robot by ceiling outline detection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |