CN112683266A - Robot and navigation method thereof - Google Patents

Robot and navigation method thereof Download PDF

Info

Publication number
CN112683266A
CN112683266A CN201910989314.1A CN201910989314A CN112683266A CN 112683266 A CN112683266 A CN 112683266A CN 201910989314 A CN201910989314 A CN 201910989314A CN 112683266 A CN112683266 A CN 112683266A
Authority
CN
China
Prior art keywords
robot
image
included angle
working scene
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910989314.1A
Other languages
Chinese (zh)
Inventor
宋庆祥
朱永康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecovacs Robotics Suzhou Co Ltd
Original Assignee
Ecovacs Robotics Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecovacs Robotics Suzhou Co Ltd filed Critical Ecovacs Robotics Suzhou Co Ltd
Priority to CN201910989314.1A priority Critical patent/CN112683266A/en
Publication of CN112683266A publication Critical patent/CN112683266A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention discloses a robot and a navigation method thereof, wherein the navigation method comprises the following steps: collecting an image of a working scene through the camera; calculating to obtain a reference direction of the working scene according to the collected image of the working scene; correcting the directions of coordinate axes in a cruise coordinate system of the robot according to the reference direction; and navigating the robot by adopting the corrected cruise coordinate system. When the robot is started or works, a group of mutually parallel lines with the largest number of lines parallel to or perpendicular to the lines are searched and selected from the collected images of the working scene as reference lines, the direction of the reference lines is taken as a reference direction, and the direction of a coordinate axis in a cruise coordinate system of the robot is corrected by the reference direction, so that the robot works by a scientific and reasonable navigation path, and the working efficiency of the robot is improved.

Description

Robot and navigation method thereof
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a robot and a navigation method thereof.
Background
Artificial Intelligence (AI) is a very popular field in recent years and is a technical science that studies and develops theories, methods, techniques and application systems for simulating, extending and expanding human Intelligence.
Robots (Robot) are machine devices that automatically perform work, and can act according to an action pattern established by artificial intelligence to assist or replace human beings in certain work, such as production, construction, etc.
Disclosure of Invention
The technical problem solved by the invention is as follows: how the robot works with a scientific and reasonable navigation path.
In order to solve the above technical problem, an embodiment of the present invention provides a navigation method for a robot, where the robot is provided with a camera, and the navigation method includes:
collecting an image of a working scene through the camera;
calculating to obtain a reference direction of the working scene according to the collected image of the working scene;
correcting the directions of coordinate axes in a cruise coordinate system of the robot according to the reference direction;
and navigating the robot by adopting the corrected cruise coordinate system.
Optionally, the calculating the reference direction of the working scene according to the acquired image of the working scene includes:
performing homography transformation on the acquired image of the working scene to obtain a transformed image, wherein the image plane of the transformed image is parallel to the ceiling of the working scene;
and searching in the transformed image to identify line segments on the ceiling of the working scene.
Optionally, the calculating the reference direction of the working scene according to the acquired image of the working scene includes:
and searching the collected image of the working scene, and identifying line segments on the ceiling of the working scene.
Optionally, the calculating the reference direction of the working scene according to the acquired image of the working scene includes:
and selecting the line segments meeting preset conditions from the searched line segments as reference lines, and determining the reference direction according to the direction of the reference lines.
Optionally, the preset conditions are: and the other line segments are parallel to the searched line segments.
Optionally, the preset conditions are: and the other line segments are vertical to the searched line segments.
Optionally, the correcting the directions of the coordinate axes in the cruise coordinate system of the robot by using the reference direction includes:
calculating a first included angle, wherein the first included angle is an included angle between the direction of the main shaft of the image and the direction of a coordinate axis in a cruise coordinate system of the robot before correction;
calculating a second included angle, wherein the second included angle is an included angle between the reference direction and the main axis direction of the image;
calculating a third included angle according to the first included angle and the second included angle, wherein the third included angle is an included angle between the reference direction and the direction of a coordinate axis in the cruise coordinate system of the robot before correction;
and correcting the direction of the coordinate axis in the cruise coordinate system of the robot according to the third included angle.
Optionally, the timing of the camera acquiring the image of the working scene is when the robot is started or during the working process of the robot.
Optionally, the robot is controlled to rotate and/or move, so that a shooting area of a camera of the robot rotates for a circle, and the camera acquires images of a working scene at preset time intervals.
In order to solve the above technical problem, an embodiment of the present invention further provides a robot, including: the device comprises an image acquisition unit, a calculation unit, a coordinate correction unit and a navigation unit; wherein:
the image acquisition unit acquires an image of a working scene through the camera;
the calculation unit is used for calculating the reference direction of the working scene according to the collected image of the working scene;
a coordinate correction unit correcting the directions of coordinate axes in a cruise coordinate system of the robot in the reference direction;
and the navigation unit is used for navigating the robot by adopting the corrected cruise coordinate system.
Optionally, the calculating unit calculates the reference direction of the working scene according to the acquired image of the working scene, and includes:
performing homography transformation on the acquired image of the working scene to obtain a transformed image, wherein the image plane of the transformed image is parallel to the ceiling of the working scene;
and searching in the transformed image to identify line segments on the ceiling of the working scene.
Optionally, the calculating unit calculates the reference direction of the working scene according to the acquired image of the working scene, and includes:
and searching the collected image of the working scene, and identifying line segments on the ceiling of the working scene.
Optionally, the calculating unit calculates the reference direction of the working scene according to the collected image of the working scene, and includes:
and selecting the line segments meeting preset conditions from the searched line segments as reference lines, and determining the reference direction according to the direction of the reference lines.
Optionally, the preset conditions are: parallel to other searched line segments.
Optionally, the preset conditions are: perpendicular to other searched line segments.
Optionally, the coordinate correcting unit correcting the directions of the coordinate axes in the cruise coordinate system of the robot by using the reference direction includes:
calculating a first included angle, wherein the first included angle is an included angle between the direction of the main shaft of the image and the direction of a coordinate axis in a cruise coordinate system of the robot before correction;
calculating a second included angle, wherein the second included angle is an included angle between the reference direction and the main axis direction of the image;
calculating a third included angle according to the first included angle and the second included angle, wherein the third included angle is an included angle between the reference direction and the direction of a coordinate axis in the cruise coordinate system of the robot before correction;
and correcting the direction of the coordinate axis in the cruise coordinate system of the robot according to the third included angle.
Optionally, the image of the working scene is acquired by the camera of the image acquisition unit when the robot is started up or during the working process of the robot.
Optionally, the robot is controlled to rotate and/or move, so that a shooting area of a camera of the robot rotates for a circle, and the camera acquires images of a working scene at preset time intervals by the image acquisition unit.
Compared with the prior art, the technical scheme of the invention has the following beneficial effects:
when the robot is started or in the working process, images of a working scene are collected, a group of mutually parallel lines with the largest number of lines parallel or perpendicular to the lines are searched and selected from the collected images of the working scene to serve as reference lines, the direction of the reference lines is taken as a reference direction, the direction of a coordinate axis in a cruise coordinate system of the robot is corrected according to the reference direction, and the corrected cruise coordinate system is adopted to navigate the robot, so that the robot works on a scientific and reasonable navigation path, and the working efficiency of the robot is improved.
Furthermore, the image plane of the camera is adjusted to be parallel to the ceiling of the working scene, the image of the ceiling of the working scene is collected through the camera, the line structure on the ceiling does not change on the image under the condition that the image plane is parallel to the ceiling, the parallel and perpendicular relation between the lines is still kept, and in order to utilize the characteristics, the image plane of the camera is adjusted to be parallel to the ceiling of the working scene by using a homography transformation method (for example, for the forward looking condition), so that the calculated amount of image transformation can be saved, and the calibration of the robot cruise coordinate system can be completed with higher efficiency.
Drawings
FIG. 1 is a flowchart of a method for navigating a robot according to an embodiment of the present invention;
FIG. 2 is a flow chart illustrating sub-steps of a method for navigating a robot according to an embodiment of the present invention;
FIG. 3 is a flow chart illustrating another sub-step of a method for navigating a robot according to an embodiment of the present invention;
FIG. 4 is a flow chart illustrating another sub-step of a method for navigating a robot in accordance with an embodiment of the present invention;
fig. 5 is a schematic view of a sweeping robot according to an embodiment of the present invention.
Detailed Description
The invention relates to a floor sweeping robot (also called an automatic cleaner, intelligent dust collection, a robot dust collector, a floor cleaner and the like), which is one of intelligent household appliances, and can automatically complete the floor sweeping work in a room by means of certain artificial intelligence after being started.
In the process of executing cleaning work, how to scientifically and reasonably plan the cleaning path of the cleaning robot is an important factor which directly influences the work efficiency.
The early robot of sweeping the floor belongs to fool's collision response, only can turn around after colliding the barrier, and intelligent degree is lower.
Some advanced sweeping robots can sense surrounding environments through cameras, lasers, magnetic induction and the like, and plan reasonable sweeping paths through a certain algorithm, so that missing sweeping and repeated sweeping are avoided as much as possible.
One more scientific cleaning path is "bow" shaped cleaning. The navigation system of the robot establishes a cruise coordinate system, performs straight line cleaning along coordinate axes (x-axis and y-axis) or a direction parallel to the coordinate axes, and turns at a right angle when turning is required.
In the prior art, the sweeping robot has various algorithms for establishing a cruise coordinate system and various sensors, for example, a magnetic sensor may be used to identify an origin magnetic nail, and a single-frame image is acquired based on an initial position to establish a coordinate system.
Specifically, the sweeping robot is started at the position of the original point magnetic nail, the sweeping robot identifies the original point magnetic nail through a magnetic sensor of the sweeping robot, and the position of the original point magnetic nail is determined as the coordinate original point of the cruise coordinate system.
The prior art scheme is adopted to help to plan a better navigation path, but still has further room for improvement.
After research, the following results are found: the coordinate axis direction of the cruise coordinate system also has obvious influence on the working efficiency of the robot. In practice, the sweeping robot usually works in an indoor environment, sweeps along the direction of a house wall, and can effectively avoid missing sweeping. Therefore, the invention designs a navigation method (and a device) of the robot based on vision, which can enable the robot to clean along the direction of the house wall as much as possible.
When the robot is started or works, the image of the working scene is collected, a group of mutually parallel lines with the largest number of lines parallel or perpendicular to the lines are searched and selected from the collected image of the working scene as reference lines, the direction of the reference lines is taken as a reference direction, the direction of coordinate axes in a cruise coordinate system of the robot is corrected by the reference direction, and the corrected cruise coordinate system is adopted to navigate the robot, so that the robot works by a scientific and reasonable navigation path, and the working efficiency of the robot is improved.
In order that those skilled in the art will better understand and realize the present invention, the following detailed description is given by way of specific embodiments with reference to the accompanying drawings.
Example one
As described below, embodiments of the present invention provide a navigation method for a robot.
First, the implementation of the navigation method of the robot in this embodiment requires that the robot has a function of visual perception, and may be a robot based on VSLAM visual positioning, for example.
Specifically, a camera may be mounted on the robot.
In the subsequent steps, the cruise coordinate system of the robot navigation system needs to be corrected by means of the images acquired by the camera.
In addition, the robot needs to have a certain computing power, and may have a built-in processor, for example.
The invention is suitable for various household robot products with vision sensors, such as sweeping robots and the like.
Referring to a flow chart of a navigation method of the robot shown in fig. 1, the following detailed description is made through specific steps:
and S101, acquiring an image of a working scene through the camera.
The camera here is the camera mounted on the robot.
In some embodiments, after the robot is powered on, sometimes the robot is powered on at a position close to a wall, in this case, most or all of the image collected by the camera of the robot is information of the wall, and in the image, information about line segments on the ceiling is very little, so that the robot needs to rotate one circle or walk for a certain distance (i.e., rotate and/or move) to collect the image, so as to collect a useful picture of the ceiling (as an image of a work scene), and further to extract useful line segments from the useful picture (as a search and) by the robot.
In some embodiments, referring to the sub-step flowchart shown in fig. 2, the step of acquiring an image of a work scene by the camera may specifically include the following sub-steps:
and S1011, adjusting the camera to be in a top view state.
The difference between using top view and using front view is:
when the robot camera looks at the top, the image plane is parallel to the ceiling, the line structure on the ceiling does not change on the image, the parallel and vertical relation between the lines is still maintained, the characteristic is utilized, the line segments which are parallel and vertical to each other are searched on the image plane, and the included angle between the line direction and the image main shaft is calculated to be used as the rotation angle required by the machine rotation.
When the visual angle of the robot is a front visual angle, a certain included angle exists between an image plane and a ceiling, because of the perspective projection relationship of the camera, the line structure on the ceiling can be changed on an image, the parallel and vertical relationship between the lines is difficult to maintain, the images shot under different visual angles can be converted by using a homography conversion method to obtain a converted image, the image plane of the converted image is parallel to the ceiling of a working scene (note that two images obtained by the same camera at the same plane shot at different poses can be converted by using homography), for example, a homography matrix is used for converting a front view image into a top view image, and the parallel and vertical relationship between the ceiling lines is recovered.
And S1012, acquiring an image of the ceiling of the working scene through the camera.
The above description of the technical solution shows that: in this embodiment, when the robot is started or in the working process, an image of a working scene is acquired, a group of mutually parallel lines with the largest number of lines parallel to or perpendicular to the lines is searched and selected as reference lines in the acquired image of the working scene, the direction of the reference lines is taken as a reference direction, the direction of coordinate axes in a cruise coordinate system of the robot is corrected by the reference direction, and the corrected cruise coordinate system is adopted to navigate the robot, so that the robot works on a scientific and reasonable navigation path, and the working efficiency of the robot is improved.
In some embodiments, the timing for acquiring the image of the working scene through the camera may be when the robot is powered on, or may be during the working process of the robot.
For example, after the robot is turned on, the robot is controlled to rotate and/or move, so that a shooting area of a camera of the robot rotates by one circle (the rotation of one circle can avoid a line structure that cannot be extracted to a ceiling due to limited vision caused by the initial position of the robot), and images of a work scene are acquired at preset time intervals.
For another example, in the working process of the robot, the camera acquires images of a working scene at preset time intervals, and subsequent steps are executed on the basis of the images, so that the cruise coordinate system is corrected in real time in the working process of the robot.
Wherein the predetermined time interval may be 200 ms.
In the method provided by the embodiment, before the method is started, the robot can be in any orientation.
And S102, calculating to obtain the reference direction of the working scene according to the collected image of the working scene.
After the image of the working scene is obtained, the reference direction of the working scene is calculated according to the collected image of the working scene.
In some embodiments, a vector expression of the directions of the reference lines in the initial coordinate system of the robot may be calculated based on the robot pose data, then the directions perpendicular and parallel to each other in all the images are counted, the direction of the reference line with the largest number is taken as the direction of the wall of the room (i.e., the reference direction of the working scene), and then the direction of the cruise coordinate system is corrected to be consistent.
In some embodiments, referring to the sub-step flowchart shown in fig. 3, the calculating the reference direction of the working scene according to the acquired image of the working scene may specifically include the following sub-steps:
and S1021, searching the collected image of the working scene, and identifying line segments on the ceiling of the working scene.
In another embodiment, in the previous step, performing homography transformation on the acquired image of the working scene to obtain a transformed image, wherein an image plane of the transformed image is parallel to a ceiling of the working scene; and searching in the transformed image to identify line segments on the ceiling of the working scene.
Specifically, a length threshold may be preset, a line segment with a length smaller than the length threshold is ignored, and a line segment with a length reaching the length threshold is searched for on a ceiling of a working scene.
S1022, selecting the line segment meeting the preset condition from the searched line segments as a reference line, and determining the reference direction according to the direction of the reference line.
In some embodiments, the preset conditions are: parallel to other searched line segments.
In some embodiments, the preset conditions are: perpendicular to other searched line segments.
In some embodiments, the preset conditions are: the reference score is highest.
Wherein, the reference of the line segment is divided into: and weighted total scores of other line segments parallel or perpendicular to the line segment, wherein the longer the length of the other line segments parallel or perpendicular to the line segment, the greater the weight.
For example, among the searched line segments, there are 4 line segments parallel or perpendicular to the line segment a, and the weights of the 4 line segments are 5, 3, 2, and 2 respectively according to the lengths of the 4 line segments (the longer the length is, the larger the weight is);
the weighted total score of segment a is 1 × (5+3+2+2) ═ 12;
3 line segments are parallel or vertical to the line segment B, and the weights of the 3 line segments are respectively 7, 6 and 4 according to the lengths of the 3 line segments (the longer the length is, the larger the weight is);
the weighted total score of segment B is 1 × (7+6+4) ═ 17;
and comparing the weighted total scores of the line segments, wherein the weighted total score of the line segment A is less than the weighted total score of the line segment B (of course, the line segments C, D and the like can be used, which is not described herein), and under the condition, selecting the line segment B as a reference line.
In some embodiments, the direction vectors of the reference lines under the initial world system can be calculated, and whether the lines are parallel or perpendicular can be determined according to the direction vectors.
And S103, correcting the directions of the coordinate axes in the cruise coordinate system of the robot by using the reference direction.
In some embodiments, referring to the sub-step flowchart shown in fig. 4, the step of correcting the directions of the coordinate axes in the cruise coordinate system of the robot by the reference direction may specifically include the sub-steps of:
and S1031, calculating the first included angle.
And the first included angle is an included angle between the direction of the main shaft of the image and the direction of a coordinate axis in the cruise coordinate system of the robot before correction.
In some embodiments, the direction of the principal axis of the image is the same as the direction of the coordinate axes in the cruise coordinate system of the robot before correction (e.g., the direction of the coordinate axes in the initial cruise coordinate system), as is the case with a top-view robot, in which case the first angle is 0.
In other embodiments, the direction of the principal axis of the image may not be the same as the direction of the coordinate axis in the cruise coordinate system of the robot before correction, for example, in the case of forward view.
And S1032, calculating a second included angle.
And the second included angle is an included angle between the reference direction and the main axis direction of the image.
And S1033, calculating a third included angle according to the first included angle and the second included angle.
And the third included angle is an included angle between the reference direction and the direction of a coordinate axis in the cruise coordinate system of the robot before correction.
S1034, correcting the direction of the coordinate axis in the cruise coordinate system of the robot according to the third included angle.
The orientation of the robot may then be rotated according to the third angle.
And S104, navigating the robot by adopting the corrected cruise coordinate system.
In some embodiments, the robot may be configured to perform cleaning and the like along a "bow" shaped path.
The above description of the technical solution shows that: in this embodiment, when the robot is started or in the working process, an image of a working scene is acquired, a group of mutually parallel lines with the largest number of lines parallel to or perpendicular to the lines is searched and selected as reference lines in the acquired image of the working scene, the direction of the reference lines is taken as a reference direction, the direction of coordinate axes in a cruise coordinate system of the robot is corrected by the reference direction, and the corrected cruise coordinate system is adopted to navigate the robot, so that the robot works on a scientific and reasonable navigation path, and the working efficiency of the robot is improved.
In addition, the invention does not bring extra hardware cost, for example, no new hardware sensor is needed to be added, the problem of the calibration of the robot cruise coordinate system is solved by utilizing an algorithm, the method is simple and convenient, and the cost is not increased.
Example two
As described below, embodiments of the present invention provide a robot.
In some embodiments, as shown in fig. 5, the robot may be a sweeping robot, or other home robot product with vision sensors.
The robot includes: the device comprises an image acquisition unit, a calculation unit, a coordinate correction unit and a navigation unit; the main functions of each unit are as follows:
the image acquisition unit acquires an image of a working scene through the camera;
the calculation unit is used for calculating the reference direction of the working scene according to the collected image of the working scene;
a coordinate correction unit correcting the directions of coordinate axes in a cruise coordinate system of the robot in the reference direction;
and the navigation unit is used for navigating the robot by adopting the corrected cruise coordinate system.
The above description of the technical solution shows that: in this embodiment, when the robot is started or in the working process, an image of a working scene is acquired, a group of mutually parallel lines with the largest number of lines parallel to or perpendicular to the lines is searched and selected as reference lines in the acquired image of the working scene, the direction of the reference lines is taken as a reference direction, the direction of coordinate axes in a cruise coordinate system of the robot is corrected by the reference direction, and the corrected cruise coordinate system is adopted to navigate the robot, so that the robot works on a scientific and reasonable navigation path, and the working efficiency of the robot is improved.
In some embodiments, as shown in fig. 2 and 3, the calculating unit calculates the reference direction of the working scene according to the acquired image of the working scene, including:
performing homography transformation on the acquired image of the working scene to obtain a transformed image, wherein the image plane of the transformed image is parallel to the ceiling of the working scene;
and searching in the transformed image to identify line segments on the ceiling of the working scene.
The above description of the technical solution shows that: in this embodiment, when the robot is started or in the working process, an image of a working scene is acquired, a group of mutually parallel lines with the largest number of lines parallel to or perpendicular to the lines is searched and selected as reference lines in the acquired image of the working scene, the direction of the reference lines is taken as a reference direction, the direction of coordinate axes in a cruise coordinate system of the robot is corrected by the reference direction, and the corrected cruise coordinate system is adopted to navigate the robot, so that the robot works on a scientific and reasonable navigation path, and the working efficiency of the robot is improved.
In some embodiments, as shown in fig. 3, the calculating unit calculates the reference direction of the working scene according to the acquired image of the working scene, including:
and searching the collected image of the working scene, and identifying line segments on the ceiling of the working scene.
In some embodiments, the calculating unit calculates the reference direction of the working scene according to the acquired image of the working scene, including:
and selecting the line segments meeting preset conditions from the searched line segments as reference lines, and determining the reference direction according to the direction of the reference lines.
In some embodiments, the preset conditions are: parallel to other searched line segments.
In some embodiments, the preset conditions are: perpendicular to other searched line segments.
In some embodiments, as shown in fig. 4, the coordinate correcting unit correcting the directions of the coordinate axes in the cruise coordinate system of the robot in the reference direction includes:
calculating a first included angle, wherein the first included angle is an included angle between the direction of the main shaft of the image and the direction of a coordinate axis in a cruise coordinate system of the robot before correction;
calculating a second included angle, wherein the second included angle is an included angle between the reference direction and the main axis direction of the image;
calculating a third included angle according to the first included angle and the second included angle, wherein the third included angle is an included angle between the reference direction and the direction of a coordinate axis in the cruise coordinate system of the robot before correction;
and correcting the direction of the coordinate axis in the cruise coordinate system of the robot according to the third included angle.
In some embodiments, the image of the working scene is acquired by the camera of the image acquisition unit when the robot is powered on or during the working process of the robot.
In some embodiments, the robot is controlled to rotate and/or move, so that the shooting area of a camera of the robot rotates for one circle, and the image acquisition unit acquires images of a work scene at preset time intervals.
Those skilled in the art will understand that, in the methods of the embodiments, all or part of the steps can be performed by hardware associated with program instructions, and the program can be stored in a computer-readable storage medium, which can include: ROM, RAM, magnetic or optical disks, and the like.
Although the present invention is disclosed above, the present invention is not limited thereto. Various changes and modifications may be effected therein by one skilled in the art without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (18)

1. A navigation method of a robot is characterized in that a camera is installed on the robot, and the navigation method comprises the following steps:
collecting an image of a working scene through the camera;
calculating to obtain a reference direction of the working scene according to the collected image of the working scene;
correcting the directions of coordinate axes in a cruise coordinate system of the robot according to the reference direction;
and navigating the robot by adopting the corrected cruise coordinate system.
2. The method of claim 1, wherein the calculating the reference direction of the work scene from the captured image of the work scene comprises:
performing homography transformation on the acquired image of the working scene to obtain a transformed image, wherein the image plane of the transformed image is parallel to the ceiling of the working scene;
and searching in the transformed image to identify line segments on the ceiling of the working scene.
3. The method of claim 1, wherein the calculating a reference direction of the work scene from the captured image of the work scene comprises:
and searching the collected image of the working scene, and identifying line segments on the ceiling of the working scene.
4. The method for navigating a robot according to claim 2 or 3, wherein the calculating a reference direction of a work scene from the captured image of the work scene comprises:
and selecting the line segments meeting preset conditions from the searched line segments as reference lines, and determining the reference direction according to the direction of the reference lines.
5. The navigation method of a robot according to claim 4, wherein the preset condition is: and the other line segments are parallel to the searched line segments.
6. The navigation method of a robot according to claim 4, wherein the preset condition is: and the other line segments are vertical to the searched line segments.
7. The navigation method of a robot according to claim 1, wherein the correcting the direction of the coordinate axis in the cruise coordinate system of the robot with the reference direction comprises:
calculating a first included angle, wherein the first included angle is an included angle between the direction of the main shaft of the image and the direction of a coordinate axis in a cruise coordinate system of the robot before correction;
calculating a second included angle, wherein the second included angle is an included angle between the reference direction and the main axis direction of the image;
calculating a third included angle according to the first included angle and the second included angle, wherein the third included angle is an included angle between the reference direction and the direction of a coordinate axis in the cruise coordinate system of the robot before correction;
and correcting the direction of the coordinate axis in the cruise coordinate system of the robot according to the third included angle.
8. The navigation method for the robot according to claim 1, wherein the camera captures the image of the working scene at a time when the robot is powered on or during the operation of the robot.
9. The navigation method of a robot according to claim 1, wherein the robot is controlled to turn and/or move so that a photographing area of a camera of the robot rotates by one turn, the camera capturing images of a work scene at predetermined time intervals.
10. A robot, comprising: the device comprises an image acquisition unit, a calculation unit, a coordinate correction unit and a navigation unit; wherein:
the image acquisition unit acquires an image of a working scene through the camera;
the calculation unit is used for calculating the reference direction of the working scene according to the collected image of the working scene;
a coordinate correction unit correcting the directions of coordinate axes in a cruise coordinate system of the robot in the reference direction;
and the navigation unit is used for navigating the robot by adopting the corrected cruise coordinate system.
11. The robot of claim 10, wherein the calculating unit calculates the reference direction of the work scene from the captured image of the work scene comprises:
performing homography transformation on the acquired image of the working scene to obtain a transformed image, wherein the image plane of the transformed image is parallel to the ceiling of the working scene;
and searching in the transformed image to identify line segments on the ceiling of the working scene.
12. The robot of claim 10, wherein the calculating unit calculates the reference direction of the work scene from the captured image of the work scene comprises:
and searching the collected image of the working scene, and identifying line segments on the ceiling of the working scene.
13. The robot of claim 11 or 12, wherein the calculating unit calculates the reference direction of the working scene according to the collected image of the working scene, comprising:
and selecting the line segments meeting preset conditions from the searched line segments as reference lines, and determining the reference direction according to the direction of the reference lines.
14. A robot as set forth in claim 13, wherein the preset condition is: parallel to other searched line segments.
15. A robot as set forth in claim 13, wherein the preset condition is: perpendicular to other searched line segments.
16. The robot according to claim 10, wherein the coordinate correcting unit corrects the directions of the coordinate axes in the cruise coordinate system of the robot in the reference direction includes:
calculating a first included angle, wherein the first included angle is an included angle between the direction of the main shaft of the image and the direction of a coordinate axis in a cruise coordinate system of the robot before correction;
calculating a second included angle, wherein the second included angle is an included angle between the reference direction and the main axis direction of the image;
calculating a third included angle according to the first included angle and the second included angle, wherein the third included angle is an included angle between the reference direction and the direction of a coordinate axis in the cruise coordinate system of the robot before correction;
and correcting the direction of the coordinate axis in the cruise coordinate system of the robot according to the third included angle.
17. The robot as claimed in claim 10, wherein the camera of the image capturing unit captures the image of the working scene at a time when the robot is powered on or during the operation of the robot.
18. The robot according to claim 10, wherein the robot is controlled to turn and/or move so that a photographing area of a camera of the robot rotates by one turn, the camera capturing an image of the work scene at predetermined time intervals by the image capturing unit.
CN201910989314.1A 2019-10-17 2019-10-17 Robot and navigation method thereof Pending CN112683266A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910989314.1A CN112683266A (en) 2019-10-17 2019-10-17 Robot and navigation method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910989314.1A CN112683266A (en) 2019-10-17 2019-10-17 Robot and navigation method thereof

Publications (1)

Publication Number Publication Date
CN112683266A true CN112683266A (en) 2021-04-20

Family

ID=75444537

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910989314.1A Pending CN112683266A (en) 2019-10-17 2019-10-17 Robot and navigation method thereof

Country Status (1)

Country Link
CN (1) CN112683266A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111813984A (en) * 2020-06-23 2020-10-23 北京邮电大学 Method and device for realizing indoor positioning by using homography matrix and electronic equipment
CN114569004A (en) * 2022-02-22 2022-06-03 杭州萤石软件有限公司 Traveling direction adjustment method, mobile robot system, and electronic device

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103886107A (en) * 2014-04-14 2014-06-25 苏州市华天雄信息科技有限公司 Robot locating and map building system based on ceiling image information
CN104848858A (en) * 2015-06-01 2015-08-19 北京极智嘉科技有限公司 Two-dimensional code and vision-inert combined navigation system and method for robot
CN105573310A (en) * 2014-10-11 2016-05-11 北京自动化控制设备研究所 Method for positioning and environment modeling of coal mine tunnel robot
CN106197427A (en) * 2016-07-04 2016-12-07 上海思依暄机器人科技股份有限公司 Method, device and the robot of a kind of indoor positioning navigation
CN106338991A (en) * 2016-08-26 2017-01-18 南京理工大学 Robot based on inertial navigation and two-dimensional code and positioning and navigation method thereof
CN106959695A (en) * 2017-04-24 2017-07-18 广东宝乐机器人股份有限公司 Angle modification method and mobile robot of the mobile robot in working region
CN107976194A (en) * 2017-11-24 2018-05-01 北京奇虎科技有限公司 The method of adjustment and device of environmental map
CN108198216A (en) * 2017-12-12 2018-06-22 深圳市神州云海智能科技有限公司 A kind of robot and its position and orientation estimation method and device based on marker
CN108733039A (en) * 2017-04-18 2018-11-02 广东工业大学 The method and apparatus of navigator fix in a kind of robot chamber
US20190146521A1 (en) * 2017-08-02 2019-05-16 Ankobot (Shenzhen) Smart Technologies Co., Ltd. Control method, device and system of robot and robot using the same
CN109916408A (en) * 2019-02-28 2019-06-21 深圳市鑫益嘉科技股份有限公司 Robot indoor positioning and air navigation aid, device, equipment and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103886107A (en) * 2014-04-14 2014-06-25 苏州市华天雄信息科技有限公司 Robot locating and map building system based on ceiling image information
CN105573310A (en) * 2014-10-11 2016-05-11 北京自动化控制设备研究所 Method for positioning and environment modeling of coal mine tunnel robot
CN104848858A (en) * 2015-06-01 2015-08-19 北京极智嘉科技有限公司 Two-dimensional code and vision-inert combined navigation system and method for robot
CN106197427A (en) * 2016-07-04 2016-12-07 上海思依暄机器人科技股份有限公司 Method, device and the robot of a kind of indoor positioning navigation
CN106338991A (en) * 2016-08-26 2017-01-18 南京理工大学 Robot based on inertial navigation and two-dimensional code and positioning and navigation method thereof
CN108733039A (en) * 2017-04-18 2018-11-02 广东工业大学 The method and apparatus of navigator fix in a kind of robot chamber
CN106959695A (en) * 2017-04-24 2017-07-18 广东宝乐机器人股份有限公司 Angle modification method and mobile robot of the mobile robot in working region
US20190146521A1 (en) * 2017-08-02 2019-05-16 Ankobot (Shenzhen) Smart Technologies Co., Ltd. Control method, device and system of robot and robot using the same
CN107976194A (en) * 2017-11-24 2018-05-01 北京奇虎科技有限公司 The method of adjustment and device of environmental map
CN108198216A (en) * 2017-12-12 2018-06-22 深圳市神州云海智能科技有限公司 A kind of robot and its position and orientation estimation method and device based on marker
CN109916408A (en) * 2019-02-28 2019-06-21 深圳市鑫益嘉科技股份有限公司 Robot indoor positioning and air navigation aid, device, equipment and storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111813984A (en) * 2020-06-23 2020-10-23 北京邮电大学 Method and device for realizing indoor positioning by using homography matrix and electronic equipment
CN111813984B (en) * 2020-06-23 2022-09-30 北京邮电大学 Method and device for realizing indoor positioning by using homography matrix and electronic equipment
CN114569004A (en) * 2022-02-22 2022-06-03 杭州萤石软件有限公司 Traveling direction adjustment method, mobile robot system, and electronic device
WO2023160305A1 (en) * 2022-02-22 2023-08-31 杭州萤石软件有限公司 Travelling direction adjusting method, mobile robot system and electronic device
CN114569004B (en) * 2022-02-22 2023-12-01 杭州萤石软件有限公司 Travel direction adjustment method, mobile robot system and electronic device

Similar Documents

Publication Publication Date Title
US10518414B1 (en) Navigation method, navigation system, movement control system and mobile robot
CN106780608B (en) Pose information estimation method and device and movable equipment
CN109074083B (en) Movement control method, mobile robot, and computer storage medium
CN105654512B (en) A kind of method for tracking target and device
KR101618030B1 (en) Method for Recognizing Position and Controlling Movement of a Mobile Robot, and the Mobile Robot Using the same
US8879787B2 (en) Information processing device and information processing method
CN107341442B (en) Motion control method, motion control device, computer equipment and service robot
CN103198488B (en) PTZ surveillance camera realtime posture rapid estimation
CN110134117B (en) Mobile robot repositioning method, mobile robot and electronic equipment
US10296002B2 (en) Autonomous movement device, autonomous movement method, and non-transitory recording medium
WO2021190321A1 (en) Image processing method and device
CN107969995B (en) Visual floor sweeping robot and repositioning method thereof
EP3127586B1 (en) Interactive system, remote controller and operating method thereof
CN112683266A (en) Robot and navigation method thereof
Mount et al. 2d visual place recognition for domestic service robots at night
CN107507133B (en) Real-time image splicing method based on circular tube working robot
Sheng et al. Mobile robot localization and map building based on laser ranging and PTAM
CN115862074B (en) Human body pointing determination and screen control method and device and related equipment
CN111832542A (en) Three-eye visual identification and positioning method and device
CN111144341A (en) Body-building action error correction method and system based on mobile platform
CN116448118A (en) Working path optimization method and device of sweeping robot
CN109213154A (en) One kind being based on Slam localization method, device, electronic equipment and computer storage medium
CN113221729B (en) Unmanned aerial vehicle cluster control method and system based on gesture human-computer interaction
CN111724438B (en) Data processing method and device
Wu et al. Monocular vision SLAM based on key feature points selection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination