CN117289689A - Robot positioning method based on line segment matching in window - Google Patents

Robot positioning method based on line segment matching in window Download PDF

Info

Publication number
CN117289689A
CN117289689A CN202210697792.7A CN202210697792A CN117289689A CN 117289689 A CN117289689 A CN 117289689A CN 202210697792 A CN202210697792 A CN 202210697792A CN 117289689 A CN117289689 A CN 117289689A
Authority
CN
China
Prior art keywords
line
line segment
fitting
subgraph
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210697792.7A
Other languages
Chinese (zh)
Inventor
李永勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Amicro Semiconductor Co Ltd
Original Assignee
Zhuhai Amicro Semiconductor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Amicro Semiconductor Co Ltd filed Critical Zhuhai Amicro Semiconductor Co Ltd
Priority to CN202210697792.7A priority Critical patent/CN117289689A/en
Publication of CN117289689A publication Critical patent/CN117289689A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a robot positioning method based on line segment matching in a window, which comprises the following steps: the robot collects point cloud data through a single-point ranging sensor and constructs a current window; the robot sequentially rotates at different rotation center positions in the current window, the point cloud data collected by rotation at each rotation center position are fitted to obtain a corresponding group of fitting line segments, and each group of fitting line segments respectively form a line characteristic subgraph; the robot controls the matching line segments in each line characteristic subgraph to be matched with the matching line segments in other line characteristic subgraphs in similarity; obtaining an error coordinate offset when the matching success rate of the fitting line segments in the two line characteristic subgraphs reaches a first preset success rate; when the matching success rate between the line feature subgraphs in the current window reaches a second preset success rate, carrying out average value processing on all error coordinate offset values to obtain a positioning coordinate compensation quantity, and correcting the current position coordinate of the robot by using the positioning coordinate compensation quantity.

Description

Robot positioning method based on line segment matching in window
Technical Field
The invention relates to the technical field of robot navigation positioning, in particular to a robot positioning method based on line segment matching in a window.
Background
SLAM is used as a positioning and ground converter reconstruction technology, and through the inquiry and mutual fusion of various sensors, the mobile robot can have the capability of sensing the position of the mobile robot and the surrounding environment. The machine with the positioning function on the market basically needs to rely on a rotary laser radar or vision auxiliary positioning, particularly utilizes a laser radar and a multi-camera vision system to observe a motion track and sense surrounding environment, establishes various error models in feature extraction, matching and representation, establishes a correlation model through laser point cloud with depth information and vision features, tightly couples a vision odometer, and even if a large amount of information needs to be stored in order to meet the positioning precision of a local area, the defect of large storage space requirement inevitably exists, the corresponding cost is relatively high, more chip resources need to be called, a large amount of data need to be filtered and processed when positioning tasks are carried out, the calculation complexity is high, more calculation resources need to be consumed, the parameter adjustment is difficult, positioning failure is easily caused due to error accumulation after long-time operation, and the sensor assistance with higher cost is also needed.
Disclosure of Invention
In order to solve the technical defects, the invention discloses a robot positioning method based on line segment matching in a window, wherein the robot can realize a local area positioning function by only using one single-point ranging sensor in inertial navigation with gyroscope angle measurement, and the specific technical scheme is as follows:
the robot positioning method based on line segment matching in the window is characterized in that an execution main body of the robot positioning method is a robot fixedly provided with a single-point ranging sensor and a gyroscope; the single-point ranging sensor is used for collecting point cloud data of the environment where the robot is located and marking the point cloud data in a map, and the gyroscope is used for collecting the rotation angle of the robot; the robot positioning method comprises the following steps: step 1, a robot collects point cloud data through a single-point ranging sensor, and a current window is built in a map according to the maximum ranging distance of the single-point ranging sensor; step 2, the robot rotates at different rotation center positions in the current window in sequence, fitting processing is carried out on the point cloud data collected by rotating at each rotation center position to obtain a corresponding group of fitting line segments in the current window, each group of fitting line segments respectively form a line feature sub-graph, and then the robot obtains a plurality of line feature sub-graphs in the current window; the robot performs fitting treatment at a rotation center position and records a group of fitting line segments in the current window; step 3, in the current window, the robot controls the fitting line segment in each line characteristic sub-graph to be matched with the fitting line segments in other line characteristic sub-graphs in a similarity matching way; step 4, in the current window, determining that the two line feature subgraphs are successfully matched and obtaining an error coordinate offset when the matching success rate of the fitting line segments in the two line feature subgraphs reaches a first preset success rate; and 5, when the matching success rate between the line feature subgraphs in the current window reaches a second preset success rate, carrying out average value processing on all the obtained error coordinate offsets to obtain a positioning coordinate compensation amount, and correcting the current position coordinates of the robot by using the positioning coordinate compensation amount so as to finish one-time positioning in the current window.
Further, the robot positioning method further comprises: when the robot selects one window adjacent to the current window as the next window, the robot moves to the next window, updates the next window to the current window, and repeatedly executes the steps 2 to 5 until the robot traverses a first preset number of windows; wherein, in the map where the current window is located, the robot builds a window in the neighborhood of the current window; the area covered by the first preset number of windows comprises all windows constructed in the neighborhood of the current window. Wherein the window is a rectangular area framed in the map for defining the coverage of the line feature subgraph.
Further, in step 1, the method for constructing the current window in the map according to the maximum ranging distance of the single-point ranging sensor includes: taking the position of the robot when executing the step 1 as the center of the current window, extending a first extension distance along the horizontal left side of the robot, and extending a second extension distance along the horizontal right side of the robot to extend and form the transverse side length of the current window, wherein the first extension distance and the second extension distance are both equal to half of the maximum ranging distance of the single-point ranging sensor, and the transverse side length of the current window is equal to the maximum ranging distance of the single-point ranging sensor; wherein the current window is rectangular in shape.
Further, the windows constructed in the neighborhood of the current window comprise windows adjacent to the upper side of the current window, windows adjacent to the lower side of the current window, windows adjacent to the left side of the current window and windows adjacent to the right side of the current window; the shape of each window constructed in the neighborhood of the current window is the same as the shape of the current window, and the size of each window constructed in the neighborhood of the current window is equal to the size of the current window; the abscissa of each point of the window adjacent to the upper side of the current window is equal to the abscissa of each point of the current window, and the difference value between the ordinate of each vertex of the window adjacent to the upper side of the current window and the ordinate of the vertex at the same position relation of the current window is the longitudinal side length of the current window; the abscissa of each point of the window adjacent to the lower side of the current window is equal to the abscissa of each point of the current window, and the difference value of the ordinate of each vertex of the current window and the ordinate of the vertex at the same position relation of the window adjacent to the lower side of the current window is the longitudinal side length of the current window; the difference value of the abscissa of each vertex of the window adjacent to the right side of the current window and the abscissa of the vertex at the same position relation of the current window is the lateral side length of the current window; the difference value of the abscissa of each vertex of the current window and the abscissa of the vertex at the same position relation of the window adjacent to the left side of the current window is the lateral side length of the current window; wherein the same positional relationship indicates that the relative positional relationship of the two vertices with respect to the center of the window in which they are located is the same.
Further, in the step 4, the matching success rate of the fitting line segments of the two line feature subgraphs is a ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs participating in similarity matching; the fitting line segment pair participating in the similarity matching consists of fitting line segments of two line feature subgraphs in a current window, and the relative position relation of one fitting line segment in one line feature subgraph relative to the rotation center position of the other line feature subgraph of the fitting line segment pair participating in the similarity matching is equivalent to the relative position relation of one fitting line segment in the other line feature subgraph relative to the rotation center position of the other line feature subgraph; in the step 5, the matching success rate between the line feature subgraphs in the current window is the ratio of the number of all line feature subgraphs successfully matched to the number of all line feature subgraphs participating in matching; the line characteristic subgraph pair consists of two different line characteristic subgraphs in the current window; and the line characteristic subgraphs where the two fitting line segments participating in similarity matching are respectively located form a line characteristic subgraph pair participating in matching.
Further, in step 2, each time the robot rotates at the rotation center position for one circle, the single-point ranging sensor is controlled to collect point cloud data in the process of rotating the robot, position points in each angle range are fitted to line segments in corresponding directions, then the part of the fitted line segments in the current window in the corresponding directions is set as a group of fitted line segments in the current window, and then the group of fitted line segments form a line feature sub-graph, so that the construction of one line feature sub-graph in the current window is completed; the point cloud data comprises coordinate information of a position point scanned by a single-point ranging sensor and angle information of the position point; each time one of the line feature subgraphs is composed, the robot also records its coordinate information and initial angle information at the rotational center position.
Further, the shape of the current window is rectangular; the transverse side length of the current window is equal to the maximum ranging distance of the single-point ranging sensor; the longitudinal side length of the current window is larger than or equal to the diameter of the robot body; half of the maximum ranging distance of the single-point ranging sensor is equal to the first extension distance; the robot sets the center of the current window as a first rotation center position; the robot sets the central axis of the current window in the longitudinal direction as a base line, and the longitudinal direction of the current window is set to be parallel to the advancing direction of the robot; the robot sets the position points which are separated from the center of the current window by half of a first extension distance and are positioned in two symmetrical directions of a vertical baseline as a second first rotation center position and a second rotation center position respectively; the robot sets the position points which are separated from the center of the current window by a first extension distance and are positioned in two symmetrical directions of a vertical baseline as a third first rotation center position and a third second rotation center position respectively; the robot traverses each rotation center position in the current window sequentially without repetition so as to construct a plurality of line feature subgraphs in the current window.
Further, in the step 2, there are the following cases: when the fitted line segment with the corresponding trend is positioned in the current window, if the robot detects that the length of the fitted line segment with the corresponding trend is larger than a preset fitting length threshold value, setting the fitted line segment with the corresponding trend as a fitting line segment and marking the fitting line segment into a map, recording the coordinates of the starting point, the coordinates of the ending point and the inclination angle of the fitted line segment with the corresponding trend, and adding the fitting line segment into a group of corresponding fitting line segments in the current window; when the fitted line segment with the corresponding trend extends from the inside of the current window to the outside of the current window or the fitted line segment with the corresponding trend extends from the outside of the current window to the inside of the current window, if the robot detects that the length of the line segment intercepted by the fitted line segment with the corresponding trend in the current window is larger than a preset fitting length threshold value, setting the line segment intercepted by the fitted line segment with the corresponding trend in the current window as the fitted line segment and marking the fitted line segment into a map, recording coordinates of a starting point, coordinates of an ending point and an inclination angle of the line segment intercepted by the fitted line segment with the corresponding trend in the current window, and adding the line segment into a corresponding group of fitted line segments in the current window; the inclination angle is set as an angle representation of an included angle formed by the line segment of the fitted corresponding trend and the coordinate axis.
Further, in the process that the TOF sensor of the robot detects the boundary of the same wall body, calculating the angle of an included angle formed by the currently fitted line segment with the newly fitted line segment with the same trend recorded in advance each time the robot fits the line segment with the new trend from the boundary of the wall body; if the angle between the line segment of the new trend which is currently fitted and the fitting line segment of the same trend which is recorded in advance is larger than a preset fitting angle threshold value, carrying out weighted average treatment on the inclination angle of the line segment of the new trend which is currently fitted and the inclination angle of the fitting line segment of the same trend which is recorded in advance to obtain a calibration inclination angle, and updating the calibration inclination angle into the inclination angle of the fitting line segment of the same trend which is recorded in advance; the line segment fitted by the robot from the boundary of the wall body is determined by a target linear equation formed by fitting the point cloud data acquired in the corresponding angle range by the robot through a least square method.
Further, in step 3, the robot sets one line feature subgraph of the line feature subgraph pair participating in matching as a reference line feature subgraph, and sets the other line feature subgraph of the line feature subgraph pair participating in matching as a line feature subgraph to be matched; then translating the line feature subgraph to be matched to completely coincide with the reference line feature subgraph, or translating the line feature subgraph to be matched to completely cover the reference line feature subgraph; setting the translated line characteristic subgraph to be matched as a target matching line characteristic subgraph, wherein all fitting line segments in the target matching line characteristic subgraph are translated relative to corresponding fitting line segments in the line characteristic subgraph to be matched; and then the robot controls the matching line segments in the target matching line characteristic subgraph to be matched with the matching line segments at the corresponding positions in the reference line characteristic subgraph in a similarity mode.
Further, the method for controlling the robot to perform similarity matching between the fitting line segment in the target matching line feature subgraph and the fitting line segment at the corresponding position in the reference line feature subgraph by the robot comprises the following steps: step 31, calculating the ratio of the absolute value of the difference value between the line segment length of one fitting line segment in the target matching line feature subgraph and the line segment length of one fitting line segment at the corresponding position in the reference line feature subgraph to the line segment length of one fitting line segment at the corresponding position in the reference line feature subgraph, and marking the ratio as the difference rate in the line segment length dimension; calculating the ratio of the absolute value of the difference value between the inclination angle of one fitting line segment in the target matching line characteristic subgraph and the inclination angle of one fitting line segment at the corresponding position in the reference line characteristic subgraph to the inclination angle of one fitting line segment at the corresponding position in the reference line characteristic subgraph, and marking the ratio as the difference rate in the inclination angle dimension; calculating the ratio of the number of coordinate positions through which one fitting line segment in the target matching line characteristic subgraph and one fitting line segment at the corresponding position in the reference line characteristic subgraph pass together to the number of coordinate positions through which one fitting line segment at the corresponding position in the reference line characteristic subgraph passes, and marking the ratio as the coincidence rate in the position dimension; step 32, for a fitting line segment in the target matching line feature sub-graph and a fitting line segment at a corresponding position in the reference line feature sub-graph, when the difference rate in the length dimension of the line segment is smaller than or equal to a preset difference rate, the difference rate in the inclined angle dimension is smaller than or equal to a preset difference rate, and the coincidence rate in the position dimension is larger than or equal to a preset coincidence rate, determining that the matching of the fitting line segment in the target matching line feature sub-graph and the fitting line segment at the corresponding position in the reference line feature sub-graph is successful in similarity matching; the coordinate offset of the target match line feature subgraph relative to the to-be-matched line feature subgraph is equal to the coordinate offset of the rotation center position corresponding to the target match line feature subgraph relative to the rotation center position corresponding to the to-be-matched line feature subgraph; wherein, the sum value of the preset difference rate and the preset contact ratio is equal to 100 percent.
Further, the method for controlling the robot to perform similarity matching between the fitting line segment in the target matching line feature subgraph and the fitting line segment at the corresponding position in the reference line feature subgraph by the robot comprises the following steps: step 31, calculating the ratio of the absolute value of the difference value between the line segment length of one fitting line segment in the target matching line feature subgraph and the line segment length of one fitting line segment at the corresponding position in the reference line feature subgraph to the line segment length of one fitting line segment with relatively longer line segment length, marking the ratio as a first ratio, and marking the difference value between the value 1 and the first ratio as the coincidence ratio in the line segment length dimension; calculating the ratio of the absolute value of the difference value between the inclination angle of one fitting line segment in the target matching line characteristic subgraph and the inclination angle of one fitting line segment at the corresponding position in the reference line characteristic subgraph to the inclination angle of one fitting line segment with a relatively large inclination angle, marking the ratio as a second ratio, and marking the difference value between the value 1 and the second ratio as the coincidence rate in the dimension of the inclination angle; calculating the ratio of the number of coordinate positions passed by one fitting line segment in the target matching line characteristic subgraph and one fitting line segment at the corresponding position in the reference line characteristic subgraph together with the number of coordinate positions passed by one fitting line segment passing through relatively more coordinate positions, and marking the ratio as the coincidence rate in the position dimension; and step 32, for one fitting line segment in the target matching line feature sub-graph and one fitting line segment at a corresponding position in the reference line feature sub-graph, when the sum of the coincidence rate in the length dimension of the line segment, the coincidence rate in the inclination angle dimension and the coincidence rate in the position dimension is greater than or equal to the preset coincidence rate, determining that the matching of the one fitting line segment in the target matching line feature sub-graph and one fitting line segment at a corresponding position in the reference line feature sub-graph is successful in similarity matching.
Further, before the robot finishes matching each fitting line segment in the target matching line feature subgraph with the fitting line segment at the corresponding position in the reference line feature subgraph, or when the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs participating in similarity matching is detected to be smaller than a first preset success rate, setting the rotation center position corresponding to the target matching line feature subgraph as an offset starting point position; then, controlling the target matching line characteristic subgraph to translate along the direction of a preset coordinate axis according to a preset translation step length from the offset starting point position; when the coordinate offset of the target match line feature subgraph translated in the same coordinate axis direction from the offset starting point position does not reach the maximum preset offset, each time the target match line feature subgraph is translated by a preset translation step length, the robot judges whether the ratio of the number of all matched line segment pairs successfully matched to the number of all matched line segment pairs matched by the similarity of the length of the participated line segment is greater than or equal to a first preset success rate, if so, the target match line feature subgraph and the reference line feature subgraph are successfully matched, and the coordinate offset of the target match line feature subgraph translated in the latest translation direction from the offset starting point position is set as an error coordinate offset, and then the target match line feature subgraph is controlled to stop translating; otherwise, the robot adjusts the direction of the set coordinate axis to be opposite to or perpendicular to the direction of the set coordinate axis, updates the direction of the opposite or perpendicular coordinate axis to be the direction of the set coordinate axis, and controls the target matching line characteristic subgraph to translate the preset translation step length along the direction of the set coordinate axis from the offset starting point position; if the coordinate offset of the target match line feature sub-graph translated in the same coordinate axis direction from the offset starting point position reaches the maximum preset offset, when the robot judges that the ratio of the number of all the matched line segment pairs successfully matched to the number of all the matched line segment pairs matched with the similarity of the lengths of the participated line segments is smaller than a first preset success rate, the robot adjusts the direction of the established coordinate axis to be opposite or perpendicular to the established coordinate axis direction, updates the opposite or perpendicular coordinate axis direction to be the established coordinate axis direction, and then controls the target match line feature sub-graph to translate the preset translation step length along the established coordinate axis direction from the offset starting point position; before the coordinate offset translated in the given coordinate axis direction reaches the maximum preset offset, each time the same target match line feature sub-graph translates by the preset translation step length along the given coordinate axis direction, updating the translated target match line feature sub-graph into the target match line feature sub-graph, and then executing step 31 and step 32; and after each robot is matched with each fitting line segment in the target matching line feature subgraph and the fitting line segment at the corresponding position in the reference line feature subgraph, the robot judges whether the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs matched by the similarity of the lengths of the participating line segments is larger than or equal to a first preset success rate.
Further, after the coordinate offset of the target match line feature subgraph translated along all coordinate axis directions from the offset starting point position reaches the maximum preset offset, if the robot judges that the ratio of the number of all matched line segment pairs successfully matched to the number of all matched line segment pairs participating in similarity matching is smaller than a first preset success rate, determining that the target match line feature subgraph and the reference line feature subgraph fail to match; the preset maximum positioning error is associated with a maximum preset offset, and the maximum preset offset comprises a maximum preset offset sitting amount in the horizontal axis direction and a maximum preset offset sitting amount in the vertical axis direction.
Further, in step 3, the robot sets one line feature subgraph of the line feature subgraph pair participating in matching as a reference line feature subgraph, and sets the other line feature subgraph of the line feature subgraph pair participating in matching as a line feature subgraph to be matched; setting the rotation center position corresponding to the line characteristic subgraph to be matched as an offset starting point position; then controlling the characteristic subgraph of the line to be matched to translate along the direction of a preset coordinate axis according to a preset translation step length from the position of the offset starting point; when the coordinate offset of the line feature subgraph to be matched, which is translated in the same coordinate axis direction from the offset starting point position, does not reach the maximum preset offset, each time the line feature subgraph to be matched is translated by a preset translation step length, the robot judges whether the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs matched by the similarity of the length of the participated line segment is greater than or equal to a first preset success rate, if so, the line feature subgraph to be matched and the reference line feature subgraph are successfully matched, and the coordinate offset of the line feature subgraph to be matched, which is translated in the latest translation direction from the offset starting point position, is set as an error coordinate offset, and then the line feature subgraph to be matched is controlled to stop translating; otherwise, the robot adjusts the direction of the established coordinate axis to be opposite or perpendicular to the established coordinate axis direction, updates the opposite or perpendicular coordinate axis direction to be the direction of the established coordinate axis, and controls the characteristic subgraph of the line to be matched to translate the preset translation step length along the established coordinate axis direction from the offset starting point position; if the coordinate offset of the line feature subgraph to be matched, which is translated in the same coordinate axis direction from the offset starting point position, reaches the maximum preset offset, when the robot judges that the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs matched with the similarity of the lengths of the participating line segments is smaller than a first preset success rate, the robot adjusts the direction of the established coordinate axis to be opposite or perpendicular to the established coordinate axis direction, updates the opposite or perpendicular coordinate axis direction to be the established coordinate axis direction, and then controls the line feature subgraph to be matched to be translated in the established coordinate axis direction from the offset starting point position by the preset translation step length; before the coordinate offset translated in the given coordinate axis direction reaches the maximum preset offset, each time the same to-be-matched line feature sub-graph translates for one preset translation step along the given coordinate axis direction, updating the translated to-be-matched line feature sub-graph, and then controlling a fitting line segment in the to-be-matched line feature sub-graph to be similar to a fitting line segment at a corresponding position in the reference line feature sub-graph; each time the robot matches each fitting line segment in the line feature subgraph to be matched with the fitting line segment at the corresponding position in the reference line feature subgraph, the robot judges whether the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs matched with the similarity of the lengths of the participated line segments is larger than or equal to a first preset success rate; after coordinate offsets of the line feature subgraphs to be matched, which are translated along all coordinate axis directions from the offset starting point positions, reach the maximum preset offset, if the robot judges that the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs participating in similarity matching is smaller than a first preset success rate, determining that the matching of the line feature subgraphs to be matched and the reference line feature subgraphs fails; the preset maximum positioning error is associated with a maximum preset offset, and the maximum preset offset comprises a maximum preset offset sitting amount in the horizontal axis direction and a maximum preset offset sitting amount in the vertical axis direction.
Further, the method for controlling the robot to perform similarity matching between the fitting line segment in the to-be-matched line feature subgraph and the fitting line segment at the corresponding position in the reference line feature subgraph by the robot comprises the following steps: step 31, calculating the ratio of the absolute value of the difference value between the length of one fitting line segment in the characteristic subgraph of the line to be matched and the length of one fitting line segment at the corresponding position in the characteristic subgraph of the reference line to the length of one fitting line segment at the corresponding position in the characteristic subgraph of the reference line, and marking the ratio as the difference rate in the length dimension of the line segments; calculating the ratio of the absolute value of the difference value between the inclination angle of one fitting line segment in the characteristic subgraph of the line to be matched and the inclination angle of one fitting line segment at the corresponding position in the characteristic subgraph of the reference line to the inclination angle of one fitting line segment at the corresponding position in the characteristic subgraph of the reference line, and marking the ratio as the difference rate in the dimension of the inclination angle; calculating the ratio of the number of coordinate positions through which one fitting line segment in the characteristic subgraph to be matched and one fitting line segment at the corresponding position in the characteristic subgraph of the reference line pass together to the number of coordinate positions through which one fitting line segment at the corresponding position in the characteristic subgraph of the reference line passes, and marking the ratio as the coincidence rate in the position dimension; step 32, for a fitting line segment in the line feature subgraph to be matched and a fitting line segment at a corresponding position in the reference line feature subgraph, when the difference rate in the length dimension of the line segment is smaller than or equal to a preset difference rate, the difference rate in the inclined angle dimension is smaller than or equal to a preset difference rate, and the coincidence rate in the position dimension is larger than or equal to a preset coincidence rate, determining that the integral coincidence rate of the fitting line segment in the line feature subgraph to be matched and the fitting line segment at the corresponding position in the reference line feature subgraph is larger than or equal to the preset coincidence rate, and further determining that the fitting line segment in the line feature subgraph to be matched and the fitting line segment at the corresponding position in the reference line feature subgraph are successfully matched in similarity matching; wherein, the sum value of the preset difference rate and the preset contact ratio is equal to 100 percent.
Further, the method for controlling the robot to perform similarity matching between the fitting line segment in the to-be-matched line feature subgraph and the fitting line segment at the corresponding position in the reference line feature subgraph by the robot comprises the following steps: step 31, calculating the ratio of the absolute value of the difference value between the line segment length of one fitting line segment in the characteristic subgraph of the line to be matched and the line segment length of one fitting line segment at the corresponding position in the characteristic subgraph of the reference line and the line segment length of one fitting line segment with relatively longer line segment length, marking the ratio as a first ratio, and marking the difference value between the value 1 and the first ratio as the coincidence ratio in the length dimension of the line segment; calculating the ratio of the absolute value of the difference value between the inclination angle of one fitting line segment in the characteristic subgraph of the line to be matched and the inclination angle of one fitting line segment at the corresponding position in the characteristic subgraph of the reference line and the inclination angle of one fitting line segment with relatively larger inclination angle, marking the ratio as a second ratio, and marking the difference value between the value 1 and the second ratio as the coincidence rate in the dimension of the inclination angle; calculating the ratio of the number of coordinate positions passed by one fitting line segment in the characteristic subgraph of the line to be matched and one fitting line segment at the corresponding position in the characteristic subgraph of the reference line to the number of coordinate positions passed by one fitting line segment passing through relatively more coordinate positions, and marking the ratio as the coincidence rate in the position dimension; and step 32, for one fitting line segment in the line characteristic subgraph to be matched and one fitting line segment at a corresponding position in the reference line characteristic subgraph, when the sum of the coincidence rate in the length dimension of the line segment, the coincidence rate in the inclination angle dimension and the coincidence rate in the position dimension is greater than or equal to the preset coincidence rate, determining that the matching of the fitting line segment in the line characteristic subgraph to be matched and one fitting line segment at a corresponding position in the reference line characteristic subgraph is successful in similarity matching.
Further, in step 5, when the ratio of the number of the pairs of all the line feature subgraphs successfully matched to the number of the pairs of all the line feature subgraphs involved in matching is greater than or equal to a second preset success rate, the robot obtains a plurality of error coordinate offsets, and if the robot judges that the number of the error coordinate offsets obtained currently is greater than a second preset number threshold, the error coordinate offsets with the maximum abscissa value, the error coordinate offsets with the minimum abscissa value, the error coordinate offsets with the maximum ordinate value and the error coordinate offsets with the minimum ordinate value are all removed from all the error coordinate offsets obtained currently; averaging the horizontal axis error coordinate offset in the residual error coordinate offset, averaging the vertical axis error coordinate offset in the residual error coordinate offset to obtain an average coordinate offset, setting the average coordinate offset as the positioning coordinate compensation amount, and adding the positioning coordinate compensation amount and the current position coordinate of the robot by the robot control to obtain the corrected robot position coordinate; wherein each error coordinate offset includes a horizontal axis error coordinate offset and a vertical axis error coordinate offset.
The invention has the beneficial technical effects that the robot controls the single-point ranging sensor to scan the surrounding environment through the rotation of the robot body, the point cloud data is acquired in the area defined by the window frame and the line segments are fitted, then the fitted line segments form line feature subgraphs in the window to form line feature subgraphs at each rotation center position, and the line feature subgraphs are used as the basis for matching two different line feature subgraphs in the same window.
Aiming at the matching of the line characteristic subgraphs, matching judgment is carried out by using the coincidence degree between the fitting line segments and the number of the fitting line segments meeting the corresponding coincidence rate, so that the calculation complexity is reduced, and the calculation speed is improved; aiming at the position positioning errors existing in the line feature subgraphs involved in matching, the line feature subgraphs are controlled to translate along the directions of all coordinate axes, and matching judgment is repeatedly carried out by using the coincidence degree between the fitting line segments and the number of the fitting line segments meeting the corresponding coincidence rate until the number of the fitting line segments successfully matched meets the preset matching success rate, so that the interference of coordinate offset errors is overcome, a plurality of error amounts are extracted, and the subsequent processing is conveniently carried out to obtain the compensation amount of the position coordinates of the robot. Specifically, an error coordinate offset for correcting or repositioning the robot position is obtained every time two target subgraphs are successfully matched, and a reasonable numerical upper limit is set for the number of successfully matched line characteristic subgraphs in each window, so that the calculation amount for calling the target subgraphs and matching positioning is reduced.
Therefore, the pose positioning under the local area can be realized by relying on the single-point ranging sensor without specially modifying the surrounding environment, and the method has the advantages of convenience in operation, strong adaptability and capability of meeting the requirements of strong instantaneity and strong robustness in the positioning process.
Drawings
Fig. 1 is a flow chart of a robot positioning method based on line segment matching within a window according to an embodiment of the present invention.
Detailed Description
The following describes the technical solution in the embodiment of the present invention in detail with reference to the drawings in the embodiment of the present invention. For further illustration of the various embodiments, the invention is provided with the accompanying drawings. The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate embodiments and together with the description, serve to explain the principles of the embodiments. With reference to these matters, one of ordinary skill in the art will understand other possible embodiments and advantages of the present invention. A process or method depicted as a flowchart. Although a flowchart depicts steps as a sequential process, many of the steps may be implemented in parallel, concurrently, or with other steps. Furthermore, the order of the steps may be rearranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figures. The processes may correspond to methods, functions, procedures, subroutines, and the like.
It should be noted that the terms "comprises" and "comprising," and any variations thereof, in the description and claims of the present application and in the foregoing figures, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
The term "and/or" is used herein to describe only one relationship, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist together, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
As an embodiment, the invention discloses a robot positioning method based on line segment matching in a window, wherein a robot can realize a local area positioning function by only using one single-point ranging sensor in inertial navigation with a gyroscope measuring angle, and an execution main body of the robot positioning method is a robot fixedly provided with the single-point ranging sensor and the gyroscope; the single-point ranging sensor is used for collecting point cloud data of an environment where the robot is located and marking the point cloud data in a map, wherein the single-point ranging sensor is not provided with a rotating mechanism of a laser radar, so that the single-point ranging sensor fixedly arranged on the robot body is required to be driven to scan the surrounding environment by means of rotation (generally 360-degree rotation) of the robot body, multi-contour discrete points (which can be regarded as position points) of the reaction environment and point cloud images formed by the multi-contour discrete points are obtained, namely, environmental characteristics of the surrounding of the robot are collected to obtain the point cloud data, data processing is carried out on the point cloud data to obtain relevant straight lines, and the environment images (belonging to the map) are obtained by means of construction of the straight lines. The gyroscope is used for collecting the rotation angle of the robot, when the robot rotates, the gyroscope and the single-point ranging sensor synchronously conduct data collection, the map constructed by the robot also needs to be converted according to the angle collected by the gyroscope, for example, pose information contained in the point cloud data is converted into a coordinate system, in some embodiments, the gyroscope can accumulate drift errors, the constructed map is easy to deviate, therefore, the embodiment needs to use the point cloud data collected by the single-point ranging sensor to conduct fusion processing, and the robot can obtain positioning information with certain precision. Preferably, the single-point ranging sensor may be a TOF sensor, and is fixedly installed on any one of two sides of a robot body, including the left side or the right side of the robot body, so long as the distance between an obstacle and a wall on the side where the single-point ranging sensor is located can be accurately tested in the walking process of the robot, but the distance is limited by the maximum ranging distance of the TOF sensor, that is, the effective ranging distance of the TOF sensor is smaller than or equal to the maximum ranging distance of the single-point ranging sensor in the process of scanning surrounding obstacles, so that a point cloud error or a certain precision is allowed, and pose information in the moving process of the robot is acquired and recorded by configuring certain frequency sampling.
Note that, TOF: time of flight. TOF sensors typically require measurement using a specific artificial light source, i.e. by measuring the "time of flight" of ultrasonic, microwave, light, etc. signals between the emitter and the reflector to calculate the distance between the two. Many TOF sensors are used, and many TOF sensors for measuring distance by infrared or laser are used.
As an embodiment, as shown in fig. 1, the robot positioning method includes:
in the step S1, the robot collects point cloud data through the single-point ranging sensor, and in this embodiment, the robot drives the single-point ranging sensor fixedly arranged on the body to scan the surrounding environment through in-situ rotation to obtain discrete points (which can be regarded as position points) of the outline of the surrounding environment, and a point cloud sub-graph is formed in a local area, that is, the environmental features around the robot are collected to obtain the point cloud data, and as most of the points are discrete points, relevant straight-line segments obtained by performing data processing on the point cloud data in the follow-up process are not necessarily continuous straight-line segments. The robot also builds a current window in the map according to the maximum ranging distance of the single-point ranging sensor, the robot starts to move and then executes step S1, in some embodiments, the robot turns in place after being electrified and started or rotates in place after exiting the charging pile, then the robot moves forwards by one room width in the current working area or moves to collide with a wall body, then rotates to change the current advancing direction, then builds the current window by taking the straight line where the latest advancing direction is located as a base line, and enables the current window to be symmetrical relative to the base line, so that the working area with a larger range is covered at the boundary position, and more point cloud subgraphs can be enclosed in particular. Step S2 is then performed.
S2, the robot sequentially rotates at different rotation center positions in the current window, fitting processing is carried out on point cloud data collected by rotation at each rotation center position to obtain corresponding groups of fitting line segments in the current window, each group of fitting line segments respectively form a line feature sub-graph, and then the robot obtains a plurality of line feature sub-graphs in the current window; step S3 is then performed. In step S2, each time the robot moves to a rotation center position, a plurality of fitting line segments (corresponding to an observation line constructed by local point cloud information, specifically, point cloud information acquired by using a fixed TOF sensor on the robot) are scanned at the rotation center position where the robot is currently located, and then all the fitting line segments are formed into a line feature subgraph, which corresponds to a set of fitting line segments in a local area, so that the corresponding fitting line segments can be recorded in a map. In step S2, the rotation center position may be symmetrically disposed in the current window, and may be symmetrically disposed about the aforementioned baseline; the robot performs fitting processing at a rotation center position and records a group of fitting line segments in the current window, one rotation center position corresponds to the group of fitting line segments, and then one rotation center position corresponds to the scanning of one line feature sub-graph, so that when the robot forms one line feature sub-graph, besides recording the line feature sub-graph, the robot also needs to record the rotation center position (comprising coordinate information and initial angle information (the angle just moved to the rotation center position) of the robot and corresponding creation time. In some embodiments, the robot also creates the line feature subgraph when walking along the planned I-shaped path and records the subgraph in the map.
S3, in the current window, the robot controls the fitting line segment in each line characteristic sub-graph to be matched with the fitting line segments in other line characteristic sub-graphs in a similarity mode; then step S4 is performed. Specifically, in a window established in advance, the robot controls each fitting line segment in each line feature sub-graph to be subjected to similarity matching with the fitting line segment at the corresponding position in the other line feature sub-graph, and similarity matching between line segment attribute position relationships is performed by utilizing the one-to-one correspondence relationship of the fitting line segments in two different line feature sub-graphs, specifically, the coincidence matching degree (or similarity degree) between two line segments, wherein the relative position relationship (information of which position of a line is at a point, the distance between the line and the point, and the like) of the two fitting line segments participating in the similarity matching relative to the center of the window (the origin of a coordinate system to which the window belongs) is equivalent; or, the relative positional relationship (which direction of the line is at a point, the distance between the line and the point, and the like) of the two fitting line segments participating in the similarity matching relative to the rotation center position of the line feature subgraph where each fitting line segment is located is equivalent; alternatively, the two fitted line segments that participate in the similarity matching are parallel fitted line segments that separate the two line feature subgraphs within the current window. In order to pursue more comprehensive matching, two fitting line segments participating in similarity matching can be set as any one fitting line segment in each line feature sub-graph and other line feature sub-graphs in the same window, similarity matching between line segment attribute position relations is carried out one by one among the fitting line segments in two different line feature sub-graphs in the same window, a one-to-one matching relation of the fitting line segments between any two line feature sub-graphs is built, and similarity matching between the fitting line segments in each line feature sub-graph in the current window and the fitting line segments in other line feature sub-graphs is facilitated; on the basis, fitting line segments with lower partial similarity can be eliminated in the partial area.
Under the necessary circumstances, in order to make the matching of the fitting line segments easier, it is also necessary to perform translation transformation on the feature line subgraph where one of the fitting line segments participating in the similarity matching is located, so that one of the fitting line segments can be converted into a coordinate system corresponding to other feature line subgraphs (the feature line subgraph where the other fitting line segment participating in the similarity matching is located), specifically, the point cloud information can be converted into a line segment in the coordinate system corresponding to the other feature line subgraphs, an observation line segment can be formed, then the observation line segment is matched with the other fitting line segment participating in the similarity matching, and the similarity matching can be performed on each observation line segment generated by translation and the original fitting line segment in the translated line feature subgraph one by one.
And S4, in the current window, when the matching success rate of the fitting line segments of the two line feature subgraphs reaches a first preset success rate, the two line feature subgraphs are successfully matched, an error coordinate offset is obtained, and then the step S5 is executed. Specifically, after the robot completes the similarity matching of each fitting line segment in each line feature sub-graph with the fitting line segments at the corresponding positions in other line feature sub-graphs respectively in the current window, determining that all fitting line segments in the current window are matched, wherein the first preset success rate is preferably 80%, and determining that two line feature sub-graphs are successfully matched if the ratio of the successfully matched fitting line segments to the total number of the fitting line segment pairs occupying all the matching line segment pairs participating in the similarity matching is greater than or equal to 80%, wherein in order to overcome the position error marked in the map, one line feature sub-graph in the two line feature sub-graphs is matched with the similarity between the fitting line segments of the other line feature sub-graph after translation, so that the coordinate translation amount generated by one line feature sub-graph can be calculated when the two fitting line segments reach a certain degree of coincidence, and can be a single-step error or an accumulated error; the matching between the two line feature subgraphs can be understood as the similarity matching between the fitting line segments in the two line feature subgraphs, and can also be directly understood as the similarity matching between the two line feature subgraphs, at this time, the number of successfully matched fitting line segments can be recorded, and the fitting line segments separated into the two line feature subgraphs are controlled to stop the similarity matching, namely, each fitting line segment in one of the two line feature subgraphs is controlled to stop the similarity matching with any fitting line segment in the other line feature subgraph.
Step S5, when the matching success rate between the line feature subgraphs in the current window reaches a second preset success rate, obtaining a plurality of error coordinate offsets, and determining that the current window is traversed, which can be understood as that any two different line feature subgraphs in the same window are completely matched in an exhaustive manner, that is, each line feature subgraph is matched with any one of the other line feature subgraphs, and then, preferably, the accumulated matching success rate between the same line feature subgraph and the other line feature subgraphs is recorded as the matching success rate (accumulation result) of the line feature subgraphs; or, in order to reduce the calculation amount, setting all the line feature subgraphs participating in matching as one line feature subgraph fixed in the current window and any one line feature subgraph of the rest, and matching the same line feature subgraphs or the same pair of fitting line segments only once, thereby completing the matching traversal of all the line feature subgraphs in the current window. Then, carrying out average value processing on all the obtained error coordinate offsets, specifically summing all the error coordinate offsets or the error coordinate offsets with representativeness to obtain an average value, and obtaining a positioning coordinate compensation quantity as an optimal error quantity; and correcting the current position coordinate of the robot by using the positioning coordinate compensation quantity, wherein the control positioning coordinate compensation quantity can be added with the preset current position coordinate of the robot (can be the initial position of the robot before the robot executes the robot positioning method) to obtain the optimal position information so as to complete one-time positioning in the current window which belongs to one-time local positioning for completing the pose of the robot, and the current window belongs to a local area which can be updated and transformed. In this embodiment, the success rate of matching between the line feature subgraphs in the current window increases with the success rate of matching every two line feature subgraphs in the current window, where the success rate of matching the two line feature subgraphs is determined by the step S4; when the second preset success rate is preferably 50%, because the line feature subgraphs in the same window are not more, the robot can traverse all the sub-line feature subgraphs in the window to carry out matching comparison, if half and more paired matched line feature subgraphs can be found, if 6 pairs of line feature subgraphs participate in matching in the current window, if more than 80% of matching success rate (first preset success rate) can be achieved by the fitting line segments in 3 pairs of line feature subgraphs, the local positioning of the robot pose can be carried out through step S5. Or if half or more paired matched line feature subgraphs can be found, if 6 paired matched line feature subgraphs participate in matching in the current window, if more than 80% of matching success rate (first preset success rate) can be achieved by the fit line segments in 3 paired line feature subgraphs, the local positioning of the robot pose can be performed through the step S5. Or if the current window has 6 line feature subgraphs, if the matching success rate (the first preset success rate) of the fitting line segments in 3 line feature subgraphs and the fitting line segments in one fixed line feature subgraph can reach more than 80%, the local positioning of the robot pose can be performed through the step S5.
The robot controls the single-point ranging sensor to scan the surrounding environment through the rotation of the robot body, the point cloud data is collected in the area defined by the window frame and the line segments are fitted, the fitted line segments form line feature subgraphs in the window to form line feature subgraphs at each rotation center position, and the line feature subgraphs serve as the basis of matching of two different line feature subgraphs in the same window. Therefore, the pose positioning under the local area can be realized by relying on the single-point ranging sensor without specially modifying the surrounding environment, and the method has the advantages of convenience in operation, strong adaptability and capability of meeting the requirements of strong instantaneity and strong robustness in the positioning process.
On the basis of the above embodiment, after the robot has performed step S5, it is also necessary to perform: when the robot selects one window adjacent to the current window as a next window, the robot moves to the next window first and updates the next window to the current window, and in order to avoid repeated traversal, the next window needs to be updated to the current window on the premise that the next window is detected not to be traversed; then repeatedly executing the steps S2 to S5 until the robot traverses a first preset number of windows, namely, the robot does not repeatedly match all line feature subgraphs in the first preset number of windows according to the steps S2 to S5, positioning the robot is completed, and positioning of the global area is completed by traversing a local area-by-local area; the area covered by the first preset number of windows comprises all windows which can be constructed in the neighborhood of the current window, the neighborhood of the current window can be a window area with the same size as the current window, and the coverage areas of the two adjacent windows are equal and a certain interval is allowed.
In the present invention, all the windows are rectangular areas framed in the map, and are used for dividing the global working area into areas and limiting the coverage range of the line characteristic subgraph scanned by the robot in the divided corresponding areas. Dividing the global working area into corresponding subareas no matter the window where the window is currently located or the next window, wherein one window corresponds to one subarea; the robot can conveniently traverse the global working area in regions according to a preset traversing sequence; each window may be a specific border line frame, may be associated with an actual position of the robot, or may be a rectangular area set with a position of the robot at each working cycle as a center, and may be reflected in a map, be a landmark product of an area in a global working area, and specifically may search for point cloud information in an external area. The size of the window is related to the maximum ranging distance of the single-point ranging sensor; the window is convenient to frame the point cloud data acquired by the TOF sensor and reflect the contour information of the environment, and a plurality of line feature subgraphs are also contained in one window.
As an embodiment, in step S1, a method of constructing a current window in a map according to a maximum ranging distance of a single-point ranging sensor includes: taking the position of the robot when the step S1 is executed as the center of the current window, extending a first extending distance along the horizontal left side of the robot (corresponding to the left side of the advancing direction of the robot), and extending a second extending distance along the horizontal right side of the robot (corresponding to the right side of the advancing direction of the robot) to extend and form the transverse side length of the current window, wherein the longitudinal side length of the current window is preferably equal to the transverse side length of the current window, or the longitudinal side length of the current window is preferably greater than or equal to the diameter of the body of the robot, so that a local area is defined in a map to facilitate the local positioning of the robot; the map is a grid map pre-constructed by the robot, and can be formed by converting point cloud data acquired by a TOF sensor and angle information acquired by a gyroscope; the first extension distance and the second extension distance are equal to half of the maximum ranging distance of the single-point ranging sensor, and the lateral side length of the current window is equal to the maximum ranging distance of the single-point ranging sensor; wherein the current window is rectangular in shape. Preferably, the maximum distance measurement distance of the single-point distance measurement sensor is 4 meters, the lateral side length of the current window is equal to 4 meters, the distance between the longitudinal side of the current window and the current position of the robot in the lateral direction is equal to 2 meters, and the window is convenient to frame point cloud data acquired by the TOF sensor and reflect the contour information of the environment.
As one embodiment, the windows constructed at the neighborhood of the current window include an upper adjacent window of the current window, a lower adjacent window of the current window, a left adjacent window of the current window, and a right adjacent window of the current window; the shape of each window constructed in the neighborhood of the current window is the same as the shape of the current window, and the size of each window constructed in the neighborhood of the current window is equal to the size of the current window; so that the window constructed in the neighborhood of the current window is equivalent to the four neighbors of one window on the map.
Specifically, the abscissa of each point of the window adjacent to the upper side of the current window is equal to the abscissa of each point of the current window, and the difference between the ordinate of each vertex of the window adjacent to the upper side of the current window and the ordinate of the vertex at the same positional relationship of the current window is the longitudinal side length of the current window. The abscissa of each point of the window adjacent to the lower side of the current window is equal to the abscissa of each point of the current window, and the difference between the ordinate of each vertex of the current window and the ordinate of the vertex at the same positional relationship of the window adjacent to the lower side of the current window is the longitudinal side length of the current window. The difference between the abscissa of each vertex of the window adjacent to the right side of the current window and the abscissa of the vertex at the same positional relationship of the current window is the lateral side length of the current window. The difference between the abscissa of each vertex of the current window and the abscissa of the vertex at the same positional relationship of the window adjacent to the left side of the current window is the lateral side length of the current window. Wherein the same positional relationship is indicative of the relative positional relationship of two vertices with respect to the center of the window in which they are located being the same, including direction and distance. And each window is easy to be repeatedly traversed by the robot when serving as the next window, and each time the robot starts to repeatedly traverse one window, the robot can multiplex the association information of the line feature subgraphs in the window, wherein the association information comprises the similarity matching condition of the fitting line segments in each line feature subgraph and the fitting line segments in other line feature subgraphs, which are related in the step S3, the matching success rate of the fitting line segments of the two line feature subgraphs in the step S4 and the error coordinate offset, so that the positioning condition of the robot in the corresponding window can be directly corrected and obtained.
As an embodiment, in the step S4, the matching success rate of the fitted line segments of the two line feature subgraphs is a ratio of the number of all the fitted line segment pairs successfully matched to the number of all the fitted line segment pairs participating in the similarity matching; the fitting line segment pair participating in the similarity matching is composed of fitting line segments of two line feature sub-graphs in a current window, the relative position relation of one fitting line segment in one line feature sub-graph and the other fitting line segment in the other line feature sub-graph is equivalent to the relative position relation of one fitting line segment in the other line feature sub-graph and the other fitting line segment in the other line feature sub-graph, and preferably, the one fitting line segment in the one line feature sub-graph and the one fitting line segment in the other line feature sub-graph are parallel.
In the step S5, the matching success rate between the line feature subgraphs in the current window is the ratio of the number of all line feature subgraphs successfully matched to the number of all line feature subgraphs participating in matching; the line characteristic subgraph pair consists of two different line characteristic subgraphs in the current window, wherein the robot can interpret the matching success rate described in the step S5 as the matching success rate of the line characteristic subgraph pair. When the line feature subgraph pairs are any two different line feature subgraphs in the same window, namely, each line feature subgraph is matched with any one of the rest line feature subgraphs, then the matching success rate between the line feature subgraphs in the current window can be set as the matching success rate between any two different line feature subgraphs. When the line feature subgraph pair is any one line feature subgraph among the same line feature subgraph and the rest of the line feature subgraphs in the same current window, the matching success rate between the line feature subgraphs in the current window can be set to be an accumulated value of the matching success rates of the same line feature subgraphs and the rest of the line feature subgraphs in the same current window and recorded as the matching success rate (statistics and values of the matching success rates of a plurality of line feature subgraphs) of the same line feature subgraph, wherein the same line feature subgraph can be any one line feature subgraph in the same current window. In order to reduce the calculation amount, the robot sets all the line feature subgraphs participating in matching as one line feature subgraph fixed in the current window and any one line feature subgraph of the rest, and only performs matching on the same pair of line feature subgraphs or the same pair of fitted line segments once, so as to complete the matching traversal of all the line feature subgraphs in the current window, and sets the matching success rate between the line feature subgraphs in the current window as the matching success rate between the fixed one line feature subgraphs and any one line feature subgraph of the rest, so that the number of all the line feature subgraphs participating in matching is equal to the difference value between the number of the line feature subgraphs in the current window and the numerical value 1.
As an embodiment, in the step S2, each time the robot rotates one turn at the rotation center position (the robot rotates in place by 360 degrees), the single-point ranging sensor is controlled to collect point cloud data during the rotation of the robot, wherein the point cloud data includes coordinate information of a position point scanned by the single-point ranging sensor and angle information of the position point. The position points can be points representing the outline of the obstacle in the surrounding environment of the robot or points in the acquired point cloud; the robot fits the position points in each angle range to the line segments of the corresponding trend respectively, so that a plurality of sections of straight line segments which are discretely distributed in the window are formed, each angle range is determined according to a fitting model, for example, when the fitting is performed by adopting a least square method, each angle range is determined according to the parameters of a target straight line fitting function, and the line segments of the corresponding trend are formed; if the Hough transformation is adopted to represent the straight line, the angle range of the included angle between the polar line and the x axis of the coordinate system is set in advance, so that the line segment to be fitted in each coordinate quadrant is determined, and the line segment with the corresponding trend is formed. Then the robot sets the part of the fitted line segment with the corresponding trend in the current window as a group of fitted line segments in the current window, wherein the fitted line segments comprise a plurality of fitted line segments, and can be the fitted line segments with the length and the slope meeting the requirements; then the group of fitting line segments form a line characteristic subgraph, namely a set of fitting line segments is formed in the map, so that the construction of the line characteristic subgraph in the current window is completed; in this embodiment, each time the robot composes one of the line feature subgraphs in the map, the robot also records the coordinate information thereof at the rotation center position and the initial angle information (angle information before the robot starts to rotate one turn at the rotation center position (angle information measured by the gyroscope)) so as to compensate the position information of the robot in conjunction with the error coordinate offset amount later.
Preferably, the shape of the current window is rectangular; the transverse side length of the current window is equal to the maximum ranging distance of the single-point ranging sensor; the longitudinal side length of the current window is larger than or equal to the diameter of the robot body; half of the maximum ranging distance of the single-point ranging sensor is equal to the first extension distance; the robot sets the center of the current window as a first rotation center position; the robot sets the central axis of the current window in the longitudinal direction as a base line, and the longitudinal direction of the current window is set to be parallel to the advancing direction of the robot; the robot sets the position points which are separated from the center of the current window by a first extension distance and are positioned in two symmetrical directions of a vertical baseline as a second first rotation center position and a second rotation center position respectively, wherein the second rotation center position and the second rotation center position are respectively positioned at the left side and the right side of the first rotation center position; the robot sets the position points which are separated from the center of the current window by a first extension distance and are positioned in two symmetrical directions of a vertical baseline as a third first rotation center position and a third second rotation center position, the third first rotation center position and the third second rotation center position are respectively positioned at the left side and the right side of the first rotation center position and can be positioned at the longitudinal edge of the current window, only half of line feature subgraphs scanned at the third rotation center position and the third second rotation center position are positioned in the current window, the other half of line feature subgraphs are negligible, and in the embodiment, fitting line segments in the windows are only matched, and the understanding that only the line feature subgraphs in the current window are matched is realized. It should be noted that, the first rotation center position, the second rotation center position, the third rotation center position, and the third rotation center position all belong to the rotation center positions, and the robot traverses each rotation center position in the current window sequentially without repetition, so as to construct a plurality of line feature subgraphs in the current window, and only the subgraphs in the current window.
Preferably, the distance between the positions (rotation center positions) at which the robot continuously makes two rotations within the current window or continuously makes two rotations (in-place rotation for scanning out the line feature sub-graph) is limited to the effective ranging distance of the single-point ranging sensor, wherein the distance between the positions at which the two rotations are continuously made is less than or equal to the maximum ranging distance of the single-point ranging sensor, and the distance between the two rotation center positions includes the distance between the first rotation center position and the second rotation center position, the distance between the first rotation center position and the third rotation center position, and the distance between the third rotation center position and the third rotation center position, which are sequentially equal to one fourth of the effective ranging distance of the single-point ranging sensor, one half of the effective ranging distance of the single-point ranging sensor, and the effective ranging distance of the single-point ranging sensor.
Preferably, for the same detection direction, the effective ranging position point farthest from the single-point ranging sensor is marked as a ranging end point of the single-point ranging sensor, which is equivalent to the farthest obstacle contour point which can be detected; in the process that the robot rotates at the same position for one circle, particularly when the robot rotates in situ according to the preset rotation speed, the acquisition frame rate of the single-point ranging sensors is higher, the distance between the ranging tail end points of two adjacent single-point ranging sensors is smaller, the acquired point cloud data are denser, the obtained obstacle information is about accurate, and the positioning precision and accuracy can be guaranteed. On the other hand, the lower the acquisition frame rate of the single-point ranging sensor is, the larger the distance between the ranging tail end points of the two adjacent single-point ranging sensors is, and the sparse the acquired point cloud data is. To meet the requirement of fitting a straight line and mapping matching, in this embodiment, the distance between the ranging end points of two adjacent single-point ranging sensors may be set to be equal to the arc length that the single-point ranging end points move when the robot rotates in place by one degree, which is related to the body diameter, rotation speed, and/or acquisition frame rate of the single-point ranging sensors of the robot.
As an example of correcting the measurement error of the relevant sensor by the robot, in the step S2, there is also the following processing case for the fitted line segment of the corresponding trend: when the line segment of the corresponding trend fitted by the robot is positioned in the current window, if the robot detects that the length of the line segment of the corresponding trend is larger than a preset fitting length threshold value, the line segment of the corresponding trend is set as a fitting line segment and marked in a map, the coordinates of the starting point and the coordinates of the ending point of the line segment of the corresponding trend and the inclination angle (the angle formed by the coordinates and the transverse axis of the coordinate system) are recorded, so that a fitting line segment is determined according to the recorded information, and although the line (corresponding straight line fitting equation) is fitted, only the line segment of the corresponding length is intercepted and taken as the fitting line segment in the current window, and then the line segment is added into a corresponding group of fitting line segments in the current window. When the fitted line segment with the corresponding trend extends from the inside of the current window to the outside of the current window or the fitted line segment with the corresponding trend extends from the outside of the current window to the inside of the current window (the fitted longer line segment passing through the current window), if the robot detects that the length of the line segment intercepted in the current window of the fitted line segment with the corresponding trend is greater than a preset fitting length threshold value, the line segment intercepted in the current window of the fitted line segment with the corresponding trend is set as the fitting line segment and marked into a map, and the coordinates of the starting point, the coordinates of the ending point and the inclination angle (the angle formed by the transverse axis of a coordinate system) of the line segment intercepted in the current window of the fitted line segment are recorded, so that a fitting line segment is determined according to the recorded information, and then the fitting line segment is added into a group of corresponding fitting line segments in the current window. Specifically, the inclination angle is set as an angle representation of an included angle formed by the line segment of the fitted corresponding trend and the coordinate axis. The preset fitting length threshold is preferably 1 meter; in the robot walking process, a single-point ranging sensor (such as a TOF sensor) on the side can well measure the distance of a wall on the side, then a line segment with a characteristic trend is easily fitted in the step S2, in the fitting process, position points far away from a straight line fitted by a target need to be screened out, so that the fitting error of the straight line is reduced, because the single-point ranging sensor directly ranges distance, the position points exceeding the maximum ranging distance are easily removed, after a straight line equation is detected, the length of the fitted straight line segment is recorded and marked on a corresponding grid map, and the fitted line segment in the current window is formed, so that the shorter and farther line segment is eliminated through the length of the line segment before matching, and the subsequent matching precision is improved.
As an embodiment of correcting the angle error of the gyroscope by the robot, for the step S2, in the process that the TOF sensor of the robot detects the boundary of the same wall, the TOF sensor may continuously detect the same wall in the process that the robot moves along the boundary of the same wall or the robot walks along the i-shaped path, then the robot may not always record the same straight line segment in the map, but always compare the currently detected straight line with the pre-recorded straight line segment when detecting the boundary (straight line) of the same wall, and if the angle is found to be greater than 2 degrees, it is determined that the angle measured by the gyroscope has the angle error, then the angle correction needs to be started. In the walking process of the robot, once a straight line is fitted, a preset database is searched for a fitting line segment corresponding to the trend, so that included angle information between the two is calculated; in this embodiment, the judging criteria for the two different line segments include different coordinates of the start point of the line segment, different coordinates of the end point of the line segment, or different inclination angles; the robot uses a least square method to determine a target linear equation which is fit by the point cloud data collected in a corresponding angle range and represents a line segment of a corresponding trend.
Therefore, each time the robot fits a new line segment (relative to the line of the corresponding trend fitted last time) from the boundary of the wall, calculating the angle between the line segment of the new trend fitted currently and the fitted line segment of the same trend recorded in advance (which can be obtained by calling the line data of the corresponding trend in a preset database, and the angle error of the fitted line segment recorded previously may occur); if the angle between the line segment of the new trend which is currently fitted and the fitting line segment of the same trend which is recorded in advance is larger than the preset fitting angle threshold value, carrying out weighted average treatment on the inclination angle of the line segment of the new trend which is currently fitted and the inclination angle of the fitting line segment of the same trend which is recorded in advance to obtain a calibration inclination angle, and updating the calibration inclination angle into the inclination angle of the fitting line segment of the same trend which is recorded in advance; preferably, in order to obtain the calibration inclination angle, the robot applies a weight of 70% to the inclination angle of the currently fitted line segment of the new trend and applies a weight of 30% to the previously recorded inclination angle of the fitted line segment of the same trend, wherein the previously recorded inclination angle of the fitted line segment of the same trend is an angle to be corrected with errors, so that the weight applied to the inclination angle of the currently fitted line segment of the new trend is relatively large, so that the direction of the subsequently recorded fitted line segment is closer to the direction of the detection contour line corresponding to the actual environment.
As an embodiment, in the foregoing steps S2 to S5, for two line feature subgraphs (one line feature subgraph pair in the same window) participating in matching, the robot sets one of the line feature subgraphs as a reference line feature subgraph and the other line feature subgraph as a line feature subgraph to be matched; preferably, the line feature subgraph to be matched is a line feature subgraph other than the reference line feature subgraph within the current window. Then the robot translates the line characteristic subgraph to be matched to completely coincide with the reference line characteristic subgraph, translates the line characteristic subgraph to be matched to completely cover the reference line characteristic subgraph, or translates the line characteristic subgraph to be matched to completely cover the reference line characteristic subgraph; setting the translated line feature subgraph to be matched as a target match line feature subgraph, wherein all the fit line segments in the target match line feature subgraph translate relative to the corresponding fit line segments in the line feature subgraph to be matched, and the translation direction is not limited to be parallel to the coordinate axis or form a preset included angle (such as 45 degrees, 30 degrees and the like) with the coordinate axis; the linear mapping of the line characteristic subgraph to be matched is realized, the mapping of the line characteristic subgraph to be matched to the reference line characteristic subgraph is completed in a translation mode, and the coincidence of the rotation center position corresponding to the line characteristic subgraph to be matched and the rotation center position corresponding to the reference line characteristic subgraph, namely the coincidence of the rotation center position corresponding to the target matching line characteristic subgraph and the rotation center position corresponding to the reference line characteristic subgraph, can be realized.
Corresponding to step S3, when the overall coincidence ratio of one fitting line segment in the target matching line characteristic subgraph and one fitting line segment at the corresponding position in the reference line characteristic subgraph is larger than or equal to the preset coincidence ratio, determining that the matching of the one fitting line segment in the target matching line characteristic subgraph and one fitting line segment at the corresponding position in the reference line characteristic subgraph is successful in similarity matching, and marking as one fitting line segment pair which is successfully matched; it should be noted that, the robot needs to calculate the overall coincidence ratio of each fitting line segment in the target matching line feature subgraph and the fitting line segment at the corresponding position in the reference line feature subgraph one by one, including calculating the coincidence degree in the length dimension of the line segment, the coincidence degree in the inclination angle dimension, and the coincidence ratio in the position dimension. In order to expand the scope of the search area, the fitted line segment at the corresponding position in the reference line characteristic subgraph can be any fitted line segment in the reference line characteristic subgraph. Alternatively, in order to reduce the calculation amount, the fitted line segment at the corresponding position in the reference line feature sub-graph may be parallel to the fitted line segment in the target match line feature sub-graph participating in the similarity matching, or have the same relative positional relationship (including angle and distance) with respect to the center of the current window.
After the robot completes matching of each fitting line segment in the target matching line feature subgraph with the fitting line segment at the corresponding position in the reference line feature subgraph according to the step S3, the robot executes to the step S4, and when the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs participating in similarity matching is greater than or equal to a first preset success rate, the successful matching of the target matching line feature subgraph with the reference line feature subgraph is determined, and the matching is recorded as one line feature subgraph pair successfully matched. In some embodiments, the relative positional relationship (information of which orientation of the line is at a point, the distance of the line from the point, etc.) of two fitted line segments participating in similarity matching with respect to the center of the window (origin of the coordinate system to which the window belongs) is equivalent; or, the relative position relation of one fitting line segment participating in similarity matching relative to the rotation center position of the located target matching line feature sub-graph (information of which direction of a line is at a point, the distance between the line and the point and the like) is equivalent to the relative position relation of the other fitting line segment participating in similarity matching relative to the rotation center position of the located reference line feature sub-graph; alternatively, the two fitting line segments participating in the similarity matching are parallel fitting line segments separating the target matching line feature subgraph and the reference line feature subgraph. The number of all pairs of fitted line segments participating in the similarity matching may be equal to the number of fitted line segments within the target match line feature sub-graph or equal to the number of fitted line segments within the reference line feature sub-graph.
And corresponding to step S5, after traversing and matching all the line feature subgraphs in the current window by the robot, determining that traversing of all the fitting line segments in all the line feature subgraphs by the robot in the current window is completed when the ratio of the number of the successfully matched all the line feature subgraphs to the number of the matched all the line feature subgraphs is greater than or equal to a second preset success rate, obtaining an effective line feature subgraph pair, and stopping matching the similarity between the fitting line segments in the target matching line feature subgraphs and the fitting line segments at the corresponding positions in the reference line feature subgraphs in the current window by using the successfully matched all the line feature subgraphs to locally position the robot. All the line characteristic subgraphs participating in the matching are any two line characteristic subgraphs in the current window, or all the line characteristic subgraphs participating in the matching are one line characteristic subgraph and any one of the rest line characteristic subgraphs which are fixed in the current window. The coordinate offset of the target match line feature subgraph relative to the to-be-matched line feature subgraph is equal to the coordinate offset of the rotation center position corresponding to the target match line feature subgraph relative to the rotation center position corresponding to the to-be-matched line feature subgraph.
On the basis of the above embodiment, the method for controlling the robot to perform similarity matching between the fitting line segment in the target matching line feature sub-graph and the fitting line segment at the corresponding position in the reference line feature sub-graph includes: step 31, calculating the ratio of the absolute value of the difference value between the length of one fitting line segment in the target matching line feature sub-graph and the length of one fitting line segment at the corresponding position in the reference line feature sub-graph to the length of one fitting line segment at the corresponding position in the reference line feature sub-graph, and marking the ratio as the difference rate in the length dimension of the line segment, wherein the length difference between the two fitting line segments can be described in a ratio mode; calculating the ratio of the absolute value of the difference value between the inclination angle of one fitting line segment in the target matching line feature subgraph and the inclination angle of one fitting line segment at the corresponding position in the reference line feature subgraph to the inclination angle of one fitting line segment at the corresponding position in the reference line feature subgraph, and marking the ratio as the difference rate in the inclination angle dimension, wherein the inclination angle difference or the similarity degree of the orientation between the two fitting line segments can be described in an angle ratio mode due to the fact that the line segments have directivity; calculating the ratio of the number of coordinate positions (which can be the same grid) where one fitting line segment in the target matching line feature sub-graph and one fitting line segment at the corresponding position in the reference line feature sub-graph pass together to the number of coordinate positions where one fitting line segment at the corresponding position in the reference line feature sub-graph passes, and marking the ratio as the coincidence ratio in the position dimension, wherein the ratio is understood to describe the ratio of the overlapping area between the two fitting line segments in a ratio mode. Step 32 is then performed. Step 32, for one fitting line segment in the target matching line feature sub-graph and one fitting line segment at a corresponding position in the reference line feature sub-graph, when the difference rate in the length dimension of the line segment is less than or equal to a preset difference rate, and the difference rate in the inclination angle dimension is less than or equal to a preset difference rate, and the coincidence rate in the position dimension is greater than or equal to a preset coincidence rate, determining that the overall coincidence rate of one fitting line segment in the target matching line feature sub-graph and one fitting line segment at a corresponding position in the reference line feature sub-graph is greater than or equal to a preset coincidence rate, wherein the sum of the preset difference rate and the preset coincidence rate is equal to 100%, preferably, the preset coincidence rate is set to 80%, and the preset coincidence rate is set to 20%; and further determining that the matching line segment in the target matching line characteristic subgraph and the matching line segment at the corresponding position in the reference line characteristic subgraph are successfully matched in similarity matching.
Or the method for controlling the robot to perform similarity matching between the fitting line segment in the target matching line characteristic subgraph and the fitting line segment at the corresponding position in the reference line characteristic subgraph by the robot comprises the following steps: step 31, calculating the ratio of the absolute value of the difference value between the line segment length of one fitting line segment in the target matching line feature subgraph and the line segment length of one fitting line segment at the corresponding position in the reference line feature subgraph to the line segment length of one fitting line segment with relatively longer line segment length (one fitting line segment with relatively larger line segment length in two fitting line segments participating in similarity matching), marking the ratio as a first ratio, and marking the difference value between the value 1 and the first ratio as the coincidence ratio in the line segment length dimension; calculating the ratio of the absolute value of the difference value between the inclination angle of one fitting line segment in the target matching line characteristic subgraph and the inclination angle of one fitting line segment at the corresponding position in the reference line characteristic subgraph to the inclination angle of one fitting line segment with relatively large inclination angle (one fitting line segment with relatively large inclination angle in two fitting line segments participating in similarity matching), marking the ratio as a second ratio, and marking the difference value between the value 1 and the second ratio as the coincidence rate in the dimension of the inclination angle; calculating the ratio of the number of coordinate positions which one fitting line segment in the target matching line characteristic subgraph passes through together with one fitting line segment at the corresponding position in the reference line characteristic subgraph to the number of coordinate positions which one fitting line segment passing through relatively more coordinate positions (one fitting line segment with relatively more passing coordinate positions (position points) in two fitting line segments participating in similarity matching) passes through, marking as the coincidence ratio in the position dimension, and describing the occupation ratio condition of the overlapping area between the two fitting line segments in a ratio mode. One fitting line segment in the target matching line characteristic subgraph and one fitting line segment at a corresponding position in the reference line characteristic subgraph are two fitting line segments participating in similarity matching. Step 32 is then performed. Step 32, for a fitted line segment in the target match line feature sub-graph and a fitted line segment at a corresponding position in the reference line feature sub-graph, when the sum of the coincidence rate in the length dimension of the line segment, the coincidence rate in the inclination angle dimension and the coincidence rate in the position dimension is greater than or equal to a preset coincidence rate, determining that the integral coincidence rate of the fitted line segment in the target match line feature sub-graph and the fitted line segment at the corresponding position in the reference line feature sub-graph is greater than or equal to the preset coincidence rate, wherein the sum of the preset difference rate and the preset coincidence rate is equal to 100%, preferably, the preset coincidence rate is set to 80%, and the preset difference rate is set to 20%; and further determining that the matching line segment in the target matching line characteristic subgraph and the matching line segment at the corresponding position in the reference line characteristic subgraph are successfully matched in similarity matching.
As an embodiment, in the process of repeatedly executing step 31 and step 32, after the robot completes matching each fitting line segment in the target matching line feature sub-graph with the fitting line segment at the corresponding position in the reference line feature sub-graph, since the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs participating in similarity matching is easily detected to be smaller than the first preset success rate in consideration of the position error, the robot will set the rotation center position corresponding to the target matching line feature sub-graph as the offset start position next; in some embodiments, before the robot performs similarity matching of the line segment lengths between each of the fitted line segments in the target match line feature sub-graph and the fitted line segment at the corresponding position in the reference line feature sub-graph, the robot sets the rotation center position corresponding to the target match line feature sub-graph as the offset start position, where the similarity matching of the line segment lengths is preceded by translation of the fitted line segment in the target match line feature sub-graph but no translation of the corresponding rotation center position. And then, the robot controls the target match line feature subgraph to translate along the direction of a given coordinate axis from the offset starting point position according to a preset translation step length, updates the target match line feature subgraph translated by the preset translation step length into the target match line feature subgraph, and executes the step 31 and the step 32 to realize the control of matching the similarity between the fit line segment in the target match line feature subgraph and the fit line segment at the corresponding position in the reference line feature subgraph again.
In this embodiment, the robot control target match line feature sub-graph starts from the offset start point position, translates along the predetermined coordinate axis direction according to the preset translation step length, and selects to translate within the current window. And each time the preset translation step length is translated, the corresponding fitting line segment in the translated target matching line feature subgraph is translated, and the translation direction is not limited to the positive direction or the negative direction of the coordinate axis. The mapping from the target match line characteristic subgraph to the reference line characteristic subgraph is completed in a translation mode along the coordinate axis direction, and the influence of noise errors acquired by the sensor is overcome; in this embodiment, each time the same target match line feature sub-graph translates along the predetermined coordinate axis direction by the preset translation step length, the translated target match line feature sub-graph is updated to the target match line feature sub-graph, so that the corresponding fit line segment in the target match line feature sub-graph can execute step 31 and step 32 to control the fit line segment in the target match line feature sub-graph and the fit line segment at the corresponding position in the reference line feature sub-graph to perform similarity matching of the line segment length, and in the process of performing similarity matching of the line segment length, the coincidence rate (which can be regarded as the similarity of the line segment length) in the line segment length dimension needs to be calculated; and then, after each robot is matched with each fitting line segment in the target matching line characteristic subgraph and the fitting line segment at the corresponding position in the reference line characteristic subgraph, the robot judges whether the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs matched by the similarity of the lengths of the participated line segments is greater than or equal to a first preset success rate, and judges the matching condition of the target matching line characteristic subgraph and the reference line characteristic subgraph after each translation.
It should be noted that, the maximum preset offset is preferably 10 grids, and the maximum preset offset may be a ratio of a preset maximum positioning error to a preset side length of a grid or a rounding result of the ratio, where a unit is a grid number; the preset maximum positioning error and the preset translation step length do not exceed the side length of a window, and the maximum distance measurement distance of the single-point distance measurement sensor is not exceeded. The preset maximum positioning error can be obtained by repeated comparison experiments of the sensing data of the TOF sensor and/or the gyroscope. In order to expand the scope of the search area within the global working area, the fitted line segment at the corresponding position in the reference line feature sub-graph may be any fitted line segment in the reference line feature sub-graph. Alternatively, to reduce the amount of computation, the fitted line segment at the corresponding position in the reference line feature sub-graph may be parallel to the fitted line segment in the target matched line feature sub-graph matching the similarity of the lengths of the participating line segments.
On the basis of the embodiment, on the premise that the coordinate offset of one target match line feature sub-graph translated in the same coordinate axis direction from the offset starting point position does not reach the maximum preset offset, each time the target match line feature sub-graph is translated by a preset translation step length, after the robot completes matching of each fit line segment in the target match line feature sub-graph with the fit line segment at the corresponding position in the reference line feature sub-graph, the robot judges that the ratio of the number of all fit line segment pairs successfully matched to the number of all fit line segment pairs matched with the similarity of the participating line segment lengths is greater than or equal to a first preset success rate, if yes, the target match line feature sub-graph (which can be understood as the target match line feature sub-graph without translation or the translated target match line feature sub-graph) is successfully matched with the reference line feature sub-graph, and sets the coordinate offset of the target match line feature translated in the latest translation direction from the offset position as an error coordinate offset, if yes, and if yes, the target match line feature sub-graph can be translated in the same integer multiple of the translation step length in the same grid-length direction; the error coordinate offset is provided with signs related to the translation direction, so that the coordinate of the repositioning position of the robot can be influenced during subsequent correction; at this point, the robot controls the target match line feature subgraph to stop translating.
It should be noted that, the error coordinate offset is decomposed into a coordinate offset in the horizontal axis direction and a coordinate offset in the vertical axis direction, the error coordinate offset may change along with the change of the translation direction, and when the error coordinate offset is that the translation of the target match line feature sub-graph in the horizontal axis direction is accumulated, the vertical axis error offset is set to 0; when the error coordinate offset is accumulated by the translation of the target match line feature subgraph in the vertical axis direction, the horizontal axis error offset is set to 0.
Under the premise that the coordinate offset of one target match line feature sub-graph shifted in the same coordinate axis direction from the offset starting point position does not reach the maximum preset offset, each time the target match line feature sub-graph is shifted by a preset shift step length, after the robot matches each fit line segment in the target match line feature sub-graph with the fit line segment at the corresponding position in the reference line feature sub-graph, the robot judges that the ratio of the number of all fit line segment pairs successfully matched to the number of all fit line segment pairs participating in similarity matching is smaller than a first preset success rate, the robot may match each fit line segment in the target match line feature sub-graph with the fit line segment at the corresponding position in the reference line feature sub-graph, or most of fitting line segments in the target matching line characteristic subgraph and fitting line segments at corresponding positions in the reference line characteristic subgraph are matched, the robot adjusts the direction of the set coordinate axis to be opposite or perpendicular to the direction of the set coordinate axis, and then updates the direction of the opposite or perpendicular coordinate axis to be the direction of the set coordinate axis, at the moment, the direction of the opposite or perpendicular coordinate axis is the latest translation direction adjusted by the robot, and can be changed from the positive direction of an x axis (a transverse axis) to the negative direction of the x axis (a transverse axis) or from the positive direction of the x axis (a transverse axis) to the positive direction of the y axis (a longitudinal axis), so as to control the target matching line characteristic subgraph to translate towards different translation directions from the previous directions, and overcome the influence of position offset errors in the corresponding coordinate axis directions. Then the robot controls the target match line feature subgraph (the corresponding rotation center position is the offset starting position) to translate the preset translation step length along the preset coordinate axis direction (updated) from the offset starting position, updates the translated target match line feature subgraph into the target match line feature subgraph, repeatedly executes the steps 31 and 32 until the robot completes matching of each fit line segment in the target match line feature subgraph with the fit line segment at the corresponding position in the reference line feature subgraph, calculates the ratio of the number of all fit line segment pairs successfully matched to the number of all fit line segment pairs participating in similarity matching, and then determines whether to continue controlling the translation of the target match line feature subgraph along the preset coordinate axis direction by judging whether the ratio of the number of all fit line segment pairs successfully matched to the number of all fit line segment pairs participating in similarity matching of line segment lengths is greater than or equal to a first preset success rate.
It should be noted that, starting from the offset starting point position, before the coordinate offset translated in the given coordinate axis direction reaches the maximum preset offset, each target match line feature sub-graph translates once along the given coordinate axis direction by the preset translation step length, the translated target match line feature sub-graph is updated to the target match line feature sub-graph, and then step 31 and step 32 are executed to calculate the ratio of the number of all the matched line segment pairs successfully matched to the number of all the matched line segment pairs participating in similarity matching in the new translation direction. And each time the robot completes matching of each fitting line segment in the target matching line feature subgraph with the fitting line segment at the corresponding position in the reference line feature subgraph, the robot judges whether the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs matched with the similarity of the lengths of the participating line segments is greater than or equal to a first preset success rate.
If the coordinate offset of the target match line feature sub-graph translated in the same coordinate axis direction from the offset starting point position has reached the maximum preset offset, after the robot has matched each fit line segment in the target match line feature sub-graph with the fit line segment at the corresponding position in the reference line feature sub-graph, judging whether the ratio of the number of all fit line segment pairs successfully matched to the number of all fit line segment pairs participating in similarity matching is greater than or equal to the first preset success rate, if yes, determining that the target match line feature sub-graph (which can be understood as the target match line feature sub-graph without translation, or as the target match line feature sub-graph after translation) is successfully matched with the reference line feature sub-graph, and recording the coordinate offset of the robot translated in the latest translation direction as an error coordinate offset, namely setting the maximum preset offset as the error coordinate offset, and setting an error coordinate offset corresponding to the target match line feature, but not updating the error coordinate offset; and then the robot controls the target matching line characteristic subgraph to stop translating, and similarity matching of the line segment lengths between each fitting line segment in the target matching line characteristic subgraph and the fitting line segment at the corresponding position in the reference line characteristic subgraph is also ended. The robot judges that the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs participating in similarity matching is smaller than the first preset success rate, adjusts the direction of the set coordinate axis to be opposite or perpendicular to the direction of the set coordinate axis, updates the direction of the opposite or perpendicular coordinate axis to be the direction of the set coordinate axis, controls the original target matching line characteristic subgraph to translate the preset translation step length along the direction of the set coordinate axis from the offset starting point position, updates the translated target subgraph to be matched to the target subgraph to be matched, and then executes steps 31 and 32; similarly, in the updated given coordinate axis direction, starting from the same offset starting point position, and before the coordinate offset translated in the given coordinate axis direction reaches the maximum preset offset, updating the translated target match line feature sub-graph into the target match line feature sub-graph every time the preset translation step length is translated, and then executing the steps 31 and 32; and after each robot is matched with each fitting line segment in the target matching line feature subgraph and the fitting line segment at the corresponding position in the reference line feature subgraph, the robot judges whether the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs matched by the similarity of the lengths of the participating line segments is larger than or equal to a first preset success rate.
On the basis of the foregoing embodiment, when the robot has controlled the original target match line feature sub-graph to have shifted in all coordinate axis directions from the shift start position, and after the coordinate shift amounts of the target match line feature sub-graph shifted in all coordinate axis directions from the shift start position reach the maximum preset shift amounts, the robot determines that the ratio of the number of all fit line segment pairs successfully matched to the number of all fit line segment pairs participating in similarity matching is smaller than the first preset success rate, the robot determines that the matching of the original target match line feature sub-graph to the reference line feature sub-graph fails, specifically, controls the same target match line feature sub-graph to shift in all coordinate axis directions from the shift start position, and determines that the ratio of the number of all fit line segment pairs successfully matched to the number of all fit line segment pairs participating in similarity matching is smaller than the first preset success rate when the target match line feature sub-graph shifts each time; even if the coordinate offset of the target match line feature subgraph translated along each coordinate axis direction from the offset starting point position reaches the maximum preset offset, the robot judges that the ratio of the number of all the matched line segment pairs successfully matched to the number of all the matched line segment pairs participating in similarity matching is smaller than a first preset success rate, and finally determines that the original target match line feature subgraph and the reference line feature subgraph fail to match.
In summary, when the robot detects that the ratio of the number of all the matched line segment pairs successfully matched to the number of all the matched line segment pairs participating in similarity matching is smaller than a first preset success rate, the target match line feature subgraph is controlled to translate preset translation step lengths along different coordinate axis directions successively from the same position, the target match line feature subgraph after translating the preset translation step lengths each time is updated to be the target match line feature subgraph, step 31 and step 32 are repeatedly executed until the ratio of the number of all the matched line segment pairs successfully matched to the number of all the matched line segment pairs participating in similarity matching among all the matched line segment pairs in the target match line feature subgraph before and after translation to the number of all the matched line segment pairs participating in similarity matching is larger than or equal to the first preset success rate, the original target match line feature subgraph and the reference line feature subgraph are determined to be successfully matched, and the coordinate offset generated in the latest translation direction is set as an error coordinate offset.
As another embodiment, for step S3, the robot sets one line feature subgraph of the line feature subgraph pair participating in matching as a reference line feature subgraph, and sets the other line feature subgraph of the line feature subgraph pair participating in matching as a line feature subgraph to be matched; preferably, the line feature subgraph to be matched is a line feature subgraph other than the reference line feature subgraph within the current window. The robot sets the rotation center position corresponding to the line characteristic subgraph to be matched as an offset starting point position; and then the robot controls the line feature subgraph to be matched to start from the offset starting point position, translates along the direction of a given coordinate axis according to a preset translation step length, updates the line feature subgraph to be matched after translating by the preset translation step length into the line feature subgraph to be matched, and controls the fitting line segment in the line feature subgraph to be matched with the fitting line segment at the corresponding position in the reference line feature subgraph in a similarity matching way so as to realize controlling the fitting line segment in the line feature subgraph to be matched with the fitting line segment at the corresponding position in the reference line feature subgraph again in a similarity matching way.
In this embodiment, the robot controls the to-be-matched line feature subgraph to translate along the direction of the predetermined coordinate axis according to the preset translation step length from the offset start point position, and selects to translate within the current window. And each time the preset translation step length is translated, the corresponding fitting line segment in the translated line feature subgraph to be matched is translated, and the translation direction is not limited to the positive direction or the negative direction of the coordinate axis. The mapping from the line characteristic subgraph to be matched to the reference line characteristic subgraph is completed in a translation mode along the coordinate axis direction, and the influence of noise errors acquired by the sensor is overcome; in this embodiment, each time the same line feature sub-graph to be matched translates along the predetermined coordinate axis direction by the preset translation step length, the translated line feature sub-graph to be matched is updated to be the line feature sub-graph to be matched, so that a corresponding fitting line segment in the line feature sub-graph to be matched can control the fitting line segment in the line feature sub-graph to be matched to a fitting line segment at a corresponding position in the reference line feature sub-graph to be matched in similarity, and in the process of matching the similarity of the line segment lengths, the coincidence rate (which can be regarded as the similarity of the line segment lengths) in the line segment length dimension needs to be calculated; and then, after each robot completes matching of each fitting line segment in the to-be-matched line feature subgraph and the fitting line segment at the corresponding position in the reference line feature subgraph, the robot judges whether the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs matched by the similarity of the lengths of the participated line segments is greater than or equal to a first preset success rate, and judges the matching condition of the to-be-matched line feature subgraph and the reference line feature subgraph after each translation.
On the basis of the embodiment, on the premise that the coordinate offset of one line feature sub-graph to be matched, which is translated in the same coordinate axis direction from the offset starting point position, does not reach the maximum preset offset, each time the line feature sub-graph to be matched is translated by a preset translation step length, after the robot is matched with each fitting line segment in the line feature sub-graph to be matched and the fitting line segment at the corresponding position in the reference line feature sub-graph, the robot judges that the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs matched to the number of similarity of the participating line segment pairs is greater than or equal to a first preset success rate, if the ratio is greater than or equal to the first preset success rate, the characteristic sub-graph to be matched (the characteristic sub-graph to be matched which is not translated can be understood as the characteristic sub-graph to be matched, or the characteristic sub-graph to be translated can be understood as the characteristic sub-graph to be matched) is successfully matched with the reference line feature sub-graph, and the offset of the line feature sub-graph to be matched is set as the coordinate offset of the coordinate offset which is successfully matched, and the number of the line feature sub-graph to be translated in the latest translation direction from the offset position is set to be the integral multiple of the whole number of the translation step length, and the number of the line sub-graph to be translated in the translation direction is translated in the same as the whole number of the whole number in the translation direction; the error coordinate offset is provided with signs related to the translation direction, so that the coordinate of the repositioning position of the robot can be influenced during subsequent correction; at this time, the robot controls the line feature subgraph to be matched to stop translation.
It should be noted that, the error coordinate offset is decomposed into a coordinate offset in the horizontal axis direction and a coordinate offset in the vertical axis direction, the error coordinate offset may change along with the change of the translation direction, and when the error coordinate offset is that the translation of the to-be-matched line feature subgraph in the horizontal axis direction is accumulated, the vertical axis error offset is set to 0; when the error coordinate offset is accumulated by the translation of the line feature subgraph to be matched in the vertical axis direction, the horizontal axis error offset is set to 0.
Under the premise that the coordinate offset of one to-be-matched line characteristic subgraph shifted in the same coordinate axis direction from the offset starting point position does not reach the maximum preset offset, each time the to-be-matched line characteristic subgraph shifts by a preset shift step, after the robot finishes matching each fitting line segment in the to-be-matched line characteristic subgraph with the fitting line segment at the corresponding position in the reference line characteristic subgraph, the robot judges that the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs participating in similarity matching is smaller than a first preset success rate, the robot can finish matching each fitting line segment in the to-be-matched line characteristic subgraph with the fitting line segment at the corresponding position in the reference line characteristic subgraph, or most of fitting line segments in the characteristic subgraph to be matched with the fitting line segments at corresponding positions in the reference characteristic subgraph are matched, the robot adjusts the direction of the set coordinate axis to be opposite or perpendicular to the direction of the set coordinate axis, and then updates the direction of the opposite or perpendicular coordinate axis to be the direction of the set coordinate axis, at the moment, the direction of the opposite or perpendicular coordinate axis is the latest translation direction adjusted by the robot, and can be changed from the positive direction of an x axis (a transverse axis) to the negative direction of the x axis (a transverse axis) or from the positive direction of the x axis (a transverse axis) to the positive direction of the y axis (a longitudinal axis), so as to control the characteristic subgraph to be matched to translate towards different translation directions from the previous directions, and overcome the influence of position offset errors in the directions of the corresponding coordinate axes. And then the robot controls the to-be-matched line characteristic subgraph (the corresponding rotation center position is the offset starting position) to translate the preset translation step length along the given coordinate axis direction (updated) from the offset starting position, updates the translated to-be-matched line characteristic subgraph into the to-be-matched line characteristic subgraph, then controls the fitting line segments in the to-be-matched line characteristic subgraph to carry out similarity matching with the fitting line segments at the corresponding positions in the reference line characteristic subgraph until the robot finishes matching each fitting line segment in the to-be-matched line characteristic subgraph with the fitting line segments at the corresponding positions in the reference line characteristic subgraph, calculates the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs participating in similarity matching, and then determines whether to continue controlling the to-be-matched line characteristic to translate along the given coordinate axis direction by judging whether the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs participating in similarity matching is larger than or equal to a first preset success rate.
It should be noted that, starting from the offset starting point position, before the coordinate offset translated in the given coordinate axis direction reaches the maximum preset offset, each identical line feature sub-graph to be matched translates once along the given coordinate axis direction by the preset translation step length, updating the translated line feature sub-graph to be matched as the line feature sub-graph to be matched, and then controlling the matching line segments in the line feature sub-graph to be matched and the matching line segments at the corresponding positions in the reference line feature sub-graph to perform similarity matching so as to calculate the ratio of the number of all matching line segment pairs successfully matched to the number of all matching line segment pairs participating in similarity matching in the new translation direction. And each time the robot matches each fitting line segment in the line feature subgraph to be matched with the fitting line segment at the corresponding position in the reference line feature subgraph, the robot judges whether the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs matched with the similarity of the lengths of the participating line segments is larger than or equal to a first preset success rate.
If the coordinate offset of the line feature subgraph to be matched, which is translated in the same coordinate axis direction from the offset starting point position, has reached the maximum preset offset, after the robot has matched each fitting line segment in the line feature subgraph to be matched with the fitting line segment at the corresponding position in the reference line feature subgraph, judging whether the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs participating in similarity matching is greater than or equal to the first preset success rate, if yes, determining that the line feature subgraph to be matched (which can be understood as the line feature subgraph to be matched without translation, which can also be understood as the line feature subgraph to be matched after translation) is successfully matched with the reference line feature subgraph, and recording the coordinate offset of the robot translated in the latest translation direction as an error coordinate offset, namely setting the maximum preset offset as the error coordinate offset, and not updating the error coordinate offset corresponding to the line feature to be matched; and then the robot controls the line feature subgraph to be matched to stop translating, and similarity matching of the line segment lengths between each fitting line segment in the line feature subgraph to be matched and the fitting line segment at the corresponding position in the reference line feature subgraph is also finished. The robot judges that the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs participating in similarity matching is smaller than the first preset success rate, adjusts the direction of the set coordinate axis to be opposite to or perpendicular to the direction of the set coordinate axis, updates the direction of the opposite coordinate axis or perpendicular to the direction of the set coordinate axis to be the direction of the set coordinate axis, controls the original line feature subgraph to be matched to translate the preset translation step length along the direction of the set coordinate axis from the offset starting point position, updates the translated target subgraph to be matched to be the target subgraph, and controls the fitting line segments in the line feature subgraph to be matched to be similar to the fitting line segments in the corresponding position in the reference line feature subgraph; similarly, in the updated given coordinate axis direction, the to-be-matched line feature subgraph starts from the same offset starting point position, and before the coordinate offset translated in the given coordinate axis direction reaches the maximum preset offset, each time the preset translation step length is translated, the translated to-be-matched line feature subgraph is updated to be the to-be-matched line feature subgraph, and then step 31 and step 32 are executed; and after each robot is matched with each fitting line segment in the line characteristic subgraph to be matched and the fitting line segment at the corresponding position in the reference line characteristic subgraph, the robot judges whether the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs matched by the similarity of the lengths of the participated line segments is larger than or equal to a first preset success rate.
On the basis of the foregoing embodiment, when the robot has controlled the original line feature subgraph to be matched to have shifted in all coordinate axis directions from the shift start position, and after the coordinate shift amounts of the line feature subgraph to be matched shifted in all coordinate axis directions from the shift start position reach the maximum preset shift amounts, the robot determines that the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs participating in similarity matching is smaller than a first preset success rate, the robot determines that the original line feature subgraph to be matched to the reference line feature subgraph fails to match, specifically, controls the same line feature subgraph to be matched to shift in all coordinate axis directions from the shift start position, and determines that the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs participating in similarity matching is smaller than the first preset success rate when the line feature subgraph to be matched each time; and even if the coordinate offset of the line feature subgraph to be matched, which is translated along the direction of each coordinate axis from the offset starting point position, reaches the maximum preset offset, the robot judges that the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs participating in similarity matching is smaller than a first preset success rate, and finally determines that the original line feature subgraph to be matched and the reference line feature subgraph fail to match.
On the basis of the above embodiment, the method for controlling the robot to perform similarity matching between the fitting line segment in the to-be-matched line feature sub-graph and the fitting line segment at the corresponding position in the reference line feature sub-graph includes: step 31, calculating the ratio of the absolute value of the difference value between the length of one fitting line segment in the characteristic subgraph of the line to be matched and the length of one fitting line segment at the corresponding position in the characteristic subgraph of the reference line to the length of one fitting line segment at the corresponding position in the characteristic subgraph of the reference line, and marking the ratio as the difference rate in the length dimension of the line segment, wherein the length difference between the two fitting line segments can be described in a mode of the absolute value of the difference value and the ratio; calculating the ratio of the absolute value of the difference value between the inclination angle of one fitting line segment in the characteristic subgraph of the line to be matched and the inclination angle of one fitting line segment at the corresponding position in the characteristic subgraph of the reference line to the inclination angle of one fitting line segment at the corresponding position in the characteristic subgraph of the reference line, and marking the ratio as the difference rate in the dimension of the inclination angle, wherein the orientation difference degree between the two fitting line segments can be described in an angle ratio mode because the line segments have directivity; calculating the ratio of the number of coordinate positions through which one fitting line segment in the characteristic subgraph to be matched and one fitting line segment at the corresponding position in the characteristic subgraph of the reference line pass together to the number of coordinate positions through which one fitting line segment at the corresponding position in the characteristic subgraph of the reference line passes, marking the ratio as the coincidence ratio in the position dimension, and describing the occupation ratio of the overlapping area between the two fitting line segments in a ratio mode. Step 32 is then performed. Step 32, for a fitted line segment in the line feature subgraph to be matched and a fitted line segment at a corresponding position in the reference line feature subgraph, when the difference rate in the length dimension of the line segment is less than or equal to a preset difference rate, the difference rate in the inclined angle dimension is less than or equal to a preset difference rate, and the coincidence rate in the position dimension is greater than or equal to a preset coincidence rate, determining that the overall coincidence rate of the fitted line segment in the line feature subgraph to be matched and the fitted line segment at the corresponding position in the reference line feature subgraph is greater than or equal to the preset coincidence rate, wherein the sum of the preset difference rate and the preset coincidence rate is equal to 100%, preferably, the preset coincidence rate is set to 80%, and the preset coincidence rate is set to 20%; and further determining that the matching line segment in the line characteristic subgraph to be matched is successfully matched with one matching line segment at the corresponding position in the reference line characteristic subgraph in similarity matching.
Or the method for controlling the matching line segment in the to-be-matched line characteristic subgraph and the matching line segment at the corresponding position in the reference line characteristic subgraph by the robot to carry out similarity matching comprises the following steps: step 31, calculating the ratio of the absolute value of the difference between the length of one fitting line segment in the characteristic subgraph of the line to be matched and the length of one fitting line segment (one fitting line segment with relatively large length of two fitting line segments participating in similarity matching) with relatively long length of the line segment at the corresponding position in the characteristic subgraph of the reference line, marking the ratio as a first ratio, marking the difference between the value 1 and the first ratio as the coincidence rate in the length dimension of the line segment, and describing the length difference between the two fitting line segments in a mode of the absolute value of the difference and the ratio; calculating the ratio of the absolute value of the difference value between the inclination angle of one fitting line segment in the characteristic subgraph of the line to be matched and the inclination angle of one fitting line segment at the corresponding position in the characteristic subgraph of the reference line and the inclination angle of one fitting line segment with relatively large inclination angle (one fitting line segment with relatively large inclination angle in two fitting line segments participating in similarity matching), marking the ratio as a second ratio, and marking the difference value between the value 1 and the second ratio as the superposition ratio in the dimension of the inclination angle, wherein the orientation difference between the two fitting line segments can be described in an angle ratio mode because the line segments have directivity; calculating the ratio of the number of coordinate positions which are passed by one fitting line segment in the characteristic subgraph of the line to be matched and one fitting line segment at the corresponding position in the characteristic subgraph of the reference line and the number of coordinate positions which are passed by one fitting line segment (one fitting line segment with relatively more passing coordinate positions (position points) in two fitting line segments which participate in similarity matching) of the relatively more passing coordinate positions, marking the ratio as the coincidence ratio in the position dimension, and describing the occupation ratio condition of the overlapping area between the two fitting line segments in a ratio mode; one fitting line segment in the characteristic subgraph of the line to be matched and one fitting line segment at the corresponding position in the characteristic subgraph of the reference line are two fitting line segments participating in similarity matching. Step 32 is then performed. Step 32, for a fitting line segment in the line feature subgraph to be matched and a fitting line segment at a corresponding position in the reference line feature subgraph, when the sum of the coincidence rate in the length dimension of the line segment, the coincidence rate in the inclination angle dimension and the coincidence rate in the position dimension is greater than or equal to a preset coincidence rate, determining that the integral coincidence rate of the fitting line segment in the line feature subgraph to be matched and the fitting line segment at the corresponding position in the reference line feature subgraph is greater than or equal to the preset coincidence rate; and further determining that the matching line segment in the line characteristic subgraph to be matched is successfully matched with one matching line segment at the corresponding position in the reference line characteristic subgraph in similarity matching.
As an embodiment, in the step S5, when the ratio of the number of the pairs of all the line feature subgraphs that are successfully matched to the number of the pairs of all the line feature subgraphs that are involved in the matching is greater than or equal to a second preset success rate, the robot obtains a plurality of error coordinate offsets, where each time the robot determines one pair of the line feature subgraphs that are successfully matched in the step S4, an error coordinate offset is set, so that the robot obtains a plurality of error coordinate offsets after traversing all the line feature subgraphs in the current window and performing similarity matching on all the fitted line segment pairs, the robot counts the number of all the error coordinate offsets, and the number of the error coordinate offsets obtained by accumulation is equal to the number of all the pairs of the line feature subgraphs that are successfully matched. Each error coordinate offset comprises a horizontal axis error coordinate offset and a vertical axis error coordinate offset, each error coordinate offset is expressed in a coordinate form, the horizontal axis error coordinate offset is a coordinate value of the error coordinate offset in the x-axis direction, and the vertical axis error coordinate offset is a coordinate value of the error coordinate offset in the y-axis direction; all the line characteristic subgraphs participating in the matching are any two line characteristic subgraphs in the current window, or all the line characteristic subgraphs participating in the matching are one line characteristic subgraph and any other line characteristic subgraphs fixed in the current window, so that all the line characteristic subgraphs in the current window are covered. In some embodiments, the values of each error coordinate offset on the horizontal axis are equal and the values of each error coordinate offset on the vertical axis are equal, regardless of the sign differences in the translation direction; therefore, the coordinate value of the result of the accumulation of all the error coordinate offsets on each coordinate axis is the value 0 under the condition of not considering the position error, namely, the ideal state.
If the robot judges that the number of the error coordinate offsets obtained at present is larger than a second preset number threshold, removing the error coordinate offset with the maximum abscissa value, the error coordinate offset with the minimum abscissa value, the error coordinate offset with the maximum ordinate value and the error coordinate offset with the minimum ordinate value from all the error coordinate offsets obtained at present; since each error coordinate offset includes a horizontal axis error coordinate offset and a vertical axis error coordinate offset, an error coordinate offset may be expressed as a coordinate value composed of a horizontal axis error coordinate offset and a vertical axis error coordinate offset. Averaging the horizontal axis error coordinate offset in the residual error coordinate offset to obtain a horizontal axis average coordinate value, wherein the averaging is to sum and average the horizontal axis error coordinate offset in the residual error coordinate offset, and the number of the horizontal axis error coordinate offset in the residual error coordinate offset is the number of the averaged data; averaging the vertical axis error coordinate offset in the residual error coordinate offset to obtain a vertical axis average coordinate value, wherein the averaging is to sum and average the vertical axis error coordinate offset in the residual error coordinate offset, and the number of the vertical axis error coordinate offset in the residual error coordinate offset is the number of the averaged data; then, the average coordinate value of the horizontal axis and the average coordinate value of the vertical axis form an average coordinate offset; and averaging the rest error coordinate offset to obtain an average coordinate offset.
If the robot judges that the number of the error coordinate offsets obtained at present is smaller than or equal to a second preset number threshold, the robot directly averages all the error coordinate offsets obtained at present to obtain an average coordinate offset; the principle of the specific averaging is similar to the above embodiment, except that the number of averaged data is equal to the number of all the error coordinate offsets currently obtained, and is equal to the number of the horizontal axis error coordinate offsets among all the error coordinate offsets currently obtained, where one error coordinate offset may be expressed as a coordinate value composed of one horizontal axis error coordinate offset and one vertical axis error coordinate offset, and the number of the horizontal axis error coordinate offsets among all the error coordinate offsets currently obtained is equal to the number of the vertical axis error coordinate offsets among all the error coordinate offsets currently obtained.
It should be noted that, in correspondence to the line feature sub-graph matching process mentioned in the foregoing embodiment, the setting of the second preset number of thresholds is associated with the number of times of translation of the line feature sub-graph to be matched or the target match line feature sub-graph along the predetermined coordinate axis direction from the offset start point position, or the number of times of change of the translation direction; since there are four coordinate axis directions in the plane coordinate system where the current window is located, including a positive axis direction (positive axis direction of abscissa, i.e., positive axis direction of x), a negative axis direction (negative axis direction of abscissa, i.e., negative axis direction of x), a positive axis direction (positive axis direction of ordinate, i.e., positive axis direction of y), and a negative axis direction (negative axis direction of ordinate, i.e., negative axis direction of y), the second preset number threshold is set to a value of 3 in order to reduce the accumulated offset error amount (position error).
The robot then sets the average coordinate offset as positioning coordinate compensation amounts, wherein each positioning coordinate compensation amount includes a horizontal axis positioning coordinate compensation amount and a vertical axis positioning coordinate compensation amount, which can be understood as being expressed in the form of coordinate values; and then the robot controls the positioning coordinate compensation quantity to be added with the current position coordinate of the robot to obtain a corrected robot position coordinate, specifically, the horizontal axis positioning coordinate compensation quantity of the positioning coordinate compensation quantity is added with the horizontal coordinate of the current position coordinate of the robot, and the vertical axis positioning coordinate compensation quantity of the positioning coordinate compensation quantity is added with the vertical coordinate of the current position coordinate of the robot to obtain a corrected robot position coordinate.
It should be noted that, the current position coordinates of the robot are determined before executing step S5, and may specifically be the rotation center position that the robot has traversed most recently, or the position where the current window is constructed during executing step S1, so that the corrected robot position coordinates become the result of the robot scanning, fitting, and matching the line feature subgraphs of the map area in the current window.
In summary, aiming at the matching of the line characteristic subgraphs, matching judgment is carried out by using the coincidence degree between the fitting line segments and the number of the fitting line segments meeting the corresponding coincidence rate, so that the calculation complexity is reduced, and the calculation speed is improved; aiming at the position positioning errors existing in the line feature subgraphs involved in matching, the line feature subgraphs are controlled to translate along the directions of all coordinate axes, and matching judgment is repeatedly carried out by using the coincidence degree between the fitting line segments and the number of the fitting line segments meeting the corresponding coincidence rate until the number of the fitting line segments successfully matched meets the preset matching success rate, so that the interference of coordinate offset errors is overcome, a plurality of error amounts are extracted, and the subsequent processing is conveniently carried out to obtain the compensation amount of the position coordinates of the robot. Specifically, an error coordinate offset for correcting or repositioning the robot position is obtained every time two target subgraphs are successfully matched, and a reasonable numerical upper limit is set for the number of successfully matched line characteristic subgraphs in each window, so that the calculation amount for calling the target subgraphs and matching positioning is reduced.
It should be noted that, in several embodiments provided in the present application, it should be understood that the disclosed technical content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, such as the division of the units, is merely a logical function division, and may be implemented in another manner, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment. In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.

Claims (18)

1. The robot positioning method based on line segment matching in the window is characterized in that an execution main body of the robot positioning method is a robot fixedly provided with a single-point ranging sensor and a gyroscope; the single-point ranging sensor is used for collecting point cloud data of the environment where the robot is located and marking the point cloud data in a map, and the gyroscope is used for collecting the rotation angle of the robot;
The robot positioning method comprises the following steps:
step 1, a robot collects point cloud data through a single-point ranging sensor, and a current window is built in a map according to the maximum ranging distance of the single-point ranging sensor;
step 2, the robot rotates at different rotation center positions in the current window in sequence, fitting processing is carried out on the point cloud data collected by rotating at each rotation center position to obtain a corresponding group of fitting line segments in the current window, each group of fitting line segments respectively form a line feature sub-graph, and then the robot obtains a plurality of line feature sub-graphs in the current window; the robot performs fitting treatment at a rotation center position and records a group of fitting line segments in the current window;
step 3, in the current window, the robot controls the fitting line segment in each line characteristic sub-graph to be matched with the fitting line segments in other line characteristic sub-graphs in a similarity matching way;
step 4, in the current window, determining that the two line feature subgraphs are successfully matched and obtaining an error coordinate offset when the matching success rate of the fitting line segments in the two line feature subgraphs reaches a first preset success rate;
and 5, when the matching success rate between the line feature subgraphs in the current window reaches a second preset success rate, carrying out average value processing on all the obtained error coordinate offsets to obtain a positioning coordinate compensation amount, and correcting the current position coordinates of the robot by using the positioning coordinate compensation amount so as to finish one-time positioning in the current window.
2. The robotic positioning method of claim 1, wherein the robotic positioning method further comprises:
when the robot selects one window adjacent to the current window as the next window, the robot moves to the next window, updates the next window to the current window, and repeatedly executes the steps 2 to 5 until the robot traverses a first preset number of windows;
wherein, in the map where the current window is located, the robot builds a window in the neighborhood of the current window;
the area covered by the first preset number of windows comprises all windows constructed in the neighborhood of the current window;
wherein the window is a rectangular area framed in the map for defining the coverage of the line feature subgraph.
3. The robot positioning method of claim 2, wherein in step 1, the method of constructing the current window in the map according to the maximum ranging distance of the single-point ranging sensor comprises: taking the position of the robot when executing the step 1 as the center of the current window, extending a first extension distance along the horizontal left side of the robot, and extending a second extension distance along the horizontal right side of the robot to extend and form the transverse side length of the current window, wherein the first extension distance and the second extension distance are both equal to half of the maximum ranging distance of the single-point ranging sensor, and the transverse side length of the current window is equal to the maximum ranging distance of the single-point ranging sensor; wherein the current window is rectangular in shape.
4. A method of positioning a robot as recited in claim 3, wherein the windows constructed in the neighborhood of the current window include windows adjacent to an upper side of the current window, windows adjacent to a lower side of the current window, windows adjacent to a left side of the current window, and windows adjacent to a right side of the current window; the shape of each window constructed in the neighborhood of the current window is the same as the shape of the current window, and the size of each window constructed in the neighborhood of the current window is equal to the size of the current window;
the abscissa of each point of the window adjacent to the upper side of the current window is equal to the abscissa of each point of the current window, and the difference value between the ordinate of each vertex of the window adjacent to the upper side of the current window and the ordinate of the vertex at the same position relation of the current window is the longitudinal side length of the current window;
the abscissa of each point of the window adjacent to the lower side of the current window is equal to the abscissa of each point of the current window, and the difference value of the ordinate of each vertex of the current window and the ordinate of the vertex at the same position relation of the window adjacent to the lower side of the current window is the longitudinal side length of the current window;
The difference value of the abscissa of each vertex of the window adjacent to the right side of the current window and the abscissa of the vertex at the same position relation of the current window is the lateral side length of the current window;
the difference value of the abscissa of each vertex of the current window and the abscissa of the vertex at the same position relation of the window adjacent to the left side of the current window is the lateral side length of the current window;
wherein the same positional relationship indicates that the relative positional relationship of the two vertices with respect to the center of the window in which they are located is the same.
5. The robot positioning method according to claim 2, wherein in the step 4, the matching success rate of the fitting line segments of the two line feature subgraphs is a ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs participating in the similarity matching; the fitting line segment pair participating in the similarity matching consists of fitting line segments of two line feature subgraphs in a current window, and the relative position relation of one fitting line segment in one line feature subgraph relative to the rotation center position of the other line feature subgraph of the fitting line segment pair participating in the similarity matching is equivalent to the relative position relation of one fitting line segment in the other line feature subgraph relative to the rotation center position of the other line feature subgraph;
In the step 5, the matching success rate between the line feature subgraphs in the current window is the ratio of the number of all line feature subgraphs successfully matched to the number of all line feature subgraphs participating in matching; the line characteristic subgraph pair consists of two different line characteristic subgraphs in the current window;
and the line characteristic subgraphs where the two fitting line segments participating in similarity matching are respectively located form a line characteristic subgraph pair participating in matching.
6. The robot positioning method according to claim 5, wherein in step 2, each time the robot rotates one turn at the rotation center position, the single-point ranging sensor is controlled to collect point cloud data during the rotation of the robot, the position points in each angle range are fitted to the line segments of the corresponding trend, the fitted part of the line segments of the corresponding trend in the current window is set as a group of fitted line segments in the current window, and then the group of fitted line segments form a line feature sub-graph, so that the construction of one line feature sub-graph in the current window is completed;
the point cloud data comprises coordinate information of a position point scanned by a single-point ranging sensor and angle information of the position point; each time one of the line feature subgraphs is composed, the robot also records its coordinate information and initial angle information at the rotational center position.
7. The robot positioning method of claim 6, wherein the current window is rectangular in shape; the transverse side length of the current window is equal to the maximum ranging distance of the single-point ranging sensor; the longitudinal side length of the current window is larger than or equal to the diameter of the robot body; half of the maximum ranging distance of the single-point ranging sensor is equal to the first extension distance;
the robot sets the center of the current window as a first rotation center position;
the robot sets the central axis of the current window in the longitudinal direction as a base line, and the longitudinal direction of the current window is set to be parallel to the advancing direction of the robot;
the robot sets the position points which are separated from the center of the current window by half of a first extension distance and are positioned in two symmetrical directions of a vertical baseline as a second first rotation center position and a second rotation center position respectively;
the robot sets the position points which are separated from the center of the current window by a first extension distance and are positioned in two symmetrical directions of a vertical baseline as a third first rotation center position and a third second rotation center position respectively;
the robot traverses each rotation center position in the current window sequentially without repetition so as to construct a plurality of line feature subgraphs in the current window.
8. The robot positioning method according to claim 6, wherein in the step 2, there are the following cases:
when the fitted line segment with the corresponding trend is positioned in the current window, if the robot detects that the length of the fitted line segment with the corresponding trend is larger than a preset fitting length threshold value, setting the fitted line segment with the corresponding trend as a fitting line segment and marking the fitting line segment into a map, recording the coordinates of the starting point, the coordinates of the ending point and the inclination angle of the fitted line segment with the corresponding trend, and adding the fitting line segment into a group of corresponding fitting line segments in the current window;
when the fitted line segment with the corresponding trend extends from the inside of the current window to the outside of the current window or the fitted line segment with the corresponding trend extends from the outside of the current window to the inside of the current window, if the robot detects that the length of the line segment intercepted by the fitted line segment with the corresponding trend in the current window is larger than a preset fitting length threshold value, setting the line segment intercepted by the fitted line segment with the corresponding trend in the current window as the fitted line segment and marking the fitted line segment into a map, recording coordinates of a starting point, coordinates of an ending point and an inclination angle of the line segment intercepted by the fitted line segment with the corresponding trend in the current window, and adding the line segment into a corresponding group of fitted line segments in the current window;
The inclination angle is set as an angle representation of an included angle formed by the line segment of the fitted corresponding trend and the coordinate axis.
9. The method according to claim 8, wherein in the process that the TOF sensor of the robot detects the boundary of the same wall, each time the robot fits a new line segment from the boundary of the wall, calculating an angle between the new line segment fitted currently and the previously recorded fitted line segment of the same line; if the angle between the line segment of the new trend which is currently fitted and the fitting line segment of the same trend which is recorded in advance is larger than a preset fitting angle threshold value, carrying out weighted average treatment on the inclination angle of the line segment of the new trend which is currently fitted and the inclination angle of the fitting line segment of the same trend which is recorded in advance to obtain a calibration inclination angle, and updating the calibration inclination angle into the inclination angle of the fitting line segment of the same trend which is recorded in advance;
the line segment fitted by the robot from the boundary of the wall body is determined by a target linear equation formed by fitting the point cloud data acquired in the corresponding angle range by the robot through a least square method.
10. The robot positioning method according to claim 5, wherein in step 3, the robot sets one of the line feature subgraphs participating in the matching as a reference line feature subgraph, and sets the other line feature subgraph of the line feature subgraph participating in the matching as a line feature subgraph to be matched; then translating the line feature subgraph to be matched to completely coincide with the reference line feature subgraph, or translating the line feature subgraph to be matched to completely cover the reference line feature subgraph; setting the translated line characteristic subgraph to be matched as a target matching line characteristic subgraph, wherein all fitting line segments in the target matching line characteristic subgraph are translated relative to corresponding fitting line segments in the line characteristic subgraph to be matched; and then the robot controls the matching line segments in the target matching line characteristic subgraph to be matched with the matching line segments at the corresponding positions in the reference line characteristic subgraph in a similarity mode.
11. The method of claim 10, wherein the method of controlling the robot to perform similarity matching between the fitted line segment in the target matched line feature sub-graph and the fitted line segment at the corresponding position in the reference line feature sub-graph comprises:
Step 31, calculating the ratio of the absolute value of the difference value between the line segment length of one fitting line segment in the target matching line feature subgraph and the line segment length of one fitting line segment at the corresponding position in the reference line feature subgraph to the line segment length of one fitting line segment at the corresponding position in the reference line feature subgraph, and marking the ratio as the difference rate in the line segment length dimension; calculating the ratio of the absolute value of the difference value between the inclination angle of one fitting line segment in the target matching line characteristic subgraph and the inclination angle of one fitting line segment at the corresponding position in the reference line characteristic subgraph to the inclination angle of one fitting line segment at the corresponding position in the reference line characteristic subgraph, and marking the ratio as the difference rate in the inclination angle dimension; calculating the ratio of the number of coordinate positions through which one fitting line segment in the target matching line characteristic subgraph and one fitting line segment at the corresponding position in the reference line characteristic subgraph pass together to the number of coordinate positions through which one fitting line segment at the corresponding position in the reference line characteristic subgraph passes, and marking the ratio as the coincidence rate in the position dimension;
step 32, for a fitting line segment in the target matching line feature sub-graph and a fitting line segment at a corresponding position in the reference line feature sub-graph, when the difference rate in the length dimension of the line segment is smaller than or equal to a preset difference rate, the difference rate in the inclined angle dimension is smaller than or equal to a preset difference rate, and the coincidence rate in the position dimension is larger than or equal to a preset coincidence rate, determining that the matching of the fitting line segment in the target matching line feature sub-graph and the fitting line segment at the corresponding position in the reference line feature sub-graph is successful in similarity matching;
The coordinate offset of the target match line feature subgraph relative to the to-be-matched line feature subgraph is equal to the coordinate offset of the rotation center position corresponding to the target match line feature subgraph relative to the rotation center position corresponding to the to-be-matched line feature subgraph;
wherein, the sum value of the preset difference rate and the preset contact ratio is equal to 100 percent.
12. The method of claim 10, wherein the method of controlling the robot to perform similarity matching between the fitted line segment in the target matched line feature sub-graph and the fitted line segment at the corresponding position in the reference line feature sub-graph comprises:
step 31, calculating the ratio of the absolute value of the difference value between the line segment length of one fitting line segment in the target matching line feature subgraph and the line segment length of one fitting line segment at the corresponding position in the reference line feature subgraph to the line segment length of one fitting line segment with relatively longer line segment length, marking the ratio as a first ratio, and marking the difference value between the value 1 and the first ratio as the coincidence ratio in the line segment length dimension; calculating the ratio of the absolute value of the difference value between the inclination angle of one fitting line segment in the target matching line characteristic subgraph and the inclination angle of one fitting line segment at the corresponding position in the reference line characteristic subgraph to the inclination angle of one fitting line segment with a relatively large inclination angle, marking the ratio as a second ratio, and marking the difference value between the value 1 and the second ratio as the coincidence rate in the dimension of the inclination angle; calculating the ratio of the number of coordinate positions passed by one fitting line segment in the target matching line characteristic subgraph and one fitting line segment at the corresponding position in the reference line characteristic subgraph together with the number of coordinate positions passed by one fitting line segment passing through relatively more coordinate positions, and marking the ratio as the coincidence rate in the position dimension;
And step 32, for one fitting line segment in the target matching line feature sub-graph and one fitting line segment at a corresponding position in the reference line feature sub-graph, when the sum of the coincidence rate in the length dimension of the line segment, the coincidence rate in the inclination angle dimension and the coincidence rate in the position dimension is greater than or equal to the preset coincidence rate, determining that the matching of the one fitting line segment in the target matching line feature sub-graph and one fitting line segment at a corresponding position in the reference line feature sub-graph is successful in similarity matching.
13. The robot positioning method according to claim 11 or 12, wherein before the robot finishes matching each fitting line segment in the target matching line feature sub-graph with a fitting line segment at a corresponding position in the reference line feature sub-graph, or when it is detected that a ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs participating in similarity matching is smaller than a first preset success rate, setting a rotation center position corresponding to the target matching line feature sub-graph as an offset start position; then, controlling the target matching line characteristic subgraph to translate along the direction of a preset coordinate axis according to a preset translation step length from the offset starting point position;
When the coordinate offset of the target match line feature subgraph translated in the same coordinate axis direction from the offset starting point position does not reach the maximum preset offset, each time the target match line feature subgraph is translated by a preset translation step length, the robot judges whether the ratio of the number of all matched line segment pairs successfully matched to the number of all matched line segment pairs matched by the similarity of the length of the participated line segment is greater than or equal to a first preset success rate, if so, the target match line feature subgraph and the reference line feature subgraph are successfully matched, and the coordinate offset of the target match line feature subgraph translated in the latest translation direction from the offset starting point position is set as an error coordinate offset, and then the target match line feature subgraph is controlled to stop translating; otherwise, the robot adjusts the direction of the set coordinate axis to be opposite to or perpendicular to the direction of the set coordinate axis, updates the direction of the opposite or perpendicular coordinate axis to be the direction of the set coordinate axis, and controls the target matching line characteristic subgraph to translate the preset translation step length along the direction of the set coordinate axis from the offset starting point position;
If the coordinate offset of the target match line feature sub-graph translated in the same coordinate axis direction from the offset starting point position reaches the maximum preset offset, when the robot judges that the ratio of the number of all the matched line segment pairs successfully matched to the number of all the matched line segment pairs matched with the similarity of the lengths of the participated line segments is smaller than a first preset success rate, the robot adjusts the direction of the established coordinate axis to be opposite or perpendicular to the established coordinate axis direction, updates the opposite or perpendicular coordinate axis direction to be the established coordinate axis direction, and then controls the target match line feature sub-graph to translate the preset translation step length along the established coordinate axis direction from the offset starting point position;
before the coordinate offset translated in the given coordinate axis direction reaches the maximum preset offset, each time the same target match line feature sub-graph translates by the preset translation step length along the given coordinate axis direction, updating the translated target match line feature sub-graph into the target match line feature sub-graph, and then executing step 31 and step 32; and after each robot is matched with each fitting line segment in the target matching line feature subgraph and the fitting line segment at the corresponding position in the reference line feature subgraph, the robot judges whether the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs matched by the similarity of the lengths of the participating line segments is larger than or equal to a first preset success rate.
14. The method for positioning a robot according to claim 13, wherein after the coordinate offsets translated along all coordinate axis directions from the offset start point position of the target match line feature subgraph reach a maximum preset offset, if the robot determines that the ratio of the number of all the fit line segment pairs successfully matched to the number of all the fit line segment pairs participating in similarity matching is smaller than a first preset success rate, determining that the target match line feature subgraph and the reference line feature subgraph fail to match;
the preset maximum positioning error is associated with a maximum preset offset, and the maximum preset offset comprises a maximum preset offset sitting amount in the horizontal axis direction and a maximum preset offset sitting amount in the vertical axis direction.
15. The robot positioning method according to claim 5, wherein in step 3, the robot sets one of the line feature subgraphs participating in the matching as a reference line feature subgraph, and sets the other line feature subgraph of the line feature subgraph participating in the matching as a line feature subgraph to be matched;
setting the rotation center position corresponding to the line characteristic subgraph to be matched as an offset starting point position; then controlling the characteristic subgraph of the line to be matched to translate along the direction of a preset coordinate axis according to a preset translation step length from the position of the offset starting point;
When the coordinate offset of the line feature subgraph to be matched, which is translated in the same coordinate axis direction from the offset starting point position, does not reach the maximum preset offset, each time the line feature subgraph to be matched is translated by a preset translation step length, the robot judges whether the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs matched by the similarity of the length of the participated line segment is greater than or equal to a first preset success rate, if so, the line feature subgraph to be matched and the reference line feature subgraph are successfully matched, and the coordinate offset of the line feature subgraph to be matched, which is translated in the latest translation direction from the offset starting point position, is set as an error coordinate offset, and then the line feature subgraph to be matched is controlled to stop translating; otherwise, the robot adjusts the direction of the established coordinate axis to be opposite or perpendicular to the established coordinate axis direction, updates the opposite or perpendicular coordinate axis direction to be the direction of the established coordinate axis, and controls the characteristic subgraph of the line to be matched to translate the preset translation step length along the established coordinate axis direction from the offset starting point position;
if the coordinate offset of the line feature subgraph to be matched, which is translated in the same coordinate axis direction from the offset starting point position, reaches the maximum preset offset, when the robot judges that the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs matched with the similarity of the lengths of the participating line segments is smaller than a first preset success rate, the robot adjusts the direction of the established coordinate axis to be opposite or perpendicular to the established coordinate axis direction, updates the opposite or perpendicular coordinate axis direction to be the established coordinate axis direction, and then controls the line feature subgraph to be matched to be translated in the established coordinate axis direction from the offset starting point position by the preset translation step length;
Before the coordinate offset translated in the given coordinate axis direction reaches the maximum preset offset, each time the same to-be-matched line feature sub-graph translates for one preset translation step along the given coordinate axis direction, updating the translated to-be-matched line feature sub-graph, and then controlling a fitting line segment in the to-be-matched line feature sub-graph to be similar to a fitting line segment at a corresponding position in the reference line feature sub-graph; each time the robot matches each fitting line segment in the line feature subgraph to be matched with the fitting line segment at the corresponding position in the reference line feature subgraph, the robot judges whether the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs matched with the similarity of the lengths of the participated line segments is larger than or equal to a first preset success rate;
after coordinate offsets of the line feature subgraphs to be matched, which are translated along all coordinate axis directions from the offset starting point positions, reach the maximum preset offset, if the robot judges that the ratio of the number of all fitting line segment pairs successfully matched to the number of all fitting line segment pairs participating in similarity matching is smaller than a first preset success rate, determining that the matching of the line feature subgraphs to be matched and the reference line feature subgraphs fails;
The preset maximum positioning error is associated with a maximum preset offset, and the maximum preset offset comprises a maximum preset offset sitting amount in the horizontal axis direction and a maximum preset offset sitting amount in the vertical axis direction.
16. The method of claim 15, wherein the method for controlling the robot to perform similarity matching between the fitting line segment in the to-be-matched line feature sub-graph and the fitting line segment at the corresponding position in the reference line feature sub-graph comprises:
step 31, calculating the ratio of the absolute value of the difference value between the length of one fitting line segment in the characteristic subgraph of the line to be matched and the length of one fitting line segment at the corresponding position in the characteristic subgraph of the reference line to the length of one fitting line segment at the corresponding position in the characteristic subgraph of the reference line, and marking the ratio as the difference rate in the length dimension of the line segments; calculating the ratio of the absolute value of the difference value between the inclination angle of one fitting line segment in the characteristic subgraph of the line to be matched and the inclination angle of one fitting line segment at the corresponding position in the characteristic subgraph of the reference line to the inclination angle of one fitting line segment at the corresponding position in the characteristic subgraph of the reference line, and marking the ratio as the difference rate in the dimension of the inclination angle; calculating the ratio of the number of coordinate positions through which one fitting line segment in the characteristic subgraph to be matched and one fitting line segment at the corresponding position in the characteristic subgraph of the reference line pass together to the number of coordinate positions through which one fitting line segment at the corresponding position in the characteristic subgraph of the reference line passes, and marking the ratio as the coincidence rate in the position dimension;
Step 32, for a fitting line segment in the line feature subgraph to be matched and a fitting line segment at a corresponding position in the reference line feature subgraph, when the difference rate in the length dimension of the line segment is smaller than or equal to a preset difference rate, the difference rate in the inclined angle dimension is smaller than or equal to a preset difference rate, and the coincidence rate in the position dimension is larger than or equal to a preset coincidence rate, determining that the integral coincidence rate of the fitting line segment in the line feature subgraph to be matched and the fitting line segment at the corresponding position in the reference line feature subgraph is larger than or equal to the preset coincidence rate, and further determining that the fitting line segment in the line feature subgraph to be matched and the fitting line segment at the corresponding position in the reference line feature subgraph are successfully matched in similarity matching; wherein, the sum value of the preset difference rate and the preset contact ratio is equal to 100 percent.
17. The method of claim 15, wherein the method for controlling the robot to perform similarity matching between the fitting line segment in the to-be-matched line feature sub-graph and the fitting line segment at the corresponding position in the reference line feature sub-graph comprises:
step 31, calculating the ratio of the absolute value of the difference value between the line segment length of one fitting line segment in the characteristic subgraph of the line to be matched and the line segment length of one fitting line segment at the corresponding position in the characteristic subgraph of the reference line and the line segment length of one fitting line segment with relatively longer line segment length, marking the ratio as a first ratio, and marking the difference value between the value 1 and the first ratio as the coincidence ratio in the length dimension of the line segment; calculating the ratio of the absolute value of the difference value between the inclination angle of one fitting line segment in the characteristic subgraph of the line to be matched and the inclination angle of one fitting line segment at the corresponding position in the characteristic subgraph of the reference line and the inclination angle of one fitting line segment with relatively larger inclination angle, marking the ratio as a second ratio, and marking the difference value between the value 1 and the second ratio as the coincidence rate in the dimension of the inclination angle; calculating the ratio of the number of coordinate positions passed by one fitting line segment in the characteristic subgraph of the line to be matched and one fitting line segment at the corresponding position in the characteristic subgraph of the reference line to the number of coordinate positions passed by one fitting line segment passing through relatively more coordinate positions, and marking the ratio as the coincidence rate in the position dimension;
And step 32, for one fitting line segment in the line characteristic subgraph to be matched and one fitting line segment at a corresponding position in the reference line characteristic subgraph, when the sum of the coincidence rate in the length dimension of the line segment, the coincidence rate in the inclination angle dimension and the coincidence rate in the position dimension is greater than or equal to the preset coincidence rate, determining that the matching of the fitting line segment in the line characteristic subgraph to be matched and one fitting line segment at a corresponding position in the reference line characteristic subgraph is successful in similarity matching.
18. The method according to claim 5, wherein in step 5, when the ratio of the number of pairs of all line feature subgraphs successfully matched to the number of pairs of all line feature subgraphs involved in matching is greater than or equal to a second preset success rate, the robot obtains a plurality of error coordinate offsets, and if the robot determines that the number of currently obtained error coordinate offsets is greater than a second preset number threshold, the error coordinate offset having the largest abscissa value, the error coordinate offset having the smallest abscissa value, the error coordinate offset having the largest ordinate value, and the error coordinate offset having the smallest ordinate value are all removed from all the currently obtained error coordinate offsets;
Averaging the horizontal axis error coordinate offset in the residual error coordinate offset, averaging the vertical axis error coordinate offset in the residual error coordinate offset to obtain an average coordinate offset, setting the average coordinate offset as the positioning coordinate compensation amount, and adding the positioning coordinate compensation amount and the current position coordinate of the robot by the robot control to obtain the corrected robot position coordinate;
wherein each error coordinate offset includes a horizontal axis error coordinate offset and a vertical axis error coordinate offset.
CN202210697792.7A 2022-06-20 2022-06-20 Robot positioning method based on line segment matching in window Pending CN117289689A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210697792.7A CN117289689A (en) 2022-06-20 2022-06-20 Robot positioning method based on line segment matching in window

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210697792.7A CN117289689A (en) 2022-06-20 2022-06-20 Robot positioning method based on line segment matching in window

Publications (1)

Publication Number Publication Date
CN117289689A true CN117289689A (en) 2023-12-26

Family

ID=89243199

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210697792.7A Pending CN117289689A (en) 2022-06-20 2022-06-20 Robot positioning method based on line segment matching in window

Country Status (1)

Country Link
CN (1) CN117289689A (en)

Similar Documents

Publication Publication Date Title
CN108507578B (en) Navigation method of robot
CN109060821B (en) Tunnel disease detection method and tunnel disease detection device based on laser detection
CN111590595B (en) Positioning method and device, mobile robot and storage medium
US6728608B2 (en) System and method for the creation of a terrain density model
CN107357297A (en) A kind of sweeping robot navigation system and its air navigation aid
CN207164586U (en) A kind of sweeping robot navigation system
JP2021516403A (en) Robot repositioning method
CN109933056A (en) A kind of robot navigation method and robot based on SLAM
CN114280625A (en) Unmanned aerial vehicle-based three-dimensional laser radar underground map construction method and device
US20240061442A1 (en) Mobile Robot Positioning Method and System Based on Wireless Ranging Sensors, and Chip
CN113475977B (en) Robot path planning method and device and robot
CN110895408B (en) Autonomous positioning method and device and mobile robot
CN112967189A (en) Point cloud splicing method, device, equipment and storage medium
CN112578392A (en) Environment boundary construction method based on remote sensor and mobile robot
CN112904845A (en) Robot jamming detection method, system and chip based on wireless distance measurement sensor
CN108876862A (en) A kind of noncooperative target point cloud position and attitude calculation method
CN114610032A (en) Target object following method and device, electronic equipment and readable storage medium
CN112880683B (en) Robot positioning control method, system and chip based on reference linear distance
CN115151836A (en) Method for detecting a moving object in the surroundings of a vehicle and motor vehicle
CN117289689A (en) Robot positioning method based on line segment matching in window
JP5953393B2 (en) Robot system and map updating method
CN116129669A (en) Parking space evaluation method, system, equipment and medium based on laser radar
AU2021273605B2 (en) Multi-agent map generation
CN117288183A (en) Angle-based robot repositioning method
CN109959935B (en) Map establishing method, map establishing device and robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination