CN110411446B - Path planning method for robot - Google Patents

Path planning method for robot Download PDF

Info

Publication number
CN110411446B
CN110411446B CN201810401508.0A CN201810401508A CN110411446B CN 110411446 B CN110411446 B CN 110411446B CN 201810401508 A CN201810401508 A CN 201810401508A CN 110411446 B CN110411446 B CN 110411446B
Authority
CN
China
Prior art keywords
robot
area
target
yaw angle
pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810401508.0A
Other languages
Chinese (zh)
Other versions
CN110411446A (en
Inventor
刘阳
佀昶
赵强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shen Zhen Gli Technology Ltd
Original Assignee
Shen Zhen Gli Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shen Zhen Gli Technology Ltd filed Critical Shen Zhen Gli Technology Ltd
Priority to CN201810401508.0A priority Critical patent/CN110411446B/en
Publication of CN110411446A publication Critical patent/CN110411446A/en
Application granted granted Critical
Publication of CN110411446B publication Critical patent/CN110411446B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Abstract

The embodiment of the application provides a path planning method of a robot, which comprises the following steps: and acquiring the target position and the target yaw angle of the robot to be advanced. And acquiring the current position and the current yaw angle of the robot. And determining the type of the region where the robot is currently located in a preset plurality of region types according to the relative relation between the current position and the current yaw angle and the target position and the target yaw angle. And carrying out corresponding path planning on the robot according to the determined type of the area. Through the mode, the method and the device can enrich the path planning modes of the robot.

Description

Path planning method for robot
Technical Field
The application relates to the technical field of robot control, in particular to a path planning method of a robot.
Background
At present, consumer robots are popular in markets, are widely used in education fields, service fields, entertainment fields and the like, and gradually enter families, schools, various service places and the like.
However, the existing path planning mode of the consumer robot is quite single, and the path planning mode cannot be adjusted correspondingly according to different movement conditions and position conditions, so that the robot is difficult to move to the target accurately, and the movement efficiency of the robot is low.
Disclosure of Invention
The application mainly solves the technical problem of providing a path planning method of a robot, which can solve the problems of low movement efficiency and the like of the robot caused by single planning mode in the prior art.
In order to solve the above technical problems, an embodiment of the present application provides a path planning method for a robot, including: and acquiring the target position and the target yaw angle of the robot to be advanced. And acquiring the current position and the current yaw angle of the robot. And determining the type of the region where the robot is currently located in a preset plurality of region types according to the relative relation between the current position and the current yaw angle and the target position and the target yaw angle. And carrying out corresponding path planning on the robot according to the determined type of the area.
Compared with the prior art, the application has the beneficial effects that: the method has the advantages that the target position and the target yaw angle of the robot and the current position and the current yaw angle of the robot are obtained, and the type of the region where the robot is located is determined according to the relative relation between the current position and the current yaw angle and the target position and the target yaw angle, so that different path planning is performed, a single path planning mode is avoided, planning can be performed in real time according to the fact that the robot is located in different regions, dynamic planning can be achieved, path planning is effectively performed on the robot, and the path planning efficiency is improved.
Drawings
FIG. 1 is a schematic flow chart of a pattern recognition method based on element matching according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a second flow chart of an embodiment of a pattern recognition method based on element matching according to the present application;
FIG. 3 is a schematic diagram of a third flow chart of an embodiment of a pattern recognition method based on element matching according to the present application;
FIG. 4 is a schematic diagram of a process of screening candidate frames according to an embodiment of a pattern recognition method based on element matching of the present application;
FIG. 5 is a schematic diagram of a polygon approximation process according to an embodiment of the pattern recognition method based on element matching of the present application;
FIG. 6 is a schematic diagram of a morphological filtering process according to an embodiment of the pattern recognition method based on element matching of the present application;
FIG. 7 is a fourth flow chart of an embodiment of a pattern recognition method based on element matching according to the present application;
FIG. 8 is a schematic diagram of an encoding process of an embodiment of a pattern recognition method based on element matching according to the present application;
FIG. 9 is a fifth flow chart of an embodiment of a pattern recognition method based on element matching according to the present application;
FIG. 10 is a schematic diagram of a sixth flow chart of an embodiment of a pattern recognition method based on element matching according to the present application;
FIG. 11 is a schematic diagram of a path planning method of the robot according to an embodiment of the present application;
FIG. 12 is a schematic diagram of a relationship between a robot and a target object according to an embodiment of a path planning method of the robot of the present application;
FIG. 13 is a first flow chart of an embodiment of a path planning method for a robot according to the present application;
FIG. 14 is a schematic view of a second flow chart of an embodiment of a path planning method of the robot of the present application;
FIG. 15 is a third flow chart of an embodiment of a path planning method for a robot according to the present application;
FIG. 16 is a fourth flow chart of an embodiment of a path planning method for a robot according to the present application;
FIG. 17 is a schematic diagram of a motion process of a robot when adjusting an area according to an embodiment of a path planning method of the robot of the present application;
FIG. 18 is a fifth flow chart of an embodiment of a path planning method for a robot of the present application;
fig. 19 is a sixth flowchart of a path planning method of the robot according to the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
In the existing pattern recognition technologies, most of the pattern recognition technologies acquire information by recognizing single black-and-white alternate forms like a two-dimensional code or a bar code, and the two-dimensional code or the bar code is single in constituent elements and cannot be distinguished by a user, so that the two-dimensional code or the bar code is visually unaesthetic to people, and is not interesting in consumer robots, and meanwhile, the pattern recognition technologies of the two-dimensional code or the bar code cannot realize the recognition of patterns formed by different elements, so that the recognition mode is single.
Referring to fig. 1, the pattern recognition method based on element matching according to the embodiment of the present application can be applied to robots, for example, consumer robots, such as educational robots, entertainment robots, companion robots, home robots, etc., or other robots having a machine recognition device. Taking a robot as an example, the robot may include a sensor (not shown) for acquiring an image and a processor (not shown), the method comprising:
t1: and extracting the pattern to be identified from the image to be identified.
For example, a sensor of the robot collects an original image, the image to be recognized can be the original image, or the image of the original image after some image processing such as binarization processing is performed, and the processor obtains the pattern to be recognized through calculation and extraction. The pattern to be identified is, for example, a user-defined pattern in the above embodiment. Since the robot generally collects images to be recognized including some other pattern in addition to the pattern to be recognized when collecting the images, it is necessary to extract the pattern to be recognized from the images to be recognized.
Because different illumination conditions can exist in each place on the collected original image, different colors possibly exist on the pattern to be recognized are also available, if the image to be recognized is an image subjected to binarization processing, the original image can be binarized according to similarity by comparing brightness, color and other information of each pixel and pixels in the neighborhood of the pixel during the binarization process, and a binary image, namely the image to be recognized, is obtained.
T2: and decomposing the pattern to be identified into a plurality of pattern elements to be matched.
The pattern to be identified comprises a plurality of pattern elements to be matched, for example, the user-defined pattern in the embodiment of the method for generating the robot control pattern according to the application can be divided into a plurality of sub-areas, and the plurality of pattern elements are sequentially filled. Each sub-region is filled with no more than one pattern element. Optionally, the pattern to be identified is decomposed into a plurality of pattern elements to be matched in a predetermined order. The pattern elements may comprise a variety of, e.g., regular or irregular geometric figures, such as triangles, lines, circles, polygons, etc. The pattern to be identified can be a combination of different pattern elements, or the same element, or a combination of pattern elements and blank, i.e. some subareas can be left blank.
T3: and respectively matching the pattern elements to be matched with a plurality of standard pattern elements, wherein different standard pattern elements correspond to different codes.
Specifically, the pattern elements to be matched are respectively matched with a plurality of standard pattern elements, and the corresponding matching is performed according to the corresponding sub-regions or the matching is performed according to a preset sequence, so that the standard pattern elements corresponding to the pattern elements to be matched are matched, and the pattern elements to be matched can be encoded. For example, a processor of the robot matches the pattern elements to be matched with the standard pattern elements.
T4: and generating a code sequence to be matched according to codes corresponding to the standard pattern elements matched by the pattern elements to be matched.
Specifically, for example, traversing pattern elements to be matched in a sub-region of a pattern to be identified according to a predetermined sequence, matching the pattern elements with standard pattern elements, and generating a code sequence to be matched according to codes corresponding to the standard pattern elements according to the predetermined sequence. In this way, the pattern to be recognized, which is formed by a plurality of different pattern elements, can be converted into a coding sequence. For example, the processor of the robot generates a code sequence to be matched according to codes corresponding to standard pattern elements matched by the pattern elements to be matched.
T5: and matching the coding sequence to be matched with the standard coding sequences of the plurality of standard patterns to determine the corresponding relation between the pattern to be identified and the standard patterns.
For example, the processor matches the code sequence to be matched corresponding to the pattern to be identified with the standard code sequences of the plurality of standard patterns in the database, and if the code sequences are consistent, the processor determines that the pattern to be identified corresponds to one of the standard patterns.
For example, the pattern to be identified comprises 3 sub-areas, each of which is filled with a pattern element, e.g. "straight", "triangle", "circle", respectively. The processor may decompose the pattern to be identified into 3 pattern elements with matching, then match the 3 pattern elements with a plurality of standard pattern elements, for example 000100030002, and match the generated code sequence to be matched with the standard code sequence in the database, so as to determine the standard pattern corresponding to the pattern to be identified.
According to the embodiment, the pattern to be identified comprising a plurality of pattern elements to be matched is decomposed, the decomposed plurality of pattern elements to be matched are matched with the standard pattern elements to generate the code sequence to be matched, and the code sequence to be matched is matched with the code sequence of the standard pattern, so that the standard pattern corresponding to the pattern to be identified is identified under the condition that the code sequence to be matched is consistent with the code sequence of the standard pattern, the form and design of the pattern to be identified can be enriched, the limitation of pattern design is reduced, the aesthetic property of pattern identification is enhanced, meanwhile, different pattern elements can be identified to obtain a plurality of codes to form the code sequence, the visual identification mode of a robot can be enriched, and the same codes but different code sequences can obtain different standard patterns, so that the interestingness and practicability of the robot can be enriched.
Referring to fig. 2, alternatively, T1: the step of extracting the pattern to be recognized from the image to be recognized includes:
t11: a polygonal reference frame is identified from the image to be identified.
In this embodiment, the pattern to be recognized is surrounded by a polygonal reference frame. Since the image to be recognized generally has other patterns or objects collected into the image in addition to the pattern to be recognized, in order to be able to extract the pattern to be recognized, a polygonal reference frame needs to be recognized first, and the pattern to be recognized is positioned by the reference frame so as to determine the pattern to be recognized in the next step.
T12: and taking the pattern in the polygonal reference frame as the pattern to be identified.
I.e. the reference frame encloses the pattern to be identified, the polygonal reference frame may be said to have a positioning effect on the identification pattern.
Referring to fig. 3-6, alternatively, T11: the step of identifying the polygonal reference frame from the image to be identified comprises:
t111: and carrying out boundary extraction on the image to be identified to obtain a plurality of contours.
The image to be recognized may include a plurality of patterns or objects including the pattern to be recognized, and thus may include a plurality of contours, for example, a pattern to be recognized is stuck on one surface of a cube, the pattern to be recognized is stuck on a central portion of one surface of the cube, and when the original image is acquired to obtain the image to be recognized, the image includes at least the contour of one surface of the cube and also includes the contour of the reference frame. The number of contours contained in the acquired image to be identified is different according to the actual condition of acquisition. The number of contours is related to the actual situation of the acquisition.
In order to be able to identify the reference frame, the identification image is first subjected to boundary extraction to obtain a plurality of contours. At the time of recognition, the robot obtains a plurality of contours by performing boundary extraction on the recognition image by a processor, for example.
T112: and screening out the contours meeting the preset hierarchical standard from the contours according to the hierarchical relationship of the contours, wherein the hierarchical relationship is the surrounding and surrounded relationship among the contours.
In this embodiment, the hierarchical relationship is a surrounding and surrounded relationship between a plurality of contours, including surrounding, being surrounded, juxtaposed, partially overlapping, etc., for example, the contours may surround each other, that is, there is a surrounding and surrounded relationship, the contours may be juxtaposed, not surrounding each other, or may partially surround or partially overlap each other.
In this embodiment, for example, a processor of the robot screens out a plurality of contours by calculating contours satisfying a preset hierarchy requirement according to a hierarchy relationship between the contours.
For example, as shown in fig. 4, where contours 0,1 do not include any contours, nor are they included by any contours, the hierarchical relationship { includes: 0, comprised of: 0}. Contour 2 contains at least 4 contours, and actually two contours of the eyebrow portion of "smiling face" should contain 6 contours and not be contained by any contours (the outermost frame is the field of view boundary of the entire image, not the contour), and its hierarchical relationship { contains: 6, it is contained: 0}. And so on. The hierarchical relationship of the outlines 3,4,5 is { comprising: 4, it is contained: 1, { contains: 3, it is included: 2} { contains: 0, comprised of: 3}. The hierarchical relationship of the outline 6 is { including: 0, comprised of: 1, assuming that the reference frame to be searched is as shown in fig. 4, and assuming that the frame of the custom pattern has a certain thickness, the reference frame to be searched (for example, the inner contour is taken as the reference frame) needs to satisfy { including: >0, comprised of: 0, i.e., contour 3 and contour 4.
T113: and performing polygon approximation on the screened outline to obtain a plurality of polygon candidate frames.
Specifically, since the extracted contour may be irregular, is a set of pixel points, and sometimes cannot be expressed in a mathematical form, it is necessary to approximate the extracted contour by a polygon, and the selected contour is approximated by the polygon to obtain a polygonal contour, for example, the contour is approximated to a contour composed of straight lines according to the contour shape, and finally the polygonal contour can be obtained. The processor of the robot, for example, performs a polygon approximation of the screened contours to obtain a plurality of polygon candidate boxes.
T114: and screening a polygonal reference frame from the polygonal candidate frames according to a preset polygonal standard.
The preset polygon standard is, for example, the same polygon standard as the polygon reference frame, for example, a parallelogram, a square, or the like. Therefore, the polygon reference frame conforming to the polygon standard can be selected from the polygon candidate frames by the preset polygon standard. The processor of the robot, for example, screens out the polygon reference frames from the polygon candidate frames according to a preset polygon criterion by calculation.
Referring to fig. 5, a parallelogram screening is taken as an example. The graph a satisfies the condition of convex polygon, edge number 4, and edge parallel, and is stored as a candidate reference frame. The graph b satisfies the convex polygon, but the number of sides is 5, so the condition is not satisfied. The graph c satisfies the convex polygon, the number of sides is 4, but two opposite sides are not parallel, so the condition is not satisfied. Graph d satisfies the number of sides of 4, but does not satisfy the convex polygon, so the condition is not satisfied.
Referring to fig. 3 and 6, alternatively, T111: the step of extracting the boundary of the image to be identified further comprises the following steps:
t110: morphological filtering is carried out on the image to be identified so as to execute subsequent steps on the basis of the filtered image to be identified.
In this embodiment, morphological filtering may include processes such as swelling, etching, opening, closing, and the like. Morphological filtering is used in the processing of the image to be identified, so that the image to be identified can keep a basic shape, and uncorrelated characteristics and noise are removed. Whether the image to be identified is a binarized image or not, or whether the pattern to be identified is itself, there may be a situation that a part of details are lost, and at the same time, some noise may be generated to exist in the image or pattern to be identified, which may interfere with the identification of the pattern itself. Thus morphological filtering is required. For example, if a break point exists in a reference frame outside the pattern to be recognized, the break point belongs to a non-connected domain, or if the pattern to be recognized itself is designed for aesthetic reasons, part of details are intentionally lost or noise is added, or if the user has lost details such as breakage, pollution and the like, or noise exists, if the image to be recognized is a binarized image, the binarization process also causes the image to lose part of details or noise. For example, the expansion processing can fill fine holes in the pattern, such as missing details, connect adjacent objects and smooth boundaries, so that the frames of the non-connected domains can be connected, and can be effectively identified later. The etching process can refine the expanded image or eliminate fine points, noise, etc., to obtain a frame of similar standard, retaining the edge profile of the pattern. For example, the processor performs morphological filtering on the image to be identified and then performs boundary extraction to obtain a plurality of contours.
Referring to fig. 6, specifically, the image to be identified is morphologically filtered at least twice with different parameters.
Because the acquired angle, distance and the like are different in the actual acquisition of the image to be identified, the image to be identified has different dimensions, angles and the like, and if single filtering is adopted, even no filtering is adopted, the polygon reference frame and the pattern to be identified are likely to be not extracted. Therefore, it is generally necessary to filter the image to be identified by using more than one set of different parameters, so as to improve the problems caused by different pattern distances and angles.
And filtering the image to be identified at least twice by adopting different parameters, wherein a new image to be identified is generated by each filtering, so that at least two new images to be identified are generated after filtering at least twice. When the filtered new image to be identified is subjected to the subsequent steps, only one image is generally required to be output, and after the filtering, the same polygonal reference frame can be extracted by a plurality of new images to be identified, if the reference frames are not combined, a repeated reference frame exists in the output image to be identified, so that the calculation efficiency and the identification efficiency are reduced, and therefore, the repeated same reference frame is required to be combined, and the repeated output is avoided.
Morphological filtering is performed, for example, using a function morphy Ex, which in turn includes setting a plurality of parameters, which may be referred to as a set of parameters. And performing multiple filtering, wherein multiple filtering is performed by adopting multiple groups of different parameters.
Referring to fig. 3 and 6, alternatively, T114: the step of screening the polygon reference frames from the polygon candidate frames according to the preset polygon standard further comprises the following steps:
t115: and merging the screened polygonal reference frames.
Specifically, it is determined whether the distance between the vertices of at least two polygonal reference frames is smaller than a preset distance and the hierarchical relationship of at least two polygonal reference frames is the same.
In a plurality of new images to be identified, a plurality of reference frames are respectively extracted, and the repeated reference frames in one image are required to be combined when one image is output. And judging whether the at least two polygonal reference frames are repeated by judging whether the hierarchical relationship of the at least two polygonal reference frames is the same and whether the distance between the vertexes of the at least two polygonal reference frames is smaller than a preset distance.
If the judgment result is yes, taking the average value of the vertex positions of at least two polygonal reference frames as the vertex position of the polygonal reference frame after combination.
Specifically, the original frame is replaced by the combined polygonal frame, so that errors of single filtering are reduced, redundancy of information caused by repeated frames is reduced, and accuracy and stability of frame extraction are improved.
Referring to fig. 2 and 8, T12: the step of taking the pattern in the polygonal reference frame as the pattern to be identified further comprises:
t13: and calculating a perspective transformation matrix according to the polygonal reference frame and the theoretical frame.
Because the difference of angle, distance and the like exists when the image to be recognized is acquired, the pattern to be recognized, which is equal to the actual pattern to be recognized, is not necessarily acquired from the front, the pattern to be recognized needs to be subjected to transformation processing of a perspective transformation matrix, and the perspective transformation matrix is calculated specifically through the recognized polygonal reference frame and theoretical frame. The perspective transformation matrix, which may also be referred to as perspective rotation transformation matrix.
T14: and performing perspective transformation on the pattern to be identified according to the perspective transformation matrix so as to execute subsequent steps on the basis of the pattern to be identified after perspective transformation.
Specifically, the pattern to be identified is subjected to perspective transformation through the perspective transformation matrix, and all pixels of the pattern to be identified are transformed into new pixels through the generated transformation matrix. This enables the pattern to be recognized to be converted into a new pattern to be recognized at the same angle of view as the actually designed pattern to be recognized, thus enabling the pattern to be recognized. Such as the visual transformation of the image illustrated by the perspective transformation shown in fig. 8.
Referring to fig. 7 and 8, alternatively, the pattern elements to be matched include line elements, and in this embodiment, the line elements refer to pattern elements that are formed by lines, such as circles formed by lines, triangles formed by lines, or straight lines. T2: the step of decomposing the pattern to be identified into a plurality of pattern elements to be matched comprises:
t21: and refining the width of the lines in the pattern to be identified, and extracting skeleton lines.
For example, the pattern is thinned by comparing the relation between each pixel of the line and the pixels in the neighborhood of the line, so as to obtain the skeleton line of the pattern. In the skeleton lines, the pixel width of all lines is one pixel.
T22: and decomposing the skeleton lines to form a plurality of line elements to be matched.
For example, a line is broken up into line segment units by taking slope discontinuities in one continuous line segment. For each line segment unit, selecting the line segment with the standard length closest to the line segment unit to replace the line segment. For example, a continuous line segment "L" forming a right angle is decomposed or divided into two line segment units at the slope abrupt point or the vertex, and each line segment unit is replaced by a line segment with a standard length to form a line element to be matched. If a continuous line segment does not have a slope abrupt change point, decomposition is not needed, or the two end points can be regarded as decomposition points, and the continuous line segment is still obtained after decomposition.
After all the skeleton lines in the pattern to be identified are decomposed, a plurality of line elements to be matched are formed.
Of course, the pattern elements in the present embodiment may also include non-line elements, such as a circular whole filled inside, unlike a circle (circular ring) made up of line elements. In this embodiment, the pattern to be recognized may be formed of only line elements, only non-line elements, or both line elements and non-line elements.
Referring to fig. 7 and 8, alternatively, T2: the step of decomposing the pattern to be identified into a plurality of pattern elements to be matched comprises:
t23: and performing approximate processing on the pattern elements to be matched according to the types of the plurality of standard pattern elements, so that the types of the pattern elements to be matched after processing are the same as the types of the standard pattern elements.
For example, a continuous arc, may be approximated as a straight line, assuming that the straight line is of the standard pattern type. For example, a closed ellipse, may be approximated as a circle, assuming that the circle is of a standard pattern type.
T24: and carrying out standardized processing on the pattern elements to be matched according to the sizes and angles of the standard pattern elements, so that the sizes and angles of the processed pattern elements to be matched and the sizes and angles of the standard pattern elements meet the preset corresponding relation.
For example, the straight line obtained by the approximation is at a certain angle to the horizontal, and the standard pattern element is horizontal, and the straight line obtained by the approximation is normalized and leveled to the horizontal direction. Of course, T24 and T23 may be performed simultaneously or sequentially.
Referring to fig. 9, alternatively, T3: the step of matching the pattern elements to be matched with the plurality of standard pattern elements respectively comprises the following steps:
t31: traversing pattern elements to be matched in the pattern to be identified according to a preset sequence.
T32: and matching the traversed pattern elements to be matched with the standard pattern elements.
Specifically, pattern elements to be matched of the pattern to be recognized are matched in a predetermined order, and then the code sequence is generated in the predetermined order. For example, as shown in fig. 8, the code generated by sequentially traversing pattern elements 1, 2, and 3 is 00010001002.
Referring to fig. 10, alternatively, T4: the step of generating the code sequence to be matched according to the codes corresponding to the standard pattern elements matched by the pattern elements to be matched comprises the following steps:
t41: and combining codes corresponding to the matched standard pattern elements or coded derivative codes according to the traversing sequence to form a code sequence to be matched.
In this embodiment, the derivative code is based on standard coding. For example, one standardized element of the pattern elements to be matched is a straight line with the length equal to 2, the length standard of the straight line in the standard pattern element is 1, the code corresponding to the straight line in the standard pattern element is 0001, and then the code derivative code such as 0011 is generated corresponding to the straight line with the length equal to 2. The coding sequences to be matched may thus be codes and/or combinations of code derivatives. For example, the codes and derived codes satisfy certain coding rules.
Alternatively, T4: the step of generating the code sequence to be matched according to the codes corresponding to the standard pattern elements matched by the pattern elements to be matched comprises the following steps:
t42: and judging whether the pattern elements to be matched and the standard pattern elements meet a preset scaling relation or not.
The predetermined scaling relationship, that is to say the pattern element to be matched, has a size, a multiple relationship, for example in terms of morphology, with the standard pattern element. For example, the length of the element to be matched is twice that of the standard pattern element, or the area of the element to be matched is twice that of the standard pattern element, and the code sequence to be matched can be the combination of codes and/or code derivative codes, not necessarily all the codes are standard codes, and the derivative codes can be generated when the scaling relation is met.
T43: if the predetermined scaling relationship is satisfied, generating a new code for the pattern elements to be matched according to the code of the standard pattern elements and a preset code rule, wherein the new code is different from the code of the standard pattern elements.
In this implementation, the new code is a derivative of the code. Satisfying the predetermined scaling relationship, a new code may be generated according to the predetermined scaling relationship, resulting in a coded derivative code.
T4: the step of generating a code sequence to be matched according to the codes of the standard pattern elements matched by the pattern elements to be matched comprises the following steps:
t44: new codes are added to the code sequences to be matched.
Specifically, the coding sequences to be matched may be combinations of codes, combinations of codes and encoded derivative codes, combinations of derivative codes and derivative codes.
In summary, in this embodiment, by decomposing a pattern to be identified including a plurality of elements to be matched, matching the plurality of elements to be matched obtained by decomposition with standard pattern elements to generate a code sequence to be matched, and matching the code sequence to be matched with the code sequence of the standard pattern, so as to identify the standard pattern corresponding to the pattern to be identified, thereby enriching the form and design of the pattern to be identified, reducing the limitation of pattern design, enhancing the aesthetic performance of pattern identification, and meanwhile, unlike the identification mode in the prior art, the plurality of elements correspond to a plurality of codes, and the same code but different code sequences obtain different standard patterns, so that the interestingness and practicability of the robot can be enriched.
The path planning method of the consumer robot in the prior art is quite single, and the path of the robot cannot be effectively planned, so that the robot is difficult to accurately move to the target, and in order to solve the technical problems, the application provides the following embodiment.
Referring to fig. 11-13, in an embodiment of a path planning method for a robot of the present application, the method may be performed by a control system independent of the robot, and the robot may be controlled to perform path planning, or the robot may autonomously identify a target and calculate and plan a path by a processor of the robot, where the method includes:
d1: the target position and the target yaw angle of the robot to be traveled are acquired.
For example, the control system acquires the image of the target object through a sensor, such as an image sensor, of the control system, then determines the target position, and then calculates the target yaw angle after acquiring the target position, or the robot acquires the target position and calculates the target yaw angle after acquiring the target position through a sensor of the robot, or the robot acquires the target position and calculates the target yaw angle and then uploads the target yaw angle to the control system.
D2: the current position and the current yaw angle of the robot are acquired.
For example, based on the acquired target position and target yaw angle, the current position and current yaw angle of the robot relative to the target position may be acquired, so that the relative relationship between the robot and the target position may be determined, specifically, the current position and yaw angle of the robot may be acquired by a position sensor and an inertial measurement unit on the robot, respectively, or may be uploaded to the control system after the acquisition. The position sensor may be, for example, a grating encoder and/or a hall encoder, etc.
In this embodiment, the current yaw angle may be measured by the inertial measurement unit by presetting a standard direction, which is the current yaw angle, as the angle between the current direction of deviation of the robot and the standard direction.
D3: and determining the type of the region where the robot is currently located in a preset plurality of region types according to the relative relation between the current position and the current yaw angle and the target position and the target yaw angle.
That is, the robots are in different positional ranges and different yaw angles may be in different zone types. The robots are in the same range of positions, but different yaw angles may also be in different zone types. The robots are in different positional ranges, but the same yaw angle may also be in different zone types.
D4: and carrying out corresponding path planning on the robot according to the determined type of the area.
In this embodiment, in one case, the distance between the target position and the target object may be 0, that is, the target position is the position of the target object. In another case, the distance between the target position and the target object is not 0. When the target object is identified, the target position can be determined according to the situation, for example, when the target object is identified, the target position information preset by the target object is acquired to determine the target position.
According to the embodiment, the target position and the target yaw angle of the robot and the current position and the current yaw angle are obtained, the region type of the robot is determined according to the relative relation between the current position and the current yaw angle and the target position and the target yaw angle, a plurality of region types correspond to a plurality of different path plans, when the robot performs one-time planning to move from one region type to another region type, the robot performs another planning, so that dynamic planning can be realized according to the position of the robot, the path planning can be effectively performed on the robot, the path planning modes of the robot can be enriched by dividing the current position and the current yaw angle into different region types, and the path planning efficiency can be improved by performing different path planning.
Referring to fig. 12 and 14, optionally, the region type includes a target region.
D3: the step of determining the type of the area where the robot is currently located in the preset plurality of area types according to the relative relation between the current position and the current yaw angle and the target position and the target yaw angle comprises the following steps:
d31: when the distance H between the current position and the target position is between a first distance threshold and a second distance threshold, and the difference value theta between the current yaw angle and the target yaw angle is smaller than a preset angle threshold, the belonging region is a target region, and the second distance threshold is larger than the first distance threshold.
Specifically, the control system may measure the distance H between the current position of the robot and the target position. For example, the control system acquires images of the robot and the target position by an image sensor or the like, calculates the distance H between the two, and can also measure the distance H by the distance sensor. The distance H between the current position of the robot and the target position may be measured by a distance sensor on the robot. The distance sensor is, for example, an ultrasonic sensor, an infrared sensor, a laser ranging sensor, a radar sensor, or the like.
D4: the step of carrying out corresponding path planning on the robot according to the determined type of the area comprises the following steps:
d41: and when the type of the area is the target area, controlling the robot to stop advancing.
Specifically, when the distance H between the current position and the target position is between the first distance threshold and the second distance threshold, and the difference θ between the current yaw angle and the target yaw angle is smaller than the preset angle threshold, the robot is controlled to stop advancing, and the robot can be considered to have reached the target area, and the yaw angle meets the requirements.
For example, the first threshold is 0.5m, the second threshold is 2m, and the preset angle threshold is 15 °. When the distance H between the current position and the target position is between 0.5m and 2m, and the difference between the current yaw angle and the target yaw angle is smaller than 15 degrees, the robot is in the target area type, and the robot is controlled to stop advancing.
The target area may be said to be an area range around the target position, and the first threshold may be set to 0 or may be set to be larger than 0. In this embodiment, for example, the robot includes an actuator (not shown), such as a mechanical arm, which includes a first arm and a second arm (not shown) rotatably connected, that is, a connection portion between the first arm and the second arm has a first rotation center (not shown), and the second arm has a claw. The first arm is rotatably arranged at the top of the first robot, and the joint of the first arm and the top of the first robot is provided with a second rotation center. The distance H between the current position of the robot and the target position is between the first threshold value and the second threshold value, so that an executing mechanism of the robot can be ensured to have a proper space to grasp the target object. If the robot is too close to the target object, the actuator cannot complete the task.
Specifically, the control system may control the robot to stop advancing, or may control the robot to stop advancing when the area where the processor of the robot is calculated is the target area type.
Referring to fig. 12 and 15, alternatively, the region type includes an advancing region,
d3: the step of determining the type of the area where the robot is currently located in the preset plurality of area types according to the relative relation between the current position and the current yaw angle and the target position and the target yaw angle comprises the following steps:
D32: when the distance H between the current position and the target position is larger than a second distance threshold value, and the difference value between the target yaw angles is smaller than a preset angle threshold value, the belonging area is an advancing area.
For example, the first threshold is 0.5m, the second threshold is 2m, and the preset angle threshold is 15 °. When the distance H between the current position and the target position is greater than 2m, and the difference theta between the current yaw angle and the target yaw angle is smaller than 15 DEG, the robot is in a forward region type, and the robot is controlled to advance.
D4: the step of carrying out corresponding path planning on the robot according to the determined type of the area comprises the following steps:
when the type of the area is a forward area, the robot is controlled to advance along a straight line.
Specifically, the robot is controlled to linearly advance, the current yaw angle of the robot is not changed in the case of no error, and therefore, the difference between the current yaw angle and the target yaw angle is not changed much or is not changed, and when the robot enters the target area type, the robot is controlled to stop.
Optionally, before the controlling the robot to advance along the straight line, further comprises:
d421: it is determined whether a difference θ between a current yaw angle and a target yaw angle due to motion accuracy after the robot moves from a current position to a target position is smaller than a preset angle threshold.
In the present embodiment, the motion accuracy means that the robot is affected by errors caused by design and manufacturing of a driving wheel, a driving motor, and the like, structural mounting, friction between each member, sensor accuracy, and the like, and the motion of the driving wheel is different from the theoretical output motion.
Due to the influence of the motion precision, the robot may further deviate on the basis of the current yaw angle in the motion process, and the difference value theta between the current yaw angle and the target yaw angle may become larger or smaller, and the smaller difference value theta is closer to the target, and the larger difference value theta is farther from the target.
Therefore, in step D421, it is necessary to determine whether the difference θ between the current yaw angle and the target yaw angle due to the motion accuracy when the robot moves from the current position to the target position is smaller or larger than a preset angle threshold.
D422: if the angle is larger than the preset angle threshold, the robot is controlled to rotate, and then the robot is controlled to linearly advance.
If the difference value theta between the current yaw angle and the target yaw angle is larger than the preset angle threshold value through pre-calculation under the condition that the motion accuracy exists, the robot is controlled to rotate towards the target direction, the difference value theta between the current yaw angle and the target yaw angle is further reduced, and after the robot rotates and linearly advances to the target area type, the difference value theta between the current yaw angle and the target yaw angle is still ensured to be smaller than the preset angle threshold value. That is, the robot eliminates an angle error generated by straight walking by rotating in advance.
When the robot is located in the forward area, the robot may perform some actions, such as the actuator grabbing the target object, because the distance is not enough, and further forward is needed at this time, and forward may be performed according to the planning strategy of the forward area.
Referring to fig. 12 and 16, optionally, the region type includes an adjustment region,
d3: the step of determining the type of the area where the robot is currently located in the preset plurality of area types according to the relative relation between the current position and the current yaw angle and the target position and the target yaw angle comprises the following steps:
d33: when the distance H between the current position and the target position is larger than a second distance threshold value and the difference value theta between the current yaw angle and the target yaw angle is larger than a preset angle threshold value, the type of the area is an adjustment area.
D4: the step of carrying out corresponding path planning on the robot according to the determined type of the area comprises the following steps:
d43: and determining a transition position and at least one transition angle according to the current position, the current yaw angle, the target position and the target yaw angle.
D44: and controlling the robot to linearly advance to a transition position, and controlling the robot to rotate to a transition angle before or after advancing so as to enable the type of the area to transition from the adjustment area to the advancing area, wherein the transition position is the same as or different from the current position, and the transition angle is different from the current yaw angle.
For example, the first threshold is 0.5m, the second threshold is 2m, and the preset angle threshold is 15 °. When the difference between the current position and the target position is 5m and is larger than 2m, the difference between the current yaw angle and the target yaw angle is 30 degrees and larger than 15 degrees, the robot is in an adjustment area type, at the moment, the transition position and at least one transition angle are determined according to the current position, the current yaw angle, the target position and the target yaw angle, the difference between the transition position and the target position is 3m, for example, 1 transition angle, the difference between the transition angle and the target yaw angle is smaller than 15 degrees, the robot is controlled to linearly advance to the transition position, at the moment, the difference between the current position and the target yaw angle of the robot is 3m and is larger than 2m, and after the transition position is reached, the robot is controlled to rotate to the transition angle, namely, the difference between the current yaw angle and the target yaw angle of the robot is smaller than 15 degrees, and the robot is in an advance area type. Of course, it is also possible to rotate the device before advancing to the transition position.
And when the transition position is the same as the current position, the robot rotates to a transition angle at the current position. When the transition position is different from the current position, the robot can advance to the transition position and then rotate by a transition angle, or advance to the transition position after rotating by the transition angle, so that the robot is positioned in the advancing area type, and then the path planning corresponding to the advancing area type is executed according to the advancing area type.
For example, as shown in fig. 17, assuming that the starting position 1 is the current position, the distance H from the target position 4 is greater than the second threshold, the difference θ between the current yaw angle and the target yaw angle is greater than the preset angle threshold, and the robot is in the adjustment area, the robot may be rotated in place at the starting position 1 by the transition angle θ 1 First, firstThe transition position 2 is the same as the starting position 1, the robot moves from the first transition position 2 by a distance H to the second transition position 3 by rotating the transition angle θ at the second transition position 3 2 Reaching the target position 4.
The adjustment area is arranged, so that when the distance difference is larger than the second threshold value and the angle difference is about the preset angle threshold value, the robot is effectively controlled to adjust to the type of the advancing area, the adjustment area and the advancing area are jointly adjusted in the front-back direction, the effectiveness of robot path planning is guaranteed, and the robot is intelligent.
Referring to fig. 12 and 18, optionally, the region type includes a fallback region.
D3: the step of determining the type of the area where the robot is currently located in the preset plurality of area types according to the relative relation between the current position and the current yaw angle and the target position and the target yaw angle comprises the following steps:
D34: when the distance H between the current position and the target position is between the first distance threshold value and the second distance threshold value, and the difference value theta between the current yaw angle and the target yaw angle is larger than a preset angle threshold value, the belonging area is a backward area.
Specifically, when the robot is located or travels to the backing area, there may be a case where the robot cannot recognize the target in an effective field of view or the difference θ between the current yaw angle and the target yaw angle is too large, for example, greater than 90 °, in-situ rotation may not be possible, and thus backing is required.
D4: the step of carrying out corresponding path planning on the robot according to the determined type of the area comprises the following steps:
d45: the robot is controlled to linearly retreat so that the type of the affiliated area transits from the retreating area to the adjusting area.
When the robot is positioned in the backward region type, the robot is controlled to linearly backward to the adjustment region type, namely, the distance H between the current position and the target position is larger than a second distance threshold value, and the difference value theta between the current yaw angle and the target yaw angle is larger than a preset angle threshold value.
Referring to fig. 12 and 19, optionally, the region type includes a near point region.
D3, determining the type of the area where the robot is currently located in the preset plurality of area types according to the relative relation between the current position and the current yaw angle and the target position and the target yaw angle, wherein the step of determining the type of the area where the robot is currently located comprises the following steps:
D35: when the distance H between the current position and the target position is smaller than the first distance threshold value, the belonging area type is a near point area.
For example, the first threshold is 0.5m and the second threshold is 2m. When the difference between the current position and the target position is smaller than 0.5m, the current position is located in the near point area.
D4: the step of carrying out corresponding path planning on the robot according to the determined type of the area comprises the following steps:
d46: the robot is controlled to linearly retreat so that the type of the affiliated region transits from the near point region to the retreating region or the target region.
Specifically, when the distance H between the current position and the target position is smaller than a first distance threshold value, and the difference θ between the current yaw angle and the target yaw angle is smaller than a preset angle threshold value, the robot is controlled to linearly retreat, so that the robot is located in the target area.
When the distance H between the current position and the target position is smaller than a first distance threshold value, and the difference value theta between the current yaw angle and the target yaw angle is larger than a preset angle threshold value, controlling the robot to linearly retreat, enabling the robot to be in a retreating area, and then executing path planning of the retreating area.
In this embodiment, the following steps mainly relate to a pattern recognition method that can use the embodiment of the pattern recognition method based on element matching according to the present application, and specifically, the embodiment of the pattern recognition method based on element matching according to the present application can be described in detail, which is not described herein.
Alternatively, D1: the step of acquiring the target position and the target yaw angle to be traveled by the robot comprises the following steps:
a pattern on the target object is identified.
Specifically, the pattern on the target object may be a pattern to be identified in the embodiment of the pattern identification method based on element matching in the present application, and may be a user-defined pattern in the embodiment of the method for generating the robot control pattern in the present application.
A target position and a target yaw angle are determined from the identified pattern.
Optionally, the step of identifying the pattern on the target object includes:
and extracting the pattern to be identified from the acquired image.
And decomposing the pattern to be identified into a plurality of pattern elements to be matched.
And respectively matching the pattern elements to be matched with a plurality of standard pattern elements, wherein different standard pattern elements correspond to different codes.
And generating a code sequence to be matched according to codes corresponding to the standard pattern elements matched by the pattern elements to be matched.
And matching the coding sequence to be matched with the standard coding sequences of the plurality of standard patterns.
Referring to fig. 11, alternatively, D12: the step of determining the target position and the target yaw angle from the identified pattern comprises:
And when the coding sequence to be matched is matched with the standard pattern, determining the target position and the target yaw angle according to the position and the azimuth angle of the pattern to be identified under a preset coordinate system.
In this embodiment, the preset coordinate system may be constructed with the current robot as a center, and the target position and the target yaw angle may be determined according to the position and the azimuth angle of the pattern to be identified on the target object under the preset coordinate system. For example, the mechanical arm of the robot needs to clamp the target object, after knowing the position and azimuth angle of the image to be identified of the target object, since the mechanical arm of the robot clamps the target object, a certain distance needs to be reserved from the target object, the distance can be known (because the movement range of the mechanical arm is known), and thus the robot can calculate a target position and a target yaw angle.
Optionally, the step of matching the pattern element to be matched with the plurality of standard pattern elements includes:
traversing pattern elements to be matched in the pattern to be identified according to a preset sequence.
And matching the traversed pattern elements to be matched with the standard pattern elements.
Optionally, the step of generating the code sequence to be matched according to the codes corresponding to the standard pattern elements matched by the pattern elements to be matched includes:
And combining the codes corresponding to the matched standard pattern elements or the coded derivative codes according to a preset sequence to form a code sequence to be matched.
Optionally, the standard pattern is further associated with a position setting parameter and/or an angle setting parameter.
For example, information associated with the position setting parameter and/or the angle setting parameter may be written in a standard pattern, and after the robot matches the standard pattern, the information may be read, and after the robot reads the information, for example, the information is used to instruct the robot to finally move to a position corresponding to the position setting parameter and/or an angle corresponding to the angle setting parameter. I.e. a position setting parameter and an angle setting parameter, for indicating a target position and a target yaw angle.
Optionally, D121: the step of determining the target position and the target yaw angle according to the position and the azimuth angle of the pattern to be identified under a preset coordinate system comprises the following steps:
and determining the target position and the target yaw angle according to the position and the azimuth angle of the pattern to be identified under the preset coordinate system and the position setting parameter and/or the angle setting parameter.
Referring to fig. 12, for example, in an x-y plane in a predetermined coordinate system, coordinates of a pattern to be recognized on a target object are (3 m,4 m), an azimuth angle is 120 ° with respect to an x-axis, a current robot is located at a center point o, a position setting parameter indicates that a target position of the robot is at a position of 1m of the pattern to be recognized, and an angle setting parameter indicates that the robot is stopped at an angle of 90 ° with respect to a plane of the pattern to be recognized, that is, the robot and the pattern to be recognized are "face-to-face", and a start direction of the robot is an x-axis direction. The distance between the centre point and the pattern to be identified is 5m, the target position is 5-1=4m, and the target yaw angle is 120-90=30°.
In this embodiment, each time the area is determined, the pattern to be identified is located, and the position and azimuth angle are analyzed. Each motion instruction is a motion instruction obtained by theoretical calculation, so that the robot may not reach the target area completely after executing the motion instruction, and therefore the path planning in implementation is a dynamic planning and is a real-time planning based on the current position.
In summary, this embodiment is different from a single path planning method in the prior art, by acquiring the target position and the target yaw angle of the robot, and the current position and the current yaw angle, and determining the type of the area where the robot is located according to the relative relationship between the current position and the current yaw angle and the target position and the target yaw angle, so as to execute different path planning, avoid the single path planning method, and perform planning in real time according to the fact that the robot is located in different areas, and realize dynamic planning, so that path planning is effectively performed on the robot, and path planning efficiency is improved.
The foregoing description is only of embodiments of the present application, and is not intended to limit the scope of the application, and all equivalent structures or equivalent processes using the descriptions and the drawings of the present application or directly or indirectly applied to other related technical fields are included in the scope of the present application.

Claims (8)

1. A method for path planning for a robot, the method comprising:
extracting a pattern to be identified from an image acquired by collecting the pattern on the target object;
decomposing the pattern to be identified into a plurality of pattern elements to be matched;
matching the pattern elements to be matched with a plurality of standard pattern elements respectively, wherein different standard pattern elements correspond to different codes;
generating a code sequence to be matched according to codes corresponding to the standard pattern elements matched by the pattern elements to be matched;
matching the coding sequence to be matched with standard coding sequences of a plurality of standard patterns to determine the corresponding relation between the pattern to be identified and the standard patterns;
when the coding sequence to be matched is matched with the standard coding sequence of the standard pattern, determining a target position and a target yaw angle to be traveled by the robot according to the position and the azimuth angle of the pattern to be identified under a preset coordinate system;
acquiring the current position and the current yaw angle of the robot;
determining the type of the region where the robot is currently located in a preset plurality of region types according to the relative relation between the current position and the current yaw angle and the target position and the target yaw angle;
Performing corresponding path planning on the robot according to the determined type of the area;
the decomposing the pattern to be identified into a plurality of pattern elements to be matched includes:
performing approximate processing on the pattern elements to be matched according to the types of the standard pattern elements, so that the types of the pattern elements to be matched after processing are the same as the types of the standard pattern elements;
and carrying out standardized processing on the pattern elements to be matched according to the sizes and angles of the standard pattern elements, so that the sizes and angles of the pattern elements to be matched after processing and the sizes and angles of the standard pattern elements meet the preset corresponding relation.
2. The method of claim 1, wherein the region type comprises a target region,
the step of determining the area type of the area where the robot is currently located in the preset plurality of area types according to the relative relation between the current position and the current yaw angle and the target position and the target yaw angle comprises the following steps:
when the distance between the current position and the target position is between a first distance threshold and a second distance threshold, and the difference between the current yaw angle and the target yaw angle is smaller than a preset angle threshold, the belonging region type is a target region, wherein the second distance threshold is larger than the first distance threshold;
The step of performing corresponding path planning on the robot according to the determined type of the area comprises the following steps:
and when the type of the area is the target area, controlling the robot to stop advancing.
3. The method of claim 2, wherein the region type comprises an advance region,
the step of determining the area type of the area where the robot is currently located in the preset plurality of area types according to the relative relation between the current position and the current yaw angle and the target position and the target yaw angle comprises the following steps:
when the distance between the current position and the target position is greater than the second distance threshold and the difference between the target yaw angles is smaller than a preset angle threshold, the belonging area type is an advancing area;
the step of performing corresponding path planning on the robot according to the determined type of the area comprises the following steps:
and when the type of the area is an advancing area, controlling the robot to advance along a straight line.
4. A method according to claim 3, wherein said controlling the robot before proceeding in a straight line further comprises:
Determining whether a difference between the current yaw angle and the target yaw angle due to motion accuracy after the robot moves from the current position to the target position is less than the angle threshold;
and if the angle threshold value is larger than the angle threshold value, controlling the robot to rotate, and controlling the robot to linearly advance.
5. The method of claim 3, wherein the region type comprises an adjustment region,
the step of determining the area type of the area where the robot is currently located in the preset plurality of area types according to the relative relation between the current position and the current yaw angle and the target position and the target yaw angle comprises the following steps:
when the distance between the current position and the target position is greater than a second distance threshold value and the difference between the current yaw angle and the target yaw angle is greater than a preset angle threshold value, the type of the area is an adjustment area;
the step of performing corresponding path planning on the robot according to the determined type of the area comprises the following steps:
determining a transition position and at least one transition angle according to the current position, the current yaw angle, the target position and the target yaw angle;
And controlling the robot to linearly advance to the transition position, and controlling the robot to rotate to the transition angle before or after the advance so as to enable the type of the area to transition from the adjustment area to the advance area, wherein the transition position is the same as or different from the current position, and the transition angle is different from the current yaw angle.
6. The method of claim 5, wherein the region type comprises a fallback region,
the step of determining the area type of the area where the robot is currently located in the preset plurality of area types according to the relative relation between the current position and the current yaw angle and the target position and the target yaw angle comprises the following steps:
when the distance between the current position and the target position is between the first distance threshold and the second distance threshold and the difference between the current yaw angle and the target yaw angle is larger than a preset angle threshold, the belonging area type is a backward area;
the step of performing corresponding path planning on the robot according to the determined type of the area comprises the following steps:
and controlling the robot to linearly retreat so as to enable the type of the affiliated region to transition from the retreating region to the adjusting region.
7. The method of claim 6, wherein the region type comprises a near point region,
the step of determining the area type of the area where the robot is currently located in the preset plurality of area types according to the relative relation between the current position and the current yaw angle and the target position and the target yaw angle comprises the following steps:
when the distance between the current position and the target position is smaller than the first distance threshold value, the type of the area is a near-point area;
the step of performing corresponding path planning on the robot according to the determined type of the area comprises the following steps:
controlling the robot to linearly retreat so that the belonging region type transits from the near point region to the retreating region or the target region.
8. The method according to claim 1, characterized in that the standard pattern is further associated with position setting parameters and/or angle setting parameters;
the step of determining the target position and the target yaw angle according to the position and the azimuth angle of the pattern to be identified under a preset coordinate system comprises the following steps:
and determining the target position and the target yaw angle according to the position and the azimuth angle of the pattern to be identified under a preset coordinate system and the position setting parameter and/or the angle setting parameter.
CN201810401508.0A 2018-04-28 2018-04-28 Path planning method for robot Active CN110411446B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810401508.0A CN110411446B (en) 2018-04-28 2018-04-28 Path planning method for robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810401508.0A CN110411446B (en) 2018-04-28 2018-04-28 Path planning method for robot

Publications (2)

Publication Number Publication Date
CN110411446A CN110411446A (en) 2019-11-05
CN110411446B true CN110411446B (en) 2023-09-08

Family

ID=68357065

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810401508.0A Active CN110411446B (en) 2018-04-28 2018-04-28 Path planning method for robot

Country Status (1)

Country Link
CN (1) CN110411446B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111504328B (en) * 2020-05-22 2022-07-08 梅卡曼德(北京)机器人科技有限公司 Robot motion planning method, path planning method, grabbing method and device
CN112393731B (en) * 2020-10-10 2023-04-25 上海钛米机器人股份有限公司 Method, device, electronic equipment and storage medium for tracking path
CN114869171A (en) * 2022-04-21 2022-08-09 美智纵横科技有限责任公司 Cleaning robot, control method and device thereof, and readable storage medium

Citations (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0107789A2 (en) * 1982-09-30 1984-05-09 Siemens Aktiengesellschaft Method of coding printed forms as arc structures with magnitude and orientation independence for the purpose of document analysis, especially character recognition
CN86100683A (en) * 1986-01-28 1987-08-19 中国人民解放军58026部队 A kind of ONLINE RECOGNITION device of handwritten Chinese character
JPH0537786A (en) * 1991-07-26 1993-02-12 Sony Corp Picture data encoder and decoder
JPH08235359A (en) * 1995-02-23 1996-09-13 Matsushita Electric Works Ltd Pattern matching method and device thereof
CN1535028A (en) * 2003-02-28 2004-10-06 株式会社Ntt都科摩 Image encoding apparatus, method and program, and image decoding apparatus, method and program
CN1908955A (en) * 2006-08-21 2007-02-07 北京中星微电子有限公司 Trilateral poly-dimensional bar code easy for omnibearing recognition and reading method thereof
CN101500165A (en) * 2009-02-26 2009-08-05 北京中星微电子有限公司 Encoding and decoding method and apparatus for displaying line pattern overlapped onto image
CN101847011A (en) * 2010-03-31 2010-09-29 深圳市银星智能电器有限公司 Method for positioning and covering portable areas of mobile robots
CN101964053A (en) * 2010-09-28 2011-02-02 中国船舶重工集团公司第七○九研究所 On-line identification method of compound pattern
CN102162736A (en) * 2010-12-13 2011-08-24 深圳市凯立德科技股份有限公司 Method for displaying planned paths, navigation method and location-based service terminal
CN102460478A (en) * 2009-04-08 2012-05-16 大陆-特韦斯贸易合伙股份公司及两合公司 Two-dimensional symbol code and method for reading the symbol code
CN102818568A (en) * 2012-08-24 2012-12-12 中国科学院深圳先进技术研究院 Positioning and navigation system and method of indoor robot
CN103353758A (en) * 2013-08-05 2013-10-16 青岛海通机器人系统有限公司 Indoor robot navigation device and navigation technology thereof
CN104914866A (en) * 2015-05-29 2015-09-16 国网山东省电力公司电力科学研究院 Tour inspection robot global path planning method based on topological point classification and system
CN105034018A (en) * 2015-09-09 2015-11-11 刘阳 Flexible unit and flexible wrist for industrial robot precision assembly
CN105509729A (en) * 2015-11-16 2016-04-20 中国航天时代电子公司 Bionic-tentacle-based robot autonomous navigation method
CN105844277A (en) * 2016-03-22 2016-08-10 江苏木盟智能科技有限公司 Label identification method and device
CN106529635A (en) * 2016-10-18 2017-03-22 网易(杭州)网络有限公司 Coding pattern generating and identifying method and apparatus
CN106778441A (en) * 2017-01-12 2017-05-31 西安科技大学 A kind of graph image intelligent identifying system and its recognition methods
CN106774315A (en) * 2016-12-12 2017-05-31 深圳市智美达科技股份有限公司 Autonomous navigation method of robot and device
CN106774310A (en) * 2016-12-01 2017-05-31 中科金睛视觉科技(北京)有限公司 A kind of robot navigation method
CN206224246U (en) * 2016-10-19 2017-06-06 九阳股份有限公司 A kind of robot for realizing target positioning and tracking
CN106919171A (en) * 2017-03-02 2017-07-04 安科智慧城市技术(中国)有限公司 A kind of robot indoor positioning navigation system and method
CN107065883A (en) * 2017-05-18 2017-08-18 广州视源电子科技股份有限公司 Control method for movement, device, robot and storage medium
CN107097228A (en) * 2017-05-11 2017-08-29 东北大学秦皇岛分校 Autonomous traveling robot system
CN107139172A (en) * 2017-05-18 2017-09-08 深圳市微付充科技有限公司 Robot control method and device
CN107168334A (en) * 2017-06-26 2017-09-15 上海与德通讯技术有限公司 A kind of paths planning method and robot
CN107369177A (en) * 2017-07-03 2017-11-21 东南大学 A kind of roadside assistance equipment capstan winch rope based on figure identification is anti-to cross drawing method for early warning
CN107368074A (en) * 2017-07-27 2017-11-21 南京理工大学 A kind of autonomous navigation method of robot based on video monitoring
CN107450544A (en) * 2017-08-14 2017-12-08 深圳市思维树科技有限公司 A kind of robot tracking running gear and method based on pattern identification
CN107571260A (en) * 2017-10-25 2018-01-12 南京阿凡达机器人科技有限公司 The method and apparatus that control machine people captures object
CN107671863A (en) * 2017-08-22 2018-02-09 广东美的智能机器人有限公司 Robot control method, device and robot based on Quick Response Code
CN107957727A (en) * 2016-10-17 2018-04-24 江苏舾普泰克自动化科技有限公司 Underwater robot control system and dynamic localization method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008084135A (en) * 2006-09-28 2008-04-10 Toshiba Corp Movement control method, mobile robot and movement control program
US20080101693A1 (en) * 2006-10-26 2008-05-01 Intelligence Frontier Media Laboratory Ltd Video image based tracking system for identifying and tracking encoded color surface
JP5271031B2 (en) * 2008-08-09 2013-08-21 株式会社キーエンス Image data compression method, pattern model positioning method in image processing, image processing apparatus, image processing program, and computer-readable recording medium
US8602893B2 (en) * 2010-06-02 2013-12-10 Sony Computer Entertainment Inc. Input for computer device using pattern-based computer vision

Patent Citations (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0107789A2 (en) * 1982-09-30 1984-05-09 Siemens Aktiengesellschaft Method of coding printed forms as arc structures with magnitude and orientation independence for the purpose of document analysis, especially character recognition
CN86100683A (en) * 1986-01-28 1987-08-19 中国人民解放军58026部队 A kind of ONLINE RECOGNITION device of handwritten Chinese character
JPH0537786A (en) * 1991-07-26 1993-02-12 Sony Corp Picture data encoder and decoder
JPH08235359A (en) * 1995-02-23 1996-09-13 Matsushita Electric Works Ltd Pattern matching method and device thereof
CN1535028A (en) * 2003-02-28 2004-10-06 株式会社Ntt都科摩 Image encoding apparatus, method and program, and image decoding apparatus, method and program
CN1908955A (en) * 2006-08-21 2007-02-07 北京中星微电子有限公司 Trilateral poly-dimensional bar code easy for omnibearing recognition and reading method thereof
CN101500165A (en) * 2009-02-26 2009-08-05 北京中星微电子有限公司 Encoding and decoding method and apparatus for displaying line pattern overlapped onto image
CN102460478A (en) * 2009-04-08 2012-05-16 大陆-特韦斯贸易合伙股份公司及两合公司 Two-dimensional symbol code and method for reading the symbol code
CN101847011A (en) * 2010-03-31 2010-09-29 深圳市银星智能电器有限公司 Method for positioning and covering portable areas of mobile robots
CN101964053A (en) * 2010-09-28 2011-02-02 中国船舶重工集团公司第七○九研究所 On-line identification method of compound pattern
CN102162736A (en) * 2010-12-13 2011-08-24 深圳市凯立德科技股份有限公司 Method for displaying planned paths, navigation method and location-based service terminal
CN102818568A (en) * 2012-08-24 2012-12-12 中国科学院深圳先进技术研究院 Positioning and navigation system and method of indoor robot
CN103353758A (en) * 2013-08-05 2013-10-16 青岛海通机器人系统有限公司 Indoor robot navigation device and navigation technology thereof
CN104914866A (en) * 2015-05-29 2015-09-16 国网山东省电力公司电力科学研究院 Tour inspection robot global path planning method based on topological point classification and system
CN105034018A (en) * 2015-09-09 2015-11-11 刘阳 Flexible unit and flexible wrist for industrial robot precision assembly
CN105509729A (en) * 2015-11-16 2016-04-20 中国航天时代电子公司 Bionic-tentacle-based robot autonomous navigation method
CN105844277A (en) * 2016-03-22 2016-08-10 江苏木盟智能科技有限公司 Label identification method and device
CN107957727A (en) * 2016-10-17 2018-04-24 江苏舾普泰克自动化科技有限公司 Underwater robot control system and dynamic localization method
CN106529635A (en) * 2016-10-18 2017-03-22 网易(杭州)网络有限公司 Coding pattern generating and identifying method and apparatus
CN206224246U (en) * 2016-10-19 2017-06-06 九阳股份有限公司 A kind of robot for realizing target positioning and tracking
CN106774310A (en) * 2016-12-01 2017-05-31 中科金睛视觉科技(北京)有限公司 A kind of robot navigation method
CN106774315A (en) * 2016-12-12 2017-05-31 深圳市智美达科技股份有限公司 Autonomous navigation method of robot and device
CN106778441A (en) * 2017-01-12 2017-05-31 西安科技大学 A kind of graph image intelligent identifying system and its recognition methods
CN106919171A (en) * 2017-03-02 2017-07-04 安科智慧城市技术(中国)有限公司 A kind of robot indoor positioning navigation system and method
CN107097228A (en) * 2017-05-11 2017-08-29 东北大学秦皇岛分校 Autonomous traveling robot system
CN107139172A (en) * 2017-05-18 2017-09-08 深圳市微付充科技有限公司 Robot control method and device
CN107065883A (en) * 2017-05-18 2017-08-18 广州视源电子科技股份有限公司 Control method for movement, device, robot and storage medium
CN107168334A (en) * 2017-06-26 2017-09-15 上海与德通讯技术有限公司 A kind of paths planning method and robot
CN107369177A (en) * 2017-07-03 2017-11-21 东南大学 A kind of roadside assistance equipment capstan winch rope based on figure identification is anti-to cross drawing method for early warning
CN107368074A (en) * 2017-07-27 2017-11-21 南京理工大学 A kind of autonomous navigation method of robot based on video monitoring
CN107450544A (en) * 2017-08-14 2017-12-08 深圳市思维树科技有限公司 A kind of robot tracking running gear and method based on pattern identification
CN107671863A (en) * 2017-08-22 2018-02-09 广东美的智能机器人有限公司 Robot control method, device and robot based on Quick Response Code
CN107571260A (en) * 2017-10-25 2018-01-12 南京阿凡达机器人科技有限公司 The method and apparatus that control machine people captures object

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
图像编码的新方法 —分形几何;刘斌;《南昌大学学报(工科 版)》;第19卷(第3期);第97-100、104页 *

Also Published As

Publication number Publication date
CN110411446A (en) 2019-11-05

Similar Documents

Publication Publication Date Title
US11238615B2 (en) Sensor calibration
Liu et al. Vectormapnet: End-to-end vectorized hd map learning
CN110411446B (en) Path planning method for robot
Qin et al. A light-weight semantic map for visual localization towards autonomous driving
CN111801711A (en) Image annotation
CN110458161B (en) Mobile robot doorplate positioning method combined with deep learning
Hu et al. A multi-modal system for road detection and segmentation
Bruls et al. The right (angled) perspective: Improving the understanding of road scenes using boosted inverse perspective mapping
CN106780484A (en) Robot interframe position and orientation estimation method based on convolutional neural networks Feature Descriptor
Manz et al. Detection and tracking of road networks in rural terrain by fusing vision and LIDAR
Aziz et al. Implementation of lane detection algorithm for self-driving car on toll road cipularang using Python language
Pauls et al. Monocular localization in hd maps by combining semantic segmentation and distance transform
Rangesh et al. Ground plane polling for 6dof pose estimation of objects on the road
Maier et al. Real-time detection and classification of arrow markings using curve-based prototype fitting
CN110018633B (en) Two-dimensional code design method for AGV positioning and navigation
Fries et al. Autonomous convoy driving by night: The vehicle tracking system
Ding et al. Pivotnet: Vectorized pivot learning for end-to-end hd map construction
CN110986945A (en) Local navigation method and system based on semantic height map
Zhu et al. A review of 6d object pose estimation
Liu et al. Vision-based uneven bev representation learning with polar rasterization and surface estimation
Weber et al. Direct 3d detection of vehicles in monocular images with a cnn based 3d decoder
Ballardini et al. Visual localization at intersections with digital maps
Pershina et al. Methods of mobile robot visual navigation and environment mapping
Pauls et al. Automatic mapping of tailored landmark representations for automated driving and map learning
Goronzy et al. QRPos: Indoor positioning system for self-balancing robots based on QR codes

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant