CN113190009B - Robot curve virtual wall implementation method based on grid map - Google Patents

Robot curve virtual wall implementation method based on grid map Download PDF

Info

Publication number
CN113190009B
CN113190009B CN202110501175.0A CN202110501175A CN113190009B CN 113190009 B CN113190009 B CN 113190009B CN 202110501175 A CN202110501175 A CN 202110501175A CN 113190009 B CN113190009 B CN 113190009B
Authority
CN
China
Prior art keywords
robot
curve
virtual wall
scattered point
scattered
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110501175.0A
Other languages
Chinese (zh)
Other versions
CN113190009A (en
Inventor
孙永强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Amicro Semiconductor Co Ltd
Original Assignee
Zhuhai Amicro Semiconductor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Amicro Semiconductor Co Ltd filed Critical Zhuhai Amicro Semiconductor Co Ltd
Priority to CN202110501175.0A priority Critical patent/CN113190009B/en
Publication of CN113190009A publication Critical patent/CN113190009A/en
Application granted granted Critical
Publication of CN113190009B publication Critical patent/CN113190009B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Numerical Control (AREA)

Abstract

The invention discloses a robot curve virtual wall realization method based on a grid map, which comprises the following steps: s1: the robot sets a curve virtual wall based on the constructed grid map; s2: the robot builds a mathematical model on the curve virtual wall; s3: converting the grid coordinates of the current position into model coordinates of a mathematical model by the robot; s4: and the robot determines the states of the robot and the virtual wall according to the model coordinates. The robot can set the curve virtual wall according to the actual environment, so that the setting of the virtual wall is more flexible; the corresponding mathematical model is set according to the curve virtual wall, and the model can support continuous function curves and has high realization and functionality.

Description

Robot curve virtual wall implementation method based on grid map
Technical Field
The invention relates to the technical field of robots, in particular to a robot curve virtual wall implementation method based on a grid map.
Background
In order to avoid the situation that the existing sweeping robot mistakenly enters into some areas which do not need to be cleaned, a virtual wall emitter is generally arranged at the entrance of the area, an infrared signal can be generated by the virtual wall emitter, and when the robot detects the infrared signal, the robot stops to continue to move forward, so that the robot is prevented from entering the area. This approach requires the provision of additional virtual wall transmitters, which increases hardware costs. At present, some manufacturers adopt a mode of setting a virtual wall, a virtual line is drawn at an entrance of an area which is not required to be cleaned and is corresponding to a map constructed by a robot, the corresponding position of the robot is set as the virtual wall according to the coordinate position in the map corresponding to the virtual line, and when the robot moves to the coordinate position corresponding to the virtual wall, the robot stops advancing or turns around to turn, so that the blocking effect is realized. Compared with a virtual wall transmitter, the method saves hardware cost, but the current method for realizing the virtual wall by applying a software algorithm only supports a linear virtual wall, and has larger limitation.
Disclosure of Invention
In order to solve the problems, the application provides a robot curve virtual wall implementation method based on a grid map, and the application provides a curve virtual wall mathematical model, wherein the curve virtual wall form supports a continuous function curve, the virtual wall with the curve form can be set according to a real environment, the application range of the method for realizing the virtual wall by an application software algorithm is improved, and the application range is wide. The specific technical scheme of the application is as follows:
a robot curve virtual wall implementation method based on a grid map comprises the following steps: s1: the robot sets a curve virtual wall based on the constructed grid map; s2: the robot builds a mathematical model on the curve virtual wall; s3: converting the grid coordinates of the current position into model coordinates of a mathematical model by the robot; s4: and the robot determines the states of the robot and the virtual wall according to the model coordinates. The robot can set the curve virtual wall according to the actual environment, so that the setting of the virtual wall is more flexible; the corresponding mathematical model is set according to the curve virtual wall, and the model can support continuous function curves and has high realization and functionality.
Further, in step S1, the robot setting a curved virtual wall based on the constructed grid map includes the steps of: the intelligent mobile terminal receives map information constructed by the robot and displays the map information through a screen; the intelligent mobile terminal detects a signal for setting a virtual wall and converts a touch signal on a screen into the virtual wall to be displayed in a map shown on the screen; and the intelligent mobile terminal transmits the map information provided with the virtual wall to the robot. The virtual wall information is displayed through the intelligent mobile terminal, so that the set virtual wall is simpler and more convenient to operate.
Further, in step S2, the robot builds a mathematical model comprising the steps of: the robot converts the curve virtual wall into a scattered point curve of the grid map; translating the center point of the scattered point curve to the origin of the coordinate system by taking the center point of the scattered point curve as a reference point, and rotating the scattered point curve to enable points on the scattered point curve to be distributed on two sides of an X axis; and translating the scattered point curve forward and backward along the Y axis by a preset value to divide the areas at two sides of the scattered point curve. The virtual wall is transformed into the scattered point curve to set the mathematical model, so that the robot can calculate conveniently.
Further, the robot dividing the area at both sides of the scatter curve comprises the following steps: the scattered point curves are translated forward and backward along the Y axis by a first preset value to form a touch area, and the virtual wall is positioned at the center of the touch area; the scattered point curves forward and backward translate a second preset value basic unit along the Y axis, and the scattered point curves after the second translation and the scattered point curves after the first translation on the same side form an early warning area respectively, so that the early warning areas are distributed on two sides of the touch area; setting the areas except the early warning area and the touch area as a safety area; the scattered point curve after the first translation comprises a first forward scattered point curve and a first reverse scattered point curve, the scattered point curve after the second translation comprises a second forward scattered point curve and a second reverse scattered point curve, and the second preset value is larger than the first preset value. Corresponding areas can be constructed through simple forward translation and reverse translation, operation data are less, and the operation speed of the robot is improved.
Further, in step S3, the robot converting the grid coordinates of the current position into model coordinates of the mathematical model includes the steps of: the robot determines grid coordinates of the current position of the robot, and translates and rotates the grid coordinates of the current position corresponding to the virtual wall to obtain model coordinates based on a mathematical model. The grid coordinates of the robot can be converted into corresponding mathematical model coordinates through simple translation, and the operation is simple and convenient.
Further, in step S4, the robot determining the states of itself and the virtual wall according to the model coordinates includes the following steps: the robot compares the model coordinates with the coordinates of corresponding scattered points of the scattered point curve after translation, determines whether the robot is in a safe area, an early warning area or a touch area on the mathematical model according to the comparison result, and if the robot is in the safe area, the robot judges that the robot and the virtual wall are in a safe state; if the robot is in the early warning area, the robot judges that the state of the robot and the virtual wall is in a close state; if the robot is in the touch area, the robot judges that the state of the robot and the virtual wall is a touch state; the translated scatter point curves comprise a first forward scatter point curve, a first reverse scatter point curve, a second forward scatter point curve and a second reverse scatter point curve. The state of the robot and the virtual wall is judged by the position of the robot in the mathematical model, so that the robot can conveniently set corresponding actions aiming at the state of the robot and the virtual wall.
Further, the robot comparing the model coordinates with coordinates of corresponding scattered points of the shifted scattered point curve comprises the following steps: making a straight line perpendicular to the X axis by passing through the model coordinates; if the straight line is not intersected with the translated scattered point curve, judging that the robot is in a safe area; if the straight line intersects with the translated scattered point curve, comparing the model coordinate with the Y-axis coordinate of the intersection point of the straight line and the translated scattered point curve; if the Y-axis coordinate of the model coordinate is larger than the Y-axis coordinate of the intersection point of the straight line and the second forward scattered point curve or smaller than the Y-axis coordinate of the intersection point of the straight line and the second reverse scattered point curve, judging that the robot is in a safety area; if the Y-axis coordinate of the model coordinate is larger than the Y-axis coordinate of the intersection point of the straight line and the first forward scattered point curve and smaller than the Y-axis coordinate of the intersection point of the straight line and the second forward scattered point curve, or smaller than the Y-axis coordinate of the intersection point of the straight line and the first reverse scattered point curve and larger than the Y-axis coordinate of the intersection point of the straight line and the second reverse scattered point curve, judging that the robot is in the early warning area; and if the Y-axis coordinate of the model coordinate is smaller than the Y-axis coordinate of the intersection point of the straight line and the first forward scatter curve and is larger than the Y-axis coordinate of the intersection point of the straight line and the first forward scatter curve, judging that the robot is in the early warning area.
Further, the robot determining coordinates of the straight line and the translated scatter curve comprises the following steps: determining coordinates of scattered points with the same X-axis coordinates as the model coordinates on a virtual wall in the mathematical model; the Y-axis coordinate of the intersection point of the straight line and the first forward scattered point curve is that the Y-axis coordinate of the scattered point is added with a first preset value; the Y-axis coordinate of the intersection point of the straight line and the first reverse scattered point curve is obtained by subtracting a first preset value from the Y-axis coordinate of the scattered point; the Y-axis coordinate of the intersection point of the straight line and the second positive scattered point curve is that the Y-axis coordinate of the scattered point is added with a second preset value; the Y-axis coordinate of the intersection of the straight line and the second reverse scatter plot is the Y-axis coordinate of the scatter minus a second preset value. The Y-axis coordinates of the scattered points used for comparison of the corresponding translated scattered point curves can be obtained through simple addition and subtraction conversion, so that the calculated data are less, and the operation speed of the robot is improved.
Further, if the robot determines that the robot is in the touch area, the robot moves to the nearest warning area.
Further, if the robot judges that the robot is in the early warning area, the robot continues to work and detects the position of the robot in real time. The robot performs corresponding operation according to actual conditions, so that the flexibility of the robot is improved.
Drawings
FIG. 1 is a flow chart of a method for realizing a robot curve virtual wall based on a grid map according to an embodiment of the invention;
fig. 2 is a schematic diagram of a mathematical model according to an embodiment of the present invention.
Detailed Description
The following describes the technical solution in the embodiment of the present invention in detail with reference to the drawings in the embodiment of the present invention. It should be understood that the following detailed description is merely illustrative of the invention, and is not intended to limit the invention.
As shown in fig. 1, a method for realizing a robot curve virtual wall based on a grid map comprises the following steps: s1: the robot sets a curve virtual wall based on the constructed grid map; s2: the robot builds a mathematical model on the curve virtual wall; s3: converting the grid coordinates of the current position into model coordinates of a mathematical model by the robot; s4: and the robot determines the states of the robot and the virtual wall according to the model coordinates. The robot can set the curve virtual wall according to the actual environment, so that the setting of the virtual wall is more flexible; the corresponding mathematical model is set according to the curve virtual wall, and the model can support continuous function curves and has high realization and functionality. The execution subject of the method is a processor or a control chip of the robot, which is directly described as the robot for convenience of description. The robot may be a sweeping robot, a floor washing robot, an air purifying robot, a logistics robot, a weeding robot, a business service robot, or the like. First, the robot sets a virtual wall based on a map constructed when the robot walks by itself. The constructed map can be a grid map, a lattice map, a color block map or other types of maps, and can reflect the current environment condition of the robot. The embodiments of the invention are described by taking a grid map as an example. The virtual wall can be set in different modes, for example, the robot is controlled to walk once at the position where the virtual wall is required to be set, the coordinate positions and directions of the robot during walking are recorded, the coordinate positions are marked as virtual barrier units, and the virtual barrier units form the virtual wall. The virtual obstacle unit is a grid unit through which the robot can walk normally in practice, but cannot walk when walking according to map navigation. Or the user directly operates on the display terminal of the map, and marks the corresponding grid unit at the corresponding position in a mouse or touch mode, namely the grid unit marked as the virtual wall. The grid unit is the smallest unit cell constituting the grid map, and the robot sets a curved virtual wall based on the constructed grid map, comprising the steps of: the intelligent mobile terminal receives map information constructed by the robot and displays the map information through a screen; the intelligent mobile terminal detects a signal for setting a virtual wall and converts a touch signal on a screen into the virtual wall to be displayed in a map shown on the screen; and the intelligent mobile terminal transmits the map information provided with the virtual wall to the robot. The virtual wall information is displayed through the intelligent mobile terminal, so that the set virtual wall is simpler and more convenient to operate.
Further, in step S2, the robot builds a mathematical model comprising the steps of: the robot converts the curve virtual wall into a scattered point curve of the grid map; translating the center point of the scattered point curve to the origin of the coordinate system by taking the center point of the scattered point curve as a reference point, and rotating the scattered point curve to enable points on the scattered point curve to be distributed on two sides of an X axis; and translating the scattered point curve forward and backward along the Y axis by a preset value to divide the areas at two sides of the scattered point curve. The virtual wall is transformed into the scattered point curve to set the mathematical model, so that the robot can calculate conveniently. The robot divides the areas at the two sides of the scattered point curve, and comprises the following steps: the scattered point curves are translated forward and backward along the Y axis by a first preset value to form a touch area, and the virtual wall is positioned at the center of the touch area; the scattered point curves forward and backward translate a second preset value basic unit along the Y axis, and the scattered point curves after the second translation and the scattered point curves after the first translation on the same side form an early warning area respectively, so that the early warning areas are distributed on two sides of the touch area; setting the areas except the early warning area and the touch area as a safety area; the scattered point curve after the first translation comprises a first forward scattered point curve and a first reverse scattered point curve, the scattered point curve after the second translation comprises a second forward scattered point curve and a second reverse scattered point curve, and the second preset value is larger than the first preset value. Corresponding areas can be constructed through simple forward translation and reverse translation, operation data are less, and the operation speed of the robot is improved. In step S3, the robot converting the grid coordinates of the current position into model coordinates of a mathematical model includes the steps of: the robot determines grid coordinates of the current position of the robot, and translates and rotates the grid coordinates of the current position corresponding to the virtual wall to obtain model coordinates based on a mathematical model. The grid coordinates of the robot can be converted into corresponding mathematical model coordinates through simple translation, and the operation is simple and convenient.
Further, in step S4, the robot determining the states of itself and the virtual wall according to the model coordinates includes the following steps: the robot compares the model coordinates with the coordinates of corresponding scattered points of the scattered point curve after translation, determines whether the robot is in a safe area, an early warning area or a touch area on the mathematical model according to the comparison result, and if the robot is in the safe area, the robot judges that the robot and the virtual wall are in a safe state; if the robot is in the early warning area, the robot judges that the state of the robot and the virtual wall is in a close state; if the robot is in the touch area, the robot judges that the state of the robot and the virtual wall is a touch state; the translated scatter point curves comprise a first forward scatter point curve, a first reverse scatter point curve, a second forward scatter point curve and a second reverse scatter point curve. The state of the robot and the virtual wall is judged by the position of the robot in the mathematical model, so that the robot can conveniently set corresponding actions aiming at the state of the robot and the virtual wall. The robot comparing the model coordinates with the coordinates of the corresponding scattered points of the shifted scattered point curve comprises the following steps: making a straight line perpendicular to the X axis by passing through the model coordinates; if the straight line is not intersected with the translated scattered point curve, judging that the robot is in a safe area; if the straight line intersects with the translated scattered point curve, comparing the model coordinate with the Y-axis coordinate of the intersection point of the straight line and the translated scattered point curve; if the Y-axis coordinate of the model coordinate is larger than the Y-axis coordinate of the intersection point of the straight line and the second forward scattered point curve or smaller than the Y-axis coordinate of the intersection point of the straight line and the second reverse scattered point curve, judging that the robot is in a safety area; if the Y-axis coordinate of the model coordinate is larger than the Y-axis coordinate of the intersection point of the straight line and the first forward scattered point curve and smaller than the Y-axis coordinate of the intersection point of the straight line and the second forward scattered point curve, or smaller than the Y-axis coordinate of the intersection point of the straight line and the first reverse scattered point curve and larger than the Y-axis coordinate of the intersection point of the straight line and the second reverse scattered point curve, judging that the robot is in the early warning area; and if the Y-axis coordinate of the model coordinate is smaller than the Y-axis coordinate of the intersection point of the straight line and the first forward scatter curve and is larger than the Y-axis coordinate of the intersection point of the straight line and the first forward scatter curve, judging that the robot is in the early warning area. The robot determining the coordinates of the straight line and the translated scattered point curve comprises the following steps: determining coordinates of scattered points with the same X-axis coordinates as the model coordinates on a virtual wall in the mathematical model; the Y-axis coordinate of the intersection point of the straight line and the first forward scattered point curve is that the Y-axis coordinate of the scattered point is added with a first preset value; the Y-axis coordinate of the intersection point of the straight line and the first reverse scattered point curve is obtained by subtracting a first preset value from the Y-axis coordinate of the scattered point; the Y-axis coordinate of the intersection point of the straight line and the second positive scattered point curve is that the Y-axis coordinate of the scattered point is added with a second preset value; the Y-axis coordinate of the intersection of the straight line and the second reverse scatter plot is the Y-axis coordinate of the scatter minus a second preset value. The Y-axis coordinates of the scattered points used for comparison of the corresponding translated scattered point curves can be obtained through simple addition and subtraction conversion, so that the calculated data are less, and the operation speed of the robot is improved. If the robot judges that the robot is in the touch area, the robot moves to the nearest early warning area. If the robot judges that the robot is in the early warning area, the robot continues to work and detects the position of the robot in real time. The robot performs corresponding operation according to actual conditions, so that the flexibility of the robot is improved.
As shown in fig. 2, fig. 2 is a mathematical model constructed on a curve virtual wall by a robot, wherein L is a scatter curve of a grid map converted by the curve virtual wall, I is a first forward scatter curve of the grid map obtained by forward translating m basic units along the Y axis, I 'is a first reverse scatter curve of the grid map obtained by forward translating m basic units along the Y axis, U is a second forward scatter curve of the grid map obtained by forward translating n basic units along the Y axis, U' is a second reverse scatter curve of the grid map obtained by forward translating n basic units along the Y axis, n > m, and a region 2 'surrounded by U and I is an early warning region, and a region 1 surrounded by I and I' is a touch region. When judging the position of the robot, setting the current position of the robot as a point G (X, Y), and in a curve virtual wall relative coordinate system, making a straight line Q perpendicular to an X-axis by the current point G, and judging that the robot is in a safety area if the straight line Q has no intersection point with U, I, I 'and U'. If the straight line Q intersects U, I, I 'and U' at A, B, C, D with the scatter curves U, I, I 'and U', respectively, the following is determined: y > YA or Y < YD, the machine is not in any area, and the robot is judged to be in a safe area; y > YB ∈Y < YA, the robot is in the area 2, or Y > YD ∈Y < YC, the robot is in the area 2', and the robot is judged to be in the early warning area; y > YC ∈Y < YB, the robot is in region 1, and the robot is in the touch region. The states of the robot and the virtual wall can be represented by numbers when the robot is in different areas, for example, the robot is in an area 2, and the state of the robot is 1; the robot is in the area 2', and the state of the robot is-1; the robot is not in zones 1, 2 and 2', robot state is 2; in the area 1, the robot enters the area 1 from the area 2', and the robot enters the area-1. After the robot enters the area 1, judging that the virtual wall is triggered, and if the robot triggers the virtual wall, if the state is 1, the robot evading direction is the area 2; if the robot triggers a virtual wall, the state is-1, and the robot evading direction is area 2'. If the state of the robot changes from 1 to-1 or-1 to 1, it can be judged that the robot passes through the virtual wall, and the robot can return through the point G calculated in the area 2 or the area 2'.
It is obvious that the above-mentioned embodiments are only some embodiments of the present invention, but not all embodiments, and that the technical solutions of the embodiments may be combined with each other. Furthermore, if terms such as "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc. are used in the embodiments, the indicated orientation or positional relationship is based on that shown in the drawings, only for convenience in describing the present invention and simplifying the description, and does not indicate or imply that the indicated apparatus or element must have a specific orientation or be constructed and operated in a specific orientation, and thus should not be construed as limiting the present invention. If the terms "first," "second," "third," etc. are used in an embodiment to facilitate distinguishing between related features, they are not to be construed as indicating or implying a relative importance, order, or number of technical features.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the method embodiments described above may be performed by hardware associated with program instructions. These programs may be stored in a computer readable storage medium (such as ROM, RAM, magnetic or optical disk, etc. various media that can store program codes). The program, when executed, performs steps including the method embodiments described above.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention.

Claims (8)

1. The method for realizing the robot curve virtual wall based on the grid map is characterized by comprising the following steps of:
S1: the robot sets a curve virtual wall based on the constructed grid map;
s2: the robot builds a mathematical model on the curve virtual wall;
s3: converting the grid coordinates of the current position into model coordinates of a mathematical model by the robot;
s4: the robot determines the states of the robot and the virtual wall according to the model coordinates;
In step S4, the robot determines the states of the robot and the virtual wall according to the model coordinates, including the following steps: the robot compares the model coordinates with the coordinates of corresponding scattered points of the scattered point curve after translation, determines whether the robot is in a safe area, an early warning area or a touch area on the mathematical model according to the comparison result, and if the robot is in the safe area, the robot judges that the robot and the virtual wall are in a safe state; if the robot is in the early warning area, the robot judges that the state of the robot and the virtual wall is in a close state; if the robot is in the touch area, the robot judges that the state of the robot and the virtual wall is a touch state; the translated scattered point curve comprises a first forward scattered point curve, a first reverse scattered point curve, a second forward scattered point curve and a second reverse scattered point curve;
the robot compares the model coordinates with the coordinates of corresponding scattered points of the scattered point curve after translation, and the method comprises the following steps:
making a straight line perpendicular to the X axis by passing through the model coordinates;
if the straight line is not intersected with the translated scattered point curve, judging that the robot is in a safe area;
if the straight line intersects with the translated scattered point curve, comparing the model coordinate with the Y-axis coordinate of the intersection point of the straight line and the translated scattered point curve;
If the Y-axis coordinate of the model coordinate is larger than the Y-axis coordinate of the intersection point of the straight line and the second forward scattered point curve or smaller than the Y-axis coordinate of the intersection point of the straight line and the second reverse scattered point curve, judging that the robot is in a safety area;
If the Y-axis coordinate of the model coordinate is larger than the Y-axis coordinate of the intersection point of the straight line and the first forward scattered point curve and smaller than the Y-axis coordinate of the intersection point of the straight line and the second forward scattered point curve, or smaller than the Y-axis coordinate of the intersection point of the straight line and the first reverse scattered point curve and larger than the Y-axis coordinate of the intersection point of the straight line and the second reverse scattered point curve, judging that the robot is in the early warning area;
And if the Y-axis coordinate of the model coordinate is smaller than the Y-axis coordinate of the intersection point of the straight line and the first forward scatter curve and is larger than the Y-axis coordinate of the intersection point of the straight line and the first forward scatter curve, judging that the robot is in the early warning area.
2. The method for realizing the virtual wall of the robot curve based on the grid map according to claim 1, wherein in the step S1, the robot sets the virtual wall of the curve based on the constructed grid map comprises the following steps:
the intelligent mobile terminal receives map information constructed by the robot and displays the map information through a screen;
the intelligent mobile terminal detects a signal for setting a virtual wall and converts a touch signal on a screen into the virtual wall to be displayed in a map shown on the screen;
and the intelligent mobile terminal transmits the map information provided with the virtual wall to the robot.
3. The method for realizing the virtual wall of the robot curve based on the grid map according to claim 1, wherein in the step S2, the robot establishes a mathematical model comprising the following steps:
the robot converts the curve virtual wall into a scattered point curve of the grid map;
Translating the center point of the scattered point curve to the origin of the coordinate system by taking the center point of the scattered point curve as a reference point, and rotating the scattered point curve to enable points on the scattered point curve to be distributed on two sides of an X axis;
and translating the scattered point curve forward and backward along the Y axis by a preset value to divide the areas at two sides of the scattered point curve.
4. The method for realizing the virtual wall of the robot curve based on the grid map as set forth in claim 3, wherein the robot divides the area at both sides of the scattered point curve, comprising the steps of:
The scattered point curves are translated forward and backward along the Y axis by a first preset value to form a touch area, and the virtual wall is positioned at the center of the touch area;
The scattered point curves forward and backward translate a second preset value basic unit along the Y axis, the scattered point curves after the second translation and the scattered point curves after the first translation on the same side form early warning areas respectively, and the early warning areas are distributed on two sides of the touch area;
setting the areas except the early warning area and the touch area as a safety area;
The scattered point curve after the first translation comprises a first forward scattered point curve and a first reverse scattered point curve, the scattered point curve after the second translation comprises a second forward scattered point curve and a second reverse scattered point curve, and the second preset value is larger than the first preset value.
5. The method for realizing the virtual wall of the robot curve based on the grid map according to claim 3, wherein in the step S3, the robot converts the grid coordinates of the current position into the model coordinates of the mathematical model, comprising the steps of:
the robot determines grid coordinates of the current position of the robot, and translates and rotates the grid coordinates of the current position corresponding to the virtual wall to obtain model coordinates based on a mathematical model.
6. The method for realizing the virtual wall of the robot curve based on the grid map according to claim 4, wherein the step of determining the coordinates of the straight line and the translated scattered point curve by the robot comprises the following steps:
Determining coordinates of scattered points with the same X-axis coordinates as the model coordinates on a virtual wall in the mathematical model;
the Y-axis coordinate of the intersection point of the straight line and the first forward scattered point curve is that the Y-axis coordinate of the scattered point is added with a first preset value;
the Y-axis coordinate of the intersection point of the straight line and the first reverse scattered point curve is obtained by subtracting a first preset value from the Y-axis coordinate of the scattered point;
The Y-axis coordinate of the intersection point of the straight line and the second positive scattered point curve is that the Y-axis coordinate of the scattered point is added with a second preset value;
the Y-axis coordinate of the intersection of the straight line and the second reverse scatter plot is the Y-axis coordinate of the scatter minus a second preset value.
7. The method for realizing the virtual wall of the robot curve based on the grid map according to claim 4, wherein if the robot judges that the robot is in the touch area, the robot moves to the nearest early warning area.
8. The method for realizing the robot curve virtual wall based on the grid map according to claim 4, wherein if the robot judges that the robot is in an early warning area, the robot continues to work and detects the position of the robot in real time.
CN202110501175.0A 2021-05-08 2021-05-08 Robot curve virtual wall implementation method based on grid map Active CN113190009B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110501175.0A CN113190009B (en) 2021-05-08 2021-05-08 Robot curve virtual wall implementation method based on grid map

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110501175.0A CN113190009B (en) 2021-05-08 2021-05-08 Robot curve virtual wall implementation method based on grid map

Publications (2)

Publication Number Publication Date
CN113190009A CN113190009A (en) 2021-07-30
CN113190009B true CN113190009B (en) 2024-05-07

Family

ID=76984467

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110501175.0A Active CN113190009B (en) 2021-05-08 2021-05-08 Robot curve virtual wall implementation method based on grid map

Country Status (1)

Country Link
CN (1) CN113190009B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1712898A (en) * 2004-06-25 2005-12-28 私立逢甲大学 Geographic space conversion
CN101349908A (en) * 2008-08-29 2009-01-21 江门市科杰机械自动化有限公司 Data partition method of numerical control machine tool
CN105828292A (en) * 2016-05-09 2016-08-03 青岛海信移动通信技术股份有限公司 Position detecting method and device based on geo-fencing
CN108803589A (en) * 2017-04-28 2018-11-13 深圳乐动机器人有限公司 Robot virtual wall system
KR20190007285A (en) * 2017-07-12 2019-01-22 엘지전자 주식회사 Moving Robot and controlling method
KR20190007284A (en) * 2017-07-12 2019-01-22 엘지전자 주식회사 Moving Robot and controlling method
CN109918468A (en) * 2019-03-21 2019-06-21 四川长虹电器股份有限公司 Internet of things equipment position data region screening technique based on Mercator projection
CN110246202A (en) * 2018-03-07 2019-09-17 苏州猫耳网络科技有限公司 A kind of grid drawing method based on map area boundary GPS coordinate
CN110385719A (en) * 2019-07-23 2019-10-29 珠海市一微半导体有限公司 Robot judges whether to collide the method for virtual wall and chip and intelligent robot
CN110874102A (en) * 2020-01-16 2020-03-10 天津联汇智造科技有限公司 Virtual safety protection area protection system and method for mobile robot
CN111360808A (en) * 2018-12-25 2020-07-03 深圳市优必选科技有限公司 Method and device for controlling robot to move and robot

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1712898A (en) * 2004-06-25 2005-12-28 私立逢甲大学 Geographic space conversion
CN101349908A (en) * 2008-08-29 2009-01-21 江门市科杰机械自动化有限公司 Data partition method of numerical control machine tool
CN105828292A (en) * 2016-05-09 2016-08-03 青岛海信移动通信技术股份有限公司 Position detecting method and device based on geo-fencing
CN108803589A (en) * 2017-04-28 2018-11-13 深圳乐动机器人有限公司 Robot virtual wall system
KR20190007285A (en) * 2017-07-12 2019-01-22 엘지전자 주식회사 Moving Robot and controlling method
KR20190007284A (en) * 2017-07-12 2019-01-22 엘지전자 주식회사 Moving Robot and controlling method
CN110246202A (en) * 2018-03-07 2019-09-17 苏州猫耳网络科技有限公司 A kind of grid drawing method based on map area boundary GPS coordinate
CN111360808A (en) * 2018-12-25 2020-07-03 深圳市优必选科技有限公司 Method and device for controlling robot to move and robot
CN109918468A (en) * 2019-03-21 2019-06-21 四川长虹电器股份有限公司 Internet of things equipment position data region screening technique based on Mercator projection
CN110385719A (en) * 2019-07-23 2019-10-29 珠海市一微半导体有限公司 Robot judges whether to collide the method for virtual wall and chip and intelligent robot
CN110874102A (en) * 2020-01-16 2020-03-10 天津联汇智造科技有限公司 Virtual safety protection area protection system and method for mobile robot

Also Published As

Publication number Publication date
CN113190009A (en) 2021-07-30

Similar Documents

Publication Publication Date Title
US20230409032A1 (en) Method for controlling an autonomous, mobile robot
US11059174B2 (en) System and method of controlling obstacle avoidance of robot, robot and storage medium
EP3706414B1 (en) Video monitoring method for mobile robot
CN110385719B (en) Method and chip for judging whether virtual wall is collided by robot and intelligent robot
US8897947B2 (en) Autonomous mobile device
CN111693050B (en) Indoor medium and large robot navigation method based on building information model
CN102890507B (en) Self-walking robot, cleaning robot and positioning method thereof
CN112074383A (en) Robot navigation using 2D and 3D path planning
CN111433697A (en) Motion planning for autonomous mobile robots
CN112008722B (en) Control method and control device for construction robot and robot
CN112894758A (en) Robot cleaning control system, method and device and computer equipment
CN115008465A (en) Robot control method, robot, and computer-readable storage medium
CN113001544A (en) Robot control method and device and robot
CN113190009B (en) Robot curve virtual wall implementation method based on grid map
CN114779777A (en) Sensor control method and device for self-moving robot, medium and robot
CN114003036A (en) Robot obstacle avoidance control method, device, equipment and medium
EP4033325B1 (en) Method for determining a working start point in a robot movement limiting frame and robot movement control method
CN113985894A (en) Autonomous obstacle avoidance path planning method, device, equipment and storage medium
CN113703439A (en) Autonomous mobile device control method, device, equipment and readable storage medium
CN110825083B (en) Control method, apparatus, and computer-readable storage medium for vehicle
CN113172630B (en) Robot control method based on virtual wall
CN112462768B (en) Mobile robot navigation map creation method and device and mobile robot
CN117589153B (en) Map updating method and robot
JP2018120482A (en) Robot and method of controlling the same
CN116400687A (en) Obstacle avoidance method, device and equipment of self-mobile equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 519000 2706, No. 3000, Huandao East Road, Hengqin new area, Zhuhai, Guangdong

Applicant after: Zhuhai Yiwei Semiconductor Co.,Ltd.

Address before: 519000 room 105-514, No. 6, Baohua Road, Hengqin new area, Zhuhai City, Guangdong Province (centralized office area)

Applicant before: AMICRO SEMICONDUCTOR Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant