CN112987748A - Robot narrow space control method and device, terminal and storage medium - Google Patents

Robot narrow space control method and device, terminal and storage medium Download PDF

Info

Publication number
CN112987748A
CN112987748A CN202110246533.8A CN202110246533A CN112987748A CN 112987748 A CN112987748 A CN 112987748A CN 202110246533 A CN202110246533 A CN 202110246533A CN 112987748 A CN112987748 A CN 112987748A
Authority
CN
China
Prior art keywords
robot
theta
distance
narrow space
rotation angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110246533.8A
Other languages
Chinese (zh)
Inventor
牟其龙
李岩
赵明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Yogo Robot Co Ltd
Original Assignee
Shanghai Yogo Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Yogo Robot Co Ltd filed Critical Shanghai Yogo Robot Co Ltd
Priority to CN202110246533.8A priority Critical patent/CN112987748A/en
Publication of CN112987748A publication Critical patent/CN112987748A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Acoustics & Sound (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a control method of a narrow space of a robot, which comprises the following steps: establishing a coordinate system with the center of the robot as an origin; calculating a mapping relation table of the rotation angle and the distance cost value of each coordinate point of the robot; detecting and calculating the cost value of the distance between each coordinate point and the nearest barrier in the surrounding space; inquiring the allowable rotation angle range [ theta ] of the surrounding space according to the distance cost valuemin,θmax](ii) a Acquiring a deflection angle theta of the robot in a current moving path; judging whether the deflection angle is within the range of the rotation angle or not, and if theta is within the range of the rotation anglemin<θ<θmaxIf so, controlling the robot to advance along the current moving path; if theta < thetaminOr theta > thetamaxAnd controlling the robot to decelerate and retreat. According to the invention, the information whether the robot can pass through the current narrow space can be obtained by detecting the distance of the surrounding space and judging the current deflection angle, so that the problem that the robot is locked due to incapability of steering is avoided, and the traffic capacity of the robot is improved.

Description

Robot narrow space control method and device, terminal and storage medium
[ technical field ] A method for producing a semiconductor device
The present invention relates to the field of robot technology, and in particular, to a method, an apparatus, a terminal, and a storage medium for controlling a narrow space of a robot.
[ background of the invention ]
With the more mature technology of the current mobile robot, the more the functions of the robot, the more the working scenes of the robot are, and the higher the requirement on the trafficability of the robot. In order to meet functional requirements of robots in the current market, the outlines of the robots are usually designed in a non-circular mode, and sensing sensors of the robots are usually only used for detecting the front areas of the robots, so that once the robots enter a narrow space or have obstacles close to the robots, the rotation capacity of the robots is limited. If the expected moving route of the robot is out of the range where the robot can turn, the movement of the robot is often blocked, so that the passing capacity of the robot is greatly limited, and the applicable scene of the robot is reduced.
In view of the above, it is desirable to provide a method, an apparatus, a terminal and a storage medium for controlling a robot in a narrow space to overcome the above-mentioned drawbacks.
[ summary of the invention ]
The invention aims to provide a control method, a control device, a control terminal and a storage medium for a narrow space of a robot, which aim to solve the problem that the existing non-circular robot is easy to block when moving in the narrow space and improve the passing capacity of the robot in the narrow space.
In order to achieve the above object, a first aspect of the present invention provides a method for controlling a narrow space of a robot, comprising the steps of:
establishing a coordinate system with the center of the robot as an origin;
calculating a mapping relation table of a rotation angle and a distance cost value of each coordinate point on the physical contour of the robot when the robot realizes 360-degree rotation and does not touch an obstacle according to the actual physical contour information of the robot;
detecting and calculating a distance cost value of each coordinate point on a physical contour of the robot and a nearest obstacle of a surrounding space;
inquiring the allowable rotation angle range [ theta ] of the surrounding space in the mapping relation table according to the distance cost valuemin,θmax];
Acquiring a deflection angle theta of the robot in a current moving path;
judging whether the deflection angle is within the range of the rotation angle or not, and if so, judging whether the deflection angle is within the range of the rotation angle or notmin<θ<θmaxIf so, controlling the robot to advance along the current moving path; if theta < thetaminOr theta > thetamaxAnd controlling the robot to decelerate and retreat.
In a preferred embodiment, the step of establishing a coordinate system with the robot center as an origin comprises:
and establishing a coordinate system by taking the center of the robot as an origin, taking the right front of the robot as an X-axis forward direction, taking the right side of the robot as a Y-axis forward direction and taking the center of the robot which is vertically upwards and upwards as a Z-axis forward direction.
In a preferred embodiment, the step of detecting and calculating a distance cost value of coordinates on a physical contour of the robot and a nearest obstacle in a surrounding space comprises the sub-steps of:
receiving distance information of the robot and the surrounding space detected by a sensor, mapping the distance information to the coordinate system, and generating a two-dimensional coordinate map containing the surrounding space information;
calculating the nearest distance from each coordinate position in the two-dimensional coordinate map to the nearest barrier, defining the nearest distance as the distance cost of each coordinate position, and generating a distance cost map;
obtaining a distance cost value in the distance cost map from coordinates on a physical contour of the robot.
In a preferred embodiment, said looking up the allowed rotation angle range [ θ ] of said surrounding space in said mapping relation table according to said distance cost valuemin,θmax]The method comprises the following steps of;
acquiring an allowable rotation angle range of each coordinate point on a physical outline of the robot in the surrounding space;
taking the intersection of the rotation angle ranges of all coordinate points on the physical outline of the robot to obtain the rotation angle allowed by the robot in the surrounding spaceRange [ theta ]min,θmax]。
In a preferred embodiment, when the robot is controlled to decelerate and retreat, if theta < thetaminControlling the robot to turn left at the same time; if theta > thetamaxAnd controlling the robot to turn right at the same time.
The second aspect of the present invention provides a control apparatus for a narrow space of a robot, including a space detection module for collecting a distance between a physical contour of the robot and an obstacle, and a motion control module for controlling the robot to move, further including:
the coordinate system establishing module is used for establishing a coordinate system with the center of the robot as an origin;
the mapping table generating module is used for calculating a mapping relation table of the rotation angle and the distance cost value of each coordinate point on the physical outline of the robot when the robot realizes 360-degree rotation and does not touch an obstacle according to the actual physical outline information of the robot;
the cost value calculation module is used for calculating the distance cost value of each coordinate point on the physical contour of the robot and the nearest obstacle in the surrounding space according to the distance between the physical contour of the robot and the obstacle;
an angle range determining module, configured to query, according to the distance cost value, an allowable rotation angle range [ θ ] of the surrounding space in the mapping relation tablemin,θmax];
The deflection angle acquisition module is used for acquiring a deflection angle theta of the robot in the current moving path;
an angle judging module for judging whether the deflection angle is in the rotation angle range or not, if so, judging whether the deflection angle is in the rotation angle range or notmin<θ<θmaxIf so, controlling the robot to advance along the current moving path; if theta < thetaminOr theta > thetamaxAnd controlling the robot to decelerate and retreat.
In a preferred embodiment, the space detection module comprises a laser sensor, an ultrasonic sensor and an infrared sensor; the infrared sensor is located the bottom of robot, ultrasonic sensor is located infrared sensor's top, laser sensor is located ultrasonic sensor's top.
A third aspect of the present invention provides a terminal, which includes a memory, a processor, and a control program of a narrow space of a robot stored in the memory and executable on the processor, wherein the control program of the narrow space of the robot realizes the steps of the control method of the narrow space of the robot according to any one of the above embodiments when executed by the processor.
A fourth aspect of the present invention provides a computer-readable storage medium storing a control program for a narrow space of a robot, the control program for the narrow space of the robot implementing the steps of the method for controlling a narrow space of a robot according to any one of the above embodiments when executed by a processor.
The control method of the narrow space of the robot provided by the invention calculates the mapping relation between the rotating angle and the distance cost value of each coordinate point during rotation in advance according to the physical outline of the robot, then, in the moving process of the robot in a narrow space, the cost value of the distance between each coordinate point of the robot and the nearest obstacle is detected and calculated in real time, and obtaining the allowed rotation angle range of each coordinate point in the surrounding space according to the mapping relation, and finally determining the next movement direction of the robot by judging whether the deflection angle of the robot in the moving path is in the rotation angle range, therefore, the information whether the robot can pass through the current narrow space can be obtained by detecting the distance of the surrounding space and judging the current deflection angle, therefore, the problem that the robot is stuck due to incapability of steering is avoided, and the traffic capacity of the robot is improved.
[ description of the drawings ]
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
FIG. 1 is a flowchart of a method for controlling a narrow space of a robot according to the present invention;
FIG. 2 is a frame diagram of a control device for a narrow space of a robot according to the present invention;
fig. 3 is a schematic structural diagram of the robot.
[ detailed description ] embodiments
In order to make the objects, technical solutions and advantageous effects of the present invention more clearly apparent, the present invention is further described in detail below with reference to the accompanying drawings and the detailed description. It should be understood that the detailed description and specific examples, while indicating the preferred embodiment of the invention, are intended for purposes of illustration only and are not intended to limit the scope of the invention.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
In an embodiment of the present invention, a first aspect provides a method for controlling a narrow space of a robot, which is used for controlling through a control device of the narrow space of the robot, so that the robot can judge whether to pass through the narrow space in real time, the robot is prevented from being stuck in the narrow space, and the passing capacity of the robot is further improved.
In this embodiment, the control device for the narrow space of the robot comprises a space detection module for collecting the distance between the physical contour of the robot and the obstacle and a motion control module for controlling the movement of the robot. The space detection module comprises a laser sensor 11, an ultrasonic sensor 12 and an infrared sensor 13. The infrared sensor 11 is located at the bottom of the robot, the ultrasonic sensor 12 is located above the infrared sensor, and the laser sensor 13 is located above the ultrasonic sensor. It can be understood that, as shown in fig. 3, for some non-circular robots, it is generally divided into two parts, one part is a driving device 101 disposed near the ground for realizing autonomous movement of the robot; the other part is a loading device 102 arranged above the driving device and used for realizing the loading function of the robot. Generally speaking, the physical profile of the driving device 101 is larger than that of the carrying device 102, so the infrared sensor 11 can be disposed at the bottom of the driving device 101, the ultrasonic sensor 12 can be disposed at the middle of the driving device 101, and the laser sensor 13 can be disposed at the connection between the driving device 101 and the carrying device 102, thereby effectively improving the application range of each sensor and avoiding shielding.
It should be noted that the present invention is applicable to various sensor fusion for detecting object distance, position or space, also applicable to fusion for acquiring position, distance or space through wireless communication or calibration, and also applicable to obstacle distance or position information acquired through computer image recognition, and the like, and is not limited to the sensor types mentioned in the present embodiment.
As shown in FIG. 1, the method includes the following steps S11-S16.
In step S11, a coordinate system with the robot center as the origin is established. Specifically, a coordinate system is established with the center of the robot as an origin, the front side of the robot as an X-axis forward direction, the right side of the robot as a Y-axis forward direction, and the center of the robot as a Z-axis forward direction in a vertical ground direction, wherein a coordinate unit can be set to be meters.
Step S12, calculating a mapping relation table of the rotation angle and the distance cost value of each coordinate point on the physical contour of the robot when the robot realizes 360-degree rotation and does not touch the obstacle according to the actual physical contour information of the robot.
Specifically, 2D rasterization is performed in a coordinate system, then a center grid of the robot and a corresponding grid value map are generated based on the physical dimensions of the robot, and a plurality of coordinate points with certain characteristics, such as a coordinate point of an edge with a maximum rotation radius, a coordinate point of a corner with a minimum rotation radius, and the like, are selected according to the physical outline of the robot, so as to generate an inscribed circle area and a circumscribed circle area of the robot. It can be understood that the inscribed circle region of the central grid of the robot is a complete collision region, the outside of the circumscribed circle region is a free safety region, and a dangerous collision region can be defined between the inscribed circle region and the circumscribed circle region. In the dangerous collision area, for some narrow spaces, the robot can pass through the narrow spaces by rotating the pose of the robot, and for coordinate points on the robot, particularly coordinate points in the dangerous collision area, a certain mapping relation is formed between the rotatable angle and the distance cost value of the coordinate points when the coordinate points pass through the narrow spaces, and then a mapping relation Table of the rotating angle and the distance cost value of each coordinate point can be generated in advance through calculation according to the physical contour of the robot and can be recorded as Table 1.
In the 2D grid map, the size of each grid may be preset, and let D (p) be the minimum euclidean distance from a certain grid p to an obstacle in the coordinate system, the distance cost value of the grid p is denoted as c (p), and c (p) is satisfied as Q (D (p)), where the Q function is a decreasing function (e.g., an exponential decreasing function) of a variable D (p), and the function can convert the distance value in the coordinate system into the grid cost value in the distance cost map. The Q function may refer to an existing decreasing function or a cost function, and the invention is not limited herein.
Step S13 is performed to detect and calculate the cost value of the distance between each coordinate point on the physical outline of the robot and the nearest obstacle in the surrounding space.
In this step, when the robot moves, each sensor in the space detection module detects the surrounding space to generate an initial static map, and then the distance measured by the sensor is converted into a distance cost value in real time according to a Q function, so that the initial static map is converted into a dynamic cost map more conforming to the navigation of the robot, and further real-time environment information is provided for the functions of autonomous obstacle avoidance, path planning and the like of the robot. Specifically, the step includes the following substeps:
and receiving the distance information between the robot and the surrounding space detected by the sensor, mapping the distance information to a coordinate system, and generating a two-dimensional coordinate map containing the surrounding space information. The distance information detected by sensors such as infrared sensors, ultrasonic sensors, laser sensors and the like is mapped to a coordinate system of the robot, and a group of two-dimensional coordinate maps of surrounding space information can be obtained and can be marked as Map 1;
calculating the nearest distance from each coordinate position to the nearest obstacle in a two-dimensional coordinate Map (Map1), defining the nearest distance as the distance cost of each coordinate position, namely, taking the minimum Euclidean distance as the obstacle avoidance cost from the coordinate position to the nearest obstacle, and generating a distance cost Map 2; the smaller the distance cost is, the closer the robot is to the obstacle is, the more limited the moving space of the robot is;
the distance cost value in the distance cost Map (Map2) is obtained from coordinates on the physical outline of the robot.
It should be noted that, in one embodiment, in order to ensure that the robot does not risk colliding with an obstacle during the movement process, a Q function (for example, an exponential descent function) is constructed by using the minimum euclidean distance between each grid and the nearest obstacle, and it is ensured that the farther the grid is from the obstacle, the lower the required distance cost is, and the Q function processing is performed on Map1 to generate a safety cost Map. Of course, in other embodiments, the environmental energy consumption parameter may also be added to the Q function, and different grids may be given different cost weight values according to the road surface ruggedness, so as to generate an energy consumption cost map. The safety cost map and the energy consumption cost map can be linearly superposed, so that a final distance cost map is obtained.
Continuing to execute step S14, according to the distance cost value, querying the surrounding space allowable rotation angle range [ theta ] in the mapping relation Table (Table1)min,θmax]. Specifically, the method comprises the following substeps:
and acquiring the rotation angle range allowed by each coordinate point on the physical outline of the robot in the surrounding space. It can be understood that when the robot runs in a narrow space, the coordinates of different coordinate points on the robot contour are different due to different distributed positions, and thus the distance cost value in the distance cost Map (Map2) is different, which means that the rotatable angle of some parts of the robot is large and the rotatable angle of some parts is small when the robot passes through the narrow space during the movement process. For example, in a robot having a rectangular cross section, when passing through a narrow space, the rotation angle of the coordinate point of the side portion is large, and the rotation angle of the coordinate point of the edge between the two sides is small. Therefore, the rotation angle ranges of no coordinate points need to be calculated and queried separately according to the cost values of the rotation angle ranges in Map2, and the rotation angle range of each coordinate point needs to be calculated and queried separately.
The intersection of the rotation angle ranges of all coordinate points on the physical outline of the robot is taken to obtain the rotation angle range [ theta ] allowed by the robot in the surrounding spacemin,θmax]. It can be understood that when one robot passes through a narrow space, all coordinate points on the robot contour are required to pass through the distance cost Map (Map2), and therefore, the intersection [ theta ] thereof is takenmin,θmax]Thereby satisfying the rotation ranges of all coordinate points.
Continuing to step S15, the deflection angle θ of the robot in the current moving path is obtained.
Specifically, the deflection angle θ is based on the positive direction of the X axis of the robot, and for convenience of description, the angle when the robot turns to the left may be described as 0 to-180 °, and the angle when the robot turns to the right may be described as 0 to-180 °. The robot runs along a moving path, and the path planning of the robot may refer to the prior art, such as algorithms a, D, etc., and the present invention is not limited thereto. When the robot runs along the moving path, the robot sometimes makes a turn, and the rotation angle required for the turn is the yaw angle θ.
Step S16, judging whether the deflection angle is in the rotation angle range, if thetamin<θ<θmaxAnd the control module is used for controlling the robot to move forward along the current moving path. Wherein, the controlling machineThe robot following path motion can refer to the existing methods such as DWA control algorithm, PID following look-ahead point and the like, and the invention is not limited herein. If theta < thetaminOr theta > thetamaxThe robot steering control method has the advantages that the rotation angle of the robot is beyond the rotation range of certain coordinate points when the robot steers, the coordinate points collide with surrounding obstacles at the moment to cause damage of jamming and the robot, and therefore the robot is controlled to decelerate and retreat through the motion control module at the moment, and jamming of the robot is avoided. Wherein, when the robot is controlled to decelerate and retreat, if theta is less than thetaminControlling the robot to simultaneously turn left; if theta > thetamaxAnd controlling the robot to turn right at the same time.
In summary, the method for controlling a narrow space of a robot according to the present invention calculates a mapping relationship between a rotation angle and a distance cost value of each coordinate point during rotation in advance according to a physical contour of the robot, then, in the moving process of the robot in a narrow space, the cost value of the distance between each coordinate point of the robot and the nearest obstacle is detected and calculated in real time, and obtaining the allowed rotation angle range of each coordinate point in the surrounding space according to the mapping relation, and finally determining the next movement direction of the robot by judging whether the deflection angle of the robot in the moving path is in the rotation angle range, therefore, the information whether the robot can pass through the current narrow space can be obtained by detecting the distance of the surrounding space and judging the current deflection angle, therefore, the problem that the robot is stuck due to incapability of steering is avoided, and the traffic capacity of the robot is improved.
The second aspect of the present invention provides a control device 100 for a robot in a narrow space, which is used for controlling the robot to pass through the narrow space, so as to improve the traffic capacity. The implementation principle and the embodiment of the control device 100 for a robot narrow space are consistent with the control method for a robot narrow space described above, and therefore, the description thereof is omitted below.
As shown in fig. 2, the control apparatus 100 for a narrow space of a robot includes, in addition to a space detection module 10 and a motion control module 20:
a coordinate system establishing module 30, configured to establish a coordinate system with the center of the robot as an origin;
the mapping table generating module 40 is configured to calculate a mapping relationship table between a rotation angle and a distance cost value of each coordinate point on the physical profile of the robot when the robot realizes 360-degree rotation and does not touch an obstacle according to the actual physical profile information of the robot;
a cost value calculation module 50, configured to calculate a cost value of a distance between each coordinate point on the physical contour of the robot and a nearest obstacle in the surrounding space according to the distance between the physical contour of the robot and the obstacle;
an angle range determining module 60, configured to query the mapping relation table for the allowable rotation angle range [ θ ] of the surrounding space according to the distance cost valuemin,θmax];
A deflection angle obtaining module 70, configured to obtain a deflection angle θ of the robot in the current moving path;
an angle determining module 80 for determining whether the deflection angle is within the rotation angle range, if θ ismin<θ<θmaxIf so, controlling the robot to advance along the current moving path; if theta < thetaminOr theta > thetamaxAnd controlling the robot to decelerate and retreat.
A third aspect of the present invention provides a terminal (not shown in the drawings), where the terminal includes a memory, a processor, and a control program of a narrow space of a robot, which is stored in the memory and is executable on the processor, and the control program of the narrow space of the robot is executed by the processor to implement the steps of the control method of the narrow space of the robot according to any one of the above embodiments.
A fourth aspect of the present invention provides a computer-readable storage medium (not shown in the drawings), wherein a control program of a narrow space of a robot is stored in the computer-readable storage medium, and when the control program of the narrow space of the robot is executed by a processor, the steps of the control method of the narrow space of the robot according to any one of the above-mentioned embodiments are implemented.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed system or apparatus/terminal device and method can be implemented in other ways. For example, the above-described system or apparatus/terminal device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The invention is not limited solely to that described in the specification and embodiments, and additional advantages and modifications will readily occur to those skilled in the art, so that the invention is not limited to the specific details, representative apparatus, and illustrative examples shown and described herein, without departing from the spirit and scope of the general concept as defined by the appended claims and their equivalents.

Claims (9)

1. A control method for a robot narrow space is characterized by comprising the following steps:
establishing a coordinate system with the center of the robot as an origin;
calculating a mapping relation table of a rotation angle and a distance cost value of each coordinate point on the physical contour of the robot when the robot realizes 360-degree rotation and does not touch an obstacle according to the actual physical contour information of the robot;
detecting and calculating a distance cost value of each coordinate point on a physical contour of the robot and a nearest obstacle of a surrounding space;
inquiring the allowable rotation angle range [ theta ] of the surrounding space in the mapping relation table according to the distance cost valuemin,θmax];
Acquiring a deflection angle theta of the robot in a current moving path;
judging whether the deflection angle is within the range of the rotation angle or not, and if so, judging whether the deflection angle is within the range of the rotation angle or notmin<θ<θmaxIf so, controlling the robot to advance along the current moving path; if theta < thetaminOr theta > thetamaxAnd controlling the robot to decelerate and retreat.
2. The method for controlling a narrow space of a robot according to claim 1, wherein the step of establishing a coordinate system with a center of the robot as an origin comprises:
and establishing a coordinate system by taking the center of the robot as an origin, taking the right front of the robot as an X-axis forward direction, taking the right side of the robot as a Y-axis forward direction and taking the center of the robot which is vertically upwards and upwards as a Z-axis forward direction.
3. The method for controlling a narrow space of a robot according to claim 1, wherein the step of detecting and calculating a distance cost value of coordinates on a physical contour of the robot from a nearest obstacle of a surrounding space comprises the sub-steps of:
receiving distance information of the robot and the surrounding space detected by a sensor, mapping the distance information to the coordinate system, and generating a two-dimensional coordinate map containing the surrounding space information;
calculating the nearest distance from each coordinate position in the two-dimensional coordinate map to the nearest barrier, defining the nearest distance as the distance cost of each coordinate position, and generating a distance cost map;
obtaining a distance cost value in the distance cost map from coordinates on a physical contour of the robot.
4. The machine of claim 1The method for controlling a human narrow space is characterized in that the range [ theta ] of the rotation angle allowed by the surrounding space is inquired in the mapping relation table according to the distance cost valuemin,θmax]The method comprises the following steps of;
acquiring an allowable rotation angle range of each coordinate point on a physical outline of the robot in the surrounding space;
taking the intersection of the rotation angle ranges of all coordinate points on the physical outline of the robot to obtain the rotation angle range [ theta ] allowed by the robot in the surrounding spacemin,θmax]。
5. The method of controlling a robot narrow space according to claim 2, wherein when controlling the robot to decelerate and retreat, if θ < θminControlling the robot to turn left at the same time; if theta > thetamaxAnd controlling the robot to turn right at the same time.
6. A control device for robot narrow space comprises a space detection module for collecting the distance between the physical outline of the robot and an obstacle and a motion control module for controlling the robot to move, and is characterized by further comprising:
the coordinate system establishing module is used for establishing a coordinate system with the center of the robot as an origin;
the mapping table generating module is used for calculating a mapping relation table of the rotation angle and the distance cost value of each coordinate point on the physical outline of the robot when the robot realizes 360-degree rotation and does not touch an obstacle according to the actual physical outline information of the robot;
the cost value calculation module is used for calculating the distance cost value of each coordinate point on the physical contour of the robot and the nearest obstacle in the surrounding space according to the distance between the physical contour of the robot and the obstacle;
an angle range determination module for querying the surrounding space allowed in the mapping relation table according to the distance cost valueRotation angle range [ theta ]min,θmax];
The deflection angle acquisition module is used for acquiring a deflection angle theta of the robot in the current moving path;
an angle judging module for judging whether the deflection angle is in the rotation angle range or not, if so, judging whether the deflection angle is in the rotation angle range or notmin<θ<θmaxIf so, controlling the robot to advance along the current moving path; if theta < thetaminOr theta > thetamaxAnd controlling the robot to decelerate and retreat.
7. The robot narrow space control device of claim 6, wherein said space detecting module comprises a laser sensor, an ultrasonic sensor and an infrared sensor; the infrared sensor is located the bottom of robot, ultrasonic sensor is located infrared sensor's top, laser sensor is located ultrasonic sensor's top.
8. A terminal, characterized in that the terminal comprises a memory, a processor and a control program of a narrow space of a robot stored in the memory and executable on the processor, the control program of the narrow space of the robot realizing the steps of the control method of the narrow space of the robot according to any one of claims 1 to 5 when executed by the processor.
9. A computer-readable storage medium, in which a control program of a narrow space of a robot is stored, and the control program of the narrow space of the robot realizes the steps of the control method of the narrow space of the robot according to any one of claims 1 to 5 when being executed by a processor.
CN202110246533.8A 2021-03-05 2021-03-05 Robot narrow space control method and device, terminal and storage medium Pending CN112987748A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110246533.8A CN112987748A (en) 2021-03-05 2021-03-05 Robot narrow space control method and device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110246533.8A CN112987748A (en) 2021-03-05 2021-03-05 Robot narrow space control method and device, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN112987748A true CN112987748A (en) 2021-06-18

Family

ID=76353133

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110246533.8A Pending CN112987748A (en) 2021-03-05 2021-03-05 Robot narrow space control method and device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN112987748A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115177186A (en) * 2022-07-21 2022-10-14 美智纵横科技有限责任公司 Sweeping method, sweeping device, sweeping robot and computer readable storage medium
US20230400856A1 (en) * 2022-06-09 2023-12-14 Kristopher Douglas Rupay Autonomous guidance through narrow spaces

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230400856A1 (en) * 2022-06-09 2023-12-14 Kristopher Douglas Rupay Autonomous guidance through narrow spaces
CN115177186A (en) * 2022-07-21 2022-10-14 美智纵横科技有限责任公司 Sweeping method, sweeping device, sweeping robot and computer readable storage medium

Similar Documents

Publication Publication Date Title
CN111797734B (en) Vehicle point cloud data processing method, device, equipment and storage medium
US8736820B2 (en) Apparatus and method for distinguishing ground and obstacles for autonomous mobile vehicle
EP4033324B1 (en) Obstacle information sensing method and device for mobile robot
Cherubini et al. Visual navigation with obstacle avoidance
WO2022111017A1 (en) Tof-camera-based obstacle classification and obstacle avoidance control method
CN112987748A (en) Robot narrow space control method and device, terminal and storage medium
CN110837814A (en) Vehicle navigation method, device and computer readable storage medium
CN111240310A (en) Robot obstacle avoidance processing method and device and electronic equipment
CN113112491B (en) Cliff detection method, cliff detection device, robot and storage medium
CN112947464A (en) Method, device, terminal and storage medium for robot to pass through narrow space
CN112508912A (en) Ground point cloud data filtering method and device and boom anti-collision method and system
CN113610910A (en) Obstacle avoidance method for mobile robot
CN114078247A (en) Target detection method and device
Garcia-Alegre et al. Real-time fusion of visual images and laser data images for safe navigation in outdoor environments
CN114661051A (en) Front obstacle avoidance system based on RGB-D
CN115289966A (en) Goods shelf detecting and positioning system and method based on TOF camera
CN110027018B (en) Omnidirectional detection system and method
CN112045654B (en) Detection method and device for unmanned closed space and robot
CN114489050A (en) Obstacle avoidance route control method, device, equipment and storage medium for straight line driving
Li et al. Obstacle information detection based on fusion of 3D LADAR and camera
Yang et al. A new algorithm for obstacle segmentation in dynamic environments using a RGB-D sensor
CN114341761B (en) Anti-collision method, mobile machine and storage medium
Sun et al. Detection and state estimation of moving objects on a moving base for indoor navigation
EP3229173B1 (en) Method and apparatus for determining a traversable path
Canh et al. Multisensor data fusion for reliable obstacle avoidance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination