CN114815791A - Method and device for planning travelable space - Google Patents

Method and device for planning travelable space Download PDF

Info

Publication number
CN114815791A
CN114815791A CN202110064822.6A CN202110064822A CN114815791A CN 114815791 A CN114815791 A CN 114815791A CN 202110064822 A CN202110064822 A CN 202110064822A CN 114815791 A CN114815791 A CN 114815791A
Authority
CN
China
Prior art keywords
projection
target
obstacle
autonomous mobile
sampling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110064822.6A
Other languages
Chinese (zh)
Inventor
徐志浩
胡晋
沈慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuzhou Online E Commerce Beijing Co ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN202110064822.6A priority Critical patent/CN114815791A/en
Publication of CN114815791A publication Critical patent/CN114815791A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0259Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means
    • G05D1/0261Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means using magnetic plots
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/028Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The embodiment of the application provides a method and equipment for planning a travelable space. In the embodiment of the application, a cost map of the area space where the autonomous mobile equipment is located under a set coordinate system is established by using the set coordinate system; and the travelable space of the autonomous mobile equipment is explored around the obstacle based on the cost map, so that the exploration of blank areas is reduced, and the exploration efficiency of the travelable space is improved.

Description

Method and device for planning travelable space
Technical Field
The application relates to the technical field of intelligent decision, in particular to a travelable space planning method and travelable space planning equipment.
Background
With the continuous development of artificial intelligence technology, autonomous mobile devices gradually enter people's lives. The autonomous moving equipment needs to search a travelable space in the autonomous moving process, plans a traveling route based on the searched travelable space, and then moves along the planned traveling route. The efficiency of the exploration of the travelable space directly influences the efficiency of subsequent route planning. Therefore, the improvement of the efficiency of exploring the travelable space has been a problem that those skilled in the art need to continuously study.
Disclosure of Invention
Aspects of the present disclosure provide a method and apparatus for planning a travelable space, so as to improve travelable space planning efficiency.
The embodiment of the application provides a method for planning a travelable space, which comprises the following steps:
obtaining a cost map of an area where the autonomous mobile equipment is located under a set coordinate system;
controlling the target projection of the autonomous mobile equipment in a set coordinate system to move around the projection of the obstacle according to the obstacle projection position information of the obstacle in the cost map so as to obtain target projection position information generated in the cost map in the process that the target projection moves to the destination;
performing collision detection on the projection of the target and the projection of the obstacle based on the target projection position information and the obstacle projection position information;
and determining the travelable space of the autonomous mobile equipment around the corresponding obstacle according to the collision detection result.
An embodiment of the present application further provides a computing device, including: a memory and a processor; wherein the memory is used for storing a computer program; the processor is coupled to the memory for executing the computer program for performing the steps in the travelable space planning method described above.
An embodiment of the present application further provides an autonomous mobile device, including: the driving space planning method comprises a machine body and computing equipment which is arranged in the machine body and used for executing the driving space planning method.
Embodiments of the present application also provide a computer-readable storage medium storing a computer program, which, when executed by one or more processors, causes the one or more processors to execute the steps in the travelable space planning method.
In the embodiment of the application, a cost map of the area space where the autonomous mobile equipment is located under a set coordinate system is established by using the set coordinate system; and the travelable space of the autonomous mobile equipment is explored around the obstacle based on the cost map, so that the exploration of blank areas is reduced, and the exploration efficiency of the travelable space is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1a is a schematic process diagram of dynamically planning a travelable space through horizontal and vertical sampling according to an embodiment of the present application;
fig. 1b is a schematic diagram of an effect of a travelable space planned by the lattic planner rule according to an embodiment of the present application;
fig. 1c is a schematic process diagram of planning a travelable space by the voronoi diagram method according to the embodiment of the present application;
fig. 2a is a schematic flow chart of a travelable space planning method according to an embodiment of the present application;
FIG. 2b is a schematic diagram of a method for determining an expansion region according to an embodiment of the present application;
FIG. 2c is a schematic diagram of a Frenet coordinate system construction process provided by the embodiment of the present application;
fig. 2d is a schematic diagram illustrating a process of simulating movement of an autonomous mobile device to a destination according to an embodiment of the present application;
fig. 2e is a schematic flow chart of a cost map construction method provided in the embodiment of the present application;
fig. 2f is a schematic flow chart of a travelable space planning method according to an embodiment of the present application;
FIG. 2g is a schematic structural diagram of an explored search tree according to an embodiment of the present application;
fig. 2h and fig. 2i are schematic diagrams of a detour effect obtained by using the travelable space planning method provided by the embodiment of the present application;
fig. 2j is a schematic diagram of a pile-passing effect obtained by using the travelable space planning method provided in the embodiment of the present application;
fig. 3 is a block diagram of a hardware structure of an autonomous mobile device according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a computing device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the field of autonomous driving (smart driving or unmanned), the planning of the travelable space for autonomous mobile devices is a problem that needs to be constantly optimized and solved. In practical applications, planning (exploring) a travelable space often encounters a problem such as a track space where the autonomous mobile device cannot pass through or a track space (a narrow space or a multi-obstacle space) where the autonomous mobile device is not suitable for passing through. Faced with such problems, the travelable space of an autonomous mobile device can be explored in the following way:
scheme 1: horizontal and vertical sampling method + Dynamic Programming (DP)
In the scheme 1, a multi-layer candidate point set is obtained by uniformly sampling in the horizontal and vertical directions of the lane, and then a primary solution is obtained by solving based on the DP. The main idea of dynamic planning is to take the current position of the autonomous mobile device as a starting point, sample some points along the horizontal and vertical directions of a lane, a group of points sampled horizontally is called horizontal sampling points (levels), and after the points are packaged into nodes (nodes), the cost (cost) of the nodes between different levels is calculated respectively, so as to form a graph (graph). And updating the minimum cost of the node by using DP, and finding a path with the minimum cost of the optimal central line connecting the starting point and the end point. The shortest path is found by the method compared with the traditional graph search method, and the difference is that the cost comprises smooth and collision-free indexes.
The method comprises the following specific steps: as shown in fig. 1a, local center lines between adjacent layers are obtained by interpolation using a spline curve between every two points between adjacent layers, and then the cost of each local center line between adjacent layers is evaluated using an evaluation function containing a plurality of costs such as collision, smoothness and the like, and the best local center line among the local center lines is obtained. And then, taking the terminal point of the optimal local central line as the starting point of the next calculation, and calculating the local optimal connecting lines of the two subsequent adjacent layers layer by layer in a recursion manner until the terminal point is reached.
The disadvantages of scheme 1 are as follows:
(1) only one output can be output each time, all possibilities cannot be completed in one-time calculation, and the calculation consumption ratio is higher under the condition of a plurality of obstacles, so that the method is not suitable for implementation on the ground.
(2) And transverse and longitudinal sampling distance parameters need to be set, so that the efficiency and the precision are difficult to balance.
(3) The parameters are more, the adjustment is complex, and the method is difficult to be suitable for variable scenes.
(4) A complex set of trajectory estimation methods is required.
Scheme 2: lattice planning method
The greatest difference in design between the Lattice plan and the EM Planner is that the Lattice plan is solved for the horizontal and vertical synthesis, while EM is solved for the separate solution. Like the EM planer, the Lattice planer also decomposes the trajectory planning problem into two independent trajectory planning problems of a transverse 1-dimensional space and a longitudinal 1-dimensional space, and the planning difficulty is reduced. The horizontal direction is still the SL problem, i.e., the Station late, and the vertical direction is also the ST, i.e., the Station Time problem.
The first step of the Lattice planning algorithm is to scatter points simultaneously in the position space and time according to the states of the starting point and the end point. The initial state and the end state of the scattering point respectively have 6 parameters, including 3 transverse parameters, namely transverse position, derivative of the transverse position, namely, derivative of the header and the header; 3 longitudinal parameters, i.e. longitudinal position, first derivative of longitudinal position, i.e. velocity, second derivative of longitudinal position, i.e. acceleration.
The parameters of the starting point are the real state of the vehicle at the time or the state of the Stitch, and the ending state is each case of scatter point enumeration. After the terminal point and the starting point are determined, the starting state and the ending state are connected through a polynomial of a fifth order or a fourth order, and therefore the planned transverse and longitudinal tracks are obtained.
This step is the essence of the Lattice algorithm, which directly determines the efficiency and solution space of the algorithm. For example, constraints on the physical and dynamic performance of the vehicle can be added during point scattering, meanwhile, pre-screening of feasible domains and the like can be performed according to obstacles, and invalid spaces are effectively cut, so that the second step of the better performance Lattice planning algorithm is to sample enough tracks and provide as many choices as possible.
The third step of the Lattice planning algorithm is to calculate the cost of each trajectory. After all the transverse and longitudinal one-dimensional tracks are generated, they are arranged and combined to synthesize a large number of two-dimensional tracks (as shown in fig. 1 b). Then, the best synthetic track is screened out according to the loss function, the cost considers factors such as the feasibility and the safety of the track, and similar to the EM planer, the loss function of the Lattice planer can be divided into three types of safety correlation, somatosensory correlation and target completion degree correlation.
The fourth step of the Lattice planning algorithm is a loop detection process. In the process, the track with the lowest cost is selected first each time, and physical limit detection and collision detection are carried out on the track. If the selected track can not pass the two detections at the same time, the selected track is screened out, and the track with the lowest track is examined.
The disadvantages of scheme 2 are as follows:
(1) although a plurality of track candidates are calculated, the calculation is not performed for a plurality of times when all the possibilities are completed by one calculation, and the calculation consumption is high under the condition of a plurality of obstacles, so that the method is not suitable for implementation on the ground.
(2) And the efficiency and the precision are difficult to balance because the transverse and longitudinal sampling distance parameters need to be set.
(3) The parameters are more, the adjustment is complex, and the method is difficult to be suitable for variable scenes.
(4) A complex set of trajectory evaluation methods is required.
Scheme 3: veno Diagram (Voronoi Diagram)
In scenario 3, as shown in fig. 1c, this scenario obtains a description of the feasible space by pressing the obstacle sensing information onto a 2d grid map and using voronoi diagram method to generate a series of bisectors between the obstacles. The method mainly comprises the following steps:
A. the steps for establishing the voronoi diagram are as follows:
(1) the discrete points automatically construct a triangulation network, namely a Delaunay triangulation network. The discrete points and the triangles formed are numbered and it is noted which three discrete points each triangle is made up of.
(2) And calculating the center of a circumscribed circle of each triangle and recording.
(3) And traversing the triangle linked list, and searching adjacent triangles TriA, TriB and TriC which are shared with the three edges of the current triangle pTri.
(4) If the triangle is found, the circumcenter of the found triangle is connected with the circumcenter of the pTri and is stored in the Voronoi edge chain table. If not, finding out the perpendicular bisector ray at the outermost edge and storing the perpendicular bisector ray in the Voronoi linked list.
(5) And (5) after the traversal is finished, finding all the voronoi edges, and drawing a voronoi graph according to the edges.
B. The generation of the Delaunay triangulation network mainly comprises the following steps:
(1) and constructing a super triangle, containing all scatter points, and putting the super triangle into a triangle linked list.
(2) And sequentially inserting scattered points in the point set, finding out a triangle (called as an influence triangle of the point) of which the circumscribed circle contains the insertion point from the triangle linked list, deleting the common edge of the influence triangle, and connecting the insertion point with all vertexes of the influence triangle, thereby completing the insertion of one point in the Delaunay triangle linked list.
(3) And optimizing the local newly formed triangle according to an optimization criterion. And putting the formed triangles into a Delaunay triangle linked list.
(4) And (5) circularly executing the step 2 until all scatter points are inserted.
The disadvantages of scheme 3 are as follows:
(1) the overall time complexity is O (1) + O (n2) + O (n2) + O (n) + 3O (n) + O (n) (n2), which is relatively high.
(2) Unreasonable routes can not be eliminated based on vehicle kinematics in the solving process, and elimination can only be carried out through subsequent pruning.
(3) The output result cannot be directly output as a full solution trajectory space.
(4) If the output trajectory space is needed, its average complexity is high, n order-determining obstacles, with 2^ n candidates.
(5) If the Voronoi diagram is searched, a complex track evaluation method is also needed, and the result is only one and incomplete.
In summary, the travelable space planning scheme provided by the prior art generally has the problems of large calculation amount, difficulty in obtaining a complete travelable space set in a short time with relatively few calculation resources, capability of solving the problem of solution time to a certain extent with relatively many calculation resources, and high cost. Therefore, the prior art provides solutions that do not provide a good balance between efficiency and cost of the travelable space planning.
Aiming at the technical problem faced by the planning of the existing travelable space, the basic principle of the embodiment of the application is to establish a cost map (Costmap) of the area space where the autonomous mobile device is located under a set coordinate system by using the set coordinate system (preferably Frenet coordinate system). The cost map comprises an obstacle layer, an expansion layer and a static map layer. Based on the cost map, the travelable space of the autonomous mobile device is explored around the obstacle, the exploration of blank areas (areas without obstacles in the cost map) is reduced, and the travelable space exploration efficiency is improved.
It should be noted that the autonomous moving apparatus to which the travelable space planning method provided in this embodiment is applied includes: a robot, an unmanned vehicle, or the like needs to autonomously travel in an area where an obstacle exists. The shape of the robot or the unmanned vehicle is not limited in this embodiment, and the robot may be, for example, a circle, an ellipse, a triangle, a convex polygon, a human shape, or the like. As another example, the unmanned vehicle may be a small car, a bus, or a truck, among others. The autonomous mobile device may implement the method provided by the embodiment by installing software, APP, or writing program codes in corresponding devices. The travelable space planning method provided by the embodiment can also be applied to computer equipment for providing navigation service. For example, in a vehicle navigation scenario, the travelable space planning method may be applied to a server device providing navigation services.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings. It should be noted that: like reference numerals refer to like objects in the following figures and embodiments, and thus, once an object is defined in one figure or embodiment, further discussion thereof is not required in subsequent figures and embodiments.
Fig. 2a is a schematic flow chart of a travelable space planning method according to an embodiment of the present application. As shown in fig. 2a, the method comprises:
201. and acquiring a cost map of the area where the autonomous mobile equipment is located under a set coordinate system.
202. And controlling the target projection of the autonomous mobile equipment in the set coordinate system to move around the projection of the obstacle according to the obstacle projection position information of the obstacle in the cost map so as to obtain target projection position information generated in the cost map in the process of moving the target projection to the destination.
203. And performing collision detection on the target projection and the projection of the obstacle in the cost map based on the target projection position information and the obstacle projection position information.
204. And determining the travelable space of the autonomous mobile equipment around the corresponding obstacle according to the collision detection result.
The above is a method for planning a travelable space provided by the present application, and the purpose of exploring the travelable space of the autonomous mobile device is to avoid an obstacle in a region where the autonomous mobile device is located during traveling of the autonomous mobile device, so as to realize autonomous movement of the autonomous mobile device. Different application scenes are different, and obstacles in the area where the autonomous mobile device is located are also different. For example, the autonomous mobile device is an unmanned vehicle, and the obstacle in the area where the autonomous mobile device is located may be another vehicle, a pedestrian, a road stub, a station card, or the like. For another example, the autonomous moving apparatus is a sweeping robot, and the obstacle in the area where the autonomous moving apparatus is located may be furniture, a wall, a person, or the like in the cleaning zone. Here, the example is only for illustrating the type of the obstacle, and should not be construed as limiting the scheme provided by the embodiment.
In this embodiment, in order to obtain the travelable space of the autonomous mobile device, in step 201, a cost map of the area where the autonomous mobile device is located under the set coordinate system may be obtained. In the present embodiment, the implementation form of setting the coordinate system is not limited. Alternatively, the setting coordinate system may be a cartesian coordinate system, a polar coordinate system, a Frenet coordinate system, or the like, but is not limited thereto.
Preferably, the set coordinate system adopts a Frenet coordinate system. The Frenet coordinate system is established based on the relative relation between the autonomous mobile device and the road, and can convert a curved road into a straight line expression, so that the calculation complexity is reduced. In Frenet coordinates, the location of the autonomous mobile device on the road is described using the variables s and l. The s-coordinate represents the distance along the road (also referred to as longitudinal displacement) and the l-coordinate represents the left-right position on the road (also referred to as lateral displacement). Accordingly, in step 201, a cost map of the area in which the autonomous mobile device is currently located in the Frenet coordinate system may be obtained.
The cost map of the current area of the autonomous mobile device in the Frenet coordinate system can be a pre-stored cost map, and can also be a cost map constructed in real time based on the environmental information of the autonomous mobile device.
For the implementation mode of constructing the cost map in real time based on the environmental information of the autonomous mobile equipment, a Frenet coordinate system can be established before the cost map is constructed. Alternatively, as shown in fig. 2b, a Frenet coordinate system may be established with a center line of a road between the current location of the autonomous mobile apparatus and the destination as a reference line R, and a point O on the reference line R where the distance from the autonomous mobile apparatus is shortest as a starting point of the reference line. Wherein, the connecting line of the autonomous mobile device and the starting point O of the reference line is vertical to the tangent of the starting point. As shown in FIG. 2b, the s-axis of the Frenet coordinate system is the vertical axis of the Frenet coordinate system, and the l-axis is the horizontal axis of the Frenet coordinate system. When the autonomous mobile device moves along the s-axis or the parallel curve of the s-axis, the i-axis coordinate of the autonomous mobile device is not changed.
Further, for the autonomous mobile device, an obstacle in a current area may be detected based on a mounted vision sensor. Alternatively, the autonomous mobile device may carry an infrared sensor or a laser sensor. An infrared sensor or laser sensor may detect the distance and direction of an obstacle relative to the autonomous mobile device. The autonomous mobile device may determine the location of the obstacle based on its current location and the distance and direction of the obstacle relative to the autonomous mobile device.
Alternatively, the autonomous mobile device may carry a camera. The camera can acquire an environmental image around the current position of the autonomous mobile device in the moving process of the autonomous mobile device. The camera may be a binocular camera, a monocular camera, a depth camera, or the like, but is not limited thereto. Further, the position of the obstacle in the area where the autonomous mobile device is located may be obtained according to an environmental image around the current location of the autonomous mobile device. Optionally, image features of the environment image may be acquired; and acquiring the position of the obstacle in the area where the autonomous mobile equipment is located according to the image characteristics of the environment image.
The position of the obstacle may be the position of the obstacle in any set coordinate system. For example, the location of the obstacle may be a location in the coordinate system of the autonomous mobile device, or a location in the world coordinate system, or a location in the Frenet coordinate system, or the like.
Further, an obstacle in the area where the autonomous mobile device is located may be projected into the Frenet coordinate system to obtain a projection of the obstacle, i.e. the projection of the obstacle in the Frenet coordinate system. Further, the expansion region of the obstacle in the Frenet coordinate system can be determined based on the size of the target projection of the autonomous mobile apparatus in the Frenet coordinate system and the position information of the projection of the obstacle (i.e., obstacle projection position information). Alternatively, as shown in fig. 2c, a circle may be made with the radius r of the inscribed circle of the autonomous mobile device as the radius and several projected position points of the obstacle as the center of the circle, so as to obtain an area formed by several circles. And the other areas except the area where the obstacle is located in the area formed by the circles are expansion areas. Fig. 2c illustrates only 3 circles, but is not limited thereto. Furthermore, a cost map can be constructed according to the projection position information of the obstacle and the expansion area of the obstacle in the Frenet coordinate system. In the cost map, the target projection of the autonomous mobile device in the Frenet coordinate system should not intersect with the area where the obstacle is located, the center of the target projection should not fall into the expansion area, and otherwise, the autonomous mobile device would collide with the obstacle.
Based on the cost map, in step 202, the autonomous mobile device may be controlled to move around the projection of the obstacle in the set coordinate system according to the projection position information of the obstacle (i.e., the obstacle projection position information) in the cost map, so as to simulate the autonomous mobile device moving to the destination. In the process of simulating the autonomous mobile device to move to the destination, target projection position information generated in the cost map in the process of moving the target projection to the destination can be obtained. If the target projection is moved to a certain projection position and is overlapped with the obstacle in the cost map, or the center of the target projection falls into an expansion area in the cost map, it is indicated that the autonomous mobile device is moved to the position and collides with the obstacle. Thus, the location is not a travelable space for the autonomous mobile device. Based on this, in step 203, collision detection may be performed on the projection of the target and the projection of the obstacle in the cost map based on target projection position information and obstacle projection position information generated in the cost map during the movement of the target projection to the destination; and in step 204, determining a travelable space of the autonomous mobile device around the corresponding obstacle according to the collision detection result.
In the embodiment, a cost map of an area space where the autonomous mobile device is located under a set coordinate system is established; and the drivable space of the autonomous mobile equipment is explored around the obstacle based on the cost map, so that the exploration of blank areas without obstacles is reduced, and the exploration efficiency of the drivable space is improved.
In the present embodiment, in order to further improve the travelable space search efficiency, as shown in fig. 2b and 2d, a plurality of sampling curves (e.g., R1 to R6 in fig. 2b and 2 d) may be provided on the left and right sides of the reference line of the Frenet coordinate system at set sampling intervals. The plurality of strands means 2 or more than 2 strands. Fig. 2b and 2d are only illustrated with 6 sampling curves, but are not limited thereto. The plurality of sampling curves are parallel curves of a reference line R of the Frenet coordinate system, as shown by dashed lines R1-R6 parallel to the reference line R in fig. 2b and 2 d. The plurality of sampling curves may be understood as a plurality of trajectory lines distributed according to a tendency to extend along the road. The distance between two adjacent sampling curves is equal. The sampling interval can be flexibly set according to the precision requirement and the exploration efficiency requirement of the travelable space. The higher the precision requirement of the travelable space is, the smaller the sampling interval is; the higher the search efficiency requirement, the larger the sampling interval.
In this embodiment, when the target projection of the autonomous mobile device in the Frenet coordinate system is controlled to move in the cost map, the autonomous mobile device may be controlled to move forward in the direction of the sampling curve in which no obstacle is distributed, so that the space in which obstacles are distributed may be eliminated, and the search for a blank area in which no obstacle is present may be reduced.
It is worth mentioning that for an autonomous mobile device located on a sampling curve and the sampling curve being free of obstacles, for the autonomous mobile device moving towards the direction of the sampling curve distributed free of obstacles can be understood: the autonomous mobile device moves forward along the sampling curve where it is currently located. Based on this, an alternative implementation of step 202 is: determining a target sampling curve without obstacle distribution in the cost map from the plurality of sampling curves according to the obstacle projection position information of the obstacles in the cost map; and controlling the target projection of the autonomous mobile equipment in the Frenet coordinate system to move forwards towards the target sampling curve to simulate the autonomous mobile equipment to move towards the target, and obtaining target projection position information generated in the cost map in the moving process of the target projection to the transverse target line where the destination is located.
In practical applications, autonomous mobile devices generally move forward in 3 directions: straight forward, left forward and right forward. Based on the method, when the travelable space of the autonomous mobile device is explored, the space adjacent to the autonomous mobile device can be explored, the exploration workload can be further reduced, and the exploration efficiency is improved.
Alternatively, as shown in fig. 2d, the distribution of the projection of the obstacle on the slot line T may be obtained according to the projection position information of the obstacle in the cost map. The groove line T is a transverse axis perpendicular to a tangent to the starting point O of the reference line R, that is, the groove line T is a transverse axis having s equal to 0. Alternatively, the projected lateral coordinate of the obstacle, i.e., the coordinate of the obstacle on the slot line T, may be acquired and compressed into one dimension in the Frenet coordinate system. Further, the slot position occupied by the projection of the obstacle can be acquired according to the distribution of the projection of the obstacle on the slot line T. For the Frenet coordinate system shown in fig. 2d, i (-f, -e) and i (-b, -a) are the partial slots occupied by the obstacle. Further, the sampling curve where the slot position occupied by the projection of the obstacle is located can be used as the sampling curve where the obstacle is distributed.
Further, from the sampling curves distributed with the obstacles, a boundary sampling curve closest to a target projection distance of the autonomous mobile device in a Frenet coordinate system is obtained. Optionally, the distance between the target projection and the sampling curve with the distributed obstacles may be calculated, and a boundary sampling curve closest to the target projection distance may be obtained from the sampling curves with the distributed obstacles according to the distance between the target projection and the sampling curve with the distributed obstacles. Wherein, the boundary sampling curve includes: the sampling curve which is positioned on the left side of the target projection and is distributed with the obstacles and is closest to the target projection distance; and the sampling curve which is positioned on the right side of the target projection and is closest to the target projection is distributed with the obstacles.
After boundary sampling curves on the left side and the right side of the target projection are obtained, the travelable space can be searched in a sampling space defined by left and right boundary sampling dotted lines. Correspondingly, a sampling curve without obstacle distribution between the target projection and the boundary sampling curve can be obtained and used as the target sampling curve; controlling the target projection of the autonomous mobile device in the Frenet coordinate system to move forwards towards the direction of the target sampling curve to simulate the autonomous mobile device to move towards the destination; then, based on the size of the target projection and target projection position information generated in the process of simulating the autonomous mobile device to move to the destination, collision detection can be carried out on the target projection and the projection of the obstacle in the cost map, the search of a travelable space around the obstacle is realized, the search of a blank area without obstacle distribution is reduced, and the travelable space search efficiency is improved.
In this embodiment, an iterative method of a multi-branch tree may be adopted to search for a travelable space of the autonomous mobile device by simulating the movement of the autonomous mobile device to the destination and performing collision detection on a target projection and a projection of an obstacle during the movement of a transverse target line on which the destination is located. Optionally, the current position of the target projection may be used as a root node, and the root node is subjected to forward heuristic exploration based on a target sampling curve without obstacle distribution to determine a sub-tree of the root node. Wherein any node in the subtree is a drivable position of the explored autonomous mobile device. In this embodiment, the area where the autonomous mobile device is located is defined as an obstacle topological space, and three driving states of straight driving, left winding and right winding can exist in the topological space of the autonomous mobile device. Therefore, when searching for a travelable space, 3 driving states of straight movement, forward movement to the left and forward movement to the right of the autonomous mobile device can be searched, and the main processes are as follows:
as shown in fig. 2d, the target projection may be moved forward from the current location toward the direction of the first sampling curve by a set distance to simulate the autonomous mobile device moving toward the destination, i.e., to simulate the autonomous mobile device moving one step toward the destination. The set distance may be set autonomously. Alternatively, the setting may be made according to the moving speed, inertia, or the like of the autonomous mobile apparatus. In the process of moving the autonomous mobile equipment to the destination, each forward movement step mainly comprises the following steps: moving forward along the current direction of travel, moving forward left or moving forward right, and thus, the first sampling curve may be a sampling curve of the target sampling curve where the target projection is currently located, or a sampling curve of the target sampling curve located at the left side of the target projection and closest to the target projection distance (e.g., l-d in fig. 2d, i.e., R2), or a sampling curve of the target sampling curve located at the right side of the target projection and closest to the target projection distance (e.g., l-c in fig. 2d, i.e., R3). It should be noted that, for the case where the first sampling curve is the sampling curve where the target projection is currently located, the target projection moves forward from the current position toward the direction of the first sampling curve, which is to be understood as the target projection moving forward along the first sampling curve.
Accordingly, the first projection position of the target projection obtained after the target projection is moved forward from the current position toward the direction of the first sampling curve by the set distance can be used as target projection position information generated by simulating the autonomous mobile device to move toward the destination. Next, collision detection may be performed on the projection of the target and the projection of the obstacle at the first projection position.
Optionally, it may be determined whether the projection of the target moved to the first projection position overlaps with the projection of the obstacle in the cost map and whether the center of the projection of the target moved to the first projection position falls in the expansion area according to the size of the projection of the target, the projection position information of the obstacle in the cost map, and the expansion area in the cost map; and if the judgment result is negative, determining that the target projection moved to the first projection position does not collide with the obstacle. Wherein, the judgment result is that: the target projection moved to the first projection location does not overlap with the projection of the obstacle in the cost map, and the center of the target projection moved to the first projection location does not fall within the inflation region.
Accordingly, if the judgment result is yes, the target projection moved to the first projection position is determined to collide with the projection of the obstacle. That is, if there is an overlap between the projection of the target moved to the first projection position and the projection of the obstacle in the cost map, and/or if the center of the projection of the target moved to the first projection position falls within the expansion region, it is determined that the projection of the obstacle will collide with the projection of the target moved to the first projection position.
Further, in the event that the movement of the target projection to the first projection location does not collide with the projection of the obstacle, the forward heuristic search may be continued based on the first projection location to determine a sub-tree of the root node.
Accordingly, in the case where the movement of the target projection to the first projection position does not collide with the projection of the obstacle, one child node of the root node may be determined based on the first projection position. Optionally, it may be determined whether the first projection position is located on the first sampling curve; and if so, taking the first projection position as a child node of the root node. And if the judgment result is negative, taking a second projection position on the first sampling curve, which has the same longitudinal coordinate with the first projection position, as a child node of the root node. Further, a distance D1 for the target projection to move from the first projection position to the second projection position may be calculated and the target projection is controlled to move laterally D1 from the first projection position onto the first sampling curve.
And then, based on the searched child nodes of the root node, continuing to perform forward heuristic search to determine the lower-layer child nodes of the searched child nodes until the searched child nodes reach the transverse target line of the destination. And the child nodes reaching the transverse target line are leaf nodes of the search tree. The horizontal target line where the destination is located is a horizontal displacement line in the Frenet coordinate system of the target. For example, assuming that the destination is located at s ═ X in Frenet, then s ═ X is the horizontal target line where the destination is located.
The forward heuristic search process is exemplified below by taking any of the explored child nodes as an example. Wherein, for convenience of description, any child node is defined as a first child node.
And for the searched first child node, under the condition that the first child node does not reach the transverse target line where the destination is located, moving the target projection of the autonomous mobile equipment forward by a set distance from the position corresponding to the first child node to the direction of the second sampling curve to obtain a third projection position of the target projection. The second sampling curve is a sampling curve where the first sub-node is located; or the second sampling curve is a sampling curve which is positioned on the left side of the first sub-node and is closest to the first sub-node in the target sampling curves distributed without the obstacles; or the second sampling curve is a sampling curve which is positioned on the right side of the first sub-node in the target sampling curve and is closest to the first sub-node. It is worth mentioning that, for the case that the second sampling region is the sampling curve where the first sub-node is located, the target projection moves forward from the current position toward the direction of the second sampling curve, which is to be understood as the target projection moves forward along the second sampling curve.
Further, collision detection can be performed on the target projection moved to the third projection position and the projection of the obstacle according to the size of the target projection, the projection position information of the obstacle in the cost map, and the expansion area in the cost map. For a specific implementation of performing collision detection on the target projection and the projection of the obstacle at the third projection position, reference may be made to the above-mentioned contents related to performing collision detection on the target projection and the projection of the obstacle at the first projection position, and details are not described here again.
And if the projection of the target at the third projection position does not collide with the projection of the obstacle, determining a lower-layer child node of the first child node based on the third projection position. Alternatively, the third projection position may be a lower-layer child node of the first child node. Or, whether the third projection position is located on the second sampling curve can also be judged; and if so, taking the third projection position as a child node of the first child node. And if the judgment result is negative, taking the position on the second sampling curve, which has the same longitudinal coordinate with the third projection position, as a child node of the first child node. Further, a lateral distance D2 may be calculated for the target projection to move from the third projection position to a position on the second sampling curve having the same longitudinal coordinate as the third projection position, and the target projection may be controlled to move from the third projection position laterally by D2 to the second sampling curve.
Correspondingly, if the target projection at the third projection position collides with the projection of the obstacle, the target projection is moved by a set distance along the transverse direction of the first child node to determine a moved fourth projection position. Wherein the lateral direction along the first child node is a set distance from the first child node along the l-axis direction. Alternatively, the target projection may be moved from the first child node to the left and/or right in the l-axis direction by a set distance.
Further, collision detection can be performed on the target projection at the fourth projection position and the projection of the obstacle according to the size of the target projection, the projection position information of the obstacle in the cost map and the expansion area in the cost map; for a specific implementation of performing collision detection on the projection of the target and the projection of the obstacle at the fourth projection position, reference may be made to the above-mentioned contents related to performing collision detection on the projection of the target and the projection of the obstacle at the first projection position, and details are not described here again.
Further, if the projection of the target at the fourth projection position does not collide with the projection of the obstacle, the lower-layer child node of the first child node is determined based on the fourth projection position. Optionally, the fourth projection position may be taken as a lower-layer child node of the first child node. Or, whether the fourth projection position is located on the second sampling curve can also be judged; and if so, taking the fourth projection position as a child node of the first child node. And if the judgment result is negative, taking the position on the second sampling curve, which has the same longitudinal coordinate with the fourth projection position, as a child node of the first child node. Further, a lateral distance D3 may be calculated for the target projection to move from the fourth projection position to a position on the second sampling curve having the same longitudinal coordinate as the fourth projection position, and the target projection may be controlled to move from the fourth projection position laterally by D3 to the second sampling curve.
After obtaining a child node of the first child node, the forward heuristic search may continue to explore new underlying child nodes based on the newly explored child node.
Correspondingly, if the projection of the target at the fourth projection position collides with the projection of the obstacle, the first child node is taken as a leaf node of the search tree.
And carrying out forward heuristic search on each searched child node according to the mode until all leaf nodes of the search tree are searched, so that all subtrees of the root node can be searched. The number and number of sub-trees may be determined by the particular environment in which the autonomous mobile device is located.
Further, the travelable space of the autonomous mobile device around the corresponding obstacle can be determined according to the root node and the search tree formed by the subtree of the root node. Alternatively, the search tree may be traversed to obtain all travelable spaces for the autonomous mobile device around the corresponding obstacle.
Furthermore, the blank area without obstacles and the travelable space of the autonomous mobile device around the corresponding obstacle can be combined to obtain all travelable spaces of the autonomous mobile device from the current position to the destination.
The travelable space of the autonomous mobile device is all routes that the autonomous mobile device can travel, and in order to realize navigation of the autonomous mobile device, the autonomous mobile device can be subjected to path planning according to the travelable space of the autonomous mobile device, so that the navigation path of the autonomous mobile device is obtained. Further, a navigation path of the autonomous mobile device may be output for movement of the autonomous mobile device along the navigation path. Optionally, an executing body of the travelable space planning method may be an autonomous mobile device, and the autonomous mobile device may control the autonomous mobile device to move along a navigation path after planning the navigation path from the autonomous mobile device. If the execution main body of the travelable space planning method is the server-side equipment providing the navigation service, the server-side equipment can send the navigation path to the autonomous mobile equipment. Accordingly, the autonomous mobile device receives the navigation path and moves along the navigation path.
To more clearly illustrate the above procedure of searching the travelable space, the travelable space planning method is exemplarily described below with reference to a specific embodiment. As shown in fig. 2e, the method mainly comprises:
s1: and establishing a Frenet coordinate system by taking the center line of the road between the current position of the autonomous mobile equipment and the destination as a reference line R and taking the point on the reference line R, which has the shortest distance with the autonomous mobile equipment, as the starting point of the reference line R.
S2: projecting the obstacle in the area where the autonomous mobile device is located into a Frenet coordinate system to obtain a projection of the obstacle.
S3: and determining the expansion area of the projection of the obstacle in the Frenet coordinate system according to the size of the target projection and the projection position information of the obstacle.
S4: and constructing a cost map according to the projection position information of the obstacle and the expansion area of the obstacle in the Frenet coordinate system.
S5: respectively arranging a plurality of sampling curves on the left side and the right side of a reference line R of a Frenet coordinate system according to a set sampling interval; the multiple sampling curves are parallel curves of the reference line R of the Frenet coordinate system.
S6: acquiring the distribution of the projection of the obstacles on the slot line T according to the projection position information of the obstacles in the cost map; the slot line T is a transverse axis perpendicular to a tangent to the starting point of the reference line R, i.e., the slot line T is an l-axis where s is 0.
S7: and acquiring the slot positions occupied by the obstacles according to the distribution of the projections of the obstacles on the slot lines.
S8: and taking the sampling curve corresponding to the slot occupied by the barrier as the sampling curve distributed with the barrier.
S9: and acquiring a boundary sampling curve which is closest to the target projection distance of the autonomous mobile equipment in a Frenet coordinate system from the sampling curves distributed with the obstacles.
S10: and acquiring a sampling curve of barrier-free distribution between the target projection and the boundary sampling curve as a target sampling curve.
The above steps S1-S10 represent the process of constructing the cost map and the target sampling curve, and the process of searching the travelable space of the autonomous mobile apparatus based on the cost map and the target sampling curve will be described with reference to fig. 2 f. As shown in fig. 2f, the main steps include:
s11: as shown in fig. 2g, the current position of the target projection is taken as the root node a. The current position of the target projection is the position of the target projection when the travelable space search is started, i.e. the initial projection position.
Next, a forward heuristic exploration is performed starting from the root node to explore the subtrees of the root node. The method mainly comprises the following steps:
s12: and as shown in fig. 2d, moving the target projection forward from the current position toward the direction of the first sampling curve by a set distance to obtain a first projection position of the target projection. For the first sampling curve, reference may be made to the relevant contents of the above embodiments, which are not described herein again.
S13: and performing collision detection on the projection of the obstacle and the projection of the target moving to the first projection position according to the size of the target projection, the projection position information of the obstacle in the cost map and the expansion area in the cost map. If the projection of the target at the first projection position collides with the projection of the obstacle, step S14a is executed; if the projection of the target from the first projection position does not collide with the projection of the obstacle, step S14b is performed.
S14 a: the target projection is moved forward by the set distance from the current position, and the execution returns to step S12.
It should be noted that, when the process returns to step S12, the first sampling curve is a sampling curve in which the target projection is currently located in the target sampling curve, a sampling curve in which the target projection is located on the left side of the target projection and is closest to the target projection, and a sampling curve in which the target sampling curve is located on the right side of the target projection and is closest to the target projection and is not searched, that is, when the process returns to step S12, the first sampling curve is different from the sampling curve used in the previous search.
S14 b: and judging whether the first projection position is positioned on the first sampling curve. If yes, go to step S15; if the determination result is negative, step S16 is executed.
S15: taking the first projected position as a child node of the root node, such as node A1 or A2 in FIG. 2 g; and proceeds to step S17.
S16: taking a second projected position on the first sampling curve having the same longitudinal coordinate as the first projected position as a child node of the root node, such as node a1 or a2 in fig. 2 g; and proceeds to step S17.
S17: detecting whether the position corresponding to the first sub-node explored currently reaches a transverse target line where the destination is located; if yes, go to step S27; if not, step S18 is executed.
S18: and moving the target projection forward from the position corresponding to the first sub-node to the direction of the second sampling curve by a set distance to obtain a third projection position of the target projection aiming at the currently explored first sub-node. For the description of the second sampling curve, reference may be made to the related contents of the foregoing embodiments, and details are not repeated herein.
S19: and performing collision detection on the projection of the obstacle and the projection of the target at the third projection position according to the size of the target projection, the projection position information of the obstacle in the cost map and the expansion area in the cost map. If the projection of the target at the third projection position collides with the projection of the obstacle, step S20 is executed; if the projection of the target at the third projection position does not collide with the projection of the obstacle, step S24 is executed.
S20: and respectively moving the target projection to the left and the right along the transverse direction of the first child node by a set distance to determine a fourth projection position and a fifth position after movement.
S21: and respectively carrying out collision detection on the target projections at the fourth projection position and the fifth projection position and the projections of the obstacles according to the size of the target projection, the position information of the obstacles in the cost map and the expansion area in the cost map. If the target projection at the fourth projection position and the target projection at the fifth projection position both collide with the projection of the obstacle, the first child node is taken as a leaf node of the search tree, and step S27 is executed; if the projection of the target at the fourth projection position does not collide with the projection of the obstacle, executing step S22; if the projection of the target at the fifth position does not collide with the projection of the obstacle, step S23 is executed.
S22: and taking the fourth projection position as a lower layer node of the first child node, taking the fourth projection position as a new first child node, and returning to execute the step S17.
S23: the fifth position is taken as the lower node of the first child node, and the fifth position is taken as a new first child node, and the process returns to the step S17.
S24: and judging whether the third projection position is positioned on the second sampling curve. If yes, go to step S25; if the determination result is negative, step S26 is executed.
S25: and taking the third projection position as a lower node of the first child node and taking the third projection position as a new first child node, and returning to execute the step S17.
S26: and taking the sixth position on the second sampling curve with the same longitudinal coordinate as the third projection position as the lower node of the first child node, taking the sixth position as a new first child node, and returning to execute the step S17.
S27: and ending the exploration of the current subtree to form a subtree of the root node.
S28: judging whether the subtree of the root node is explored completely; if the search is completed, go to step S29; if not, the process returns to step S12.
It should be noted that, when the step S12 is executed back, the first sampling curves are the sampling curves of the target sampling curves where the target projection is currently located, the sampling curve of the target sampling curves located on the left side of the target projection and closest to the target projection, and the unexplored sampling curve of the sampling curves located on the right side of the target projection and closest to the target projection. For example, as shown in fig. 2d, when step S12 is executed for the first time, the sampling curve located at the right side of the target projection and closest to the target projection distance in the target sampling curve is moved forward by a set distance (i.e., right-forward), and when step S12 is executed again, the sampling curve located at the initial projection position of the target projection in the target sampling curve may be moved by the set distance (i.e., straight-forward); and the like, returns again to perform step S12, the sampling curve located at the left side of the target projection and closest to the target projection distance in the target sampling curve is moved forward by a set distance (i.e., left forward). In the embodiment of the present application, the sequence of the execution of the direction in which the target projection moves forward is not limited, and the above description is only an exemplary description and is not limiting.
It is also worth mentioning that, when returning to the execution of step S12 after the execution of S28, the current position of the target projection in step S12 is the position where the target projection is located when the travelable space operation is started, i.e., the initial projection position. For example, assuming that the initial projection position where the target projection is located when the travelable space operation is started is (S-0, l-100), when step S12 is executed after S28 is executed, the current position of the target projection is still referred to as the initial projection position (S-0, l-100).
S29: and outputting the search tree formed by the root node and the subtrees of the root node.
S30: and traversing the search tree to obtain a travelable space of the autonomous mobile equipment around the corresponding obstacle. I.e. traverse the search tree to get all feasible solutions.
By using the travelable space planning method provided by the embodiment of the application, travelable space planning is performed on the autonomous mobile device bypassing and passing through the pile respectively, and the obtained travelable space planning effect is shown in fig. 2 h-2 j. In fig. 2h and 2i, the effect of the detour is shown, in fig. 2h and 2i, the dashed line shows the planned travelable space, and the square with arrows shows the pedestrian, i.e. the obstacle. Fig. 2j shows a schematic representation of the pile-crossing effect of an autonomous mobile device. In fig. 2j, the dashed lines represent the planned travelable space, the black rectangles represent road piles 1-4 and the grey squares represent other vehicles. Road piles and other vehicles are obstacles to autonomous mobile devices.
Further, in fig. 2 h-2 j, a navigation path of the autonomous mobile device may be planned according to the planned travelable space, and the obtained navigation path is shown by a gray curved surface in fig. 2 h-2 j. Alternatively, in fig. 2 h-2 j, the moving speed of the pedestrian and the autonomous mobile device may also be considered when performing the navigation path planning.
In this embodiment, the initial projection position of the autonomous mobile device is used as a root node, the sampling curve and the horizontal target line are used as reference scalars, full tree search is performed on a plurality of slots, and all feasible solutions can be obtained at one time in a short time through an iterative manner, that is, all feasible travel spaces of the autonomous mobile device can be obtained at one time. And the voronoi diagram cannot directly output results, only can output a grid (grid) diagram, and finally needs to output a track by another searching method, so that the results cannot be directly output, and even the voronoi diagram cannot independently complete calculation. For Dynamic Programming (DP) and lattice algorithms, discrete sampling is required, and then one track is calculated, which is low in calculation efficiency. On the other hand, the voronoi diagram, the Dynamic Programming (DP) and the lattice algorithm cannot calculate the complete travelable space in a short time, resulting in unclear travelable space. In the case of unknown total available space, it is difficult to ensure the search validity, and it is often possible to select a feasible space with dense obstacles in the end of a complicated search, or to deviate from the expected walking direction after optimization.
It should be noted that the execution subjects of the steps of the methods provided in the above embodiments may be the same device, or different devices may be used as the execution subjects of the methods. For example, the execution subjects of steps 201 and 202 may be device a; for another example, the execution subject of step 201 may be device a, and the execution subject of step 202 may be device B; and so on.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations are included in a specific order, but it should be clearly understood that the operations may be executed out of the order presented herein or in parallel, and the sequence numbers of the operations, such as 201, 202, etc., are merely used for distinguishing different operations, and the sequence numbers do not represent any execution order per se. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel.
Accordingly, embodiments of the present application also provide a computer-readable storage medium storing a computer program, which, when executed by one or more processors, causes the one or more processors to execute the steps in the travelable space planning method.
The method for planning the travelable space provided by the embodiment of the application can be suitable for autonomous mobile equipment and also can be suitable for computer equipment for providing navigation service. Correspondingly, the embodiment of the application also provides the autonomous mobile equipment and the computer equipment.
Fig. 3 is a block diagram of a hardware structure of an autonomous mobile device according to an embodiment of the present disclosure. As shown in fig. 3, the autonomous mobile device 30 includes: a machine body 30 a. Within the machine body 30a1 is disposed a computing device 30a 2. Wherein the computing device may include: a processor 30b and a memory 30 c. The processor 30b and the memory 30c may be disposed inside the machine body 30a, or disposed on the surface of the machine body 30 a.
The machine body 30a is an actuator of the autonomous mobile device 30, and can perform operations designated by one or more processors 30b in a certain environment. The mechanical body 30a represents the appearance of the autonomous moving apparatus 30 to some extent. In the present embodiment, the appearance of the autonomous moving apparatus 30 is not limited. The autonomous mobile device may be a robot, an unmanned vehicle, or an unmanned aerial vehicle, among others.
The machine body 30a may further be provided with a sensor 30d for collecting environmental information of the autonomous moving apparatus 30 or status information of the autonomous moving apparatus 30. Alternatively, the sensor 30d may include: a vision sensor, an inertial sensor, an acceleration sensor, a velocity sensor, and the like, but is not limited thereto.
It is worth mentioning that some basic components of the autonomous mobile device 30, such as a driving component, a odometer, a power supply component, an audio component, etc., are also provided on the machine body 30 a. Alternatively, the drive assembly may include drive wheels, drive motors, universal wheels, and the like. The basic components and the structures of the basic components contained in different mobile devices are different, and the embodiments of the present application are only some examples.
The memory 30c is used primarily for storing computer programs that may be executed by the one or more processors 30b to cause the one or more processors 30b to control the autonomous mobile device 30 to perform corresponding functions, perform corresponding actions, or tasks. In addition to storing computer programs, the one or more memories 30c may also be configured to store other various data to support operations on the autonomous mobile device 30. Examples of such data include instructions for any application or method operating on autonomous mobile device 30.
One or more processors 30b, which may be considered control systems for the autonomous mobile devices 30, may be coupled to the one or more memories 30c for executing computer programs stored in the one or more memories 30c to control the autonomous mobile devices 30 to perform corresponding functions, perform corresponding actions, or tasks. It should be noted that, when the autonomous mobile device 30 is in different scenes, the functions, actions or tasks required to be performed by the autonomous mobile device may be different; accordingly, the computer programs stored in the one or more memories 30c may vary, and execution of the different computer programs by the one or more processors 30b may control the autonomous mobile device 30 to perform different functions, perform different actions, or tasks.
In the present embodiment, the processor 30b is mainly configured to: obtaining a cost map of an area where the autonomous mobile equipment is located under a set coordinate system; controlling the target projection of the autonomous mobile equipment in a set coordinate system to move around the projection of the obstacle according to the obstacle projection position information of the obstacle in the cost map so as to obtain target projection position information generated in the cost map in the process that the target projection moves to the destination; collision detection is carried out on the projection of the target and the projection of the obstacle based on the projection position information of the target and the projection position information of the obstacle; and determining the travelable space of the autonomous mobile equipment around the corresponding obstacle according to the collision detection result.
Alternatively, a Frenet coordinate system is set. Accordingly, the processor 30b is further configured to: and establishing a Frenet coordinate system by taking a center line of a road between the current position of the autonomous mobile equipment and the destination as a reference line and a point with the shortest distance between the reference line and the autonomous mobile equipment as a starting point of the reference line.
Accordingly, when acquiring the cost map of the area where the autonomous mobile device is located in the set coordinate system, the processor 30b is specifically configured to: projecting the obstacle in the area where the autonomous mobile device is located into a Frenet coordinate system to obtain a projection of the obstacle; determining the expansion area of the obstacle in a Frenet coordinate system according to the size of the target projection and the projection position information of the obstacle; and constructing a cost map according to the projection position information of the obstacle and the expansion area of the obstacle in the Frenet coordinate system.
In some embodiments, the set coordinate system is a Frenet coordinate system. The processor 30b is further configured to: respectively arranging a plurality of sampling curves on the left side and the right side of a reference line of a Frenet coordinate system according to a set sampling interval; the plurality of sampling curves are parallel curves of the reference line.
Accordingly, the processor 30b, when controlling the target projection of the autonomous mobile apparatus in the set coordinate system to move around the projection of the obstacle, is specifically configured to: determining sampling curves distributed with obstacles in the cost map from the plurality of sampling curves according to the projection position information of the obstacles; acquiring a boundary sampling curve closest to the autonomous mobile equipment from the sampling curves distributed with the obstacles; acquiring a target sampling curve without obstacle distribution between the autonomous mobile equipment and the boundary sampling curve; and controlling the target projection to move forward towards the direction of the target sampling curve so as to obtain target projection position information generated in the cost map when the target projection moves towards the destination.
Optionally, the processor 30b, when determining the target sampling curve of the obstacle-free distribution in the cost map, is further configured to: acquiring the distribution of the obstacles on the slot line according to the projection position information of the obstacles; the groove line is a transverse axis perpendicular to a tangent of the starting point of the reference line; acquiring a slot position occupied by the barrier according to the distribution of the barrier on the slot line; and taking the sampling curve of the slot position occupied by the barrier as the sampling curve distributed with the barrier.
Further, when determining the sampling curves with obstacles distributed in the cost map, the processor 30b is specifically configured to: acquiring the distribution of the obstacles on the slot line according to the position information of the obstacles in the cost map; the groove line is a transverse axis perpendicular to a tangent of the starting point of the reference line; acquiring a slot position occupied by the barrier according to the distribution of the barrier on the slot line; and taking the sampling curve corresponding to the slot occupied by the barrier as the sampling curve distributed with the barrier.
Optionally, when the processor 30b controls the target projection to move towards the direction of the target sampling curve, it is specifically configured to: and moving the target projection forward from the current position to the direction of the first sampling curve by a set distance to obtain a first projection position of the target projection in the cost map.
The first sampling curve is a sampling curve where the target projection is located currently in the target sampling curve, or is a sampling curve which is located on the left side of the target projection and is closest to the target projection distance in the target sampling curve, or is a sampling curve which is located on the right side of the target projection and is closest to the target projection distance in the target sampling curve.
Accordingly, the processor 30b, when performing collision detection on the target projection and the projection of the obstacle, is specifically configured to: judging whether the projection of the target moved to the first projection position overlaps with the projection of the obstacle or not and judging whether the center of the projection of the target moved to the first projection position falls into the expansion area or not according to the size of the projection of the target, the projection position information of the obstacle and the expansion area in the cost map; and if the judgment result is negative, determining that the projection of the target at the first projection position does not collide with the projection of the obstacle.
In some embodiments, the processor 30b is further configured to: the current position of the autonomous mobile device is taken as a root node. Accordingly, the processor 30b, when determining the travelable space of the autonomous mobile device around the respective obstacle, is specifically configured to: under the condition that the projection of the target projection at the first projection position does not collide with the projection of the obstacle, performing forward heuristic search based on the first projection position to determine a subtree of a root node; and determining the travelable space of the autonomous mobile equipment around the corresponding obstacle according to the search tree formed by the root node and the subtree of the root node.
Further, when determining the subtree of the root node, the processor 30b is specifically configured to: judging whether the first projection position is positioned on the first sampling curve or not; if the judgment result is yes, taking the first projection position as a child node of the root node; if the judgment result is negative, taking a second projection position on the first sampling curve, which has the same longitudinal coordinate with the first projection position, as a child node of the root node; continuing forward heuristic search based on the explored child nodes to determine lower-layer child nodes of the explored child nodes until the explored child nodes reach a transverse target line where the destination is located; and taking the child nodes reaching the horizontal target line as leaf nodes of the search tree.
Optionally, when determining the lower-layer child node of the explored child node, the processor 30b is specifically configured to: for the searched first sub-node, under the condition that the first sub-node does not reach the transverse target line, moving the target projection forward from the position corresponding to the first sub-node to the direction of the second sampling curve by a set distance so as to simulate the autonomous mobile equipment to move to the destination, and obtaining a third projection position of the target projection; performing collision detection on the target projection at the third projection position and the projection of the obstacle according to the size of the target projection, the position information of the obstacle in the cost map and the expansion area in the cost map; and if the projection of the target at the third projection position does not collide with the projection of the obstacle, determining a lower-layer child node of the first child node based on the third projection position.
Correspondingly, if the third projection position of the target projection at the third projection position collides with the projection of the obstacle, the target projection is moved by a set distance along the transverse direction of the first sub-node to determine a moved fourth projection position; performing collision detection on the projection of the obstacle and the target projection at the fourth projection position according to the size of the target projection, the projection position information of the obstacle and the expansion area in the cost map; and if the target projection at the fourth projection position does not collide with the obstacle, determining the lower-layer child node of the first child node based on the fourth projection position. Correspondingly, if the projection of the target at the fourth projection position collides with the projection of the obstacle, the first child node is taken as a leaf node of the search tree.
The second sampling curve is a sampling curve where the first sub-node is located; or, the sampling curve is positioned on the left side of the first sub-node in the target sampling curve and is closest to the first sub-node; or the sampling curve which is positioned at the right side of the first sub-node and is closest to the first sub-node in the target sampling curve.
Optionally, the processor 30b, when determining the travelable space of the autonomous mobile device around the respective obstacle, is specifically configured to: and traversing the search tree to obtain a travelable space of the autonomous mobile equipment around the corresponding obstacle.
Optionally, the processor 30b is further configured to: and planning a path of the autonomous mobile equipment according to the travelable space of the autonomous mobile equipment around the corresponding obstacle to obtain the navigation path of the autonomous mobile equipment.
The autonomous mobile device provided by the embodiment can establish a cost map of an area space where the autonomous mobile device is located in a set coordinate system by using the set coordinate system; and the travelable space of the autonomous mobile equipment is explored around the obstacle based on the cost map, so that the exploration of blank areas is reduced, and the exploration efficiency of the travelable space is improved.
Fig. 4 is a schematic structural diagram of a computing device according to an embodiment of the present application. As shown in fig. 4, the computing device includes: a memory 40a and a processor 40 b.
In the present embodiment, the memory 40a is used for storing a computer program.
The processor 40b is coupled to the memory 40a for executing a computer program for: obtaining a cost map of an area where the autonomous mobile equipment is located under a set coordinate system; controlling the target projection of the autonomous mobile equipment in a set coordinate system to move around the projection of the obstacle according to the obstacle projection position information of the obstacle in the cost map so as to obtain target projection position information generated in the cost map in the process that the target projection moves to the destination; collision detection is carried out on the projection of the target and the projection of the obstacle based on the target projection position information and the obstacle projection position information; and determining the travelable space of the autonomous mobile equipment around the corresponding obstacle according to the collision detection result.
Alternatively, a Frenet coordinate system is set. Accordingly, the processor 40b is further configured to: and establishing a Frenet coordinate system by taking a center line of a road between the current position of the autonomous mobile equipment and the destination as a reference line and a point with the shortest distance between the reference line and the autonomous mobile equipment as a starting point of the reference line.
Accordingly, when acquiring the cost map of the area where the autonomous mobile device is located in the set coordinate system, the processor 40b is specifically configured to: projecting the obstacle in the area where the autonomous mobile device is located into a Frenet coordinate system to obtain a projection of the obstacle; determining the expansion area of the obstacle in a Frenet coordinate system according to the size of the target projection and the projection position information of the obstacle; and constructing a cost map according to the projection position information of the obstacle and the expansion area of the obstacle in the Frenet coordinate system.
In some embodiments, the set coordinate system is a Frenet coordinate system. The processor 40b is further configured to: respectively arranging a plurality of sampling curves on the left side and the right side of a reference line of a Frenet coordinate system according to a set sampling interval; the plurality of sampling curves are parallel curves of the reference line.
Accordingly, the processor 40b, when controlling the target projection of the autonomous mobile apparatus in the set coordinate system to move around the projection of the obstacle, is specifically configured to: determining sampling curves distributed with obstacles in the cost map from the plurality of sampling curves according to the projection position information of the obstacles; acquiring a boundary sampling curve closest to the autonomous mobile equipment from the sampling curves distributed with the obstacles; acquiring a target sampling curve without obstacle distribution between the autonomous mobile equipment and the boundary sampling curve; and controlling the target projection to move forward towards the direction of the target sampling curve so as to obtain target projection position information generated in the cost map when the target projection moves towards the destination.
Optionally, the processor 40b, when determining the target sampling curve of the obstacle-free distribution in the cost map, is further configured to: acquiring the distribution of the obstacles on the slot line according to the projection position information of the obstacles; the groove line is a transverse axis perpendicular to a tangent of the starting point of the reference line; acquiring a slot position occupied by the barrier according to the distribution of the barrier on the slot line; and taking the sampling curve of the slot position occupied by the barrier as the sampling curve distributed with the barrier.
Further, when determining the sampling curves with obstacles distributed in the cost map, the processor 40b is specifically configured to: acquiring the distribution of the obstacles on the slot line according to the position information of the obstacles in the cost map; the groove line is a transverse axis perpendicular to a tangent of the starting point of the reference line; acquiring a slot position occupied by the barrier according to the distribution of the barrier on the slot line; and taking the sampling curve corresponding to the slot occupied by the barrier as the sampling curve distributed with the barrier.
Optionally, when the processor 40b controls the target projection to move towards the direction of the target sampling curve, it is specifically configured to: and moving the target projection forward from the current position to the direction of the first sampling curve by a set distance to obtain a first projection position of the target projection in the cost map.
The first sampling curve is a sampling curve where the target projection is located currently in the target sampling curve, or is a sampling curve which is located on the left side of the target projection and is closest to the target projection distance in the target sampling curve, or is a sampling curve which is located on the right side of the target projection and is closest to the target projection distance in the target sampling curve.
Accordingly, the processor 40b, when performing collision detection on the target projection and the projection of the obstacle, is specifically configured to: judging whether the projection of the target moved to the first projection position overlaps with the projection of the obstacle or not and judging whether the center of the projection of the target moved to the first projection position falls into the expansion area or not according to the size of the projection of the target, the projection position information of the obstacle and the expansion area in the cost map; and if the judgment result is negative, determining that the projection of the target at the first projection position does not collide with the projection of the obstacle.
In some embodiments, the processor 40b is further configured to: the current position of the autonomous mobile device is taken as a root node. Accordingly, the processor 40b, when determining the travelable space of the autonomous mobile device around the respective obstacle, is specifically configured to: under the condition that the projection of the target projection at the first projection position does not collide with the projection of the obstacle, performing forward heuristic search based on the first projection position to determine a subtree of a root node; and determining the travelable space of the autonomous mobile equipment around the corresponding obstacle according to the search tree formed by the root node and the subtree of the root node.
Further, when determining the subtree of the root node, the processor 40b is specifically configured to: judging whether the first projection position is positioned on the first sampling curve or not; if the judgment result is yes, taking the first projection position as a child node of the root node; if the judgment result is negative, taking a second projection position on the first sampling curve, which has the same longitudinal coordinate with the first projection position, as a child node of the root node; continuing forward heuristic search based on the explored child nodes to determine lower-layer child nodes of the explored child nodes until the explored child nodes reach a transverse target line where the destination is located; and taking the child nodes reaching the horizontal target line as leaf nodes of the search tree.
Optionally, when determining the lower-layer child node of the explored child node, the processor 40b is specifically configured to: for the searched first sub-node, under the condition that the first sub-node does not reach the transverse target line, moving the target projection forward from the position corresponding to the first sub-node to the direction of the second sampling curve by a set distance so as to simulate the autonomous mobile equipment to move to the destination, and obtaining a third projection position of the target projection; performing collision detection on the target projection at the third projection position and the projection of the obstacle according to the size of the target projection, the position information of the obstacle in the cost map and the expansion area in the cost map; and if the projection of the target at the third projection position does not collide with the projection of the obstacle, determining a lower-layer child node of the first child node based on the third projection position.
Correspondingly, if the third projection position of the target projection at the third projection position collides with the projection of the obstacle, the target projection is moved by a set distance along the transverse direction of the first sub-node to determine a moved fourth projection position; performing collision detection on the projection of the obstacle and the target projection at the fourth projection position according to the size of the target projection, the projection position information of the obstacle and the expansion area in the cost map; and if the target projection at the fourth projection position does not collide with the obstacle, determining the lower-layer child node of the first child node based on the fourth projection position. Correspondingly, if the projection of the target at the fourth projection position collides with the projection of the obstacle, the first child node is taken as a leaf node of the search tree.
The second sampling curve is a sampling curve where the first sub-node is located; or, the sampling curve is positioned on the left side of the first sub-node in the target sampling curve and is closest to the first sub-node; or the sampling curve which is positioned at the right side of the first sub-node and is closest to the first sub-node in the target sampling curve.
Optionally, the processor 40b, when determining the travelable space of the autonomous mobile device around the respective obstacle, is specifically configured to: and traversing the search tree to obtain a travelable space of the autonomous mobile equipment around the corresponding obstacle.
Optionally, the processor 40b is further configured to: and planning a path of the autonomous mobile equipment according to the travelable space of the autonomous mobile equipment around the corresponding obstacle to obtain the navigation path of the autonomous mobile equipment.
In some optional embodiments, as shown in fig. 4, the computer device may further include: optional components such as a communications component 40c, a power component 40d, a display component 40e, and an audio component 40 f. Only some of the components shown in fig. 4 are schematically shown, and it is not meant that the computer device must include all of the components shown in fig. 4, nor that the computer device only includes the components shown in fig. 4.
The computing device provided by the embodiment can establish a cost map of the area space where the autonomous mobile device is located in the set coordinate system by using the set coordinate system; and the travelable space of the autonomous mobile equipment is explored around the obstacle based on the cost map, so that the exploration of blank areas is reduced, and the exploration efficiency of the travelable space is improved.
In embodiments of the present application, the memory is used to store computer programs and may be configured to store other various data to support operations on the device on which it is located. Wherein the processor may execute a computer program stored in the memory to implement the corresponding control logic. The memory may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
In the embodiments of the present application, the processor may be any hardware processing device that can execute the above described method logic. Alternatively, the processor may be a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), or a Micro Controller Unit (MCU); programmable devices such as Field-Programmable Gate arrays (FPGAs), Programmable Array Logic devices (PALs), General Array Logic devices (GAL), Complex Programmable Logic Devices (CPLDs), etc. may also be used; or Advanced Reduced Instruction Set (RISC) processors (ARM), or System On Chips (SOC), etc., but is not limited thereto.
In embodiments of the present application, the communication component is configured to facilitate wired or wireless communication between the device in which it is located and other devices. The device in which the communication component is located can access a wireless network based on a communication standard, such as WiFi, 2G or 3G, 4G, 5G or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component may also be implemented based on Near Field Communication (NFC) technology, Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, or other technologies.
In the embodiment of the present application, the display assembly may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display assembly includes a touch panel, the display assembly may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
In embodiments of the present application, a power supply component is configured to provide power to various components of the device in which it is located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
In embodiments of the present application, the audio component may be configured to output and/or input audio signals. For example, the audio component includes a Microphone (MIC) configured to receive an external audio signal when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals. For example, for devices with language interaction functionality, voice interaction with a user may be enabled through an audio component, and so forth.
It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (15)

1. A travelable space planning method, comprising:
obtaining a cost map of an area where the autonomous mobile equipment is located under a set coordinate system;
controlling the target projection of the autonomous mobile equipment in a set coordinate system to move around the projection of the obstacle according to the obstacle projection position information of the obstacle in the cost map so as to obtain target projection position information generated in the cost map in the process that the target projection moves to the destination;
performing collision detection on the projection of the target and the projection of the obstacle based on the target projection position information and the obstacle projection position information;
and determining the travelable space of the autonomous mobile equipment around the corresponding obstacle according to the collision detection result.
2. The method of claim 1, wherein the set coordinate system is a Frenet coordinate system, the method further comprising:
respectively arranging a plurality of sampling curves on the left side and the right side of a reference line of a Frenet coordinate system according to a set sampling interval, wherein the plurality of sampling curves are parallel curves of the reference line;
the controlling the target projection of the autonomous mobile equipment in the set coordinate system to move around the projection of the obstacle according to the obstacle projection position information of the obstacle in the cost map comprises:
determining sampling curves distributed with obstacles in the cost map from the plurality of sampling curves according to the obstacle projection position information;
acquiring a boundary sampling curve closest to the autonomous mobile equipment from the sampling curves distributed with the obstacles;
acquiring a target sampling curve of barrier-free distribution between the autonomous mobile equipment and the boundary sampling curve;
and controlling the target projection to move forwards towards the direction of the target sampling curve so as to obtain target projection position information generated in the cost map when the target projection moves towards the destination.
3. The method according to claim 2, wherein the determining a sampling curve in which obstacles are distributed in the cost map according to obstacle projection position information of the obstacles in the cost map comprises:
acquiring the distribution of the obstacles on a slot line according to the projection position information of the obstacles; the groove line is a transverse axis perpendicular to a tangent of the starting point of the reference line;
acquiring a slot position occupied by the barrier according to the distribution of the barrier on the slot line;
and taking the sampling curve of the slot position occupied by the barrier as the sampling curve distributed with the barrier.
4. The method of claim 2, wherein the controlling the target projection to move toward the target sampling curve comprises:
moving the target projection forward from the current position to a direction of a first sampling curve by a set distance to obtain a first projection position of the target projection in the cost map;
the first sampling curve is a sampling curve where the target projection is located currently in the target sampling curve, or is a sampling curve located on the left side of the target projection and closest to the target projection distance in the target sampling curve, or is a sampling curve located on the right side of the target projection and closest to the target projection distance in the target sampling curve.
5. The method of claim 4, wherein the collision detecting the projection of the target projection and the obstacle based on the target projection location information and the obstacle projection location information comprises:
judging whether the projection of the target moved to the first projection position overlaps with the projection of the obstacle or not and judging whether the center of the projection of the target moved to the first projection position falls into the expansion area or not according to the size of the projection of the target, the projection position information of the obstacle and the expansion area in the cost map;
and if the judgment result is negative, determining that the projection of the target at the first projection position does not collide with the projection of the obstacle.
6. The method of claim 5, further comprising:
taking the current position of the autonomous mobile device as a root node;
the determining a travelable space of the autonomous mobile device according to the collision detection result includes:
performing a forward heuristic search based on the first projection location to determine a sub-tree of the root node if the projection of the target does not collide with the projection of the obstacle at the first projection location;
and determining the travelable space of the autonomous mobile equipment around the corresponding obstacle according to the root node and the search tree formed by the subtrees of the root node.
7. The method of claim 6, wherein the performing a forward heuristic search based on the first projection location to determine the subtree of the root node comprises:
judging whether the first projection position is positioned on the first sampling curve or not;
if the judgment result is yes, taking the first projection position as a child node of the root node;
if the judgment result is negative, taking a second projection position on the first sampling curve, which has the same longitudinal coordinate with the first projection position, as a child node of the root node;
continuing forward heuristic search based on the explored child nodes to determine lower-layer child nodes of the explored child nodes until the explored child nodes reach a transverse target line where the destination is located;
and taking the child node reaching the transverse target line as a leaf node of the search tree.
8. The method of claim 7, wherein the continuing a forward heuristic search based on the explored child node to determine lower level child nodes of the explored child node comprises:
for the explored first sub node, under the condition that the first sub node does not reach the transverse target line, moving the target projection forward from the position corresponding to the first sub node to a set distance in the direction of a second sampling curve so as to simulate the autonomous mobile device to move to the destination, and obtaining a third projection position of the target projection;
performing collision detection on the target projection at the third projection position and the projection of the obstacle according to the size of the target projection, the position information of the obstacle in the cost map and the expansion area in the cost map;
if the projection of the target at the third projection position does not collide with the projection of the obstacle, determining a lower-layer child node of the first child node based on the third projection position;
wherein the second sampling curve is a sampling curve where the first sub-node is located; or, the sampling curve is positioned on the left side of the first sub-node in the target sampling curve and is closest to the first sub-node; or, a sampling curve which is positioned on the right side of the first sub-node and is closest to the first sub-node in the target sampling curve.
9. The method of claim 8, further comprising:
if the third projection position of the target projection at the third projection position collides with the projection of the obstacle, moving the target projection along the transverse direction of the first child node by a set distance to determine a moved fourth projection position;
according to the size of the target projection, the projection position information of the obstacle and the expansion area in the cost map, carrying out collision detection on the projection of the target projection and the obstacle at a fourth projection position;
and if the target projection at the fourth projection position does not collide with the obstacle, determining a lower-layer child node of the first child node based on the fourth projection position.
10. The method of claim 9, further comprising:
and if the projection of the target at the fourth projection position collides with the projection of the barrier, taking the first child node as a leaf node of the search tree.
11. The method of claim 8, wherein the determining a travelable space for the autonomous mobile device from a search tree formed by the root node and a sub-tree of the root node comprises:
and traversing the search tree to obtain a travelable space of the autonomous mobile equipment around the corresponding obstacle.
12. The method of claim 2, further comprising:
establishing a Frenet coordinate system by taking a center line of a road between the autonomous mobile device and a destination as the reference line and a point, which is on the reference line and has the shortest distance with the autonomous mobile device, as a starting point of the reference line;
the acquiring the cost map of the area where the autonomous mobile device is located under the set coordinate system includes:
projecting an obstacle in an area where the autonomous mobile device is located into a Frenet coordinate system to obtain a projection of the obstacle;
determining an expansion area of the obstacle in a Frenet coordinate system according to the size of the target projection and the projection position information of the obstacle;
and constructing the cost map according to the projection position information of the obstacle and the expansion area of the obstacle in the Frenet coordinate system.
13. The method of any of claims 1-12, further comprising:
and planning a path of the autonomous mobile equipment according to the travelable space of the autonomous mobile equipment around the corresponding obstacle to obtain the navigation path of the autonomous mobile equipment.
14. A computing device, comprising: a memory and a processor; wherein the memory is used for storing a computer program;
the processor is coupled to the memory for executing the computer program for performing the steps of the method of any of claims 1-13.
15. An autonomous mobile device, comprising: a machine body and the computing device of claim 14 disposed within the machine body.
CN202110064822.6A 2021-01-18 2021-01-18 Method and device for planning travelable space Pending CN114815791A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110064822.6A CN114815791A (en) 2021-01-18 2021-01-18 Method and device for planning travelable space

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110064822.6A CN114815791A (en) 2021-01-18 2021-01-18 Method and device for planning travelable space

Publications (1)

Publication Number Publication Date
CN114815791A true CN114815791A (en) 2022-07-29

Family

ID=82524487

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110064822.6A Pending CN114815791A (en) 2021-01-18 2021-01-18 Method and device for planning travelable space

Country Status (1)

Country Link
CN (1) CN114815791A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115930969A (en) * 2023-01-09 2023-04-07 季华实验室 Path planning method and device for mobile robot, electronic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115930969A (en) * 2023-01-09 2023-04-07 季华实验室 Path planning method and device for mobile robot, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
Qin et al. Autonomous exploration and mapping system using heterogeneous UAVs and UGVs in GPS-denied environments
EP3623759B1 (en) A computer-implemented method and a system for defining a path for a vehicle within an environment with obstacles
US10705528B2 (en) Autonomous visual navigation
Schwesinger et al. Automated valet parking and charging for e-mobility
CN113405558B (en) Automatic driving map construction method and related device
CN106774347A (en) Robot path planning method, device and robot under indoor dynamic environment
KR20200096115A (en) Method and device for short-term path planning of autonomous driving through information fusion by using v2x communication and image processing
US20140334713A1 (en) Method and apparatus for constructing map for mobile robot
JP2006350776A (en) Traveling object route generating device
US20220073101A1 (en) Autonomous Parking with Hybrid Exploration of Parking Space
CN112284393A (en) Global path planning method and system for intelligent mobile robot
CN116045998A (en) Context-aware path planning for autonomously driven vehicles using dynamic step search
CN115079702B (en) Intelligent vehicle planning method and system under mixed road scene
Bonetto et al. irotate: Active visual slam for omnidirectional robots
CN114815791A (en) Method and device for planning travelable space
Benelmir et al. An efficient autonomous vehicle navigation scheme based on LiDAR sensor in vehicular network
Sedighi et al. Implementing voronoi-based guided hybrid a* in global path planning for autonomous vehicles
Zhao et al. Complete coverage path planning scheme for autonomous navigation ROS-based robots
Martin Interactive motion prediction using game theory
CN114610042A (en) Robot path dynamic planning method and device and robot
Hao et al. Cell a* for navigation of unmanned aerial vehicles in partially-known environments
Chang et al. On-road Trajectory Planning with Spatio-temporal Informed RRT
Kala Navigating multiple mobile robots without direct communication
Desiderio Development of a modified A-Star Algorithm for path planning
Premkumar et al. Combining geometric and information-theoretic approaches for multi-robot exploration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20230705

Address after: Room 437, Floor 4, Building 3, No. 969, Wenyi West Road, Wuchang Subdistrict, Yuhang District, Hangzhou City, Zhejiang Province

Applicant after: Wuzhou Online E-Commerce (Beijing) Co.,Ltd.

Address before: Box 847, four, Grand Cayman capital, Cayman Islands, UK

Applicant before: ALIBABA GROUP HOLDING Ltd.

TA01 Transfer of patent application right