CN115240160A - Obstacle avoidance method, device and equipment for edge pasting cleaning of unmanned sweeper - Google Patents

Obstacle avoidance method, device and equipment for edge pasting cleaning of unmanned sweeper Download PDF

Info

Publication number
CN115240160A
CN115240160A CN202111423157.1A CN202111423157A CN115240160A CN 115240160 A CN115240160 A CN 115240160A CN 202111423157 A CN202111423157 A CN 202111423157A CN 115240160 A CN115240160 A CN 115240160A
Authority
CN
China
Prior art keywords
point
obstacle
obstacles
unmanned
rule
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111423157.1A
Other languages
Chinese (zh)
Inventor
黄超
叶玥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Xiantu Intelligent Technology Co Ltd
Original Assignee
Shanghai Xiantu Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Xiantu Intelligent Technology Co Ltd filed Critical Shanghai Xiantu Intelligent Technology Co Ltd
Priority to CN202111423157.1A priority Critical patent/CN115240160A/en
Priority to PCT/CN2022/071307 priority patent/WO2023092835A1/en
Publication of CN115240160A publication Critical patent/CN115240160A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application provides an obstacle avoidance method, an obstacle avoidance device and obstacle avoidance equipment for welting cleaning of an unmanned sweeper, wherein the method comprises the following steps: before collision with the obstacles, point clouds and images of the obstacles are obtained, the obtained point clouds are segmented, and points in the point clouds of the obstacles are classified; and then matching the points in the point clouds of the obstacles with a preset ignoring rule, determining whether to execute an ignoring process according to a matching result, and planning a cleaning path of the unmanned sweeper according to a processing result. Neglect treatment is introduced into the treatment mode of the obstacles, so that the bypassing frequency of the unmanned sweeper can be reduced, the welting cleaning quality is improved, and potential safety hazards are reduced. In addition, the preset neglect rule comprises a tree neglect rule, the point in the point cloud of each obstacle matched with the preset neglect rule needs to be located in a tree area marked in the off-line map, and the tree area is updated at regular time, so that the matching error and the influence of the change of the tree area caused by seasons and manual trimming on the matching accuracy can be reduced.

Description

Obstacle avoidance method, device and equipment for edge pasting cleaning of unmanned sweeper
Technical Field
The application relates to the field of intelligent driving, in particular to an obstacle avoidance method, device and equipment for edge-pasting cleaning of an unmanned sweeper.
Background
Urban garbage is easy to accumulate at the edge of a road which is difficult to clean, and the mode of manually driving the sanitation vehicle to clean the edge of the road requires a driver to be skilled in mastering good driving technology, so that the time cost and the labor cost are high. The unmanned sweeping vehicle is adopted for welting sweeping, so that sweeping cost can be reduced, but in order to avoid collision as much as possible, the unmanned sweeping vehicle performs bypassing operation on all recognized obstacles, so that sweeping effect is reduced, and potential safety hazards are brought.
Disclosure of Invention
In order to solve the problems in the prior art, the application provides an obstacle avoidance method, device and equipment for the edge-pasting sweeping of the unmanned sweeping vehicle.
According to a first aspect of the application, an obstacle avoidance method for welting cleaning of an unmanned sweeping vehicle is provided, and the method comprises the following steps:
before collision with an obstacle, acquiring point clouds of the obstacles and images of the obstacles, wherein points in the point clouds of the obstacles are obstacle points;
segmenting the point cloud of each obstacle;
classifying the obstacle points by combining the images of the obstacles and the segmentation result of the point cloud;
matching with a preset ignoring rule according to the type of the barrier point and data in the point cloud of each barrier, wherein the ignoring rule comprises a tree ignoring rule, the condition met by the barrier point matched with the tree ignoring rule comprises that the barrier point is located in a tree area marked in an offline map, and the tree area marked in the offline map is updated according to a preset period;
determining whether to execute neglect processing on the obstacle point according to a matching result, wherein the neglect processing is not to execute a bypassing operation;
and planning a cleaning path of the unmanned sweeper according to the processing result of the obstacle point.
According to the second aspect of this application, provide an obstacle-avoiding device when unmanned motor sweeper is cleaned along a border, the device includes:
the system comprises an acquisition module, a storage module and a processing module, wherein the acquisition module is used for acquiring point clouds of all obstacles and images of all the obstacles before collision with the obstacles, and points in the point clouds of all the obstacles are obstacle points;
the segmentation module is used for segmenting the point cloud of each obstacle;
the classification module is used for classifying the obstacle points by combining the images of the obstacles and the segmentation results of the point cloud;
the matching module is used for matching the types of the barrier points and data in the point cloud of each barrier with a preset ignoring rule, wherein the ignoring rule comprises a tree ignoring rule, the condition met by the barrier points matched with the tree ignoring rule comprises that the barrier points are located in a tree area marked in an offline map, and the tree area marked in the offline map is updated according to a preset period;
the determining module is used for determining whether to execute neglect processing on the obstacle point according to a matching result, wherein the neglect processing is not executed for bypassing operation;
and the planning module is used for planning the cleaning path of the unmanned sweeper according to the processing result of the obstacle point.
According to a third aspect of the present application, there is provided an electronic device comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor, when executing the executable instructions, is configured to implement the method of the first aspect.
According to a fourth aspect of embodiments herein, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of the first aspect described above.
The technical scheme provided by the embodiment of the application can have the following beneficial effects:
in the embodiment of the application, before the collision with the obstacle occurs, when the garbage at the edge of the road is cleaned by the unmanned sweeper, neglect treatment is introduced except for bypassing the obstacle, namely, bypassing operation is not performed on the identified obstacle. The method comprises the steps of classifying points (namely obstacle points) in point clouds of all obstacles according to obtained point clouds of all obstacles and image information of all the obstacles, matching the points (namely the obstacle points) with preset ignoring rules by combining data in the point clouds of all the obstacles and types of all the obstacle points, determining whether to ignore all the obstacle points according to matching effects, and planning a cleaning path according to whether to execute ignoring operation. Because the unmanned cleaning vehicle does not perform bypassing processing on all obstacles any longer, the problem of reduced cleaning quality and potential safety hazards caused by frequent bypassing of the unmanned cleaning vehicle can be solved. In addition, the obstacle point of the matching tree neglecting rule needs to be located in a tree area marked in the offline map to reduce errors possibly generated during matching, and the tree area marked in the offline map is updated regularly to reduce the influence on the accuracy of the matching result due to seasonal changes, manual trimming and the like.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
Fig. 1 is a flowchart illustrating an obstacle avoidance method for edge-mounted sweeping of an unmanned sweeping vehicle according to an exemplary embodiment of the present application.
FIG. 2 is a schematic diagram illustrating a tree region marked in an offline map according to an exemplary embodiment of the present application.
Fig. 3 is a schematic view of an unmanned sweeper vehicle shown in the present application for handling brush obstacles according to an exemplary embodiment.
Fig. 4 is a flowchart illustrating an obstacle avoidance method for edge-mounted sweeping of an unmanned sweeping vehicle according to another exemplary embodiment of the present application.
Fig. 5 is a schematic structural diagram of an obstacle avoidance device for edgewise cleaning by an unmanned sweeping vehicle according to an exemplary embodiment of the present application.
Fig. 6 is a schematic structural diagram of an electronic device shown in the present application according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. The following description refers to the accompanying drawings in which the same numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, this information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if," as used herein, may be interpreted as "at \8230; \8230when" or "when 8230; \823030when" or "in response to a determination," depending on the context.
Urban garbage is easy to accumulate at the edge of a road which is difficult to clean, and the mode of manually driving the sanitation truck is adopted for carrying out edge-pasting cleaning, so that a driver is required to master good driving technology and operation technology, and the time cost and the labor cost are higher. The sweeping cost can be reduced by adopting the unmanned sweeper for sweeping the welt, and the task of sweeping the welt can be completed without manual intervention. However, in order to avoid collision as much as possible to ensure driving safety, the unmanned sweeping vehicle performs a detour operation on all the recognized obstacles, and it is not further determined whether the obstacles actually collide therewith, thereby deteriorating a sweeping effect. In addition, the unmanned sweeper can frequently occupy an outer lane due to frequent detour, so that potential safety hazards are brought.
The application provides an obstacle avoidance method for the edge-attaching cleaning of the unmanned sweeper, and neglects the obstacles. Classifying the recognized obstacles before the obstacles collide, judging whether the recognized obstacles actually collide with the unmanned sweeper or not according to the information such as the type, the height, the position and the like of the obstacles, judging whether the driving safety of the unmanned sweeper is influenced or not after the recognized obstacles collide, and executing neglect processing on the obstacles which do not actually collide or influence the driving safety after the recognized obstacles collide, namely not executing the bypassing operation. The judgment method comprises the steps of matching the various pieces of information of the barrier with a preset rule, and determining whether to execute detour processing according to a matching result. As shown in fig. 1, the above-mentioned obstacle avoidance method is described in detail next.
S101, before collision with an obstacle, acquiring point clouds of the obstacles and images of the obstacles, wherein points in the point clouds of the obstacles are obstacle points;
when the unmanned sweeping vehicle carries out welt cleaning, the existing obstacles are identified and processed before the unmanned sweeping vehicle collides with the obstacles. The point cloud of each obstacle includes a plurality of points, and the points in the point cloud of each obstacle are referred to as obstacle points in the present application. The acquired point cloud of each obstacle contains three-dimensional space information of each obstacle point, and each obstacle point corresponds to a three-dimensional space coordinate, so that the height information of each obstacle point can be reflected. The image of each obstacle is acquired in order to classify each obstacle point. There are various ways of acquiring point clouds, such as using laser radar, binocular camera, and depth camera, and each point in different point clouds according to the acquiring way may further include color information (RGB value) and reflection intensity information.
S102, segmenting the point cloud of each obstacle;
the acquired point cloud of each obstacle is actually a set of a group of points, and it is unknown to which obstacle the points (i.e., the obstacle points) belong, so that the point cloud of each obstacle is divided, that is, the obstacle points belonging to the same obstacle are divided into one block, so that different obstacles are distinguished, and the obstacle points in the divided point cloud of the same obstacle have similar characteristics.
S103, classifying the obstacle points by combining the images of the obstacles and the point cloud segmentation result;
and the point clouds of the partitioned obstacles are in one-to-one correspondence with the obstacles in the acquired image, so that the type of the point clouds of each partitioned obstacle is known. For example, the point clouds of the respective partitioned obstacles may be projected onto the image of the respective partitioned obstacles, so that the types of the obstacle points in the point clouds of the respective partitioned obstacles may be known according to the positions where the point clouds of the respective partitioned obstacles are projected.
S104, matching according to the types of the obstacle points and data in point clouds of the obstacles with preset ignoring rules, wherein the ignoring rules comprise tree ignoring rules, the condition met by the obstacle points matched with the tree ignoring rules comprises that the obstacle points are located in a tree area marked in an off-line map, and the tree area marked in the off-line map is updated according to a preset period;
the data in the point cloud of each obstacle is information of each obstacle point included in the point cloud of each obstacle, including three-dimensional space coordinates of each obstacle point, and the information of each obstacle point according to different obstacle point cloud acquisition modes may further include color information (RGB value) and reflection intensity information. The preset ignoring rule is formulated for obstacles which do not actually collide with the unmanned sweeping vehicle or do not affect the driving safety of the unmanned sweeping vehicle after collision, wherein the obstacles include a tree ignoring rule, and an obstacle point matched with the tree ignoring rule needs to be located in a tree area 110 marked in an offline map, namely, the closed polygonal area 110 shown in fig. 2. In addition, since the tree region may change due to reasons such as season, manual trimming, and the like, so as to affect the accuracy of the matching result, the tree region 110 marked in the offline map needs to be updated periodically, and the updating period may be set according to the season change or the historical operation data, or may be set by referring to other factors, which is not limited in the present application.
The tree regions 110 marked in the offline map can be generated by manual marking, but this method requires a large amount of work to be implemented in a large range of multiple regions, and is hardly feasible. Therefore, in an embodiment of the present application, an automatic generation method is adopted, and in order to ensure the reliability of the data, the tree region 110 marked in the offline map is generated based on the historical point cloud of the tree obstacles collected by at least one unmanned sweeping vehicle in at least one day. The process of automatically generating the tree region 110 marked in the offline map may include the following steps of firstly obtaining historical point clouds of the tree obstacles collected by at least one unmanned sweeper in at least one day, then filtering false detection points in the historical point clouds, then clustering the filtered historical point clouds 130 (shown in fig. 2), and finally performing convex encapsulation on the point clouds of which the number of points is greater than a preset value in the various clustered historical point clouds to generate the tree region 110 marked in the offline map, wherein the preset value is set according to the actual operation condition of the unmanned sweeper and the self requirement. Because the obtained historical point cloud family density has little change, the family number is not fixed, and the processing speed is not strictly required when offline processing is performed, various clustering algorithms are adopted in the process of generating the tree region 110 marked in the offline map, such as a KNN algorithm, a K-means algorithm and a Mean-shift algorithm. In addition, algorithms adopted for performing convex wrapping on the point clouds in which the number of the points in the various clustered historical point clouds is larger than a preset value also include various algorithms, such as Graham Scan algorithm, jarvis algorithm and Melkman algorithm.
In another embodiment of the present application, the filtering of the false detection points in the process of generating the tree region 110 marked in the offline map includes dividing the offline map into blocks, performing three-dimensional rasterization on the divided blocks, projecting the obtained historical point cloud into each grid, filtering the points in the historical point cloud in each grid which do not satisfy a preset voting mechanism, and finally combining the filtered historical point clouds in each grid. The preset voting mechanism can be set according to historical operation data of the unmanned sweeper, for example, if a block is divided into n days of collected historical point clouds, and if more than 2n/3 days of the collected historical point clouds have projected points in a certain grid in the block, the point clouds in the grid are adopted, otherwise, the historical point clouds projected into the grid are all regarded as false detection points.
The tree neglect rule is directed against the tree class barrier that is located the road edge face, and wherein including partial branch and leaf lean out the road edge face, and the tree class barrier that the leaf that hangs down can collide but does not influence driving safety with unmanned motor sweeper. In an embodiment of the present application, in addition to meeting the condition of the tree region 110 marked in the off-line map, the barrier points matched with the tree ignoring rule also need to meet the requirement that the type of the barrier points is trees, the barrier points are within the preset range of the working range of the unmanned sweeping vehicle, and the height of the barrier points is greater than the preset height. The value of the preset range is set as required, and can be a circle with the unmanned sweeping vehicle as the center of the circle or a rectangular area where the unmanned sweeping vehicle is located, and the application is not limited herein. The preset height can be set according to the body type, the height of the vehicle body and historical operation data of the unmanned sweeper under the condition that the driving safety is not influenced even if collision occurs.
Besides the tree obstacles for which the tree ignoring rule is intended, there are other obstacles that meet the two conditions for which the preset ignoring rule is formulated. In an embodiment of the application, the preset ignoring rules further include a first ignoring rule, and the first ignoring rule is for an obstacle which is located above a sweeping brush of the unmanned sweeping vehicle and does not collide with the unmanned sweeping vehicle, and the obstacle is located on the road surface but partially protrudes out of the road surface. For the sake of computational efficiency, the point cloud obtained is projected onto a two-dimensional plane when performing collision detection, and some obstacles above the brush but not actually colliding with the brush are considered to be likely to collide in the prior art and thus to be detoured, thereby affecting the effect of welt sweeping, such as ornamental shrubs 330 on a road surface, as shown in fig. 3. For convenience of understanding the various conditions in the first ignoring rule, as explained in conjunction with fig. 3, the conditions satisfied by the obstacle points matching the first ignoring rule include:
(1) The height 210 of the obstacle point is greater than the height 220 of the sweeper brush 320 in the unmanned sweeper;
(2) And the type of the obstacle point is non-human and non-vehicle;
(3) The obstacle point is within a preset range of the working range of the unmanned sweeper;
(4) And the barrier point is at the inner side of the road edge line segment; and the road edge line segment is generated according to the road edge detection result, the road edge line segment is positioned at the intersection of the vertical surface of the road edge and the running road surface of the unmanned sweeper, and the unmanned sweeper is positioned at the inner side of the road edge line segment. The line segment 400 in fig. 3 is in the same plane as the road vertical plane and perpendicular to the road segment, and is a point on the road segment when viewed from the top, and in fig. 3, the inside and the outside of the road segment can be understood as the road segment, and the side where the unmanned sweeping vehicle is located is the inside 401 of the road segment.
(5) And at least one point in the obstacle point cloud where the obstacle point is located is on the outer side 402 of the road edge line segment;
(6) And the transverse distance 230 between the obstacle point and the point which is closest to the unmanned sweeping vehicle and is at the inner side of the road section in the obstacle point cloud of the obstacle point is smaller than the distance 240 from the vehicle body 310 in the unmanned sweeping vehicle to the edge of the sweeper brush 320 in the unmanned sweeping vehicle.
In the case shown in fig. 3, the parts of the shrub 330 inside the road-following line segment 401 are all subjected to the ignoring process by the unmanned sweeping vehicle. It should be noted that fig. 3 is only schematic and does not represent all cases, and the height 210 of the obstacle point and the lateral distance 230 between the obstacle point and the point closest to the unmanned sweeping vehicle and on the inner side 401 of the road edge line segment in the obstacle point cloud where the obstacle point is located are determined according to the three-dimensional spatial positions of different obstacle points, and are not limited to the case shown in fig. 3.
And judging whether the obstacle point is positioned on the inner side or the outer side of the road edge line segment, calculating the distance from the obstacle point to the road edge line segment by using vector cross multiplication, setting the distance from the obstacle point on the inner side of the road edge line segment to be positive, setting the distance from the obstacle point on the outer side of the road edge line segment to be negative, or interchanging the corresponding positive and negative relations between the inner side and the outer side of the road edge line segment, and knowing that the obstacle point is positioned on the inner side or the outer side of the road edge line segment according to the positive and negative conditions of the calculated distance from the obstacle point to the road edge line segment.
In another embodiment of the application, the preset ignoring rules further include a second ignoring rule, the second ignoring rule is directed at an obstacle which is located on the road edge surface and cannot intersect with the unmanned sweeping vehicle at all, and the computing efficiency can be improved by executing the ignoring operation on the obstacle; and misdetection points which are misjudged as barriers due to the fact that the road edge is uneven are low in height and can collide with the sweeping brush but cannot influence the welt of the unmanned sweeping vehicle, and the quality of welt sweeping can be improved by neglecting the misdetection points. The condition that the obstacle point matching the second rule of omission satisfies includes: the obstacle point is in the preset range of the working range of the unmanned cleaning vehicle, the type of the obstacle point is non-man and non-vehicle, the type of the obstacle point is road edge or the outer side of the road edge line segment, the road edge line segment is generated according to the road edge detection result and is located at the intersection of the vertical surface of the road edge and the road surface where the unmanned cleaning vehicle runs, and the unmanned cleaning vehicle is located on the inner side of the road edge line segment. The method for determining whether the obstacle point is located inside or outside the route edge segment is similar to the above method, and is not described herein again.
S105, determining whether to execute neglect processing on the obstacle point according to a matching result, wherein the neglect processing is not to execute a bypassing operation;
in an embodiment of the present application, if an obstacle point matches with any preset ignoring rule consistently, the ignoring process is performed on the obstacle point.
And S106, planning a cleaning path of the unmanned sweeper according to the processing result of the obstacle point.
In the following, a preferred embodiment is described with reference to an obstacle avoidance method for sweeping an edge of an unmanned sweeping vehicle, which includes the following specific steps as shown in fig. 4:
s201, before collision with an obstacle, acquiring point clouds of obstacles in a working range of the unmanned sweeper by using a laser radar, and acquiring images of the obstacles by using a camera, wherein points in the point clouds of the obstacles are obstacle points, and each obstacle point corresponds to a three-dimensional space coordinate;
s202, segmenting the point cloud of each obstacle;
s203, projecting the segmented point clouds onto images of the obstacles, and classifying the obstacle points according to the positions of the point clouds projected on the images of the obstacles;
s204, matching according to the type of the obstacle point and the three-dimensional space coordinate of the obstacle point with a preset tree neglecting rule, wherein the condition that the obstacle point matched with the tree neglecting rule meets includes that the type of the obstacle point is a tree, the obstacle point is within a range of 1 meter of the radius of the unmanned sweeper truck, the height of the obstacle point is higher than 1 meter, the obstacle point is located in a tree area marked in an off-line map, and the tree area marked in the off-line map is updated according to a preset period;
s205, if the obstacle point is matched with the tree ignoring rule, ignoring processing is carried out on the obstacle point;
and S206, planning a cleaning path of the unmanned sweeper according to whether to execute neglect processing on the obstacle point.
Based on the above method embodiment, the present application provides an obstacle avoidance device for sweeping an edge of an unmanned sweeping vehicle, as shown in fig. 5, the device includes:
an obtaining module 510, configured to obtain a point cloud of each obstacle and an image of each obstacle before a collision with the obstacle occurs, where a point in the point cloud of each obstacle is an obstacle point;
a segmentation module 520, configured to segment the point cloud of each obstacle;
a classification module 530, configured to classify the obstacle points by combining the image of each obstacle and the segmentation result of the point cloud;
a matching module 540, configured to match the types of the obstacle points and the data in the point cloud of each obstacle with a preset ignoring rule, where the ignoring rule includes a tree ignoring rule, and a condition that the obstacle points matched with the tree ignoring rule meet includes that the obstacle points are located in a tree region marked in an offline map, and the tree region marked in the offline map is updated according to a preset period;
a determining module 550, configured to determine whether to perform an ignoring process on the obstacle point according to the matching result, where the ignoring process is not to perform a bypassing operation;
and the planning module 560 is used for planning the cleaning path of the unmanned sweeping vehicle according to the processing result of the obstacle point.
The implementation process of the functions and actions of each module in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
The embodiment of the obstacle avoidance device during the welt cleaning of the unmanned sweeper can be applied to electronic equipment, and the embodiment of the obstacle avoidance device can be realized through software or hardware or a combination mode of the software and the hardware. The software implementation is taken as an example, and as a device in a logical sense, a processor in which the device is located processes a file reads corresponding computer program instructions in the nonvolatile memory into the memory to run. From a hardware aspect, as shown in fig. 6, the hardware structure diagram of a computer device in which a file processing apparatus is located in this specification is shown, except for the processor 610, the memory 630, the network interface 620, and the nonvolatile memory 640 shown in fig. 6, an electronic device in which an apparatus 631 is located in this embodiment may also include other hardware according to an actual function of the electronic device, which is not described again.
In addition, corresponding to the method embodiment, the application also provides a computer readable storage medium, where the storage medium stores a computer program, and the computer program is used to execute the obstacle avoidance method for sweeping the edge of the unmanned sweeping vehicle.
The foregoing description of specific embodiments has been presented for purposes of illustration and description. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous. The various technical features in the above embodiments can be arbitrarily combined, so long as there is no conflict or contradiction between the combinations of the features, but the combination is limited by space and is not described one by one, and therefore, any combination of the various technical features in the above embodiments also belongs to the scope of the present application.
Other embodiments of the present disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This specification is intended to cover any variations, uses, or adaptations of the specification following, in general, the principles of the specification and including such departures from the present disclosure as come within known or customary practice within the art to which the specification pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the specification being indicated by the following claims.
The above description is only a preferred embodiment of the present disclosure, and should not be taken as limiting the present disclosure, and any modifications, equivalents, improvements, etc. made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.

Claims (10)

1. An obstacle avoidance method for edgewise cleaning of an unmanned sweeper is characterized by comprising the following steps:
before collision with an obstacle, acquiring point clouds of the obstacles and images of the obstacles, wherein points in the point clouds of the obstacles are obstacle points;
segmenting the point cloud of each obstacle;
classifying the obstacle points by combining the images of the obstacles and the segmentation result of the point cloud;
matching with a preset ignoring rule according to the type of the barrier point and data in the point cloud of each barrier, wherein the ignoring rule comprises a tree ignoring rule, the condition met by the barrier point matched with the tree ignoring rule comprises that the barrier point is located in a tree area marked in an offline map, and the tree area marked in the offline map is updated according to a preset period;
determining whether to execute ignoring processing on the obstacle point according to a matching result, wherein the ignoring processing is to execute no bypassing operation;
and planning a cleaning path of the unmanned sweeper according to the processing result of the obstacle point.
2. The method of claim 1, wherein the labeled tree regions in the offline map are generated based on historical point clouds of tree obstacles collected by at least one of the unmanned sweeping vehicles for at least one day.
3. The method of claim 2, wherein generating the labeled tree regions in the offline map comprises:
dividing the offline map into blocks;
performing three-dimensional rasterization processing on the divided blocks, and projecting the historical point cloud into each grid;
filtering points in the historical point cloud in the grid which do not meet a preset voting mechanism;
merging the filtered historical point clouds in the grids.
4. The method of claim 1, wherein the condition satisfied by the obstacle point matching the tree omission rule further comprises:
the type of the barrier point is trees, the barrier point is in the preset range of the working range of the unmanned sweeper and the height of the barrier point is larger than the preset height.
5. The method of claim 1, wherein the ignore rule further comprises a first ignore rule, and wherein the obstacle point matching the first ignore rule satisfies a condition comprising:
the height of the barrier point is greater than that of a sweeping brush in the unmanned sweeping vehicle;
and the type of the obstacle point is non-human and non-vehicle;
the obstacle point is within a preset range of the working range of the unmanned sweeper;
the obstacle point is positioned on the inner side of a road edge line segment, the road edge line segment is generated according to a road edge detection result and is positioned at the intersection of the vertical surface of the road edge and the road surface where the unmanned sweeping vehicle runs, and the unmanned sweeping vehicle is positioned on the inner side of the road edge line segment;
at least one point exists in the point cloud of the obstacle where the obstacle point is located and is positioned outside the road edge line segment;
and the transverse distance between the point in the obstacle point cloud where the obstacle point is located and the point which is closest to the unmanned sweeping vehicle and is on the inner side of the road edge line segment is smaller than the distance from the vehicle body in the unmanned sweeping vehicle to the edge of the sweeping brush in the unmanned sweeping vehicle.
6. The method of claim 1, wherein the ignore rules further comprise a second ignore rule, and wherein the condition satisfied by the obstacle point matching the second ignore rule comprises:
the obstacle point is within a preset range of the working range of the unmanned sweeping vehicle;
and the type of the obstacle point is non-human and non-vehicle;
the type of the barrier point is a road edge or the barrier point is on the outer side of the road edge line segment, the road edge line segment is generated according to the road edge detection result and is located at the intersection of the vertical surface of the road edge and the road surface where the unmanned sweeping vehicle runs, and the unmanned sweeping vehicle is located on the inner side of the road edge line segment.
7. The method according to claim 1, wherein the determining whether to perform the ignoring process for the obstacle point according to the matching result comprises:
and if the obstacle point is matched with any ignoring rule in a consistent manner, executing ignoring processing on the obstacle point.
8. The utility model provides an obstacle-avoiding device when unmanned motor sweeper welts are cleaned which characterized in that includes:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring point clouds of all obstacles and images of all the obstacles before collision with the obstacles, and points in the point clouds of all the obstacles are obstacle points;
the segmentation module is used for segmenting the point cloud of each obstacle;
the classification module is used for classifying the obstacle points by combining the images of the obstacles and the segmentation results of the point cloud;
the matching module is used for matching the types of the barrier points and data in the point cloud of each barrier with a preset ignoring rule, wherein the ignoring rule comprises a tree ignoring rule, the condition met by the barrier points matched with the tree ignoring rule comprises that the barrier points are located in a tree area marked in an offline map, and the tree area marked in the offline map is updated according to a preset period;
the determining module is used for determining whether to execute neglect processing on the obstacle point according to a matching result, wherein the neglect processing is to execute no bypassing operation;
and the planning module is used for planning the cleaning path of the unmanned sweeper according to the processing result of the obstacle point.
9. An electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor, when executing the executable instructions, is configured to implement the method of any one of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method of any one of claims 1 to 7.
CN202111423157.1A 2021-11-26 2021-11-26 Obstacle avoidance method, device and equipment for edge pasting cleaning of unmanned sweeper Pending CN115240160A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111423157.1A CN115240160A (en) 2021-11-26 2021-11-26 Obstacle avoidance method, device and equipment for edge pasting cleaning of unmanned sweeper
PCT/CN2022/071307 WO2023092835A1 (en) 2021-11-26 2022-01-11 Obstacle avoidance during edgewise sweeping of unmanned sweeper

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111423157.1A CN115240160A (en) 2021-11-26 2021-11-26 Obstacle avoidance method, device and equipment for edge pasting cleaning of unmanned sweeper

Publications (1)

Publication Number Publication Date
CN115240160A true CN115240160A (en) 2022-10-25

Family

ID=83665947

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111423157.1A Pending CN115240160A (en) 2021-11-26 2021-11-26 Obstacle avoidance method, device and equipment for edge pasting cleaning of unmanned sweeper

Country Status (2)

Country Link
CN (1) CN115240160A (en)
WO (1) WO2023092835A1 (en)

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9457718B2 (en) * 2014-12-19 2016-10-04 Caterpillar Inc. Obstacle detection system
CN105955275B (en) * 2016-05-26 2021-07-13 华讯方舟科技有限公司 Robot path planning method and system
KR20180059188A (en) * 2016-11-25 2018-06-04 연세대학교 산학협력단 Method of Generating 3d-Background Map Except Dynamic Obstacles Using Deep Learning
GB201803292D0 (en) * 2018-02-28 2018-04-11 Five Ai Ltd Efficient computation of collision probabilities for safe motion planning
CN110515095B (en) * 2019-09-29 2021-09-10 北京智行者科技有限公司 Data processing method and system based on multiple laser radars
CN111796299A (en) * 2020-06-10 2020-10-20 东风汽车集团有限公司 Obstacle sensing method and device and unmanned sweeper
CN111736603B (en) * 2020-06-22 2023-06-09 广州赛特智能科技有限公司 Unmanned sweeper and long-distance welt sweeping method thereof
CN113377111A (en) * 2021-06-30 2021-09-10 杭州电子科技大学 Task scheduling system and method for unmanned sweeper
CN113467457A (en) * 2021-07-08 2021-10-01 无锡太机脑智能科技有限公司 Graph optimization path planning method for edge-pasting sweeping of unmanned sanitation vehicle

Also Published As

Publication number Publication date
WO2023092835A1 (en) 2023-06-01

Similar Documents

Publication Publication Date Title
US11709058B2 (en) Path planning method and device and mobile device
US10031231B2 (en) Lidar object detection system for automated vehicles
WO2022099511A1 (en) Method and apparatus for ground segmentation based on point cloud data, and computer device
CN108873013B (en) Method for acquiring passable road area by adopting multi-line laser radar
Azim et al. Detection, classification and tracking of moving objects in a 3D environment
CN110674705B (en) Small-sized obstacle detection method and device based on multi-line laser radar
Moras et al. Credibilist occupancy grids for vehicle perception in dynamic environments
WO2022016311A1 (en) Point cloud-based three-dimensional reconstruction method and apparatus, and computer device
WO2022188663A1 (en) Target detection method and apparatus
WO2022099530A1 (en) Motion segmentation method and apparatus for point cloud data, computer device and storage medium
CN112526993A (en) Grid map updating method and device, robot and storage medium
CN113008296B (en) Method for detecting the environment of a vehicle by fusing sensor data on a point cloud plane and vehicle control unit
CN114488073A (en) Method for processing point cloud data acquired by laser radar
EP4120123A1 (en) Scan line-based road point cloud extraction method
CN114454875A (en) Urban road automatic parking method and system based on reinforcement learning
CN111897906A (en) Method, device, equipment and storage medium for processing map data
CN115546749B (en) Pavement pothole detection, cleaning and avoiding method based on camera and laser radar
CN116109601A (en) Real-time target detection method based on three-dimensional laser radar point cloud
CN114820657A (en) Ground point cloud segmentation method, ground point cloud segmentation system, ground modeling method and medium
CN115240160A (en) Obstacle avoidance method, device and equipment for edge pasting cleaning of unmanned sweeper
CN111742242A (en) Point cloud processing method, system, device and storage medium
CN113313629B (en) Automatic intersection identification method and system and model storage method and system thereof
CN115331189A (en) Vehicle passable area detection method, system and storage medium
CN115512330A (en) Object detection method based on image segmentation and laser radar point cloud completion
Quint Recognition of structured objects in monocular aerial images using context information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination