CN112306061A - Robot control method and robot - Google Patents

Robot control method and robot Download PDF

Info

Publication number
CN112306061A
CN112306061A CN202011176294.5A CN202011176294A CN112306061A CN 112306061 A CN112306061 A CN 112306061A CN 202011176294 A CN202011176294 A CN 202011176294A CN 112306061 A CN112306061 A CN 112306061A
Authority
CN
China
Prior art keywords
laser
laser point
robot
point pair
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011176294.5A
Other languages
Chinese (zh)
Inventor
夏舸
刘文泽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uditech Co Ltd
Original Assignee
Uditech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uditech Co Ltd filed Critical Uditech Co Ltd
Priority to CN202011176294.5A priority Critical patent/CN112306061A/en
Publication of CN112306061A publication Critical patent/CN112306061A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
    • G05D1/0236Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application is applicable to the technical field of robots, and provides a robot control method, which comprises the following steps: performing laser scanning on a first area where the robot is located to obtain first scanning information; determining a second area capable of being driven by the robot and a corresponding driving path from the first area based on the first scanning information; and controlling the robot to run based on the running path. Compared with the traditional method for getting rid of the trouble, the scheme does not need frequent trial and error of the robot. Only the first scanning information is obtained through scanning, the driving path in the second area is obtained according to the first scanning information, and the robot is trapped by escaping based on the driving path, so that the trapping escaping efficiency of the robot is improved.

Description

Robot control method and robot
Technical Field
The application belongs to the technical field of robots, and particularly relates to a robot control method and a robot.
Background
With the gradual popularization of artificial intelligence, robots are gradually integrated into home life and work production. The robot acts as a different role in different situations. For example, a common sweeping robot acts as a cleaner and a dining robot acts as an attendant.
Although the functions of the robot are more and more abundant, there are still a lot of places to be improved for the existing robot. For example, for a scene needing getting rid of the trouble, the existing robot often tries to get rid of the trouble by continuously bumping, and thus the getting rid of the trouble is realized, and the getting rid of the trouble method is low in efficiency.
Disclosure of Invention
In view of this, embodiments of the present application provide a robot control method, a robot, and a computer-readable storage medium, which can solve the technical problem in the prior art that a robot escaping method is low in efficiency.
A first aspect of an embodiment of the present application provides a robot control method, which is applied to a robot, and includes:
performing laser scanning on a first area where the robot is located to obtain first scanning information;
determining a second area capable of being driven by the robot and a corresponding driving path from the first area based on the first scanning information;
and controlling the robot to run based on the running path.
A second aspect of an embodiment of the present application provides a robot control method apparatus, where the apparatus includes:
the scanning unit is used for carrying out laser scanning on a first area where the robot is located to obtain first scanning information;
a determining unit, configured to determine, based on the first scanning information, a second area where the robot can travel and a corresponding travel path from within the first area;
and the control unit is used for controlling the robot to run based on the running path.
A third aspect of embodiments of the present application provides a robot, including a scanning radar, a motion module, a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the method of the first aspect when executing the computer program.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, which stores a computer program, and the computer program, when executed by a processor, implements the steps of the method according to the first aspect.
Compared with the prior art, the embodiment of the application has the advantages that: the method comprises the steps of carrying out laser scanning on a first area where the robot is located to obtain first scanning information; determining a second area capable of being driven by the robot and a corresponding driving path from the first area based on the first scanning information; and controlling the robot to run based on the running path. Compared with the traditional method for getting rid of the trouble, the scheme does not need frequent trial and error of the robot. Only the first scanning information is obtained through scanning, the driving path in the second area is obtained according to the first scanning information, and the robot is trapped by escaping based on the driving path, so that the trapping escaping efficiency of the robot is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the related technical descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 illustrates a schematic flow chart of a robot control method provided herein;
FIG. 2 shows a schematic diagram of laser point cloud information provided herein;
FIG. 3 illustrates a detailed schematic flow chart of a robot control method provided herein;
FIG. 4 shows a schematic diagram of laser point cloud information provided herein;
FIG. 5 illustrates a detailed schematic flow chart of a robot control method provided herein;
FIG. 6 illustrates a detailed schematic flow chart of a robot control method provided herein;
FIG. 7 shows a schematic diagram of a laser spot pair provided herein;
FIG. 8 illustrates a detailed schematic flow chart of a method of robot control provided by the present application;
FIG. 9 illustrates a detailed schematic flow chart diagram of a robot control method provided herein;
FIG. 10 shows a schematic view of a perpendicular bisector translation provided herein;
FIG. 11 illustrates a detailed schematic flow chart of a method of robot control provided by the present application;
FIG. 12 illustrates a detailed schematic flow chart diagram of a robot control method provided herein;
FIG. 13 illustrates a detailed schematic flow chart diagram of a robot control method provided herein;
FIG. 14 illustrates a detailed schematic flow chart diagram of a robot control method provided herein;
FIG. 15 illustrates a travel path diagram provided by the present application;
FIG. 16 illustrates a schematic of a travel path provided by the present application;
FIG. 17 is a schematic diagram of an apparatus for a robot control method provided herein;
fig. 18 is a schematic diagram of a robot according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should be noted that the present application is applicable to any type of robot. When the solution of the embodiment of the present application is applied to these robots, the robots applied at this time are the execution subjects of the embodiment of the present application.
In order to better explain the technical scheme of the application, the application takes the sweeping robot as an example to explain the scheme.
The floor sweeping robots in the market are not uncommon, and the traditional floor sweeping robots usually run through preset paths. If the robot body is pushed away or an obstacle approaches the robot body during driving, the robot cannot leave along the original route. Especially for a passage with narrow passing width, the robot needs to continuously impact for trial and error, and then the escaping is realized.
In order to better understand the technical scheme of the present application, the overall process and the beneficial effects of the scheme of the present application are simply analyzed here:
and when the robot cannot drive according to the preset path, scanning the surrounding environment to obtain the laser point information of the obstacles in the surrounding environment. The robot determines a passing area according to the laser point information of the obstacle. And the robot calculates the running path of the robot in the passing area according to the laser point information on the two sides of the passing area. And the vehicle runs according to the running path to realize getting rid of the trouble. Compared with the traditional escaping method, the escaping method has the advantages that the robot is not required to continuously impact and try for mistakes, only the passing area is identified by the robot, the running path is calculated according to the passing area, escaping can be achieved, and escaping efficiency is improved.
In view of the above, embodiments of the present application provide a robot control method, an apparatus, a robot, and a computer-readable storage medium, which can solve the above technical problems.
Referring to fig. 1, fig. 1 shows a schematic flow chart of a robot control method provided in the present application. The method is applicable to robots.
As shown in fig. 1, the method may include the steps of:
step 101, performing laser scanning on a first area where the robot is located to obtain first scanning information.
The first area is a peripheral area where the robot is located. The robot transmits laser signals to the peripheral area through the laser radar. The laser signal reaches the surface of the obstacle and laser reflection occurs. The robot receives the reflected laser and obtains first scanning information according to the reflected laser signal. The first scanning information includes, but is not limited to, any one or combination of information such as the distance of the laser spot, the coordinate value of the laser spot, and the serial number of the laser spot. It is understood that the first scanning information is a plurality of information of the laser spot.
After the robot emits laser signals to the periphery, one part of the laser signals are reflected back to the sensor, and the other part of the laser signals are not reflected back to the sensor. The laser signal reflected back to the sensor will show a succession of laser spots, which belong to a physical obstacle.
As an optional embodiment of the application, since the obstacle with too far distance does not obstruct the robot to pass through, the application screens the laser points through the threshold value, eliminates the laser points with the distance larger than the threshold value, and only reserves the laser points with the distance closer to the robot. Namely, the position where the laser point with the distance greater than the threshold is located is taken as the "clear position" to avoid unnecessary calculation.
And 102, determining a second area for the robot to travel and a corresponding travel path from the first area based on the first scanning information.
The first region includes an obstacle region and an open region. And the open area is the area where the robot can travel. Therefore, the robot needs to determine a second area in the first area, where the robot can travel, and the second area includes, but is not limited to, an open area or a reversible obstacle area.
The robot may determine the second area based on different information in the first scanning information. For example: and acquiring the reflection paths of the laser points in the first scanning information, and taking the area in the reflection path of each laser point as a second area, namely taking the area between the body position of the robot and each laser point as the second area. Another example is: the positions of the laser spots in the first scanning information are acquired, and the area surrounded by the positions of the respective laser spots is used as a second area.
Because each laser signal corresponds to different emission directions, laser point cloud information formed by laser points can be obtained according to the emission directions and the reflection distances. Referring to fig. 2, fig. 2 shows a schematic diagram of laser point cloud information provided by the present application. As shown in fig. 2, the robot can obtain the laser point cloud information shown in fig. 2 by laser scanning. The laser point cloud information includes the number and location of laser points. The robot may regard the area surrounded by the laser spot as the second area. After the second area is obtained, the robot can determine a driving path according to the first scanning information of the edge laser point of the second area, and then the getting-out of the trouble is realized.
Since the distance intervals between the obstacles on the different sides in the traveling direction are different, the robot needs to confirm the traveling path in the second area according to the first scanning information. The robot may determine a travel path within the second area based on different information in the first scan information. For example: and determining regression lines of the laser points on different sides in the driving direction according to the first scanning information, and taking a central straight line between the regression lines on the two sides as a driving path. Another example is: and determining the laser points closest to different sides in the driving direction according to the first scanning information, and taking the perpendicular bisector of the line segment formed by the laser points closest to the laser points as the driving path. The driving direction can be determined in two ways: firstly, the advancing direction of a preset path is taken as the driving direction (the preset path is an original path which cannot escape and is adopted by the traditional robot), and secondly, the laser emitting direction of the unreflected laser signal is taken as the driving direction.
As an alternative embodiment of the present application, step 102 may also be implemented by the following embodiment. The embodiment confirms the traveling path in the second area according to the laser points closest to different sides of the traveling direction. Referring to fig. 3, fig. 3 shows a specific schematic flowchart in a robot control method provided by the present application, including:
step 1021, screens out second scanning information related to the second area from the first scanning information.
The second scanning information includes, but is not limited to, any one or combination of information such as the distance of the laser spot, the coordinate value of the laser spot, and the serial number of the laser spot. It is understood that the second scanning information is a plurality of information of the laser spot.
The width of the second area is determined by the position of the laser spot at the edge of the second area, and the laser points outside the edge of the second area do not influence the width of the second area, so the laser spots outside the edge of the second area need to be eliminated, and invalid calculation is avoided. To better explain the above, please refer to fig. 4, fig. 4 shows a schematic diagram of the laser point cloud information provided by the present application. As shown in fig. 4, circles a and b are laser points at the edge of the second region, and circles c and d are laser points outside the edge of the second region. The only effect on the width of the second area is the circle a and the circle b, while the size and number of the laser spots in the circle c and the circle d do not affect the width of the second area. Therefore, the second scanning information associated with the second area needs to be screened from the first scanning information.
The robot may determine second scan information associated with the second region based on different ones of the first scan information. For example: the laser pair distance (a laser pair is a combination of two laser points on different sides in the traveling direction) is calculated from the coordinate values of the respective laser points in the first scanning information, and the laser pair closest in distance is selected as the second scanning information. Another example is: and calculating the distance between the laser points according to the coordinate values of the laser points in the first scanning information, and eliminating the laser points which are closer to the robot according to a certain distance threshold value to obtain second scanning information related to the second area.
As an alternative embodiment of the present application, step 1021 may also be implemented by the following embodiments. The present embodiment selects the laser light pair closest to each other as the second scanning information. Referring to fig. 5, fig. 5 shows a specific schematic flowchart in a robot control method provided by the present application, including:
and A1, combining the laser points on different sides of the driving direction in pairs to obtain a plurality of first laser point pairs, wherein each first laser point pair comprises two laser points on different sides of the driving direction.
The first laser point pair may be a pair of laser points obtained by combining all the laser points on different sides of the driving direction. The first laser point pair can also be a laser point pair obtained by combining all the laser points on different sides of the driving direction in pairs and screening so as to reduce subsequent calculation amount.
As an alternative embodiment of the present application, step a1 may also be implemented by the following embodiment. In this embodiment, all the laser points on different sides of the driving direction are combined in pairs, and a first laser point pair is obtained through screening. Step a1 specifically includes the following steps. Referring to fig. 6, fig. 6 shows a specific schematic flowchart in a robot control method provided by the present application, including:
and step A1a, combining the laser points on different sides of the driving direction in pairs to obtain a plurality of second laser point pairs, wherein each second laser point pair comprises two laser points on different sides of the driving direction.
In this embodiment, all the laser points on different sides in the driving direction are combined in pairs, and the obtained laser point pair is called a second laser point pair. I.e. the second laser point pair is the original laser point pair that was not screened.
Step A1b, a first distance between two laser points within each second laser point pair is calculated.
Step A1c, based on the first distance, selecting a plurality of first laser point pairs from the second laser point pairs.
Since the width of the robot in the second area is determined by the distance between the laser spots on different sides of the direction of travel. Therefore, the implementation screens the first laser point pairs according to the first distance so as to plan the driving path according to the first laser point pairs.
The robot may perform a single screening or multiple screens based on the first distance.
The single screening format was as follows: and screening only the laser point pair with the minimum first distance from all the second laser point pairs to which the single first laser point belongs, and taking the screened laser point pair as the first laser point pair.
The multiple screening mode is as follows: and screening only the laser point pair with the minimum first distance from all the second laser point pairs to which the single first laser point belongs, and taking the screened laser point pair as a third laser point pair. And screening out a fourth laser point pair with the smallest first distance from the plurality of third laser point pairs. And determining a distance range according to the first distance of the fourth laser point pair and a preset numerical value. And taking a laser point pair with the first distance within the distance range as the first laser point pair in a plurality of third laser point pairs.
The following examples illustrate the single-and multi-screening formats, respectively.
As an alternative embodiment of the present application, step A1c may also be implemented by the following embodiment. This example is a single screening mode. The laser spot marking one side of the direction of travel is the first laser spot and the laser spot marking the other side of the direction of travel is the second laser spot. And screening only the laser point pair with the minimum first distance from all the second laser point pairs to which the single first laser point belongs, and taking the screened laser point pair as the first laser point pair.
For better explaining the technical solution of the present embodiment, please refer to fig. 7, and fig. 7 shows a schematic diagram of a laser point pair provided in the present application. As shown in fig. 7, the laser spot e, the laser spot f, the laser spot g, and the laser spot k are located at the edge of the second area, and the laser spot h and the laser spot j are located outside the edge of the second area. If all laser points are arranged and combined pairwise, the number of the obtained laser points is large, and repetition exists, for example: the (laser spot e, laser spot g) and (laser spot g, laser spot e) repeat. Therefore, in the present embodiment, the laser points on one side are used as the reference points, that is, only the laser point pair with the smallest first distance in all the second laser points to which the first laser point belongs is selected. It can be understood that, since the laser points e are at different distances from different laser point pairs consisting of all the laser points on the other side, and the passing width of the second area is determined by the nearest laser point pair, only the laser point pair with the smallest first distance in all the second laser points to which the first laser point belongs is selected as the first laser point pair. And screening the first laser point pair corresponding to each datum point in the above manner.
As an alternative embodiment of the present application, step A1c may also be implemented by the following embodiment. In this embodiment, the first laser point pair is screened from the second laser point pair in a multi-screening manner. Referring to fig. 8, fig. 8 shows a specific schematic flowchart in a robot control method provided by the present application, including:
step A1ca, based on the first distance, selecting a plurality of third laser point pairs from the second laser point pairs.
Step A1ca is a first screening of the multiple screens, that is, screening a third laser point pair from the second laser point pair, where the first screening includes: and taking the second laser points of which the first distance does not exceed the preset data as a plurality of third laser point pairs. And secondly, screening only the laser point pair with the minimum first distance from all the second laser point pairs to which the single first laser point belongs, and taking the screened laser point pair as a third laser point pair.
As an alternative embodiment of the present application, step A1ca may also be implemented by the following embodiment. This example is the first screening in a multiple screening mode. The first screening procedure was as follows: let the laser spot on one side of the first area be the first laser spot, and the laser spot on the other side of the first area be the second laser spot. And screening only the laser point pair with the minimum first distance from all the second laser point pairs to which the single first laser point belongs, and taking the screened laser point pair as the first laser point pair. The screening process of this embodiment is the same as that of the above embodiment, and specific reference is made to the above embodiment, which is not repeated herein.
Step A1cb, selecting a fourth laser point pair with the smallest first distance from the plurality of third laser point pairs.
Step A1cb through step A1cd are the second of the multiple screens.
It will be appreciated that the width of the second region is determined by the pair of laser points having the smallest distance. Therefore, in this embodiment, the fourth laser point pair with the smallest first distance is selected from the plurality of third laser point pairs.
As an alternative embodiment of the present application, the width of the passing area must be larger than the width of the robot body. Therefore, the present embodiment determines whether the first distance of the fourth laser point pair is greater than the first threshold; if the first distance of the fourth laser point pair is greater than the first threshold, step A1cc is performed. And if the first distance of the fourth laser point pair is smaller than or equal to the first threshold, screening out a laser point pair with the first distance larger than the first threshold and the minimum difference value with the first threshold from the plurality of third laser point pairs, and determining the distance range based on the first distance of the laser point pair and a preset value.
And step A1cc, determining a screening range according to the first distance of the fourth laser point pair and a preset value.
The distance range may be obtained by adding or multiplying a preset value to the first distance of the fourth laser point pair.
And step A1cd, centering a plurality of third laser points, and taking the laser point pairs with the first distance within the screening range as the first laser point pairs.
The fourth laser point pair is not absolutely "the minimum first distance" because of the unavoidable error of the lidar. In order to avoid measurement errors and possible calculation errors, the implementation screens a laser point pair with a first distance slightly larger than that of the fourth laser point pair through the screening range, and pairs a plurality of third laser points, wherein the laser point pair with the first distance within the screening range is used as the first laser point pair.
Step A2, calculating the total obstacle score of each first laser point pair; the total obstacle score is positively correlated to the number of obstacles between the two laser points in the first laser point pair.
As there may be other laser spots (i.e. obstacles) between the two laser spots in the first laser spot pair. Therefore, the present embodiment selects the first laser point pair with the least number of other laser points between the two laser points by calculating the total obstacle score of each first laser point pair.
The total obstacle score may be the number of all laser spots in the area formed by the first laser point pair. The total barrier fraction may also be the number of partial laser spots in the area formed by the first laser spot pair.
As an alternative embodiment of the present application, step a2 may also be implemented by the following embodiment. In this embodiment, the number of partial laser points in the area formed by the first laser point pair is used as the total obstacle score. In the operation of calculating the total obstacle value of each of the first laser point pairs, the operation of calculating the total obstacle value of a single first laser point pair includes the following steps. Referring to fig. 9 and 10, fig. 9 shows a specific schematic flowchart in a robot control method provided by the present application. In order to better explain the technical solution of the present embodiment, the present embodiment takes fig. 10 as an example to explain the technical solution of the present embodiment.
Fig. 10 shows a schematic view of the perpendicular bisector translation provided by the present application. As shown in fig. 10, a laser spot C and a laser spot D are present on both sides of the edge of the pass area. And connecting the laser point C and the laser point D to obtain a second line segment. And a second perpendicular bisector 11 for the second line segment is drawn from the second line segment. The translation line segment 12, the translation line segment 13, the translation line segment 14, the translation line segment 15, the translation line segment 16, and the translation line segment 17 are line segments obtained by translation according to the second perpendicular bisector 11.
Taking fig. 10 as an example, the method includes:
and step A2a, connecting the positions of the two laser points in the first laser point pair to obtain a second line segment, and drawing a second perpendicular bisector of the second line segment.
A second line segment is obtained by connecting the laser spot C and the laser spot D as in fig. 10. A second perpendicular bisector 11 of the second line segment is drawn from the second line segment.
And step A2b, translating the second perpendicular bisector equidistantly on the second line segment to obtain a plurality of translation line segments parallel to the second perpendicular bisector.
And translating the second perpendicular bisector 11 according to a preset distance to obtain a translation line segment 12, a translation line segment 13, a translation line segment 14, a translation line segment 15, a translation line segment 16 and a translation line segment 17. The length of the translation line segment can be infinite or linear. When the length of the translation line segment is a linear length, the position of the robot may be regarded as one end. And taking a point which is away from the second line segment by a preset length and has the advancing direction as the other end of the translation line segment.
And step A2c, counting the total number of laser points contained in all the translation line segments in the first area.
And counting the number of laser points on each translation line segment, and taking the number of the laser points on all the translation line segments as the total number of the laser points.
And step A2d, screening a plurality of adjacent line segments which contain the laser points with 0, and counting the first number of the screened line segments.
And counting the number of the translation line segments with the number of the continuous laser points being 0.
And step A2e, determining the total obstacle value of the first laser point pair according to the total number of the laser points and the first number.
The robot can directly carry out simple mathematical operation according to the total number of the laser points and the first number to obtain the total obstacle value of the first laser point pair. The total obstacle score of the first laser point pair may also be calculated according to a preset weight of the total number of laser points, a preset weight of the first number, the total number of laser points, and the first number.
As an alternative embodiment of the present application, step A2e may also be implemented by the following embodiment. Step A2e specifically includes the following steps. Referring to fig. 11, fig. 11 shows a specific schematic flowchart in a robot control method provided by the present application, including:
and step A2ea, obtaining the translation step length distance of equidistant translation, and obtaining the passing width according to the translation step length distance and the first number.
Since the distance of each translated line segment is fixed, the translation step distance can be multiplied by the first number to obtain the pass width.
And step A2eb, determining the total obstacle value of the first laser point pair according to the total laser point number and the passing width.
The total number of the laser points and the passing width can be directly subjected to simple mathematical operation to obtain the total barrier value of the first laser point pair. The total obstacle score of the first laser point pair can also be calculated according to the preset weight of the total number of the laser points, the preset weight of the passing width, the total number of the laser points and the passing width.
As an alternative embodiment of the present application, step A2eb may also be implemented by the following embodiment. Referring to fig. 12, fig. 12 shows a specific schematic flowchart in a robot control method provided by the present application, including:
and step B1, determining a first obstacle value of the first laser point pair according to the total number of the laser points.
And taking the total number of the laser points as a first barrier score of the first laser point pair.
And step B2, determining a second obstacle value of the first laser point pair according to the passing width.
And subtracting the width threshold value from the passing width to obtain a second obstacle value of the first laser point pair.
Step B3, calculating the total obstacle score of the first laser point pair according to the following formula:
E=a×C-b×(d-D)
wherein E represents a total obstacle score of the target laser spot group, a and b represent preset parameters, C represents the first obstacle score, (D-D) represents a second obstacle score, D represents the pass width, and D represents a width threshold.
And each first laser pair sequentially executes the steps A2a to B3 to obtain a plurality of obstacle total scores.
And A3, screening the first laser point pair with the lowest total obstacle score, and taking two laser points in the screened first laser point pair as second scanning information associated with the second area.
And the robot takes two laser points in the first laser point pair with the lowest obstacle total score as second scanning information related to the second area. The second scanning information includes, but is not limited to, a combination of one or more information of a distance of the laser spot, a coordinate value of the laser spot, and a serial number of the laser spot.
And step 1022, determining a traveling path of the robot in the second area according to the second scanning information.
The robot may calculate the travel path in the second area from a line segment formed by the positions of the laser points in the second scanning information, for example: and taking the perpendicular bisector of the line segment as a driving path. The robot may calculate the travel path in the second area based on a triangle formed by the position of the laser spot and the position of the robot in the second scanning information, for example, calculate a median line of the triangle, and use the median line as the travel path.
As an alternative embodiment of the present application, step A2eb may also be implemented by the following embodiment. Referring to fig. 13, fig. 13 shows a specific schematic flowchart in a robot control method provided by the present application, including:
step C1, the positions of the two laser points in the second scanning information in the first area are obtained.
And obtaining the positions of the two laser points in the first area according to the coordinate values of the two laser points in the second scanning information.
And step C2, determining the driving path of the robot in the second area according to the acquired positions of the two laser points.
And the robot determines a driving path in the second area of the robot according to the line segment formed by the positions of the two laser points. The travel path may be the midperpendicular of the line segment or a straight line of similar angle to the midperpendicular.
As an alternative embodiment of the present application, step C2 can also be implemented by the following embodiments. Referring to fig. 14, fig. 14 shows a specific schematic flowchart in a robot control method provided by the present application, including:
step C2a, connecting the positions of the two laser points to obtain a first line segment;
and step C2b, drawing a first perpendicular bisector of the first line segment, and determining the driving path of the robot in the second area according to the first perpendicular bisector.
And taking the first perpendicular bisector as a running path of the robot in the second area.
And 104, controlling the robot to run based on the running path.
To better explain the technical solution of the present application, please refer to fig. 15 and 16, fig. 15 shows a schematic traveling path diagram provided by the present application, and fig. 16 shows a schematic traveling path diagram provided by the present application. As shown in fig. 15 and 16, the robot obtains a travel route from the perpendicular bisector corresponding to the line segment formed by the position of the laser spot in the second scanning information, and performs escape according to the travel route.
In this embodiment, laser scanning is performed on a first area where the robot is located to obtain first scanning information; determining a second area capable of being driven by the robot and a corresponding driving path from the first area based on the first scanning information; and controlling the robot to run based on the running path. Compared with the traditional method for getting rid of the trouble, the scheme does not need frequent trial and error of the robot. Only the first scanning information is obtained through scanning, the driving path in the second area is obtained according to the first scanning information, and the robot is trapped by escaping based on the driving path, so that the trapping escaping efficiency of the robot is improved.
Fig. 17 shows a schematic diagram of a robot control method apparatus 17 according to the present application, and fig. 17 shows a robot control method apparatus according to the present application, where the robot control method apparatus shown in fig. 17 includes:
the scanning unit 171 is configured to perform laser scanning on a first area where the robot is located to obtain first scanning information;
a first determination unit 172 configured to determine a second drivable area from within the first area based on the first scanning information;
a second determining unit 173, configured to determine a traveling path of the robot in the second area according to the first scanning information;
a control unit 174 for controlling the robot to travel based on the travel path.
According to the device of the robot control method, laser scanning is carried out on a first area where the robot is located, and first scanning information is obtained; determining a second area capable of being driven by the robot and a corresponding driving path from the first area based on the first scanning information; and controlling the robot to run based on the running path. Compared with the traditional method for getting rid of the trouble, the scheme does not need frequent trial and error of the robot. Only the first scanning information is obtained through scanning, the driving path in the second area is obtained according to the first scanning information, and the robot is trapped by escaping based on the driving path, so that the trapping escaping efficiency of the robot is improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 18 is a schematic diagram of a robot according to an embodiment of the present invention. As shown in fig. 18, a robot 18 of this embodiment includes: a scanning radar 181, a motion module 182, a processor 183, a memory 184 and a computer program 185, e.g. a program for a robot control method, stored in said memory 184 and executable on said processor 183. The processor 183, when executing the computer program 185, implements the steps in each of the robot control method embodiments described above, such as steps 101-104 shown in fig. 1. Alternatively, the processor 183 implements the functions of the units in the above-described device embodiments, for example, the functions of the units 171 to 174 shown in fig. 17, when the computer program 185 is executed.
Illustratively, the computer program 185 may be divided into one or more units that are stored in the memory 184 and executed by the processor 183 to carry out the invention. The one or more elements may be a series of computer program instruction segments capable of performing specific functions that describe the execution of the computer program 185 in the one robot 18. For example, the computer program 185 may be divided into an acquisition unit and a calculation unit, each unit having the following specific functions:
the scanning unit is used for carrying out laser scanning on a first area where the robot is located to obtain first scanning information;
a determining unit, configured to determine, based on the first scanning information, a second area where the robot can travel and a corresponding travel path from within the first area;
and the control unit is used for controlling the robot to run based on the running path.
The one robot may include, but is not limited to, a processor 183, a memory 184. Those skilled in the art will appreciate that fig. 18 is merely an example of one type of robot 18 and is not intended to limit the type of robot 18 and may include more or fewer components than shown, or some components in combination, or different components, e.g., the type of robot may also include input output devices, network access devices, buses, etc.
The scanning radar 181 may be a laser radar, an infrared radar, or the like. The laser radar can be a single line radar or a multi-line radar.
The motion module 182 is used to control the robot motion.
The Processor 183 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 184 may be an internal storage unit of the robot 18, such as a hard disk or memory of the robot 18. The memory 184 may also be an external storage device of the robot 18, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), etc. provided on the robot 18. Further, the memory 184 may also include both an internal memory unit and an external memory device of the robot 18. The memory 184 is used for storing the computer program and other programs and data required for the one robot. The memory 184 may also be used to temporarily store data that has been output or is to be output.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the above-mentioned method embodiments.
The embodiments of the present application provide a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing device/robot, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunication signals, and software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to monitoring ". Similarly, the phrase "if it is determined" or "if [ a described condition or event ] is monitored" may be interpreted depending on the context to mean "upon determining" or "in response to determining" or "upon monitoring [ a described condition or event ]" or "in response to monitoring [ a described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (12)

1. A robot control method, characterized in that the method comprises:
performing laser scanning on a first area where the robot is located to obtain first scanning information;
determining a second area capable of being driven by the robot and a corresponding driving path from the first area based on the first scanning information;
and controlling the robot to run based on the running path.
2. The method of claim 1, wherein determining a second area and a corresponding travel path from within the first area for the robot to travel based on the first scan information comprises:
screening out second scanning information associated with the second area from the first scanning information;
and determining a running path of the robot in the second area according to the second scanning information.
3. The method of claim 2, wherein the first scan information includes laser points on different sides of a direction of travel, and wherein screening out second scan information associated with the second region from the first scan information comprises:
combining the laser points on different sides of the driving direction in pairs to obtain a plurality of first laser point pairs, wherein each first laser point pair comprises two laser points on different sides of the driving direction;
calculating the total obstacle value of each first laser point pair; the total obstacle value is positively correlated with the number of obstacles between two laser points in the first laser point pair;
and screening the first laser point pair with the lowest total obstacle score, and taking two laser points in the screened first laser point pair as second scanning information associated with the second area.
4. The method of claim 3, wherein determining the travel path of the robot within the second area based on the second scan information comprises:
acquiring the positions of two laser points in the second scanning information in the first area;
connecting the positions of the two laser points to obtain a first line segment;
drawing a first perpendicular bisector of the first line segment, and determining a travel path of the robot in the second area according to the first perpendicular bisector.
5. The method of claim 3, wherein said combining two by two laser points on different sides of said driving direction to obtain a plurality of first laser point pairs comprises:
combining the laser points on different sides of the driving direction in pairs to obtain a plurality of second laser point pairs, wherein each second laser point pair comprises two laser points on different sides of the driving direction;
calculating a first distance between two laser points in each second laser point pair;
and screening a plurality of first laser point pairs from the second laser point pairs based on the first distance.
6. The method of claim 5, wherein marking the laser spot on one side of the direction of travel as a first laser spot and the laser spot on the other side of the direction of travel as a second laser spot; the screening out a plurality of the first laser point pairs from the second laser point pairs based on the first distance comprises:
and screening only the laser point pair with the minimum first distance from all the second laser point pairs to which the single first laser point belongs, and taking the screened laser point pair as the first laser point pair.
7. The method of claim 5, wherein the step of selecting a plurality of first laser point pairs from the second laser point pairs based on the first distance comprises:
screening a plurality of third laser point pairs from the second laser point pairs based on the first distance;
screening out a fourth laser point pair with the minimum first distance from the plurality of third laser point pairs;
determining a screening range according to the first distance of the fourth laser point pair and a preset value;
and taking the laser point pairs with the first distance within the screening range as the first laser point pairs in the plurality of third laser point pairs.
8. The method of claim 7, wherein marking the laser spot on one side of the direction of travel as a first laser spot and the laser spot on the other side of the direction of travel as a second laser spot; the screening out a plurality of third laser point pairs from the second laser point pairs based on the first distance comprises:
and screening only the laser point pair with the minimum first distance from all the second laser point pairs to which the single first laser point belongs, and taking the screened laser point pair as the third laser point pair.
9. The method of claim 3, wherein in said calculating a total obstacle score for each of said first laser point pairs, calculating an obstacle total score for a single said first laser point pair comprises:
connecting the positions of the two laser points in the first laser point pair to obtain a second line segment, and drawing a second perpendicular bisector of the second line segment;
translating the second perpendicular bisector on the second line segment at equal intervals to obtain a plurality of translation line segments parallel to the second perpendicular bisector;
counting the total number of laser points contained in all the translation line segments in the first area;
screening a plurality of adjacent line segments which contain the laser points of which are 0 from the translation line segments, and counting a first number of the screened line segments;
and determining the total obstacle value of the first laser point pair according to the total number of the laser points and the first number.
10. The method of claim 9, wherein said determining a total number of obstacles score for the first laser point pair based on said total number of laser points and said first number comprises:
obtaining the translation step length distance of the equidistant translation, and obtaining the passing width according to the translation step length distance and the first quantity;
and determining the total barrier score of the first laser point pair according to the total number of the laser points and the passing width.
11. The method of claim 10, wherein determining a total obstacle score for the first laser point pair based on the total number of laser points and the pass width comprises:
determining a first obstacle value of the first laser point pair according to the total number of the laser points;
determining a second barrier score of the first laser point pair according to the passing width;
calculating the total obstacle score of the first laser point pair according to the following formula:
E=a×C-b×(d-D)
wherein E represents the total obstacle score of the target laser spot group, a and b represent preset parameters, C represents the first obstacle score, (D-D) represents the second obstacle score, D represents the pass width, and D represents a width threshold.
12. A robot comprising a scanning radar, a motion module, a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the steps of the method according to any of claims 1 to 11 are implemented when the computer program is executed by the processor.
CN202011176294.5A 2020-10-28 2020-10-28 Robot control method and robot Pending CN112306061A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011176294.5A CN112306061A (en) 2020-10-28 2020-10-28 Robot control method and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011176294.5A CN112306061A (en) 2020-10-28 2020-10-28 Robot control method and robot

Publications (1)

Publication Number Publication Date
CN112306061A true CN112306061A (en) 2021-02-02

Family

ID=74330429

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011176294.5A Pending CN112306061A (en) 2020-10-28 2020-10-28 Robot control method and robot

Country Status (1)

Country Link
CN (1) CN112306061A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113768420A (en) * 2021-09-18 2021-12-10 安克创新科技股份有限公司 Sweeper and control method and device thereof

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000121730A (en) * 1998-10-19 2000-04-28 Honda Motor Co Ltd Obstacle detecting device for vehicle
US20140052296A1 (en) * 2012-08-16 2014-02-20 Samsung Techwin Co., Ltd. Robot system and method for driving the same
CN203950203U (en) * 2014-06-11 2014-11-19 江苏数字鹰科技发展有限公司 Can be at the unmanned plane of disturbance in judgement thing distance at night
KR20170018674A (en) * 2015-08-10 2017-02-20 한국과학기술연구원 Vehicle where shadow region of laser distance sensor is eliminated
CN108303089A (en) * 2017-12-08 2018-07-20 浙江国自机器人技术有限公司 Based on three-dimensional laser around barrier method
CN108318895A (en) * 2017-12-19 2018-07-24 深圳市海梁科技有限公司 Obstacle recognition method, device and terminal device for automatic driving vehicle
CN108375373A (en) * 2018-01-30 2018-08-07 深圳市同川科技有限公司 Robot and its air navigation aid, navigation device
WO2018218680A1 (en) * 2017-06-02 2018-12-06 华为技术有限公司 Obstacle detection method and device
CN109884656A (en) * 2017-12-06 2019-06-14 北京万集科技股份有限公司 For realizing the laser radar and distance measuring method of scanning field of view subregion
CN110377038A (en) * 2019-07-15 2019-10-25 深圳优地科技有限公司 A kind of robot evacuation running method, device and robot
CN111707279A (en) * 2020-05-19 2020-09-25 上海有个机器人有限公司 Matching evaluation method, medium, terminal and device of laser point cloud and map

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000121730A (en) * 1998-10-19 2000-04-28 Honda Motor Co Ltd Obstacle detecting device for vehicle
US20140052296A1 (en) * 2012-08-16 2014-02-20 Samsung Techwin Co., Ltd. Robot system and method for driving the same
CN203950203U (en) * 2014-06-11 2014-11-19 江苏数字鹰科技发展有限公司 Can be at the unmanned plane of disturbance in judgement thing distance at night
KR20170018674A (en) * 2015-08-10 2017-02-20 한국과학기술연구원 Vehicle where shadow region of laser distance sensor is eliminated
WO2018218680A1 (en) * 2017-06-02 2018-12-06 华为技术有限公司 Obstacle detection method and device
CN109884656A (en) * 2017-12-06 2019-06-14 北京万集科技股份有限公司 For realizing the laser radar and distance measuring method of scanning field of view subregion
CN108303089A (en) * 2017-12-08 2018-07-20 浙江国自机器人技术有限公司 Based on three-dimensional laser around barrier method
CN108318895A (en) * 2017-12-19 2018-07-24 深圳市海梁科技有限公司 Obstacle recognition method, device and terminal device for automatic driving vehicle
CN108375373A (en) * 2018-01-30 2018-08-07 深圳市同川科技有限公司 Robot and its air navigation aid, navigation device
CN110377038A (en) * 2019-07-15 2019-10-25 深圳优地科技有限公司 A kind of robot evacuation running method, device and robot
CN111707279A (en) * 2020-05-19 2020-09-25 上海有个机器人有限公司 Matching evaluation method, medium, terminal and device of laser point cloud and map

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张永博;李必军;陈诚;: "激光点云在无人驾驶路径检测中的应用", 测绘通报, no. 11 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113768420A (en) * 2021-09-18 2021-12-10 安克创新科技股份有限公司 Sweeper and control method and device thereof

Similar Documents

Publication Publication Date Title
EP3650884A2 (en) Method and apparatus for determining relative pose, device and medium
WO2022105764A1 (en) Goods storage method and apparatus, and robot, warehousing system and storage medium
US11815602B2 (en) Grid occupancy mapping using error range distribution
US20230286750A1 (en) Method and device for storing goods, robot, warehousing system and storage medium
CN112513679B (en) Target identification method and device
EP3699890A2 (en) Information processing method and apparatus, and storage medium
US10838424B2 (en) Charging station identifying method and robot
CN110815202B (en) Obstacle detection method and device
CN110850859A (en) Robot and obstacle avoidance method and obstacle avoidance system thereof
US11719799B2 (en) Method for determining a collision free space
CN112306061A (en) Robot control method and robot
CN113561179B (en) Robot control method, robot control device, robot, storage medium, and program product
CN111505652A (en) Map establishing method, device and operation equipment
CN105760023A (en) Scanning method and device for infrared emitting diode touch screen
CN111275087A (en) Data processing method and device, electronic equipment and motor vehicle
CN112731355B (en) Method, device, terminal and medium for calculating laser radar installation angle deviation
CN117289300A (en) Point cloud correction method, laser radar and robot
CN116501070A (en) Recharging method, robot and storage medium
CN116872194A (en) Robot control method and device, readable storage medium and robot
CN113768420B (en) Sweeper and control method and device thereof
CN115177186A (en) Sweeping method, sweeping device, sweeping robot and computer readable storage medium
JP7265027B2 (en) Processing device and point cloud reduction method
CN114779207A (en) Noise data identification method, device and storage medium
CN110728288B (en) Corner feature extraction method based on three-dimensional laser point cloud and application thereof
CN114049393A (en) Robot map scanning method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination