CN117140536B - Robot control method and device and robot - Google Patents

Robot control method and device and robot Download PDF

Info

Publication number
CN117140536B
CN117140536B CN202311417891.6A CN202311417891A CN117140536B CN 117140536 B CN117140536 B CN 117140536B CN 202311417891 A CN202311417891 A CN 202311417891A CN 117140536 B CN117140536 B CN 117140536B
Authority
CN
China
Prior art keywords
robot
yaw angle
missing
laser radar
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311417891.6A
Other languages
Chinese (zh)
Other versions
CN117140536A (en
Inventor
陶永
张宇帆
高赫
刘海涛
薛蛟
万嘉昊
段练
韩栋明
陈硕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN202311417891.6A priority Critical patent/CN117140536B/en
Publication of CN117140536A publication Critical patent/CN117140536A/en
Application granted granted Critical
Publication of CN117140536B publication Critical patent/CN117140536B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application provides a robot control method, a robot control device and a robot. According to the robot control method, when the absence of the detection data sequence of the first laser radar is detected, the condition that the robot encounters a cliff is determined; according to a first missing sequence of the first laser radar which is missing currently and a second missing sequence of the lateral laser radar which is missing currently, the yaw angle of the robot is calculated periodically; the lateral laser radar is a third laser radar or a fourth laser radar; the yaw angle represents an included angle between a detection edge where a detection domain of the first laser radar is located and the edge of the cliff; when the yaw angle converges, controlling the robot to stop moving forwards; and controlling the robot to rotate in situ according to the yaw angle until the yaw angle is equal to a specified value, and controlling the robot to stop moving. The robot control method, the robot control device and the robot can realize edge detection and accurate alignment of the special-shaped structural member.

Description

Robot control method and device and robot
Technical Field
The present disclosure relates to the field of robot control technologies, and in particular, to a robot control method, a device, and a robot.
Background
Currently, the scale of infrastructures such as bridges, tunnels and oil platforms is increasingly enlarged, and new challenges are brought to maintenance work. In order to further improve the operation quality and efficiency, intelligent operation and maintenance schemes based on intelligent equipment such as unmanned aerial vehicles, magnetic adsorption robots, negative pressure adsorption robots and the like are receiving attention.
The intelligent equipment is used as a new maintenance tool to cut into the traditional maintenance process mainly based on manual work, and is required to have reliable and easy-to-use basic properties. Tall, oversized structures, etc., have presented new challenges to robots that are reliable and easy to use.
In the steel structure scene of the inspection bridge of the magnetic adsorption robot, the inspection bridge mainly bears the tasks of turn-over inspection, anti-falling early warning, defect identification and repair and the like, and the key areas of the main structure, such as bolt areas, shallow edges and other areas, have larger edge identification and alignment control difficulty, longer operation time consumption and easy influence on the real-time safety of the task. Therefore, a set of robot control methods are necessary to be studied to realize the edge detection and the accurate alignment of the special-shaped structural member.
Disclosure of Invention
In view of the above, the present application provides a robot control method, apparatus and robot to realize edge detection and precise alignment of a special-shaped structural member.
Specifically, the application is realized by the following technical scheme:
the first aspect of the present application provides a robot control method, where the method is applied to a robot, a first laser radar is arranged at a front part of the robot, a second laser radar is arranged at a rear part of the robot, a third laser radar is arranged at a left part of the robot, a fourth laser radar is arranged at a right part of the robot, and a detection domain of the first laser radar, a detection domain of the second laser radar, a detection domain of the third laser radar, and a detection domain of the fourth laser radar enclose a whole detection domain which is a rectangular detection domain; the method comprises the following steps:
determining that the robot encounters a cliff when the absence of the detection data sequence of the first lidar is detected;
according to the first missing sequence of the first laser radar which is missing currently and the second missing sequence of the lateral laser radar which is missing currently, the yaw angle of the robot is calculated periodically; wherein the lateral lidar is the third lidar or the fourth lidar; the yaw angle represents an included angle between a detection edge where a detection domain of the first laser radar is located and the edge of the cliff;
Controlling the robot to stop moving forward when the yaw angle converges;
and controlling the robot to rotate in situ according to the yaw angle until the yaw angle is equal to a specified value, and controlling the robot to stop moving.
The second aspect of the present application provides a robot control device, where the device is applied to a robot, a first laser radar is arranged at a front part of the robot, a second laser radar is arranged at a rear part of the robot, a third laser radar is arranged at a left part of the robot, a fourth laser radar is arranged at a right part of the robot, and an overall detection domain enclosed by the detection domains of the first laser radar, the second laser radar, the third laser radar and the fourth laser radar is a rectangular detection domain; the device comprises: the device comprises a detection module, a calculation module and a control module; wherein,
the detection module is used for determining that the first laser radar meets a cliff when detecting that a detection data sequence of the first laser radar is missing;
the calculation module is used for periodically calculating the yaw angle of the robot according to the first missing sequence of the first laser radar which is currently missing and the second missing sequence of the lateral laser radar which is currently missing; wherein the lateral lidar is the third lidar or the fourth lidar; the yaw angle represents an included angle between a detection edge where a detection domain of the first laser radar is located and the edge of the cliff;
The control module is used for controlling the robot to stop moving forwards when the yaw angle converges;
the control module is also used for periodically controlling the robot to rotate in situ according to the yaw angle until the yaw angle is equal to a specified value, and controlling the robot to stop moving.
A third aspect of the present application provides a robot comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of any of the methods provided in the first aspect of the present application when the program is executed.
A fourth aspect of the present application provides a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of any of the methods provided in the first aspect of the present application.
According to the robot control method, the robot control device and the robot, the laser radars are distributed on the front part, the rear part, the left part and the right part of the robot, and then the whole detection domain surrounded by the detection domain of the laser radars is the rectangular detection domain, so that in the running process of the robot, the detection data sequence of the laser radars is detected, when the detection data sequence of the first laser radars is detected to be missing, the condition that the robot encounters a cliff is determined, when the condition that the robot encounters the cliff is determined, the yaw angle of the robot is calculated periodically according to the first missing sequence of the first laser radars and the second missing sequence of the lateral laser radars, and then when the yaw angle converges, the robot is controlled to stop moving forwards, and accordingly the robot is controlled to rotate in place according to the yaw angle until the yaw angle is equal to a specified value, and the robot is controlled to stop moving. Wherein the lateral lidar is the third lidar or the fourth lidar; and the yaw angle represents an included angle between a detection edge where the detection domain of the first laser radar is located and the edge of the cliff. Therefore, whether the robot meets the cliff or not can be monitored in real time through the detection data sequence of the laser radar, and when the robot meets the cliff, the robot is controlled to stop in a mode that the front surface faces the cliff, so that edge detection and accurate alignment of the special-shaped structural member can be realized, and the robot also has reliability and usability when meeting high-rise structural members.
Drawings
Fig. 1 is a flowchart of a first embodiment of a robot control method provided in the present application;
FIG. 2 is a top view of a robot shown in an exemplary embodiment of the present application;
FIG. 3 is a schematic diagram illustrating calibration of a lidar according to an exemplary embodiment of the present application;
FIG. 4 is a schematic diagram of detection of a lidar according to an exemplary embodiment of the present application;
FIG. 5 is a schematic view of a robot walking plane shown in an exemplary embodiment of the present application;
fig. 6 is a flowchart of a second embodiment of a robot control method provided in the present application;
fig. 7 is a flowchart of a third embodiment of a robot control method provided in the present application;
FIG. 8 is a flow chart illustrating one example embodiment of the present application for calculating an expected angular velocity at a next time;
FIG. 9 is a schematic diagram of a fuzzy subset membership function according to an exemplary embodiment of the present application;
FIG. 10 is a control schematic of a robot shown in an exemplary embodiment of the present application;
FIG. 11 is a graph showing yaw angle as a function of time according to an exemplary embodiment of the present application;
FIG. 12 is a schematic diagram of an implementation of alignment control shown in an exemplary embodiment of the present application;
FIG. 13 is a graph showing yaw angle and angular velocity as a function of time, according to an exemplary embodiment of the present application;
Fig. 14 is a hardware configuration diagram of a robot in which the robot control device provided in the present application is located;
fig. 15 is a schematic structural diagram of a first embodiment of a robot control device provided in the present application.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present application.
The terminology used in the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the present application. As used in this application, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, a first message may also be referred to as a second message, and similarly, a second message may also be referred to as a first message, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at … …" or "at … …" or "responsive to a determination", depending on the context.
The application provides a robot control method, a robot control device and a robot, which are used for realizing edge detection and accurate alignment of special-shaped structural parts.
According to the robot control method, the robot control device and the robot, the laser radars are distributed on the front part, the rear part, the left part and the right part of the robot, and then the whole detection domain surrounded by the detection domain of the laser radars is the rectangular detection domain, so that in the running process of the robot, the detection data sequence of the laser radars is detected, when the detection data sequence of the first laser radars is detected to be missing, the condition that the robot encounters a cliff is determined, when the condition that the robot encounters the cliff is determined, the yaw angle of the robot is calculated periodically according to the first missing sequence of the first laser radars and the second missing sequence of the lateral laser radars, and then when the yaw angle converges, the robot is controlled to stop moving forwards, and accordingly the robot is controlled to rotate in place according to the yaw angle until the yaw angle is equal to a specified value, and the robot is controlled to stop moving. Wherein the lateral lidar is the third lidar or the fourth lidar; and the yaw angle represents an included angle between a detection edge where the detection domain of the first laser radar is located and the edge of the cliff. Therefore, whether the robot meets the cliff or not can be monitored in real time through the detection data sequence of the laser radar, and when the robot meets the cliff, the robot is controlled to stop in a mode that the front surface faces the cliff, so that edge detection and accurate alignment of the special-shaped structural member can be realized, and the robot also has reliability and usability when meeting high-rise structural members.
Specific examples are given below to describe the technical solutions of the present application in detail.
Fig. 1 is a flowchart of a first embodiment of a robot control method provided in the present application. Referring to fig. 1, the method provided in this embodiment is applied to a robot, and includes:
and S101, when detecting that the detection data sequence of the first laser radar is missing, determining that the robot encounters a cliff.
Specifically, the method provided in this embodiment is applied to a robot, which may be a magnetic attraction robot. It should be noted that, the method provided in this embodiment may be applied to a robot multi-surface inspection scene, and specifically, how the robot recognizes the edge of the cliff in the robot multi-surface inspection scene, and further, the robot turns over to the other surface in the direction opposite to the cliff, which is an important part of robot control. The control method provided by the embodiment can realize edge detection and automatic alignment, so that the robot can cross to the other surface in the direction facing the cliff.
Before describing the method provided by the application, a simple description is given to the robot:
specifically, fig. 2 is a top view of a robot according to an exemplary embodiment of the present application. Referring to fig. 2, a first laser radar is disposed at the front of the robot, a second laser radar is disposed at the rear of the robot, a third laser radar is disposed at the left of the robot, and a fourth laser radar is disposed at the right of the robot.
Specifically, the lidar senses the environment around the robot by emitting a laser beam and receiving the reflected laser beam.
Further, after the laser radar is well qualified, the detection domain of the first laser radar, the detection domain of the second laser radar, the detection domain of the third laser radar and the detection domain of the fourth laser radar enclose an overall detection domain which is a rectangular detection domain.
Specifically, the detection field of the lidar refers to the range of the area that the lidar can effectively detect. The rectangular detection domain reserves the safety of the front and back travelling directions of the robot, discards the safety of the left and right transverse movement directions, and is more in line with the differential speed operation characteristics of the robot.
The following describes the laser radar calibration process in a simple way:
specifically, when the laser radars are calibrated, firstly, the arrangement pose of each laser radar is acquired, and the arrangement pose comprises the arrangement heights and angles of the first laser radar, the second laser radar, the third laser radar and the fourth laser radar; secondly, determining the relative position coordinates of each laser radar; finally, specific values of the calibration parameters are determined (in this embodiment, the field angle and the angle resolution of each lidar are mainly determined).
It should be noted that the first laser radar and the second laser radar are symmetrically arranged, and the third laser radar and the fourth laser radar are symmetrically arranged.
Specifically, the third lidar and the fourth lidar are arranged on the horizontal centerline of the robot (e.g., the third lidar is arranged at the center point position on the left side of the robot) at the arrangement height thereofh L Is known; further, the third laser radar and the fourth laser radar are vertically arranged, namely, the arrangement included angle between the third laser radar and the fourth laser radar and the horizontal plane is 90 degrees, and the arrangement included angle between the first laser radar and the fourth laser radar and the horizontal plane is that(known); further, the field angle and the angular resolution of the first lidar are selected from default values.
The calibration process of the laser radar will be described below by taking the first laser radar and the third laser radar as examples. Fig. 3 is a schematic diagram illustrating calibration of a laser radar according to an exemplary embodiment of the present application. Specifically, referring to fig. 3, the setting of two parameters of the arrangement angle and the arrangement height of the first lidar needs to satisfy the envelope of the safety area, and define the guard areas of the forward direction and the backward direction (the left and right directions are closely attached to the two sides of the robot), and the arrangement height can be calculated according to the following formula:
Wherein,a horizontal zone distance for the direction of travel alert zone; />Arranging a vertical height for the first lidar; />Is the arrangement included angle between the first laser radar and the horizontal plane.
Further, for the third lidar, the field angle may be calculated according to the following formula:
wherein,h F an arrangement height for the first lidar;h L an arrangement height for a third lidar;an arrangement included angle between the first laser radar and the horizontal plane;l 0 half the width of the robot chassis (the third lidar is arranged at the center in the width direction);θ left is the field angle of the third lidar (wherein, in figure 3,θ right the field angle of the fourth lidar).
Thus, the calibration of the laser radar can be completed, and after the calibration is completed, the plurality of laser radar detection domains can be converged into a closed rectangular detection domain; the rectangular detection domain reserves the safety margin of the front and back advancing directions of the robot, discards the safety margin of the left and right traversing directions, and is more in line with the differential speed operation characteristics of the robot.
The method provided in this application is described in detail below:
specifically, fig. 4 is a schematic diagram of detection of a lidar according to an exemplary embodiment of the present application. Referring to fig. 4, after the angular resolution of the lidar is fixed, rays are emitted according to the fixed and equally divided angular resolution, for example, in the example shown in fig. 4, i laser beams are emitted, and accordingly, if a cliff is not encountered, all the laser beams return, and the number of data points included in the detected data sequence should be equal to the number of laser beams emitted.
Further, the absence of the detection data sequence refers to the situation that some data points in the detection data sequence cannot be acquired, that is, the number of actually returned detection data sequence points is smaller than the number of laser beams emitted by the laser radar.
Specifically, when the robot detects that the number of data points contained in the detection data sequence returned by the first laser radar is smaller than the number of laser beams emitted by the laser radar, the robot is determined to encounter the cliff.
In combination with the above example, for example, the first lidar emits 200 laser beams, the number of data points included in the returned probe data sequence should be 200, and at a moment, if the number of data points included in the actually returned probe data sequence is 199, the missing probe data sequence is found, and it is determined that the robot encounters the cliff.
S102, periodically calculating the yaw angle of the robot according to the first missing sequence of the first laser radar which is currently missing and the second missing sequence of the lateral laser radar which is currently missing; wherein the lateral lidar is the third lidar or the fourth lidar; and the yaw angle represents an included angle between a detection edge where the detection domain of the first laser radar is located and the edge of the cliff.
Specifically, the lateral laser radar refers to a third laser radar or a fourth laser radar, and the yaw angle refers to an included angle between a detection edge where a detection domain of the first laser radar is located and an edge of the cliff.
Specifically, when the robot encounters the cliff, first, the corner point of the detection data sequence of the first lidar disappears first, further, as the robot continues to move, the detection data sequence of the lateral lidar also gradually disappears.
Specifically, when calculating the yaw angle of the robot, the yaw angle may be calculated periodically, where the period is set according to the actual requirement, and in this embodiment, this is not limited. For example, in one embodiment, the period is 1/30 seconds.
For example, in one embodiment, fig. 5 is a schematic view of a robot walking plane according to an exemplary embodiment of the present application. Referring to fig. 5, in the example shown in fig. 5, the robot travels in the direction indicated by the arrow, at a certain moment, the robot encounters a cliff, the detection data sequence corresponding to the lidar is missing, referring to the diagram corresponding to the right yaw in fig. 5, the line segment BC is the first missing sequence of the first lidar missing, the line segment AB is the second missing sequence of the lateral lidar missing, and the angle BCA is the yaw angle.
The specific calculation principle of the yaw angle will be described in detail in the following embodiments, and will not be described here.
And S103, controlling the robot to stop moving forwards when the yaw angle converges.
Specifically, yaw angle convergence means that the yaw angle is maintained around a certain value, i.e., the amount of change in the calculated yaw angle is smaller than a preset value in consecutive n periods. Where n is set according to practical needs, for example, in one embodiment, n is 10. In addition, the preset value is set according to actual needs, and in this embodiment, specific values of the preset value are not limited. For example, the preset value is 0.05.
In combination with the above example, for example, in one embodiment, if the yaw angles calculated for 10 consecutive periods are all around 15 °, that is, the change amounts of the yaw angles calculated for 10 consecutive periods are all smaller than 0.05, the robot is controlled to stop moving forward.
S104, controlling the robot to rotate in situ according to the yaw angle until the yaw angle is equal to a specified value, and controlling the robot to stop moving.
Specifically, the specified value is 90 °, which characterizes the direction of advance of the robot perpendicular to the edge of the cliff, i.e. the robot will turn over the cliff with the front face facing the cliff.
For example, the robot may be controlled to rotate in place at a certain angular velocity, and the yaw angle is periodically calculated during the rotation of the robot, and the robot is controlled to stop moving when the yaw angle is equal to a specified value at a certain time. Further, referring to the description of the application scenario, after the robot stops moving, the robot can be controlled to turn over the cliff to reach the other side.
According to the robot control method, the laser radars are distributed on the front part, the rear part, the left part and the right part of the robot, and then the whole detection domain surrounded by the detection domain of the laser radars is a rectangular detection domain, so that in the running process of the robot, the detection data sequence of the laser radars is detected, when the detection data sequence of the first laser radars is detected to be missing, the condition that the robot encounters a cliff is determined, when the condition that the robot encounters the cliff is determined, the yaw angle of the robot is calculated periodically according to the first missing sequence of the first laser radars and the second missing sequence of the lateral laser radars, and then when the yaw angle converges, the robot is controlled to stop moving forwards, and accordingly the robot is controlled to rotate in place according to the yaw angle until the yaw angle is equal to a specified value, and the robot is controlled to stop moving. Wherein the lateral lidar is the third lidar or the fourth lidar; and the yaw angle represents an included angle between a detection edge where the detection domain of the first laser radar is located and the edge of the cliff. Therefore, whether the robot meets the cliff or not can be monitored in real time through the detection data sequence of the laser radar, and when the robot meets the cliff, the robot is controlled to stop in a mode that the front surface faces the cliff, so that edge detection and accurate alignment of the special-shaped structural member can be realized, and the robot also has reliability and usability when meeting high-rise structural members.
Fig. 6 is a flowchart of a second embodiment of a robot control method provided in the present application. Referring to fig. 6, in the method provided in this embodiment, the calculating, according to the first missing sequence of the current missing of the first lidar and the second missing sequence of the current missing of the lateral lidar, a yaw angle of the robot includes:
s601, calculating a first length of a first missing part according to a sequence number corresponding to the first missing sequence; wherein the first missing portion is a portion of the first lidar that is missing on a forward detection edge of the rectangular detection domain.
Specifically, the specific implementation process of this step may include the following steps:
(1) And for any pair of adjacent two sequence points in the missing part, calculating the detection length of two adjacent laser beams corresponding to the two adjacent sequence points according to the sequence numbers of the two adjacent sequence points.
Specifically, with continued reference to FIG. 4, the robot operating environment is defined herein as bounded horizontal and vertical planes, with no angular component, for simplicity of analysis. Referring to fig. 4, the detection lengths of the adjacent two laser beams corresponding to the adjacent two sequence points are calculated according to the following formula:
Wherein,for the ith sequenceThe detection length of two adjacent laser beams corresponding to the point and the (i+1) th sequence point;da distance return value for the center ray of the lidar;αis the angular resolution of the lidar; wherein,θ = (i+1)αθis the field angle of the lidar.
(2) And calculating the length of the missing part according to all the calculated detection lengths.
Specifically, the length of the missing portion is equal to the sum of all the calculated detection lengths, in other words, the length of the missing portion can be calculated according to the following formula:
,
wherein p is the initial sequence number of the missing part; q is the termination sequence number of the missing part;is the length of the missing portion.
In connection with the above example, in the example shown in fig. 5, BC is the first missing portion.
S602, calculating a second length of a second missing part according to the sequence number corresponding to the second missing sequence; wherein the second missing portion is a portion of the lateral lidar that is missing on a lateral detection edge corresponding to the rectangular detection domain.
Specifically, the specific implementation procedure and implementation principle of this step are similar to S601, and will not be described here again.
In combination with the above example, in the example shown in fig. 5, AB is the second missing portion.
S603, calculating the yaw angle according to the first length and the second length.
Specifically, the yaw angle is calculated according to the following formula:
,
wherein,ρis a yaw angle;L1for the first length;L2is the second length.
In combination with the above example, in the example shown on the left in FIG. 5, the yaw angle isρWherein, the method comprises the steps of, wherein,
according to the robot control method provided by the embodiment, the first length of the first missing part is calculated according to the sequence number corresponding to the first missing sequence, the second length of the second missing part is calculated according to the sequence number corresponding to the second missing sequence, and the yaw angle is calculated according to the first length and the second length. Wherein the first missing portion is a portion of the first lidar that is missing on a forward detection edge of the rectangular detection domain; the second missing portion is a portion of the lateral lidar that corresponds to a missing portion on a lateral detection edge on the rectangular detection domain. In this way, the yaw angle can be calculated through the missing part detected by the laser radar, and the robot is further controlled to rotate in situ based on the yaw angle, and finally the front surface of the robot faces the cliff.
Fig. 7 is a flowchart of a third embodiment of a robot control method provided in the present application. Referring to fig. 7, the method provided in this embodiment, based on the above embodiment, includes:
s701, calculating the expected angular velocity of the robot at the next moment according to the first yaw angle calculated in the current period, the second yaw angle calculated in the period before the current period, the third yaw angle calculated in the period before the current period and the actual angular velocity of the robot at the current moment.
Specifically, fig. 8 is a flowchart illustrating a method for calculating an expected angular velocity at a next time according to an exemplary embodiment of the present application. Referring to fig. 8, the step of calculating the expected angular velocity of the robot at the next moment may include:
s801, inputting the first yaw angle and the actual angular velocity to a fuzzy controller in a motion controller, and determining a proportional parameter increment, an integral parameter increment and a differential parameter increment of a PID controller in the motion controller by the fuzzy controller according to the first yaw angle and the actual angular velocity by using a preset fuzzy rule.
Motion controllers as used herein include fuzzy controllers and PID controllers.
In specific implementation, the first yaw angle and the actual angular velocity are input to a motion controller, a fuzzy controller in the motion controller fuzzifies the first yaw angle and the actual angular velocity, and fuzzy reasoning is carried out by using a preset fuzzy rule to obtain a proportional parameter increment, an integral parameter increment and a differential parameter increment.
Further, the preset fuzzy rule is set according to actual needs, and in this embodiment, this is not limited. For example, in one embodiment, the fuzzy rule may be set according to the following principle:
(1) When the first yaw angle is large and the actual angular velocity is small, the proportional parameter k is used for accelerating the response speed p The size to be set is large; further, to avoid possible oversaturation of the difference, the integral parameter k d Smaller one needs to be set while differentiating the parameter k in order to eliminate the integral effect i The value of (2) also needs to be set smaller; when the first yaw angle is large and the angular velocity is also large, the ratio parameter k is set to a specified value for the yaw angle to be rapid p And a differential parameter k i Should remain unchanged, while in order to prevent a large overshoot, the integral parameter k should be set d Greatly increases.
(2) When the first yaw angle is medium, if the actual angular velocity is also medium, the ratio parameter k p And a differential parameter k i The integral parameter k should remain unchanged at this time d The value of (2) has larger influence on the system, and the set value is proper; if the actual angular velocity is large, the integral parameter k is required to avoid excessive overshoot d Relatively increasing in value; if the actual angular velocity is small, the proportional parameter k is slightly increased p To make the yaw angle reach the specified value quickly and simultaneously reduce the differential parameter k i Is used as a reference to the value of (a),ensuring the stability of the system.
(3) If the current yaw angle is already close to the specified value and the actual angular velocity is still large, a rapid decrease of the scaling parameter k is required p Slowing down the actual angular velocity, slowing down the yaw rate to a specified value, and increasing the integral parameter k d A value preventing the yaw angle from exceeding a specified value too much; if the actual angular velocity is already slow, no further adjustment is necessary.
In combination with the above principles, table 1 is a fuzzy principle shown in an exemplary embodiment of the present application, fig. 9 is a schematic diagram of a fuzzy subset membership function shown in an exemplary embodiment of the present application, and table 2 is a preset fuzzy rule table shown in an exemplary embodiment of the present application.
TABLE 1 principle of fuzzification
TABLE 2 deltaKpIs a fuzzy rule table of (a)
S802, updating the proportional parameter, the integral parameter and the differential parameter of the PID controller according to the proportional parameter increment, the integral parameter increment and the differential parameter increment.
Specifically, the proportional parameter of the PID controller in the motion controller is updated to a value obtained by adding the proportional parameter increment to the current proportional parameter, the integral parameter of the PID controller in the motion controller is updated to a value obtained by adding the integral parameter increment to the current integral parameter, and the differential parameter of the PID controller in the motion controller is updated to a value obtained by adding the differential parameter increment to the current differential parameter. In other words, the update is according to the following formula:
wherein,the value of the proportional parameter at the moment j+1; k (k) pj The value of the proportional parameter at the moment j; />Is a proportional parameter increment; />The value of the integral parameter at time j+1; k (k) ij The value of the integral parameter at time j; />Delta for integral parameter; />A value of a differential parameter at time j+1; k (k) dj The value of the differential parameter at time j; />Is the differential parameter increment.
In combination with the above example, the proportional parameter of the PID controller at time 0 is, for exampleK p0 The integral parameter isK i0 The differential parameter is K d0 The increment of the proportion parameter is deltaKpThe integral parameter increment is deltaKiThe differential parameter increment is deltaKdThe proportion parameter of the PID controller at the 1 time after updating is as followsK p0 KpThe integral parameter isK i0 KiThe differential parameter isK d0 Kd
S803, inputting the first yaw angle, the second yaw angle, the third yaw angle, and the actual angular velocity to the PID controller, to calculate and output an expected angular velocity from the input data and the updated parameters by the PID controller.
The first yaw angle is a yaw angle calculated in the current period, the second yaw angle is a yaw angle calculated in a period previous to the current period, and the third yaw angle is a yaw angle calculated in a period previous to the current period. The expected angular velocity refers to the angular velocity at which the robot rotates at the next moment.
Specifically, the expected angular velocity can be calculated according to the following formula:
wherein,angular velocity for the current period; />Angular velocity of the previous cycle to the current cycle; k (k) pj The proportional parameter of the PID controller in the current period; k (k) ij Integrating parameters of the PID controller in the current period; k (k) dj The derivative parameter is the derivative parameter of the PID controller in the current period; α (j) is the yaw angle calculated in the current period, i.e., the first yaw angle; alpha (j-1) is the yaw angle calculated in the previous cycle of the current cycle, namely, the second yaw angle; α (j-2) is the yaw angle calculated in the previous cycle from the previous cycle, i.e., the third yaw angle.
S702, controlling the robot to rotate in situ according to the expected angular speed.
S703, when the next period comes, judging whether the yaw angle calculated in the current period is equal to the specified value.
Specifically, the designated value is 90 °, and the advancing direction of the characterization robot is perpendicular to the edge of the cliff.
And S704, if not, repeating the step of calculating the expected angular velocity of the robot at the next moment until the yaw angle is equal to a specified value.
In combination with the above example, for example, the yaw angle calculated in the current cycle is 60 degrees, which is not equal to 90 degrees, the step of calculating the expected angular velocity of the robot in the next cycle is continued until the calculated yaw angle is equal to 90 degrees, at which time the control robot stops moving, and further, the control robot is made to roll over the cliff with the front face facing the cliff.
The robot control method provided by the embodiment calculates the expected angular velocity of the next moment of the robot through the first yaw angle calculated according to the current period, the second yaw angle calculated according to the previous period of the current period, the third yaw angle calculated according to the previous period of the previous period and the actual angular velocity of the current moment of the robot, and controls the robot to rotate in place according to the expected angular velocity, further judges whether the yaw angle calculated by the current period is equal to the specified value when the next period arrives, so that when the yaw angle calculated by the current period is not equal to the specified value, the step of calculating the expected angular velocity of the next moment of the robot is repeatedly executed until the yaw angle is equal to the specified value. Thus, by continuously calculating the yaw angle and the expected angular velocity until the yaw angle is equal to the specified value, the robot can be controlled to cross the cliff when encountering the cliff in a manner that the front surface faces the cliff, so that the robot has reliability and usability when encountering high-rise structures.
A more specific example is given below to illustrate the present solution in detail:
in order to verify the effect of the method provided by the embodiment, the embodiment establishes a robot and a test scene in Webots simulation software, and the simulation environment is consistent with the real environment and has the basic condition for carrying out the robot test.
Firstly, the calibration work is carried out, the purpose of calibration is to calibrate the envelope precision of the front, left and right laser radars, and the main way is to adjust the angle of view and the resolution parameters of each laser radar. After calibration, specific parameters are shown in table 3:
table 3 parameters of lidar
Based on the simulation scenario described above, fig. 10 is a control schematic diagram of a robot according to an exemplary embodiment of the present application. Referring to fig. 10, the robot is controlled to advance, the robot moves forward in the sub-graph (a) in fig. 10, and the whole detection area of the robot is free from abnormality; further, when the robot is operated to the condition shown in the sub-graph (b), the detection data sequence of the first laser radar is missing, and at this time, the yaw angle is calculated; further, in the process of calculating the yaw angle, the robot continues to move forwards, and when in subgraphs (c), (d) and (e), the superposition area of the whole detection area of the robot and the cliff is gradually enlarged, and the calculated yaw angle gradually becomes stable (i.e. the yaw angle converges); throughout the process, the calculated yaw angle is shown in fig. 11 (fig. 11 is a graph showing a change of yaw angle with time according to an exemplary embodiment of the present application).
Further, referring to fig. 11, as can be seen from fig. 11, in the course of periodically calculating the yaw angle, the calculated yaw angle is not fixed at the beginning, and exhibits an overall tendency of "continuously decreasing first and then tending to stabilize". Analysis shows that the continuous reduction phase has larger interframe variance and the numerical value is not available. After tending to stabilize, the values tend to stabilize as input for subsequent control. Further, the calculated statistical result of the yaw angle is shown in table 4:
table 4 statistical results of yaw angle
Further, referring to the foregoing description, after the yaw angle converges, the robot is controlled to stop moving forward, at which time an edge alignment operation (i.e., letting the yaw angle reach a specified value so that the front face of the robot faces the edge of the cliff) needs to be performed in order to perform a jump operation (referring to the foregoing description, a jump to the other face) next. Therefore, in the present embodiment, the alignment control is performed when the yaw angle is stabilized at 17 °.
Specifically, in the process of alignment control, the robot is controlled to rotate in situ according to the yaw angle until the yaw angle is equal to a specified value, and the robot is controlled to stop moving.
Specifically, in the present embodiment, experiments were performed using the motion controller (including the fuzzy controller and the PID controller) and the conventional PID controller (the conventional PID controller involves only the PID controller) referred to herein, respectively, under the same parameter conditions.
Fig. 12 is a schematic diagram of an implementation of alignment control as illustrated in an exemplary embodiment of the present application.
Referring to fig. 12, in the case of sub-graph (a), the robot is controlled to rotate in a direction in which the yaw angle is reduced; when in sub-graph (c), the robot is influenced by a traditional PID controller, and passes over the stable value of the yaw angle to generate overshoot; when the sub graph (d) reaches the overshoot maximum, the yaw angle is turned towards the direction of decreasing the yaw angle again; and (3) repeatedly oscillating until the yaw angle of the robot reaches a specified value when the yaw angle of the robot reaches the sub-graph (f), and stopping the operation of the robot. In the process, the calculated yaw angle and angular velocity are shown in fig. 13 (fig. 13 is a graph showing a change of yaw angle and angular velocity with time, wherein (a) is a graph showing a change of yaw angle with time, and (b) is a graph showing a change of angular velocity with time, according to an exemplary embodiment of the present application).
As can be seen from fig. 13, the motion controller can achieve a smaller overshoot and a shorter settling time than the conventional PID controller. Quantitatively, the rise time of the motion controller is reduced by about 57% compared with that of the traditional PID controller under the condition of the same parameters, which represents the quick response performance of the system; the stabilization time is shortened by about 47%, which means that the system can reach a stable state more quickly, and it is verified that the motion controller related to the invention can realize faster and more stable robot speed adjustment compared with the traditional PID controller, so that the motion controller can reach a designated yaw angle stably. The remaining statistical indicators of the controller are shown in table 5:
Table 5 analysis of the performance of the controller
Corresponding to the embodiments of the robot control method described above, the present application also provides embodiments of the robot control device.
The embodiment of the robot control device can be applied to a robot. The apparatus embodiments may be implemented by software, or may be implemented by hardware or a combination of hardware and software. Taking software implementation as an example, the device in a logic sense is formed by reading corresponding computer program instructions in a nonvolatile memory into a memory through a processor of a robot where the device is located for operation. In terms of hardware, as shown in fig. 14, a hardware configuration diagram of a robot in which a robot control device is located is shown in fig. 14, and in addition to a processor, a memory, a network interface, and a nonvolatile memory shown in fig. 14, the robot in which the device is located in the embodiment generally includes other hardware according to an actual function of the robot control device, which is not described herein again.
Fig. 15 is a schematic structural diagram of a first embodiment of a robot control device provided in the present application. Referring to fig. 15, the apparatus provided in this embodiment is applied to a robot, where a first laser radar is disposed at a front portion of the robot, a second laser radar is disposed at a rear portion of the robot, a third laser radar is disposed at a left portion of the robot, a fourth laser radar is disposed at a right portion of the robot, and a detection domain of the first laser radar, a detection domain of the second laser radar, a detection domain of the third laser radar, and a detection domain of the fourth laser radar enclose a whole detection domain that is a rectangular detection domain; the device comprises: a detection module 1610, a calculation module 1620, and a control module 1630; wherein,
The detection module 1610 is configured to determine that the first lidar encounters a cliff when detecting that the detection data sequence of the first lidar is missing;
the calculating module 1620 is configured to periodically calculate a yaw angle of the robot according to the first missing sequence that the first lidar is currently missing and the second missing sequence that the lateral lidar is currently missing; wherein the lateral lidar is the third lidar or the fourth lidar; the yaw angle represents an included angle between a detection edge where a detection domain of the first laser radar is located and the edge of the cliff;
the control module 1630 is used for controlling the robot to stop moving forwards when the yaw angle converges;
the control module 1630 is further configured to periodically control the robot to rotate in place according to the yaw angle until the yaw angle is equal to a specified value, and control the robot to stop moving.
According to the robot control device, the laser radars are distributed on the front part, the rear part, the left part and the right part of the robot, and then the whole detection domain surrounded by the detection domain of the laser radars is the rectangular detection domain, so that in the running process of the robot, the detection data sequence of the laser radars is detected, when the detection data sequence of the first laser radars is detected to be missing, the condition that the robot encounters a cliff is determined, when the condition that the robot encounters the cliff is determined, the yaw angle of the robot is calculated periodically according to the first missing sequence of the first laser radars and the second missing sequence of the lateral laser radars, and then when the yaw angle converges, the robot is controlled to stop moving forwards, and accordingly the robot is controlled to rotate in place according to the yaw angle until the yaw angle is equal to a specified value, and the robot is controlled to stop moving. Wherein the lateral lidar is the third lidar or the fourth lidar; and the yaw angle represents an included angle between a detection edge where the detection domain of the first laser radar is located and the edge of the cliff. Therefore, whether the robot meets the cliff or not can be monitored in real time through the detection data sequence of the laser radar, and when the robot determines that the robot meets the cliff, the robot is controlled to stop in a mode that the front face faces the cliff, so that edge detection and accurate alignment of the special-shaped structural member can be realized, and the robot also has reliability and usability when meeting the high-rise structural member.
Optionally, the control module 1630 is specifically configured to:
calculating the expected angular velocity of the robot at the next moment according to the first yaw angle calculated in the current period, the second yaw angle calculated in the previous period of the current period, the third course angle calculated in the previous period of the previous period and the actual angular velocity of the robot at the current moment;
controlling the robot to rotate in situ according to the expected angular speed;
judging whether the yaw angle calculated in the current period is equal to the specified value or not when the next period comes;
if not, repeating the step of calculating the expected angular velocity of the robot at the next moment until the yaw angle is equal to a specified value.
Optionally, the control module 1630 is specifically configured to:
inputting the first yaw angle and the actual angular velocity to a fuzzy controller in a motion controller, so that the fuzzy controller can determine the proportional parameter increment, the integral parameter increment and the differential parameter increment of a PID controller in the motion controller according to the first yaw angle and the actual angular velocity by utilizing a preset fuzzy rule;
updating the proportional parameter, the integral parameter and the differential parameter of the PID controller according to the proportional parameter increment, the integral parameter increment and the differential parameter increment;
The first yaw angle, the second yaw angle, the third yaw angle, and the actual angular velocity are input to the PID controller to calculate and output an expected angular velocity from the input data and the updated parameters by the PID controller.
Optionally, the calculating module 1620 is specifically configured to:
calculating a first length of a first missing part according to the sequence number corresponding to the first missing sequence; wherein the first missing portion is a portion of the first lidar that is missing on a forward detection edge of the rectangular detection domain;
calculating a second length of a second missing part according to the sequence number corresponding to the second missing sequence; wherein the second missing portion is a portion of the lateral lidar that is missing on a lateral detection edge corresponding to the rectangular detection domain;
and calculating the yaw angle according to the first length and the second length.
Optionally, the calculating module 1620 is specifically configured to: for any pair of adjacent two sequence points in the missing part, calculating the detection length of two adjacent laser beams corresponding to the two adjacent sequence points according to the sequence numbers of the two adjacent sequence points, and calculating the length of the missing part according to all the calculated detection lengths.
With continued reference to fig. 14, the present application further provides a robot comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of any of the methods provided in the first aspect of the present application when executing the program.
The present application also provides a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of any of the methods provided in the first aspect of the present application.
The implementation process of the functions and roles of each unit in the above device is specifically shown in the implementation process of the corresponding steps in the above method, and will not be described herein again.
For the device embodiments, reference is made to the description of the method embodiments for the relevant points, since they essentially correspond to the method embodiments. The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purposes of the present application. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
The foregoing description of the preferred embodiments of the present invention is not intended to limit the invention to the precise form disclosed, and any modifications, equivalents, improvements and alternatives falling within the spirit and principles of the present invention are intended to be included within the scope of the present invention.

Claims (9)

1. The robot control method is characterized in that the method is applied to a robot, a first laser radar is arranged at the front part of the robot, a second laser radar is arranged at the rear part of the robot, a third laser radar is arranged at the left part of the robot, a fourth laser radar is arranged at the right part of the robot, and the whole detection domains enclosed by the detection domains of the first laser radar, the second laser radar, the third laser radar and the fourth laser radar are rectangular detection domains; the method comprises the following steps:
determining that the robot encounters a cliff when the absence of the detection data sequence of the first lidar is detected;
according to the first missing sequence of the first laser radar which is missing currently and the second missing sequence of the lateral laser radar which is missing currently, the yaw angle of the robot is calculated periodically; wherein the lateral lidar is the third lidar or the fourth lidar; the yaw angle represents an included angle between a detection edge where a detection domain of the first laser radar is located and the edge of the cliff;
Controlling the robot to stop moving forward when the yaw angle converges;
controlling the robot to rotate in situ according to the yaw angle until the yaw angle is equal to a specified value, and controlling the robot to stop moving;
the controlling the robot to rotate in place according to the yaw angle includes:
calculating an expected angular velocity at the next moment of the robot according to the first yaw angle calculated in the current period, the second yaw angle calculated in the previous period of the current period, the third yaw angle calculated in the previous period of the previous period and the actual angular velocity at the current moment of the robot;
controlling the robot to rotate in situ according to the expected angular speed;
judging whether the yaw angle calculated in the current period is equal to the specified value or not when the next period comes;
if not, repeating the step of calculating the expected angular velocity of the robot at the next moment until the yaw angle is equal to a specified value;
the calculating the expected angular velocity of the robot at the next moment according to the first yaw angle calculated in the current period, the second yaw angle calculated in the previous period of the current period, the third yaw angle calculated in the previous period of the previous period, and the actual angular velocity of the robot at the current moment includes:
Inputting the first yaw angle and the actual angular velocity to a fuzzy controller in a motion controller, so that the fuzzy controller can determine the proportional parameter increment, the integral parameter increment and the differential parameter increment of a PID controller in the motion controller according to the first yaw angle and the actual angular velocity by utilizing a preset fuzzy rule;
updating the proportional parameter, the integral parameter and the differential parameter of the PID controller according to the proportional parameter increment, the integral parameter increment and the differential parameter increment;
the first yaw angle, the second yaw angle, the third yaw angle, and the actual angular velocity are input to the PID controller to calculate and output an expected angular velocity from the input data and the updated parameters by the PID controller.
2. The method of claim 1, wherein calculating a yaw angle of the robot from a first missing sequence that the first lidar is currently missing and a second missing sequence that the lateral lidar is currently missing comprises:
calculating a first length of a first missing part according to the sequence number corresponding to the first missing sequence; wherein the first missing portion is a portion of the first lidar that is missing on a forward detection edge of the rectangular detection domain;
Calculating a second length of a second missing part according to the sequence number corresponding to the second missing sequence; wherein the second missing portion is a portion of the lateral lidar that is missing on a lateral detection edge corresponding to the rectangular detection domain;
and calculating the yaw angle according to the first length and the second length.
3. The method of claim 2, wherein calculating the length of the missing portion based on the sequence number corresponding to the missing sequence comprises:
for any pair of adjacent two sequence points in the missing part, calculating the detection length of two adjacent laser beams corresponding to the two adjacent sequence points according to the sequence numbers of the two adjacent sequence points;
and calculating the length of the missing part according to all the calculated detection lengths.
4. A method according to claim 3, wherein calculating the detection lengths of the adjacent two laser beams corresponding to the adjacent two sequence points according to the sequence numbers of the adjacent two sequence points comprises:
and calculating the detection length of two adjacent laser beams corresponding to the two adjacent sequence points according to the following formula:
,
Wherein,the detection length of two adjacent laser beams corresponding to the ith sequence point and the (i+1) th sequence point is set; d is the distance return value of the central ray of the laser radar; />Is the angular resolution of the lidar.
5. The robot control device is characterized in that the device is applied to a robot, a first laser radar is arranged at the front part of the robot, a second laser radar is arranged at the rear part of the robot, a third laser radar is arranged at the left part of the robot, a fourth laser radar is arranged at the right part of the robot, and the whole detection domains surrounded by the detection domains of the first laser radar, the second laser radar, the third laser radar and the fourth laser radar are rectangular detection domains; the device comprises: the device comprises a detection module, a calculation module and a control module; wherein,
the detection module is used for determining that the first laser radar meets a cliff when detecting that a detection data sequence of the first laser radar is missing;
the calculation module is used for periodically calculating the yaw angle of the robot according to the first missing sequence of the first laser radar which is currently missing and the second missing sequence of the lateral laser radar which is currently missing; wherein the lateral lidar is the third lidar or the fourth lidar; the yaw angle represents an included angle between a detection edge where a detection domain of the first laser radar is located and the edge of the cliff;
The control module is used for controlling the robot to stop moving forwards when the yaw angle converges;
the control module is also used for periodically controlling the robot to rotate in situ according to the yaw angle until the yaw angle is equal to a specified value, and controlling the robot to stop moving;
the control module is specifically configured to calculate an expected angular velocity at a next moment of the robot according to a first yaw angle calculated in a current period, a second yaw angle calculated in a period previous to the current period, a third heading angle calculated in a period previous to the previous period, and an actual angular velocity at a current moment of the robot;
the control module is also specifically used for controlling the robot to rotate in situ according to the expected angular speed;
the control module is further specifically configured to determine, when the next period arrives, whether the yaw angle calculated in the current period is equal to the specified value;
the control module is further specifically configured to repeatedly perform the step of calculating the expected angular velocity of the robot at the next moment when the yaw angle calculated in the current period is not equal to the specified value, until the yaw angle is equal to the specified value;
The computing module is specifically configured to input the first yaw angle and the actual angular velocity to a fuzzy controller in a motion controller, so that the fuzzy controller determines a proportional parameter increment, an integral parameter increment and a differential parameter increment of a PID controller in the motion controller according to the first yaw angle and the actual angular velocity by using a preset fuzzy rule;
the calculation module is further specifically configured to update a proportional parameter, an integral parameter and a derivative parameter of the PID controller according to the proportional parameter increment, the integral parameter increment and the derivative parameter increment;
the calculation module is further specifically configured to input the first yaw angle, the second yaw angle, the third yaw angle, and the actual angular velocity to the PID controller, so that the PID controller calculates and outputs an expected angular velocity according to the input data and the updated parameters.
6. The apparatus according to claim 5, wherein the calculating module is specifically configured to calculate a first length of the first missing portion according to a sequence number corresponding to the first missing sequence; wherein the first missing portion is a portion of the first lidar that is missing on a forward detection edge of the rectangular detection domain;
The calculating module is further specifically configured to calculate a second length of the second missing portion according to the sequence number corresponding to the second missing sequence; wherein the second missing portion is a portion of the lateral lidar that is missing on a lateral detection edge corresponding to the rectangular detection domain;
the calculating module is further specifically configured to calculate the yaw angle according to the first length and the second length.
7. The apparatus according to claim 6, wherein the calculating module is specifically configured to calculate, for any pair of two adjacent sequence points in the missing portion, detection lengths of two adjacent laser beams corresponding to the two adjacent sequence points according to sequence numbers of the two adjacent sequence points, and calculate, according to all the calculated detection lengths, a length of the missing portion.
8. A robot comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1-4 when the program is executed.
9. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the steps of the method according to any of claims 1-4.
CN202311417891.6A 2023-10-30 2023-10-30 Robot control method and device and robot Active CN117140536B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311417891.6A CN117140536B (en) 2023-10-30 2023-10-30 Robot control method and device and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311417891.6A CN117140536B (en) 2023-10-30 2023-10-30 Robot control method and device and robot

Publications (2)

Publication Number Publication Date
CN117140536A CN117140536A (en) 2023-12-01
CN117140536B true CN117140536B (en) 2024-01-09

Family

ID=88910426

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311417891.6A Active CN117140536B (en) 2023-10-30 2023-10-30 Robot control method and device and robot

Country Status (1)

Country Link
CN (1) CN117140536B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11222882A (en) * 1998-02-05 1999-08-17 Komatsu Ltd Dangerous zone monitoring device
KR100904769B1 (en) * 2008-08-01 2009-06-25 (주)다사로봇 Detecting device of obstacle and method thereof
KR20180024326A (en) * 2016-08-29 2018-03-08 엘지전자 주식회사 Moving Robot and controlling method
CN110968099A (en) * 2019-12-17 2020-04-07 小狗电器互联网科技(北京)股份有限公司 Robot trapped detection method and robot
WO2021213737A1 (en) * 2020-04-22 2021-10-28 Siemens Aktiengesellschaft Automatic navigation system for fire fighting robot
CN114114367A (en) * 2021-11-17 2022-03-01 中南大学 AGV outdoor positioning switching method, computer device and program product
JP2022039906A (en) * 2020-08-28 2022-03-10 中国計量大学 Multi-sensor combined calibration device and method
CN114237243A (en) * 2021-12-16 2022-03-25 杭州图灵视频科技有限公司 Anti-falling method and device for mobile robot, electronic equipment and storage medium
WO2022082843A1 (en) * 2020-10-19 2022-04-28 垒途智能教科技术研究院江苏有限公司 Multi-sensor integrated unmanned vehicle detection and obstacle avoidance system and obstacle avoidance method
CN114859380A (en) * 2021-05-14 2022-08-05 汤恩智能科技(苏州)有限公司 Cliff detection method, driving device and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220128995A1 (en) * 2020-10-22 2022-04-28 Waymo Llc Velocity estimation and object tracking for autonomous vehicle applications

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11222882A (en) * 1998-02-05 1999-08-17 Komatsu Ltd Dangerous zone monitoring device
KR100904769B1 (en) * 2008-08-01 2009-06-25 (주)다사로봇 Detecting device of obstacle and method thereof
KR20180024326A (en) * 2016-08-29 2018-03-08 엘지전자 주식회사 Moving Robot and controlling method
CN110968099A (en) * 2019-12-17 2020-04-07 小狗电器互联网科技(北京)股份有限公司 Robot trapped detection method and robot
WO2021213737A1 (en) * 2020-04-22 2021-10-28 Siemens Aktiengesellschaft Automatic navigation system for fire fighting robot
JP2022039906A (en) * 2020-08-28 2022-03-10 中国計量大学 Multi-sensor combined calibration device and method
WO2022082843A1 (en) * 2020-10-19 2022-04-28 垒途智能教科技术研究院江苏有限公司 Multi-sensor integrated unmanned vehicle detection and obstacle avoidance system and obstacle avoidance method
CN114859380A (en) * 2021-05-14 2022-08-05 汤恩智能科技(苏州)有限公司 Cliff detection method, driving device and storage medium
CN114114367A (en) * 2021-11-17 2022-03-01 中南大学 AGV outdoor positioning switching method, computer device and program product
CN114237243A (en) * 2021-12-16 2022-03-25 杭州图灵视频科技有限公司 Anti-falling method and device for mobile robot, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN117140536A (en) 2023-12-01

Similar Documents

Publication Publication Date Title
US11921510B1 (en) Approach for consolidating observed vehicle trajectories into a single representative trajectory
US9499197B2 (en) System and method for vehicle steering control
US9625911B2 (en) System and method for avoiding obstacle for autonomous vehicle
US9927816B2 (en) System and method for operating a follower vehicle in a vehicle platoon
US10860035B2 (en) Travel history storage method, method for producing travel path model, method for estimating local position, and travel history storage device
US20200174482A1 (en) Online bidirectional trajectory planning method in state-time space, recording medium storing program for executing same, and computer program stored in recording medium for executing same
CN113741454B (en) Multi-agent path planning method and system based on search
CN112254727B (en) TEB-based path planning method and device
CN110789530B (en) Four-wheel independent steering-independent driving vehicle trajectory tracking method and system
US11415996B2 (en) Positioning system for a mobile unit, vehicle and method for operating a positioning system
CN112092825B (en) Lane keeping method based on machine learning
JP2018167939A (en) Controlling system of crane and controlling method of the same
CN110609494A (en) ASRV-based anti-congestion simulation control method and system for multiple four-way shuttles on planned path
CN117140536B (en) Robot control method and device and robot
AU2016100586A4 (en) System and method for operating a follower vehicle in a vehicle platoon
US20210046925A1 (en) Enhanced vehicle operation
Philippe et al. Safe and online MPC for managing safety and comfort of autonomous vehicles in urban environment
Kim et al. Study on vehicle lateral control for backward driving
Lombard et al. Lateral control of an unmaned car using GNSS positionning in the context of connected vehicles
AU2023201142B1 (en) Method for controlling underground unmanned vehicle and device
JP2020044957A (en) Steering control system
Ma et al. Path following based on model predictive control for automatic parking system
CN112925323B (en) Rule-based mobile robot speed adjusting method and system
Seppänen et al. Comparison of Semi-autonomous Mobile Robot Control Strategies in Presence of Large Delay Fluctuation
CN114089730A (en) Robot motion planning method and automatic guided vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant