CN114347024A - Robot speed adjusting method and device, robot and storage medium - Google Patents
Robot speed adjusting method and device, robot and storage medium Download PDFInfo
- Publication number
- CN114347024A CN114347024A CN202111674308.0A CN202111674308A CN114347024A CN 114347024 A CN114347024 A CN 114347024A CN 202111674308 A CN202111674308 A CN 202111674308A CN 114347024 A CN114347024 A CN 114347024A
- Authority
- CN
- China
- Prior art keywords
- robot
- pedestrian
- speed
- relative motion
- ahead
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 44
- 230000033001 locomotion Effects 0.000 claims abstract description 135
- 239000013598 vector Substances 0.000 claims description 73
- 230000008569 process Effects 0.000 claims description 13
- 238000004590 computer program Methods 0.000 claims description 10
- 238000004891 communication Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1689—Teleoperation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The application relates to a robot speed adjusting method, a robot speed adjusting device and computer equipment. The method comprises the following steps: predicting the motion trail of the robot based on the robot direction and the robot speed at the same moment, and predicting the first meeting time of the robot and the pedestrian ahead based on the robot direction, the robot speed, the pedestrian direction and the pedestrian speed at the same moment so as to predict the reachable region of the pedestrian ahead; determining a relative motion coefficient between a pedestrian in front and the robot according to the robot direction, the robot speed, the pedestrian direction, the pedestrian speed and the distance between the robot and the pedestrian in front at the same moment; when an intersection exists between the motion trail and the reachable area, determining the risk degree of collision between the robot and the pedestrian in front according to the area of the intersection; selecting a target relative motion coefficient corresponding to the maximum risk degree from the relative motion coefficients; and adjusting the speed of the robot according to the relative motion coefficient of the target. The method improves the motion safety of the robot in the high stream environment.
Description
Technical Field
The present application relates to the field of robot technology, and in particular, to a method and an apparatus for adjusting a speed of a robot, a computer device, and a storage medium.
Background
With the development of robots, service robots appear, which can be classified into professional field service robots, personal or family service robots, mainly engaged in maintenance, repair, transportation, cleaning, security, rescue, monitoring and other work, and have a wide application range. The traditional service robot is inconvenient to move in a high-traffic environment, so that the problem that the motion safety of the robot cannot be guaranteed exists.
Disclosure of Invention
In view of the above, it is necessary to provide a robot speed adjusting method, apparatus, computer device and storage medium for solving the above technical problems.
A robot comprising a memory and a processor, the memory having stored therein executable program code, wherein the processor is configured to invoke and execute the executable program code by performing the steps of:
acquiring the robot direction and the robot speed of the robot in the moving process in real time, and acquiring the pedestrian direction and the pedestrian speed corresponding to a pedestrian in front of the robot;
predicting a motion trail of the robot based on the robot position and the robot speed at the same moment, predicting first meeting time of the robot and the pedestrian ahead based on the robot position, the robot speed, the pedestrian position and the pedestrian speed at the same moment, and predicting a reachable region of the pedestrian ahead according to the first meeting time;
determining a relative motion coefficient between the pedestrian ahead and the robot according to the robot orientation, the robot speed, the pedestrian orientation, the pedestrian speed and the distance between the robot and the pedestrian ahead at the same moment;
when an intersection exists between the motion trail and the reachable region, determining the collision risk degree of the robot and the front pedestrian according to the area of the intersection;
selecting a target relative motion coefficient corresponding to the maximum risk degree from the relative motion coefficients;
and adjusting the speed of the robot according to the relative motion coefficient of the target.
In one embodiment thereof, a robot, the robot orientation comprising a robot position and a robot direction; the pedestrian orientation comprises a pedestrian position and a pedestrian direction;
the predicting a first encounter time of the robot and the pedestrian ahead based on the robot orientation, the robot speed, the pedestrian orientation, and the pedestrian speed at the same time, the predicting a reachable region of the pedestrian ahead according to the first encounter time includes:
calculating a first meeting time of the robot and the pedestrian ahead based on the robot position, the robot speed, the pedestrian position and the pedestrian speed at the same moment;
calculating a direction included angle between the direction of the pedestrian and the predicted direction of the pedestrian ahead in the target time period; the target time period is a time period from a current time to the first encounter time;
calculating a speed vector of the front pedestrian in the prediction direction based on the direction included angle and the speed of the pedestrian at the same moment;
and predicting the reachable region of the pedestrian ahead according to the speed vector and the first meeting time.
In one embodiment, the determining the relative motion coefficient between the pedestrian in front and the robot according to the robot orientation, the robot speed, the pedestrian orientation, the pedestrian speed and the distance between the robot and the pedestrian in front at the same time comprises:
calculating the distance and the direction vector between the robot and the front pedestrian according to the position of the robot and the position of the pedestrian at the same moment; the pointing vectors include a first vector of the robot pointing to the pedestrian ahead and a second vector of the pedestrian ahead pointing to the robot;
calculating an included angle between the robot direction and the first vector to obtain a first included angle;
calculating an included angle between the pedestrian direction and the second vector to obtain a second included angle;
and calculating the relative motion coefficient between the front pedestrian and the robot according to the distance, the robot speed, the first included angle, the pedestrian speed and the second included angle at the same moment.
In one embodiment thereof, the determining the degree of risk of collision of the robot with the pedestrian in front according to the area of intersection comprises:
calculating the area of the intersection;
weighting the area based on a first preset coefficient to obtain a weighted area;
weighting the distance based on a second preset coefficient to obtain a weighted distance;
and determining the collision risk degree of the robot and the front pedestrian based on the weighted area and the weighted distance.
In one embodiment, the speed adjustment of the robot according to the target relative motion coefficient comprises:
determining a velocity parameter based on the target relative motion coefficient;
and adjusting the speed of the robot based on the speed parameter and the maximum speed of the robot.
In one embodiment thereof, a robot, the pedestrian in front comprising a pedestrian in front facing away from the robot, the method further comprising:
predicting a second encounter time of a pedestrian in front of and facing away from the robot with the robot;
predicting a reachable area of the pedestrian in front of and facing away from the robot according to the second meeting time;
and if an intersection exists between the motion trail and a reachable area of the pedestrian with the front opposite to the robot, playing prompt voice.
A method of adjusting robot speed, the method comprising:
acquiring the robot direction and the robot speed of the robot in the moving process in real time, and acquiring the pedestrian direction and the pedestrian speed corresponding to a pedestrian in front of the robot;
predicting a motion trail of the robot based on the robot position and the robot speed at the same moment, predicting first meeting time of the robot and the pedestrian ahead based on the robot position, the robot speed, the pedestrian position and the pedestrian speed at the same moment, and predicting a reachable region of the pedestrian ahead according to the first meeting time;
determining a relative motion coefficient between the pedestrian ahead and the robot according to the robot orientation, the robot speed, the pedestrian orientation, the pedestrian speed and the distance between the robot and the pedestrian ahead at the same moment;
when an intersection exists between the motion trail and the reachable region, determining the collision risk degree of the robot and the front pedestrian according to the area of the intersection;
selecting a target relative motion coefficient corresponding to the maximum risk degree from the relative motion coefficients;
and adjusting the speed of the robot according to the relative motion coefficient of the target.
An apparatus for adjusting a speed of a robot, the apparatus comprising:
the acquisition module is used for acquiring the robot direction and the robot speed of the robot in the motion process in real time and acquiring the pedestrian direction and the pedestrian speed corresponding to the pedestrian in front of the robot;
the prediction module is used for predicting the motion trail of the robot based on the robot position and the robot speed at the same moment, predicting first meeting time of the robot and the front pedestrian based on the robot position, the robot speed, the pedestrian position and the pedestrian speed at the same moment, and predicting the reachable region of the front pedestrian according to the first meeting time;
the determining module is used for determining a relative motion coefficient between the pedestrian in front and the robot according to the robot direction, the robot speed, the pedestrian direction, the pedestrian speed and the distance between the robot and the pedestrian in front at the same moment;
a risk determination module; when an intersection exists between the motion trail and the reachable region, determining the risk degree of collision between the robot and the front pedestrian according to the area of the intersection;
a selecting module, configured to select, from the relative motion coefficients, a target relative motion coefficient corresponding to a maximum risk degree;
and the adjusting module is used for adjusting the speed of the robot according to the relative motion coefficient of the target.
In one of the embodiments, the robot orientation includes a robot position and a robot direction; the pedestrian orientation comprises a pedestrian position and a pedestrian direction;
the prediction module is further used for calculating a first meeting time of the robot and the front pedestrian based on the robot position, the robot speed, the pedestrian position and the pedestrian speed at the same moment; calculating a direction included angle between the direction of the pedestrian and the predicted direction of the pedestrian ahead in the target time period; the target time period is a time period from a current time to the first encounter time; calculating a speed vector of the front pedestrian in the prediction direction based on the direction included angle and the speed of the pedestrian at the same moment; and predicting the reachable region of the pedestrian ahead according to the speed vector and the first meeting time.
In one embodiment, the determining module is further configured to calculate a distance and a direction vector between the robot and the pedestrian ahead according to the robot position and the pedestrian position at the same time; the pointing vectors include a first vector of the robot pointing to the pedestrian ahead and a second vector of the pedestrian ahead pointing to the robot; calculating an included angle between the robot direction and the first vector to obtain a first included angle; calculating an included angle between the pedestrian direction and the second vector to obtain a second included angle; and calculating the relative motion coefficient between the front pedestrian and the robot according to the distance, the robot speed, the first included angle, the pedestrian speed and the second included angle at the same moment.
In one embodiment, the risk determination module is further configured to calculate an area of the intersection; weighting the area based on a first preset coefficient to obtain a weighted area; weighting the distance based on a second preset coefficient to obtain a weighted distance; and determining the collision risk degree of the robot and the front pedestrian based on the weighted area and the weighted distance.
In one embodiment, the adjusting module is further configured to determine a speed parameter based on the target relative motion coefficient; and adjusting the speed of the robot based on the speed parameter and the maximum speed of the robot.
In one embodiment thereof, the apparatus further comprises:
the prompting module is used for predicting second meeting time of the pedestrian and the robot, wherein the pedestrian faces away from the robot in front of the prompting module; predicting a reachable area of the pedestrian in front of and facing away from the robot according to the second meeting time; and if an intersection exists between the motion trail and a reachable area of the pedestrian with the front opposite to the robot, playing prompt voice.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, causes the processor to carry out the steps of the above-mentioned method of adjusting the speed of a robot.
The robot speed adjusting method, the robot speed adjusting device, the computer equipment and the storage medium predict the motion trail of the robot based on the robot direction and the robot speed, predict the first meeting time of the robot and the pedestrian ahead based on the robot direction, the robot speed, the pedestrian direction and the pedestrian speed, and predict the reachable region of the pedestrian ahead according to the first meeting time; determining a relative motion coefficient between the pedestrian in front and the robot according to the robot direction, the robot speed, the pedestrian direction, the pedestrian speed and the distance between the robot and the pedestrian in front; the relative motion relationship between the robot and the pedestrian in front is determined. When an intersection exists between the motion trail and the reachable area, determining the risk degree of collision between the robot and the pedestrian in front according to the area of the intersection; the method effectively predicts the collision risk degree of the robot and the pedestrian in front by using the coincidence degree of the motion trail of the robot and the predicted area which the pedestrian in front is likely to reach and the distance between the robot and the pedestrian in front. Selecting a target relative motion coefficient corresponding to the maximum risk degree from the relative motion coefficients; and adjusting the speed of the robot according to the relative motion coefficient of the target. Through the measurement of the relative relation and the risk degree of different pedestrians and the robot, the speed of the robot is adjusted comprehensively, and the moving safety of the robot in the high-stream environment is effectively improved.
Drawings
FIG. 1 is a diagram of an exemplary embodiment of a method for adjusting a speed of a robot;
FIG. 2 is a schematic flow chart illustrating a method for adjusting the speed of a robot according to one embodiment;
FIG. 3 is a flowchart illustrating the steps of predicting the reach area of a pedestrian ahead in one embodiment;
FIG. 4 is a flow chart illustrating the steps of determining the relative motion coefficient between the pedestrian in front and the robot in one embodiment;
FIG. 5 is a block diagram showing an example of a device for adjusting the speed of a robot;
FIG. 6 is a block diagram showing the structure of a device for adjusting the speed of a robot according to another embodiment;
FIG. 7 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The method for adjusting the speed of the robot can be applied to the application environment shown in fig. 1. In this application environment, a robot 102 and a server 104 are included.
The server 104 may be an independent physical server or a service node in a blockchain system, a point-To-point (P2P, Peer To Peer) network is formed among the service nodes in the blockchain system, and the P2P Protocol is an application layer Protocol operating on a Transmission Control Protocol (TCP).
In addition, the server 104 may also be a server cluster composed of a plurality of physical servers, and may be a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a web service, cloud communication, a middleware service, a domain name service, a security service, a Content Delivery Network (CDN), a big data and artificial intelligence platform, and the like.
The robot 102 and the server 104 may be connected through communication connection manners such as bluetooth, USB (Universal Serial Bus), or network, which is not limited herein.
In one embodiment, as shown in fig. 2, a method for adjusting a robot speed is provided, which can be applied to a robot, and is described by taking the example that the method is applied to the robot in fig. 1, including the following steps:
s202, acquiring the robot direction and the robot speed of the robot in the moving process in real time, and acquiring the pedestrian direction and the pedestrian speed corresponding to the pedestrian in front of the robot.
Wherein the robot orientation includes a robot position and a robot direction. The pedestrian orientation includes a pedestrian position and a pedestrian direction, the position refers to coordinates, for example, the coordinates may be (2, 3), and the pedestrian direction refers to the direction or orientation of the pedestrian movement. The front pedestrians comprise pedestrians with the front facing the robot, namely pedestrians with the moving direction facing (approaching) the robot; the front pedestrian also includes a pedestrian with the front facing away from the robot, i.e., a pedestrian with a motion direction facing away from the robot. The pedestrian that the application relates to is the pedestrian in the visual scope of robot, and the pedestrian that the place ahead was just to the robot and the pedestrian that the place ahead was back to the robot all are the pedestrian in the visual scope of robot. The visible range of the robot may refer to a range that can be photographed by a camera mounted on the robot, and the number of the cameras mounted on the robot may be one or more, and if one, the camera is usually mounted right in front of the robot.
In one embodiment, before S202, the robot determines whether the pedestrian is a pedestrian with front facing the robot or a pedestrian with front facing away from the robot: calculating to obtain a robot vector (v) of the robot direction based on the robot direction and the pedestrian direction3) And pedestrian vector (v) of pedestrian direction4) Calculating robot vector (v)3) And pedestrian vector (v)4) The included angle beta of (a).
When the included angle is pi/2 and beta is less than or equal to pi, the pedestrian is a pedestrian with the front facing the robot, and when the included angle is 0 and less than or equal to beta is less than or equal to pi/2, the pedestrian is a pedestrian with the front facing away from the robot.
Specifically, the robot acquires the robot orientation and the robot speed, and the pedestrian orientation and the pedestrian speed corresponding to the pedestrian ahead. The robot can also receive the direction and speed instruction, and process the direction and speed of the robot in the moving process and the direction and speed of the pedestrian corresponding to the pedestrian in front of the robot.
S204, predicting the motion trail of the robot based on the robot direction and the robot speed at the same moment, predicting the first meeting time of the robot and the pedestrian ahead based on the robot direction, the robot speed, the pedestrian direction and the pedestrian speed at the same moment, and predicting the reachable region of the pedestrian ahead according to the first meeting time.
The motion trajectory of the robot is a trajectory formed by predicting an area (for example, an area with an area size equal to a floor space size of the robot) through which the robot will pass by using the current robot orientation and robot speed. The first encounter time is an encounter time between the robot and the pedestrian ahead, and for example, the encounter time is 5 seconds, which means that the robot and the pedestrian are predicted to encounter after 5 seconds. The reachable region refers to a region that a pedestrian may reach.
In one embodiment, the predicting of the motion trajectory of the robot comprises: the robot firstly determines a road area with a certain distance, then predicts the areas where the robot possibly appears in the road area according to the robot direction and the robot speed, and generates corresponding motion tracks according to the areas.
And S206, determining the relative motion coefficient between the pedestrian in front and the robot according to the robot direction, the robot speed, the pedestrian direction, the pedestrian speed and the distance between the robot and the pedestrian in front at the same moment.
Wherein, the Relative Motion coefficient (RMI) is used for measuring the Relative Motion relationship between the pedestrian in front and the robot by introducing the robot azimuth, the robot speed, the pedestrian azimuth, the pedestrian speed and the distance between the robot and the pedestrian in front.
And S208, when an intersection exists between the motion trail and the reachable area, determining the collision risk degree of the robot and the pedestrian ahead according to the area of the intersection.
The risk degree of the collision is a measure of the possible degree of the collision between the robot and the pedestrian in front, and the greater the risk degree is, the higher the possible degree of the collision between the robot and the pedestrian in front is.
Specifically, when an intersection exists between the motion trail and the reachable region, the robot calculates the area of the intersection; weighting the area based on a first preset coefficient to obtain a weighted area; weighting the distance based on a second preset coefficient to obtain a weighted distance; and determining the risk degree of the collision between the robot and the pedestrian in front based on the weighted area and the weighted distance.
For example, it is known that the degree of risk of collision is weighted area + weighted distance, the area of intersection is 2 square meters, the first predetermined coefficient is 0.7, the distance is 1 meter, and the second predetermined coefficient is 0.3.
The calculation can obtain: the weighted area is: 0.7 × 2 ═ 1.4
The weighted distance is: 0.3 × 1 ═ 0.3
Therefore, the risk degree of collision between the robot and the pedestrian in front is: 1.4+0.3 ═ 1.7
S210, selecting a target relative motion coefficient corresponding to the maximum risk degree from the relative motion coefficients.
The target relative motion coefficient refers to the relative motion coefficient of the front pedestrian corresponding to the front pedestrian with the maximum collision risk degree with the robot. The target relative motion coefficient is used for speed adjustment.
Specifically, the robot selects the maximum risk degree from all the risk degrees of collision between pedestrians in front and the robot, and determines the relative motion coefficient of the pedestrian in front as the target relative motion coefficient according to the pedestrian in front corresponding to the maximum risk degree.
And S212, adjusting the speed of the robot according to the relative motion coefficient of the target.
Specifically, the robot determines a speed parameter based on a target relative motion coefficient; and adjusting the speed based on the speed parameter and the maximum speed of the robot, such as adjusting the speed of the current robot.
The target relative motion coefficient may be referred to as target _ RMI, the speed parameter may be referred to as speed ratio (speed _ ratio), and the maximum speed that can be reached may be referred to as v _ max.
In one embodiment, the robot calculates the speed ratio speed _ ratio to 1/target _ RMI. And (3) adjusting the result: the robot may reach a maximum speed v _ max ═ v _ max speed _ ratio.
In one embodiment, the method further comprises: the robot predicts a second meeting time of a pedestrian and the robot, wherein the pedestrian is in front of the robot and back to the robot; predicting an accessible area of the pedestrian in front of the robot and away from the robot according to the second meeting time; and if an intersection exists between the motion trail and the reachable area of the pedestrian in front of the robot, the pedestrian is back to the robot, and a prompt voice is played.
In one embodiment, the step of predicting the reachable area of the pedestrian in front of and facing away from the robot comprises: the robot calculates a first encounter time of the pedestrian in front based on the robot position, the robot speed, the pedestrian with front facing away from the robot, and the pedestrian with front facing away from the robot. And calculating a rear direction included angle between the direction of the pedestrian with the front facing away from the robot and the predicted direction of the pedestrian with the front facing away from the robot in the target time period. The target time period is a time period from the current time to the first encounter time; and calculating the speed vector of the pedestrian with the front side back to the robot in the prediction direction based on the rear direction included angle and the speed of the pedestrian with the front side back to the robot. And predicting the reachable area of the pedestrian in front of the robot and opposite to the robot according to the speed vector and the second meeting time.
Where position refers to coordinates, i.e. the robot position may refer to robot coordinates, for example the coordinates may be (2, 3). The current time refers to the time for acquiring the robot orientation of the robot during the movement. The second meeting time refers to the meeting time of the robot and the pedestrian in front, for example, the meeting time is 5 seconds, which means that the robot and the pedestrian are predicted to meet after 5 seconds.
Specifically, the formula for calculating the second encounter time (T) may be
Where R _ p refers to the robot position, HB _ p refers to the pedestrian position with the front facing away from the robot, R _ v refers to the robot velocity, and HB _ v refers to the pedestrian velocity with the front facing away from the robot.
In one embodiment, when the number of cameras mounted on the robot is more than one, the range which can be shot by the cameras on the robot comprises the rear (opposite direction relative to the advancing direction of the robot) range of the robot, and when the pedestrian is a rear pedestrian, the robot predicts the third meeting time of the rear pedestrian and the robot; predicting the reachable area of the rear pedestrian according to the third phase meeting time; and if an intersection exists between the motion trail and the reachable area of the pedestrian behind, playing prompt voice or adjusting the speed of the robot.
The rear pedestrian can refer to a pedestrian behind the robot, namely a pedestrian facing away from the advancing direction of the robot. The third phase meets the time for the robot to meet the pedestrian behind.
In the above embodiment, the motion trajectory of the robot is predicted based on the robot orientation and the robot speed, the first encounter time of the robot and the pedestrian ahead is predicted based on the robot orientation, the robot speed, the pedestrian orientation and the pedestrian speed, and the reachable region of the pedestrian ahead is predicted according to the first encounter time; determining a relative motion coefficient between the pedestrian in front and the robot according to the robot direction, the robot speed, the pedestrian direction, the pedestrian speed and the distance between the robot and the pedestrian in front; the relative motion relationship between the robot and the pedestrian in front is determined. When an intersection exists between the motion trail and the reachable area, determining the risk degree of collision between the robot and the pedestrian in front according to the area of the intersection; the method effectively predicts the collision risk degree of the robot and the pedestrian in front by using the coincidence degree of the motion trail of the robot and the predicted area which the pedestrian in front is likely to reach and the distance between the robot and the pedestrian in front. Selecting a target relative motion coefficient corresponding to the maximum risk degree from the relative motion coefficients; and adjusting the speed of the robot according to the relative motion coefficient of the target. Through the measurement of the relative relation and the risk degree of different pedestrians and the robot, the speed of the robot is adjusted comprehensively, and the moving safety of the robot in the high-stream environment is effectively improved.
In one embodiment, as shown in fig. 3, the step of predicting the reachable area of the pedestrian ahead comprises:
and S302, calculating the first meeting time of the robot and the front pedestrian based on the robot position, the robot speed, the pedestrian position and the pedestrian speed at the same moment.
Wherein, the position refers to coordinates, that is, the robot position may refer to robot coordinates, for example, the coordinates may be (2.3), and the front pedestrian refers to a pedestrian within the visible range of the robot. The first encounter time is an encounter time between the robot and a pedestrian ahead, for example, the encounter time is 5 seconds, which means that the robot and the pedestrian are predicted to encounter after 5 seconds.
Specifically, the formula for calculating the first encounter time (t) may be
Where R _ p refers to the robot position, H _ p refers to the pedestrian position ahead, R _ v refers to the robot velocity, and H _ v refers to the pedestrian velocity ahead.
S304, calculating a direction included angle between the direction of the pedestrian and the predicted direction of the pedestrian in front in the target time interval; the target period is the period from the current time to the first encounter time.
The pedestrian direction refers to a direction in which a pedestrian is facing ahead at present. The predicted direction refers to a direction in which a pedestrian ahead is likely to face during the target period. The direction included angle is an included angle formed by clockwise or anticlockwise rotating to a prediction direction by taking the pedestrian direction as an initial direction. The range of the included angle of direction is 0 to 2 pi.
In one embodiment, the robot takes the current pedestrian direction as the starting direction, and the included angle obtained by clockwise turning to the predicted direction is the direction included angle.
In one embodiment, the robot takes the current pedestrian direction as the starting direction, and the included angle obtained by turning counterclockwise to the predicted direction is the direction included angle.
And S306, calculating the speed vector of the pedestrian ahead in the prediction direction based on the direction included angle and the pedestrian speed at the same moment.
Wherein the velocity vector refers to a unit velocity vector of a preceding pedestrian in the prediction direction.
In one embodiment, the unit velocity vector (v) is calculated as:
v=1/(e^(d_theta+H_v))
in the above formula, d _ theta refers to the included angle of direction, (0. ltoreq. d _ theta. ltoreq.2 pi), and H _ v refers to the speed of the pedestrian ahead, and the unit is meter/second.
And S308, predicting the reachable region of the pedestrian ahead according to the speed vector and the first meeting time.
Specifically, the robot multiplies the velocity vectors in all the prediction directions by the first meeting time to obtain all the pedestrian reachable points in front within the range of the direction included angle of 0 to 2 pi, and the region formed by the pedestrian reachable points in front surrounding the pedestrian in front is taken as the reachable region of the pedestrian in front.
In the embodiment, the first meeting time of the robot and the pedestrian in front is calculated, and the direction included angle between the direction of the pedestrian and the predicted direction of the pedestrian in front in the target time period is calculated; calculating a speed vector of a front pedestrian in the prediction direction based on the direction included angle and the speed of the pedestrian; and predicting the reachable region of the pedestrian ahead according to the speed vector and the first meeting time. Through the calculation of first encounter time, predict robot and the time of meeting of the pedestrian in the place ahead, all reachable points on the pedestrian of rethread calculation the pedestrian in the place ahead again first encounter time, predict the reachable region of the pedestrian in the place ahead, the effectual prediction that has promoted the pedestrian in the place ahead can reach the region, establish the basis for the effective measurement of subsequent risk degree simultaneously.
In one embodiment, as shown in fig. 4, the step of determining the relative motion coefficient between the pedestrian in front and the robot comprises:
s402, calculating the distance and the direction vector between the robot and the pedestrian in front according to the position of the robot and the position of the pedestrian at the same moment; the pointing vectors include a first vector where the robot points to the pedestrian ahead, and a second vector where the pedestrian ahead points to the robot.
In one embodiment, when the robot position (coordinates): (x1, y1), pedestrian position (coordinates): (x2, y2), then the distance (d) between the robot and the pedestrian in front may be:
in one embodiment, the calculation formula for the first vector (vec1) may be:
vec1=H_p-R_p
wherein, H _ p is the pedestrian position, and R _ p is the robot position.
S404, calculating an included angle between the robot direction and the first vector to obtain a first included angle.
Wherein, the value range of the first included angle is 0 to pi.
In one embodiment, S404 includes: the robot calculates robot vectors based on the robot position and the robot direction, and calculates the formula of the included angle (theta) according to the vectors: θ ═ acos (v1 × v2/| v1| × | v2|), where v1 is the robot vector and v2 is the first vector. A first angle is obtained.
In one embodiment, S404 includes: and the robot measures and calculates an included angle between the direction of the robot and the first vector to obtain a first included angle.
And S406, calculating an included angle between the pedestrian direction and the second vector to obtain a second included angle.
Wherein the value range of the second included angle is 0 to pi.
In one embodiment, S406 includes: the robot calculates pedestrian vectors based on the positions and directions of pedestrians, and calculates the formula of the included angle (theta) according to the vectors: θ ═ acos (v1 × v2/| v1| × | v2|), where v1 is the pedestrian vector and v2 is the second vector. A second angle is obtained.
And S408, calculating the relative motion coefficient between the front pedestrian and the robot according to the same moment distance, the robot speed, the first included angle, the pedestrian speed and the second included angle.
In one embodiment, S408 includes: the robot calculates the relative motion coefficient (RMI) between the pedestrian in front and the robot according to the formula:
(RMI=(2+R_v*cos(angle1)+H_v*cos(angle2))/dis。
in the above formula, R _ v refers to the robot speed, angle1 refers to the first angle, H _ v refers to the pedestrian speed, angle2 refers to the second angle, and dis refers to the distance between the pedestrian in front and the robot.
In the above embodiment, the distance and the direction vector between the robot and the pedestrian in front are calculated; calculating an included angle between the direction of the robot and the first vector to obtain a first included angle; calculating an included angle between the direction of the pedestrian and the second vector to obtain a second included angle; and calculating the relative motion coefficient between the pedestrian in front and the robot according to the distance, the speed of the robot, the first included angle, the speed of the pedestrian and the second included angle. According to the motion information of the robot and the front pedestrian, a parameter reflecting the relative motion between the robot and the front pedestrian, namely a relative motion coefficient is established, and the accurate adjustment of the speed of the robot is ensured.
It should be understood that although the various steps in the flow charts of fig. 2-4 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2-4 may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed in turn or alternately with other steps or at least some of the other steps.
In one embodiment, as shown in fig. 5, there is provided a robot speed adjusting apparatus, which may be a part of a computer device using a software module or a hardware module, or a combination of the two, the apparatus specifically includes: an obtaining module 502, a predicting module 504, a determining module 506, a risk determining module 508, a selecting module 510, and an adjusting module 512, wherein:
the acquiring module 502 is configured to acquire a robot orientation and a robot speed of the robot in a moving process in real time, and acquire a pedestrian orientation and a pedestrian speed corresponding to a pedestrian in front of the robot.
The prediction module 504 is configured to predict a motion trajectory of the robot based on the robot position and the robot speed at the same time, predict a first meeting time of the robot and a pedestrian ahead based on the robot position, the robot speed, the pedestrian position, and the pedestrian speed at the same time, and predict a reachable region of the pedestrian ahead according to the first meeting time.
And a determining module 506, configured to determine a relative motion coefficient between the pedestrian ahead and the robot according to the robot orientation, the robot speed, the pedestrian orientation, the pedestrian speed, and the distance between the robot and the pedestrian ahead at the same time.
And the risk determining module 508 is configured to determine a risk degree of collision between the robot and a pedestrian ahead according to an area of an intersection when the motion trajectory and the reachable area have the intersection.
The selecting module 510 is configured to select a target relative motion coefficient corresponding to the maximum risk degree from the relative motion coefficients.
And an adjusting module 512, configured to adjust a speed of the robot according to the target relative motion coefficient.
In one embodiment, the robot orientation includes a robot position and a robot direction; the pedestrian orientation comprises a pedestrian position and a pedestrian direction; the prediction module 504 is further configured to calculate a first meeting time of the robot and a pedestrian ahead based on the robot position, the robot speed, the pedestrian position and the pedestrian speed at the same time; calculating a direction included angle between the direction of the pedestrian and the predicted direction of the pedestrian in front in the target time period; the target time period is a time period from the current time to the first encounter time; calculating a speed vector of a front pedestrian in the prediction direction based on the same time direction included angle and the speed of the pedestrian; and predicting the reachable region of the pedestrian ahead according to the speed vector and the first meeting time.
In one embodiment, the determining module 506 is further configured to calculate a distance and a direction vector between the robot and a pedestrian ahead according to the robot position and the pedestrian position at the same time; the directional vectors comprise a first vector of the robot pointing to the front pedestrian and a second vector of the front pedestrian pointing to the robot; calculating an included angle between the direction of the robot and the first vector to obtain a first included angle; calculating an included angle between the direction of the pedestrian and the second vector to obtain a second included angle; and calculating the relative motion coefficient between the pedestrian in front and the robot according to the same moment distance, the speed of the robot, the first included angle, the speed of the pedestrian and the second included angle.
In one embodiment, the risk determination module 508 is further configured to calculate an area of intersection; weighting the area based on a first preset coefficient to obtain a weighted area; weighting the distance based on a second preset coefficient to obtain a weighted distance; and determining the risk degree of the collision between the robot and the pedestrian in front based on the weighted area and the weighted distance.
In one embodiment, the adjusting module 512 is further configured to determine a velocity parameter based on the target relative motion coefficient; and adjusting the speed of the robot based on the speed parameter and the maximum speed of the robot.
In one embodiment, as shown in fig. 6, the apparatus further comprises: a prompt module 514, wherein:
the prompting module 514 is used for predicting a second meeting time of the pedestrian and the robot, wherein the front of the pedestrian is opposite to the robot; predicting an accessible area of the pedestrian in front of the robot and away from the robot according to the second meeting time; and if an intersection exists between the motion trail and the reachable area of the pedestrian in front of the robot, the pedestrian is back to the robot, and a prompt voice is played.
In the above embodiment, the motion trajectory of the robot is predicted based on the robot orientation and the robot speed, the first encounter time of the robot and the pedestrian ahead is predicted based on the robot orientation, the robot speed, the pedestrian orientation and the pedestrian speed, and the reachable region of the pedestrian ahead is predicted according to the first encounter time; determining a relative motion coefficient between the pedestrian in front and the robot according to the robot direction, the robot speed, the pedestrian direction, the pedestrian speed and the distance between the robot and the pedestrian in front; the relative motion relationship between the robot and the pedestrian in front is determined. When an intersection exists between the motion trail and the reachable area, determining the risk degree of collision between the robot and the pedestrian in front according to the area of the intersection; the method effectively predicts the collision risk degree of the robot and the pedestrian in front by using the coincidence degree of the motion trail of the robot and the predicted area which the pedestrian in front is likely to reach and the distance between the robot and the pedestrian in front. Selecting a target relative motion coefficient corresponding to the maximum risk degree from the relative motion coefficients; and adjusting the speed of the robot according to the relative motion coefficient of the target. Through the measurement of the relative relation and the risk degree of different pedestrians and the robot, the speed of the robot is adjusted comprehensively, and the moving safety of the robot in the high-stream environment is effectively improved.
For the specific limitation of the robot speed adjusting device, reference may be made to the above limitation on the robot speed adjusting method, which is not described herein again. The modules in the robot speed adjusting device can be wholly or partially realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a robot is provided, the internal structure of which may be as shown in fig. 7. The computer device includes a processor, a memory, a communication interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless communication can be realized through WIFI, an operator network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a method of adjusting the speed of a robot. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 7 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, there is provided a robot comprising a memory and a processor, the memory having stored therein executable program code, wherein the processor is configured to call and execute the executable program code and implement the following steps:
acquiring the robot direction and the robot speed of the robot in the moving process in real time, and acquiring the pedestrian direction and the pedestrian speed corresponding to a pedestrian in front of the robot;
predicting the motion trail of the robot based on the robot position and the robot speed at the same moment, predicting first meeting time of the robot and the front pedestrian based on the robot position, the robot speed, the pedestrian position and the pedestrian speed at the same moment, and predicting the reachable region of the front pedestrian according to the first meeting time;
determining a relative motion coefficient between a pedestrian in front and the robot according to the robot direction, the robot speed, the pedestrian direction, the pedestrian speed and the distance between the robot and the pedestrian in front at the same moment;
when an intersection exists between the motion trail and the reachable area, determining the risk degree of collision between the robot and the pedestrian in front according to the area of the intersection;
selecting a target relative motion coefficient corresponding to the maximum risk degree from the relative motion coefficients;
and adjusting the speed of the robot according to the relative motion coefficient of the target.
In one embodiment, the robot orientation includes a robot position and a robot direction; the pedestrian orientation comprises a pedestrian position and a pedestrian direction;
predicting the first meeting time of the robot and the pedestrian in front based on the robot position, the robot speed, the pedestrian position and the pedestrian speed at the same moment, and predicting the reachable region of the pedestrian in front according to the first meeting time comprises the following steps:
calculating first meeting time of the robot and the front pedestrian based on the robot position, the robot speed, the pedestrian position and the pedestrian speed at the same moment;
calculating a direction included angle between the direction of the pedestrian and the predicted direction of the pedestrian in front in the target time period; the target time period is a time period from the current time to the first encounter time;
calculating a speed vector of a front pedestrian in the prediction direction based on the same time direction included angle and the speed of the pedestrian;
and predicting the reachable region of the pedestrian ahead according to the speed vector and the first meeting time.
In one embodiment, the robot, determining the relative motion coefficient between the pedestrian in front and the robot according to the robot orientation, the robot speed, the pedestrian orientation, the pedestrian speed and the distance between the robot and the pedestrian in front at the same time comprises:
calculating the distance and the direction vector between the robot and the pedestrian in front according to the position of the robot and the position of the pedestrian at the same moment; the directional vectors comprise a first vector of the robot pointing to the front pedestrian and a second vector of the front pedestrian pointing to the robot;
calculating an included angle between the direction of the robot and the first vector to obtain a first included angle;
calculating an included angle between the direction of the pedestrian and the second vector to obtain a second included angle;
and calculating the relative motion coefficient between the pedestrian in front and the robot according to the same moment distance, the speed of the robot, the first included angle, the speed of the pedestrian and the second included angle.
In one embodiment, the robot, determining the degree of risk of collision of the robot with the pedestrian ahead according to the area of intersection, comprises:
calculating the area of the intersection;
weighting the area based on a first preset coefficient to obtain a weighted area;
weighting the distance based on a second preset coefficient to obtain a weighted distance;
and determining the risk degree of the collision between the robot and the pedestrian in front based on the weighted area and the weighted distance.
In one embodiment, the robot, the speed adjustment of the robot according to the target relative motion coefficient comprises:
determining a speed parameter based on the target relative motion coefficient;
and adjusting the speed of the robot based on the speed parameter and the maximum speed of the robot.
In one embodiment, the robot further comprises:
predicting a second meeting time of the pedestrian and the robot, wherein the front of the pedestrian is opposite to the robot;
predicting an accessible area of the pedestrian in front of the robot and away from the robot according to the second meeting time;
and if an intersection exists between the motion trail and the reachable area of the pedestrian in front of the robot, the pedestrian is back to the robot, and a prompt voice is played.
In an embodiment, a computer-readable storage medium is provided, in which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
In one embodiment, a computer program product or computer program is provided that includes computer instructions stored in a computer-readable storage medium. The computer instructions are read by a processor of a computer device from a computer-readable storage medium, and the computer instructions are executed by the processor to cause the computer device to perform the steps in the above-mentioned method embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include at least one of non-volatile and volatile memory. Non-volatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical storage, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.
Claims (10)
1. A robot comprising a memory and a processor, the memory having stored therein executable program code, wherein the processor is configured to invoke and execute the executable program code by performing the steps of:
acquiring the robot direction and the robot speed of the robot in the moving process in real time, and acquiring the pedestrian direction and the pedestrian speed corresponding to a pedestrian in front of the robot;
predicting a motion trail of the robot based on the robot position and the robot speed at the same moment, predicting first meeting time of the robot and the pedestrian ahead based on the robot position, the robot speed, the pedestrian position and the pedestrian speed at the same moment, and predicting a reachable region of the pedestrian ahead according to the first meeting time;
determining a relative motion coefficient between the pedestrian ahead and the robot according to the robot orientation, the robot speed, the pedestrian orientation, the pedestrian speed and the distance between the robot and the pedestrian ahead at the same moment;
when an intersection exists between the motion trail and the reachable region, determining the collision risk degree of the robot and the front pedestrian according to the area of the intersection;
selecting a target relative motion coefficient corresponding to the maximum risk degree from the relative motion coefficients;
and adjusting the speed of the robot according to the relative motion coefficient of the target.
2. A robot as claimed in claim 1, characterized in that the robot orientation comprises a robot position and a robot direction; the pedestrian orientation comprises a pedestrian position and a pedestrian direction;
the predicting a first encounter time of the robot and the pedestrian ahead based on the robot orientation, the robot speed, the pedestrian orientation, and the pedestrian speed at the same time, the predicting a reachable region of the pedestrian ahead according to the first encounter time includes:
calculating a first meeting time of the robot and the pedestrian ahead based on the robot position, the robot speed, the pedestrian position and the pedestrian speed at the same moment;
calculating a direction included angle between the direction of the pedestrian and the predicted direction of the pedestrian ahead in the target time period; the target time period is a time period from a current time to the first encounter time;
calculating a speed vector of the front pedestrian in the prediction direction based on the direction included angle and the speed of the pedestrian at the same moment;
and predicting the reachable region of the pedestrian ahead according to the speed vector and the first meeting time.
3. The robot of claim 2, wherein the determining the relative motion coefficient between the pedestrian in front and the robot from the robot orientation, the robot speed, the pedestrian orientation, the pedestrian speed, and the distance between the robot and the pedestrian in front at the same time comprises:
calculating the distance and the direction vector between the robot and the front pedestrian according to the position of the robot and the position of the pedestrian at the same moment; the pointing vectors include a first vector of the robot pointing to the pedestrian ahead and a second vector of the pedestrian ahead pointing to the robot;
calculating an included angle between the robot direction and the first vector to obtain a first included angle;
calculating an included angle between the pedestrian direction and the second vector to obtain a second included angle;
and calculating the relative motion coefficient between the front pedestrian and the robot according to the distance, the robot speed, the first included angle, the pedestrian speed and the second included angle at the same moment.
4. The robot of claim 2, wherein said determining a degree of risk of collision of the robot with the pedestrian in front from the area of intersection comprises:
calculating the area of the intersection;
weighting the area based on a first preset coefficient to obtain a weighted area;
weighting the distance based on a second preset coefficient to obtain a weighted distance;
and determining the collision risk degree of the robot and the front pedestrian based on the weighted area and the weighted distance.
5. The robot of claim 1, wherein said adjusting the speed of the robot in accordance with the target relative motion coefficient comprises:
determining a velocity parameter based on the target relative motion coefficient;
and adjusting the speed of the robot based on the speed parameter and the maximum speed of the robot.
6. A robot as claimed in any of claims 1 to 5, wherein the front pedestrian comprises a pedestrian with a front facing away from the robot;
predicting a second encounter time of a pedestrian in front of and facing away from the robot with the robot;
predicting a reachable area of the pedestrian in front of and facing away from the robot according to the second meeting time;
and if an intersection exists between the motion trail and a reachable area of the pedestrian with the front opposite to the robot, playing prompt voice.
7. A method of adjusting a speed of a robot, the method comprising:
acquiring the robot direction and the robot speed of the robot in the moving process in real time, and acquiring the pedestrian direction and the pedestrian speed corresponding to a pedestrian in front of the robot;
predicting a motion trail of the robot based on the robot position and the robot speed at the same moment, predicting first meeting time of the robot and the pedestrian ahead based on the robot position, the robot speed, the pedestrian position and the pedestrian speed at the same moment, and predicting a reachable region of the pedestrian ahead according to the first meeting time;
determining a relative motion coefficient between the pedestrian ahead and the robot according to the robot orientation, the robot speed, the pedestrian orientation, the pedestrian speed and the distance between the robot and the pedestrian ahead at the same moment;
when an intersection exists between the motion trail and the reachable region, determining the collision risk degree of the robot and the front pedestrian according to the area of the intersection;
selecting a target relative motion coefficient corresponding to the maximum risk degree from the relative motion coefficients;
and adjusting the speed of the robot according to the relative motion coefficient of the target.
8. An apparatus for adjusting a speed of a robot, the apparatus comprising:
the acquisition module is used for acquiring the robot direction and the robot speed of the robot in the motion process in real time and acquiring the pedestrian direction and the pedestrian speed corresponding to the pedestrian in front of the robot;
the prediction module is used for predicting the motion trail of the robot based on the robot position and the robot speed at the same moment, predicting first meeting time of the robot and the front pedestrian based on the robot position, the robot speed, the pedestrian position and the pedestrian speed at the same moment, and predicting the reachable region of the front pedestrian according to the first meeting time;
the determining module is used for determining a relative motion coefficient between the pedestrian in front and the robot according to the robot direction, the robot speed, the pedestrian direction, the pedestrian speed and the distance between the robot and the pedestrian in front at the same moment;
a risk determination module; when an intersection exists between the motion trail and the reachable region, determining the risk degree of collision between the robot and the front pedestrian according to the area of the intersection;
a selecting module, configured to select, from the relative motion coefficients, a target relative motion coefficient corresponding to a maximum risk degree;
and the adjusting module is used for adjusting the speed of the robot according to the relative motion coefficient of the target.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method as claimed in claim 7.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method as claimed in claim 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111674308.0A CN114347024B (en) | 2021-12-31 | 2021-12-31 | Robot speed adjusting method and device, robot and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111674308.0A CN114347024B (en) | 2021-12-31 | 2021-12-31 | Robot speed adjusting method and device, robot and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114347024A true CN114347024A (en) | 2022-04-15 |
CN114347024B CN114347024B (en) | 2024-08-16 |
Family
ID=81106081
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111674308.0A Active CN114347024B (en) | 2021-12-31 | 2021-12-31 | Robot speed adjusting method and device, robot and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114347024B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116189380A (en) * | 2022-12-26 | 2023-05-30 | 湖北工业大学 | Man-machine safety interaction method, system, device and medium for mechanical equipment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103909926A (en) * | 2014-03-31 | 2014-07-09 | 长城汽车股份有限公司 | Vehicle lateral collision prevention method, device and system |
EP2952301A1 (en) * | 2014-06-05 | 2015-12-09 | Aldebaran Robotics | Humanoid robot with collision avoidance and trajectory recovery capabilities |
CN110213739A (en) * | 2019-06-11 | 2019-09-06 | 北京航空航天大学 | Multi-mode communication method and device towards the unmanned transportation system of mine car |
CN110843781A (en) * | 2019-11-27 | 2020-02-28 | 长安大学 | Vehicle curve automatic control method based on driver behavior |
CN113608528A (en) * | 2021-07-12 | 2021-11-05 | 千里眼(广州)人工智能科技有限公司 | Robot scheduling method, device, robot and storage medium |
-
2021
- 2021-12-31 CN CN202111674308.0A patent/CN114347024B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103909926A (en) * | 2014-03-31 | 2014-07-09 | 长城汽车股份有限公司 | Vehicle lateral collision prevention method, device and system |
EP2952301A1 (en) * | 2014-06-05 | 2015-12-09 | Aldebaran Robotics | Humanoid robot with collision avoidance and trajectory recovery capabilities |
CN110213739A (en) * | 2019-06-11 | 2019-09-06 | 北京航空航天大学 | Multi-mode communication method and device towards the unmanned transportation system of mine car |
CN110843781A (en) * | 2019-11-27 | 2020-02-28 | 长安大学 | Vehicle curve automatic control method based on driver behavior |
CN113608528A (en) * | 2021-07-12 | 2021-11-05 | 千里眼(广州)人工智能科技有限公司 | Robot scheduling method, device, robot and storage medium |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116189380A (en) * | 2022-12-26 | 2023-05-30 | 湖北工业大学 | Man-machine safety interaction method, system, device and medium for mechanical equipment |
CN116189380B (en) * | 2022-12-26 | 2024-08-02 | 湖北工业大学 | Man-machine safety interaction method, system, device and medium for mechanical equipment |
Also Published As
Publication number | Publication date |
---|---|
CN114347024B (en) | 2024-08-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6996009B2 (en) | Self-supervised training for depth estimation models using depth hints | |
US12005892B2 (en) | Simulating diverse long-term future trajectories in road scenes | |
US9175972B2 (en) | Route planning | |
CN113264066B (en) | Obstacle track prediction method and device, automatic driving vehicle and road side equipment | |
CN110364006A (en) | The vehicle interflow of machine learning enhancing | |
JP2021503134A (en) | Unsupervised learning of image depth and egomotion prediction neural networks | |
JP2022539250A (en) | Agent Trajectory Prediction Using Anchor Trajectories | |
CN109901139A (en) | Laser radar scaling method, device, equipment and storage medium | |
JP2020140704A (en) | Abnormality mapping by vehicle micro cloud | |
CN108801254B (en) | Repositioning method and robot | |
JP2020140704A5 (en) | ||
CN112540609B (en) | Path planning method, device, terminal equipment and storage medium | |
WO2021262837A1 (en) | Systems and methods for fine adjustment of roof models | |
JP2020507857A (en) | Agent navigation using visual input | |
CN114347024B (en) | Robot speed adjusting method and device, robot and storage medium | |
KR102383567B1 (en) | Method and system for localization based on processing visual information | |
CN110248157B (en) | Method and equipment for scheduling on duty | |
WO2024051437A1 (en) | Path planning method, and related apparatus and system | |
CN115793715B (en) | Unmanned aerial vehicle auxiliary flight method, system, device and storage medium | |
CN115435795B (en) | Vehicle positioning method based on looking-around image and semantic map | |
Xu et al. | An efficient multi‐sensor fusion and tracking protocol in a vehicle‐road collaborative system | |
CN113075713A (en) | Vehicle relative pose measuring method, system, equipment and storage medium | |
JP2024515248A (en) | Panoptic Segmentation Prediction for Augmented Reality | |
CN114663503A (en) | Three-dimensional position prediction from images | |
Rebling et al. | Development and testing of driver assistance systems with unity |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |