CN111474930B - Tracking control method, device, equipment and medium based on visual positioning - Google Patents

Tracking control method, device, equipment and medium based on visual positioning Download PDF

Info

Publication number
CN111474930B
CN111474930B CN202010284664.0A CN202010284664A CN111474930B CN 111474930 B CN111474930 B CN 111474930B CN 202010284664 A CN202010284664 A CN 202010284664A CN 111474930 B CN111474930 B CN 111474930B
Authority
CN
China
Prior art keywords
robot
curvature
preset
determining
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010284664.0A
Other languages
Chinese (zh)
Other versions
CN111474930A (en
Inventor
刘方圆
郭若楠
朱明明
刘星
张弥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sineva Technology Co ltd
Original Assignee
Beijing Sineva Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sineva Technology Co ltd filed Critical Beijing Sineva Technology Co ltd
Priority to CN202010284664.0A priority Critical patent/CN111474930B/en
Publication of CN111474930A publication Critical patent/CN111474930A/en
Application granted granted Critical
Publication of CN111474930B publication Critical patent/CN111474930B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The embodiment of the invention provides a tracking control method, a device, equipment and a storage medium based on visual positioning, which are used for improving the positioning precision and the independent operation capability in the robot tracking control process. The method comprises the following steps: when the control robot runs according to the track in the track file stored in advance, acquiring the position information of the current position of the robot based on a visual positioning technology; determining a reference target position for adjusting the angular speed of the robot based on the position information of the current position of the robot, the position information of a front path point of the current position of the robot and the position information of a rear path point of the current position of the robot, and determining the distance between the current position of the robot and the reference target position; when the distance is determined to be smaller than a first preset distance threshold, determining the curvature between the current position of the robot and the reference target position, and determining the running angular speed of the robot based on the curvature and the preset linear speed; the robot is controlled to operate at a preset linear velocity and a determined angular velocity.

Description

Tracking control method, device, equipment and medium based on visual positioning
Technical Field
The present invention relates to the field of unmanned vehicle driving, and in particular, to a tracking control method, device, equipment and storage medium based on visual positioning.
Background
With the development of industry and the upgrade of manufacturing industry, robots become more intelligent and informative, and simultaneously develop toward high reliability and high precision. Tracking robots (Automated Guided Vehicle, AGVs) are widely used in the field of industrial automation, and are an important part of automation in future factories. Among them, robot tracking is a mode of driving. The tracking driving refers to automatic tracking driving of the automobile according to a preset route. Currently existing tracking driving is usually only performed according to a predetermined route, no provision is made for the speed at which the vehicle is driven, and in view of safety, an obstacle avoidance function should be added to the tracking scheme.
The AGV in the prior art keeps away the barrier with following two modes, and one kind has put forward ultrasonic wave and keeps away barrier and infrared tracking scheme, and this scheme needs two at least sensors, but the ultrasonic wave is low to soft interface's obstacle avoidance ability, and because the precision is not high, receives the condition that the interference appears reporting by mistake easily. Another proposal proposes a scheme of performing a tracking operation by an image recognition apparatus, which requires preset environmental information without using external characteristic environmental information.
In summary, the existing tracking obstacle avoidance technology has low accuracy, is easily interfered, depends on external characteristic environments, and has poor independent operation capability.
Disclosure of Invention
The embodiment of the invention provides a tracking control method, a device, equipment and a storage medium based on visual positioning, which are used for improving the positioning precision and the independent operation capability in the robot tracking control process.
In a first aspect, an embodiment of the present invention provides a tracking control method based on visual positioning, including:
when the control robot runs according to the track in the pre-stored track file, the position information of the current position of the robot is acquired based on the visual positioning technology, and the pre-stored track file contains the position information of each preset road point on the motion track;
determining a reference target position for adjusting the angular speed of the robot based on the position information of the current position of the robot, the position information of a front path point of the current position of the robot and the position information of a rear path point of the current position of the robot, and determining the distance between the current position of the robot and the reference target position;
when the distance is determined to be smaller than a first preset distance threshold, determining the curvature between the current position of the robot and the reference target position, and determining the running angular speed of the robot based on the curvature and the preset linear speed;
the robot is controlled to operate at a preset linear velocity and a determined angular velocity.
According to the tracking control method based on visual positioning, when a robot is controlled to operate according to a track in a pre-stored track file, position information of the current position of the robot is acquired based on a visual positioning technology, position information of each preset road point on a motion track is acquired from the pre-stored track file, other recognition devices are not needed for presetting environment information, then a reference target position for adjusting the angular speed of the robot is determined based on the position information of the current position of the robot, the position information of a previous road point of the current position of the robot and the position information of a subsequent road point, the distance between the current position of the robot and the reference target position is determined, curvature between the current position of the robot and the reference target position is determined when the distance is determined to be smaller than a first preset distance threshold value, the angular speed of the robot operation is determined based on the curvature and the preset linear speed, and finally the robot is controlled to operate at the preset linear speed and the determined angular speed. Compared with the prior art, the robot tracking control system does not depend on external characteristic environments, can independently operate the robot to complete tracking control work, and improves the positioning precision and the independent operation capability in the robot tracking control process.
In one possible embodiment, determining the reference target position for adjusting the angular velocity of the robot based on the position information of the current position of the robot, the position information of the previous waypoint and the position information of the subsequent waypoint of the current position of the robot includes:
if the distance between the front path point and the rear path point is smaller than a second preset distance threshold value, determining the rear path point as a reference target position;
if the distance between the front path point and the rear path point is greater than or equal to a second preset distance threshold value, determining an intersection point with a shorter distance between the front path point and the rear path point as a reference target position from intersection points of the searching boundary of the robot and paths between the front path point and the rear path point.
In one possible embodiment, determining the angular velocity of the robot operation based on the curvature and the preset linear velocity includes:
in the corresponding relation between a preset curvature section and a linear speed, determining a target curvature section where the curvature is located, and determining a target linear speed corresponding to the target curvature section;
the product of the curvature and the target line speed is determined as the angular speed at which the robot is operating.
In the tracking control method based on visual positioning provided by the embodiment of the invention, in the robot tracking control process, the control instruction is further subjected to smoothing treatment, namely, the curvature is set into different sections, the linear speed and the angular speed are correspondingly adjusted according to the section where the curvature is located, so that the stable steering operation of the vehicle body is realized, and the situation of sharp turning or overlarge steering angle is avoided.
In one possible embodiment, the tracking control method based on visual localization further includes:
and when the distance between the current position of the robot and the reference target position is greater than or equal to a first preset distance threshold value, controlling the robot to keep the current speed to run.
In one possible embodiment, the tracking control method based on visual localization further includes:
acquiring images on a robot running path;
counting the number of point clouds in a unit area in a preset range on a robot running path in an image;
and when the number of the point clouds is determined to be larger than a preset number threshold, controlling the robot to stop running.
In the tracking control method based on visual positioning provided by the embodiment of the invention, in the robot tracking control process, images on a robot running path are collected, whether an obstacle exists is judged according to the number of point clouds in a unit area in an effective range in a statistical image, if the number of the point clouds exceeds a threshold value, the obstacle is judged, if the number of the point clouds is lower than the threshold value, no obstacle exists, and an instruction for stopping running is sent until the obstacle disappears. Compared with the prior art, the method has the advantages that the point cloud obstacle avoidance function is added in the running process of robot tracking control, so that the running safety of the robot is improved.
In a second aspect, an embodiment of the present invention provides a tracking control device based on visual positioning, including:
the acquisition unit is used for acquiring the position information of the current position of the robot based on a visual positioning technology when the robot is controlled to run according to the track in the pre-stored track file, wherein the pre-stored track file contains the position information of each preset road point on the motion track;
the analysis unit is used for determining a reference target position for adjusting the angular speed of the robot based on the position information of the current position of the robot, the position information of a previous path point of the current position of the robot and the position information of a subsequent path point of the current position of the robot, and determining the distance between the current position of the robot and the reference target position;
the processing unit is used for determining the curvature between the current position of the robot and the reference target position when the distance is determined to be smaller than a first preset distance threshold value, and determining the running angular speed of the robot based on the curvature and the preset linear speed;
and a control unit for controlling the robot to operate at a preset linear speed and the determined angular speed.
In one possible embodiment, the analysis unit is specifically configured to:
if the distance between the front path point and the rear path point is smaller than a second preset distance threshold value, determining the rear path point as a reference target position;
if the distance between the front path point and the rear path point is greater than or equal to a second preset distance threshold value, determining an intersection point with a shorter distance between the front path point and the rear path point as a reference target position from intersection points of the searching boundary of the robot and paths between the front path point and the rear path point.
In a possible embodiment, the processing unit is specifically configured to:
in the corresponding relation between a preset curvature section and a linear speed, determining a target curvature section where the curvature is located, and determining a target linear speed corresponding to the target curvature section;
the product of the curvature and the target line speed is determined as the angular speed at which the robot is operating.
In one possible embodiment, the tracking control device based on visual localization further includes:
and when the distance between the current position of the robot and the reference target position is greater than or equal to a first preset distance threshold value, controlling the robot to keep the current speed to run.
In one possible embodiment, the tracking control device based on visual localization further includes:
acquiring images on a robot running path;
counting the number of point clouds in a unit area in a preset range on a robot running path in an image;
and when the number of the point clouds is determined to be larger than a preset number threshold, controlling the robot to stop running.
In a third aspect, an embodiment of the present invention further provides a tracking control device based on visual positioning, including: the at least one processor, the at least one memory, and the computer program instructions stored in the memory, when executed by the processor, implement the vision positioning-based tracking control method provided by the first aspect of the present invention.
In a fourth aspect, an embodiment of the present invention further provides a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the tracking control method based on visual positioning provided in the first aspect of the present invention.
Drawings
The above, as well as additional purposes, features, and advantages of exemplary embodiments of the present invention will become readily apparent from the following detailed description when read in conjunction with the accompanying drawings. Several embodiments of the present invention are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which:
FIG. 1 is a schematic flow chart of a tracking control method based on visual localization according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of another tracking control method based on visual localization according to an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of a tracking control device based on visual positioning according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a tracking control device based on visual positioning according to an embodiment of the present invention.
Detailed Description
The embodiments of the present invention will be described below with reference to the accompanying drawings, and it should be understood that the embodiments described herein are for illustration and explanation of the present invention only, and are not intended to limit the present invention.
In view of the fact that the accuracy of the tracking obstacle avoidance technology in the prior art is low, interference is easy to occur, the independent operation capability is poor due to the fact that the external characteristic environment is relied on, the embodiment of the invention provides a tracking control scheme based on visual positioning, and the tracking control scheme is used for improving the positioning accuracy and the independent operation capability in the tracking control process of a robot.
The following describes the scheme provided by the embodiment of the invention in detail with reference to the accompanying drawings.
As shown in fig. 1, an embodiment of the present invention provides a tracking control method based on visual localization, which may include the following steps:
step 101, when the control robot runs according to the track in the pre-stored track file, the position information of the current position of the robot is acquired based on the visual positioning technology, and the pre-stored track file contains the position information of each preset road point on the motion track.
It should be noted that, in the track file stored in advance, besides the position information of each preset road point on the track, the track file also includes yaw angle information between adjacent road points, and when the robot runs to any road point, the robot is controlled to turn to the next road point according to the yaw angle information.
Step 102, determining a reference target position for adjusting the angular speed of the robot based on the position information of the current position of the robot, the position information of the previous path point and the position information of the next path point of the current position of the robot, and determining the distance between the current position of the robot and the reference target position.
In the implementation, if the distance between the front path point and the rear path point is smaller than a second preset distance threshold, the rear path point is determined to be the reference target position, and if the distance between the front path point and the rear path point is larger than or equal to the second preset distance threshold, the intersection point with the shorter distance between the robot search boundary and the rear path point is determined to be the reference target position.
And step 103, when the distance is determined to be smaller than a first preset distance threshold value, determining the curvature between the current position of the robot and the reference target position, and determining the running angular speed of the robot based on the curvature and the preset linear speed.
In the specific implementation, in the corresponding relation between the preset curvature interval and the linear speed, the target curvature interval where the curvature is located is determined, the target linear speed corresponding to the target curvature interval is determined, and the product of the curvature and the target linear speed is determined as the angular speed of the robot operation.
When the target curvature interval where the curvature is located is determined, and the curvature value at the moment is set to zero in a positive and negative zero preset range, the corresponding angular velocity is also zero, so that the direction adjustment is avoided from being too frequent in the running process of the robot.
In one possible embodiment, the robot is controlled to maintain the current speed operation when it is determined that the distance between the current position of the robot and the reference target position is greater than or equal to a first preset distance threshold.
Step 104, controlling the robot to run at a preset linear speed and the determined angular speed.
In the specific implementation, a point cloud obstacle avoidance function is added in the robot tracking control process, namely, images on a robot running path are collected; counting the number of point clouds in a unit area in a preset range on a robot running path in an image; and when the number of the point clouds is determined to be larger than a preset number threshold, controlling the robot to stop running.
The following describes in detail a specific procedure of a tracking control scheme based on visual localization according to an embodiment of the present invention with reference to the accompanying drawings.
Fig. 2 shows a tracking control scheme based on visual localization according to an embodiment of the present invention, which may include the following steps:
step 201, creating and storing a track file based on a visual positioning technology, and acquiring the position information of each preset road point and the yaw angle between the adjacent road points according to the track file.
When the robot reaches any road point, the direction of the robot is controlled to run with the backward road point according to the yaw angle information between the adjacent road points in the track file, and when the reached road point is determined to be the last road point in the track file, the robot is completely tracked and controlled, and the running is finished.
Step 202, acquiring position information of a current position of the robot based on a visual positioning technology, and detecting a distance between the current position of the robot and a path point behind the current position.
Step 203, determining whether the distance between the current position of the robot and the path point behind the current position is greater than or equal to a second preset distance threshold, if yes, executing step 204, otherwise, executing step 205.
Step 204, determining an intersection point with a shorter distance from the rear path point among intersection points of the path between the search boundary of the robot and the front path point and the rear path point of the current position as a reference target position, and executing step 206.
Step 205, determining a subsequent waypoint of the current position of the robot as a reference target position, and executing step 206.
Step 206, detecting the distance between the current position of the robot and the reference target position, when the distance is smaller than a first preset distance threshold, executing step 207, otherwise, executing step 208.
Step 207, determining a curvature between the current position of the robot and the reference target position, determining a target line speed corresponding to the target curvature interval in a corresponding relation between the preset curvature interval and the line speed according to the target curvature interval in which the curvature is located, determining a product of the curvature and the target line speed as an angular speed of the robot operation, controlling the robot to operate at the corresponding preset line speed and the determined angular speed, and executing step 209.
In the implementation, when the target curvature interval where the curvature is located is determined, and the curvature value at the moment is set to zero in a positive and negative zero preset range, the corresponding angular speed is also zero, so that the direction adjustment is prevented from being too frequent in the running process of the robot.
Step 208, the control robot runs in the current state, and step 209 is performed.
Step 209, collecting images on the robot running path, and counting the number of point clouds in a unit area in a preset range on the robot running path in the images.
Step 210, determining whether the number of point clouds is greater than a preset number threshold, if so, executing step 211, otherwise, returning to step 206.
Step 211, the robot stops running and returns to step 210 to judge whether the obstacle disappears according to the number of point clouds.
As shown in fig. 3, an embodiment of the present invention further provides a tracking control device based on visual positioning, including:
an obtaining unit 31, configured to obtain, based on a visual positioning technique, position information of a current position of the robot when the robot is controlled to operate according to a track in a track file stored in advance, where the track file stored in advance includes position information of each preset waypoint on the motion track;
an analysis unit 32 for determining a reference target position for adjusting the angular velocity of the robot based on the position information of the current position of the robot, the position information of the previous waypoint and the position information of the next waypoint of the current position of the robot, and determining a distance between the current position of the robot and the reference target position;
a processing unit 33, configured to determine a curvature between a current position of the robot and a reference target position when the determined distance is smaller than a first preset distance threshold, and determine an angular speed of the robot running based on the curvature and a preset linear speed;
and a control unit 34 for controlling the robot to operate at a preset linear velocity and a determined angular velocity.
In one possible embodiment, the analysis unit 32 is specifically configured to:
if the distance between the front path point and the rear path point is smaller than a second preset distance threshold value, determining the rear path point as a reference target position;
if the distance between the front path point and the rear path point is greater than or equal to a second preset distance threshold value, determining an intersection point with a shorter distance between the front path point and the rear path point as a reference target position from intersection points of the searching boundary of the robot and paths between the front path point and the rear path point.
In one possible implementation, the processing unit 33 is specifically configured to:
in the corresponding relation between a preset curvature section and a linear speed, determining a target curvature section where the curvature is located, and determining a target linear speed corresponding to the target curvature section;
the product of the curvature and the target line speed is determined as the angular speed at which the robot is operating.
In one possible embodiment, the tracking control device based on visual localization further includes:
and when the distance between the current position of the robot and the reference target position is greater than or equal to a first preset distance threshold value, controlling the robot to keep the current speed to run.
In one possible embodiment, the tracking control device based on visual localization further includes:
acquiring images on a robot running path;
counting the number of point clouds in a unit area in a preset range on a robot running path in an image;
and when the number of the point clouds is determined to be larger than a preset number threshold, controlling the robot to stop running.
Based on the same conception as the embodiment of the invention, the embodiment of the invention also provides a tracking control device based on visual positioning.
As shown in fig. 4, an embodiment of the present invention further provides a tracking control device 40 based on visual positioning, including: at least one processor 41, at least one memory 42, and computer program instructions stored in the memory 42, which, when executed by the processor 41, implement the vision positioning-based tracking control method provided in an embodiment of the present invention.
In an exemplary embodiment, a storage medium is also provided, such as a memory 42, comprising instructions executable by the processor 41 of the vision-based positioning tracking control device to perform the above-described method.
Alternatively, the storage medium may be a non-transitory computer readable storage medium, for example, a ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present application without departing from the spirit or scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims and the equivalents thereof, the present application is intended to cover such modifications and variations.

Claims (12)

1. A tracking control method based on visual localization, comprising:
when a control robot runs according to a track in a pre-stored track file, acquiring the position information of the current position of the robot based on a visual positioning technology, wherein the pre-stored track file contains the position information of each preset road point on a motion track;
determining a reference target position for adjusting the angular speed of the robot based on the position information of the current position of the robot, the position information of a previous path point of the current position of the robot and the position information of a subsequent path point, and determining the distance between the current position of the robot and the reference target position;
when the distance is determined to be smaller than a first preset distance threshold value, determining the curvature between the current position of the robot and the reference target position, determining the angular speed of the robot operation based on the curvature and a preset linear speed, and setting the curvature and the angular speed corresponding to the curvature to be zero when a target curvature interval in which the curvature is located is determined to be within a positive-negative zero preset range;
and controlling the robot to run at the preset linear speed and the angular speed.
2. The method according to claim 1, wherein the determining a reference target position for adjusting the angular velocity of the robot based on the position information of the current position of the robot, the position information of the previous waypoint and the position information of the subsequent waypoint of the current position of the robot, comprises:
if the distance between the front road point and the rear road point is smaller than a second preset distance threshold value, determining the rear road point as a reference target position;
and if the distance between the front path point and the rear path point is greater than or equal to the second preset distance threshold value, determining an intersection point with a shorter distance between the front path point and the rear path point as a reference target position from intersection points of the searching boundary of the robot and paths between the front path point and the rear path point.
3. The method of claim 1, wherein the determining the angular velocity of the robot operation based on the curvature and a preset linear velocity comprises:
in a preset corresponding relation between a curvature interval and a linear speed, determining a target curvature interval in which the curvature is located, and determining a target linear speed corresponding to the target curvature interval;
the product of the curvature and the target line speed is determined as an angular speed at which the robot operates.
4. The method according to claim 1, wherein the method further comprises:
and when the distance between the current position of the robot and the reference target position is greater than or equal to the first preset distance threshold, controlling the robot to keep the current speed to run.
5. The method according to claim 1, wherein the method further comprises:
acquiring images on the running path of the robot;
counting the number of point clouds in a unit area in a preset range on a robot running path in the image;
and when the number of the point clouds is determined to be larger than a preset number threshold, controlling the robot to stop running.
6. A tracking control device based on visual localization, comprising:
the acquisition unit is used for acquiring the position information of the current position of the robot based on a visual positioning technology when the robot is controlled to run according to the track in the pre-stored track file, wherein the pre-stored track file contains the position information of each preset road point on the motion track;
an analysis unit, configured to determine a reference target position for adjusting an angular velocity of the robot based on position information of a current position of the robot, position information of a previous waypoint and position information of a subsequent waypoint of the current position of the robot, and determine a distance between the current position of the robot and the reference target position;
the processing unit is used for determining the curvature between the current position of the robot and the reference target position when the distance is determined to be smaller than a first preset distance threshold value, determining the angular speed of the robot operation based on the curvature and a preset linear speed, and setting the curvature and the angular speed corresponding to the curvature to be zero when a target curvature interval where the curvature is located is determined to be within a positive-negative zero preset range;
and the control unit is used for controlling the robot to run at the preset linear speed and the angular speed.
7. The device according to claim 6, wherein the analysis unit is specifically configured to:
if the distance between the front road point and the rear road point is smaller than a second preset distance threshold value, determining the rear road point as a reference target position;
and if the distance between the front path point and the rear path point is greater than or equal to the second preset distance threshold value, determining an intersection point with a shorter distance between the front path point and the rear path point as a reference target position from intersection points of the searching boundary of the robot and paths between the front path point and the rear path point.
8. The apparatus according to claim 6, wherein the processing unit is specifically configured to:
in a preset corresponding relation between a curvature interval and a linear speed, determining a target curvature interval in which the curvature is located, and determining a target linear speed corresponding to the target curvature interval;
the product of the curvature and the target line speed is determined as an angular speed at which the robot operates.
9. The apparatus of claim 6, wherein the apparatus further comprises:
and when the distance between the current position of the robot and the reference target position is greater than or equal to the first preset distance threshold, controlling the robot to keep the current speed to run.
10. The apparatus of claim 6, wherein the apparatus further comprises:
acquiring images on the running path of the robot;
counting the number of point clouds in a unit area in a preset range on a robot running path in the image;
and when the number of the point clouds is determined to be larger than a preset number threshold, controlling the robot to stop running.
11. A tracking control device based on visual localization, comprising: at least one processor, at least one memory, and computer program instructions stored in the memory, which when executed by the processor, implement the visual localization-based tracking control method of any one of claims 1-5.
12. A computer readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the visual localization based tracking control method of any one of claims 1-5.
CN202010284664.0A 2020-04-13 2020-04-13 Tracking control method, device, equipment and medium based on visual positioning Active CN111474930B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010284664.0A CN111474930B (en) 2020-04-13 2020-04-13 Tracking control method, device, equipment and medium based on visual positioning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010284664.0A CN111474930B (en) 2020-04-13 2020-04-13 Tracking control method, device, equipment and medium based on visual positioning

Publications (2)

Publication Number Publication Date
CN111474930A CN111474930A (en) 2020-07-31
CN111474930B true CN111474930B (en) 2023-07-18

Family

ID=71751562

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010284664.0A Active CN111474930B (en) 2020-04-13 2020-04-13 Tracking control method, device, equipment and medium based on visual positioning

Country Status (1)

Country Link
CN (1) CN111474930B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112068570A (en) * 2020-09-18 2020-12-11 拉扎斯网络科技(上海)有限公司 Robot movement control method and device and robot
CN113753042B (en) * 2020-10-30 2023-06-30 北京京东乾石科技有限公司 Unmanned vehicle speed limiting method and device, unmanned vehicle and storage medium
CN112711252B (en) * 2020-12-08 2024-05-24 深圳市优必选科技股份有限公司 Mobile robot, path tracking method thereof and computer readable storage medium
CN113031591B (en) * 2021-02-24 2023-04-07 丰疆智能(深圳)有限公司 Exception handling method and device for material pushing robot, server and storage medium
CN113050657B (en) * 2021-03-29 2021-09-17 紫清智行科技(北京)有限公司 Waypoint processing method and system for automatic driving tracking
CN113176782B (en) * 2021-05-21 2022-10-04 福建盛海智能科技有限公司 Autonomous path-changing tracking method and unmanned vehicle
CN114281085B (en) * 2021-12-29 2023-06-06 福建汉特云智能科技有限公司 Robot tracking method and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105867379A (en) * 2016-04-13 2016-08-17 上海物景智能科技有限公司 Method and system for controlling motion of robot
CN107703973A (en) * 2017-09-11 2018-02-16 广州视源电子科技股份有限公司 Trace tracking method, device
CN110187706A (en) * 2019-05-28 2019-08-30 上海钛米机器人科技有限公司 A kind of speed planning method, apparatus, electronic equipment and storage medium
CN110799989A (en) * 2019-04-20 2020-02-14 深圳市大疆创新科技有限公司 Obstacle detection method, equipment, movable platform and storage medium
CN110807806A (en) * 2020-01-08 2020-02-18 中智行科技有限公司 Obstacle detection method and device, storage medium and terminal equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105867379A (en) * 2016-04-13 2016-08-17 上海物景智能科技有限公司 Method and system for controlling motion of robot
CN107703973A (en) * 2017-09-11 2018-02-16 广州视源电子科技股份有限公司 Trace tracking method, device
CN110799989A (en) * 2019-04-20 2020-02-14 深圳市大疆创新科技有限公司 Obstacle detection method, equipment, movable platform and storage medium
CN110187706A (en) * 2019-05-28 2019-08-30 上海钛米机器人科技有限公司 A kind of speed planning method, apparatus, electronic equipment and storage medium
CN110807806A (en) * 2020-01-08 2020-02-18 中智行科技有限公司 Obstacle detection method and device, storage medium and terminal equipment

Also Published As

Publication number Publication date
CN111474930A (en) 2020-07-31

Similar Documents

Publication Publication Date Title
CN111474930B (en) Tracking control method, device, equipment and medium based on visual positioning
EP3566821B1 (en) Robot movement control method, and robot
CN110673115B (en) Combined calibration method, device, equipment and medium for radar and integrated navigation system
US11809194B2 (en) Target abnormality determination device
US10363960B2 (en) Method and device for assisting a maneuvering process of a motor vehicle
CN111830979B (en) Track optimization method and device
US10369993B2 (en) Method and device for monitoring a setpoint trajectory to be traveled by a vehicle for being collision free
CN111216792B (en) Automatic driving vehicle state monitoring system and method and automobile
CN109947089B (en) Automatic guided vehicle attitude control method and device and automatic guided vehicle
JP2018527689A (en) Virtual line following method and modification method for autonomous vehicles
CN110580040A (en) Object tracking in blind zones
US20220119007A1 (en) Method and Device for Operating a Robot with Improved Object Detection
CN114200945B (en) Safety control method of mobile robot
JP7156924B2 (en) Lane boundary setting device, lane boundary setting method
CN113228131B (en) Method and system for providing ambient data
CN113771839B (en) Automatic parking decision planning method and system
WO2019031168A1 (en) Mobile body and method for control of mobile body
CN114879704B (en) Robot obstacle-avoiding control method and system
CN109689459B (en) Vehicle travel control method and travel control device
CN111445723A (en) Vehicle path identification
CN111966089A (en) Method for estimating speed of dynamic obstacle by using cost map in mobile robot
CN113885525A (en) Path planning method and system for automatically driving vehicle to get rid of trouble, vehicle and storage medium
US20230249675A1 (en) Method for ascertaining a replacement trajectory, computer program product, parking assistance system and vehicle
CN110913335B (en) Automatic guided vehicle sensing and positioning method and device, server and automatic guided vehicle
CN116279428A (en) Method and system for automatic parking and obstacle avoidance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant