CN113791627B - Robot navigation method, equipment, medium and product - Google Patents

Robot navigation method, equipment, medium and product Download PDF

Info

Publication number
CN113791627B
CN113791627B CN202111352081.8A CN202111352081A CN113791627B CN 113791627 B CN113791627 B CN 113791627B CN 202111352081 A CN202111352081 A CN 202111352081A CN 113791627 B CN113791627 B CN 113791627B
Authority
CN
China
Prior art keywords
coordinate
information
robot
target
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111352081.8A
Other languages
Chinese (zh)
Other versions
CN113791627A (en
Inventor
曹学为
鲁涛
程道一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Automation of Chinese Academy of Science
Original Assignee
Institute of Automation of Chinese Academy of Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Automation of Chinese Academy of Science filed Critical Institute of Automation of Chinese Academy of Science
Priority to CN202111352081.8A priority Critical patent/CN113791627B/en
Publication of CN113791627A publication Critical patent/CN113791627A/en
Application granted granted Critical
Publication of CN113791627B publication Critical patent/CN113791627B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Acoustics & Sound (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Navigation (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a robot navigation method, device, medium and product, wherein the method comprises the following steps: acquiring information of a main road of a current floor, an initial coordinate of the current floor and a target coordinate of the current floor; determining the distance between the initial coordinate and the target coordinate according to the initial coordinate and the target coordinate; judging whether the distance meets a first preset condition, if so, generating a global path through an A-x algorithm based on the initial coordinate and the target coordinate; otherwise, generating a global path through a Dijkstra algorithm and an A-x algorithm, and enabling the robot to move from an initial coordinate to a target coordinate according to the global path.

Description

Robot navigation method, equipment, medium and product
Technical Field
The invention relates to the field of intelligent autonomous mobile robots, in particular to a robot navigation method, equipment, a medium and a product.
Background
In recent years, with the continuous development of artificial intelligence theory, autonomous mobile robot technology is mature day by day and is widely applied to various fields such as intelligent industry, military, medical treatment, service and the like. Meanwhile, the tasks of the mobile robot are more complex, and the environment is changed from the original single deterministic environment to the uncertain environment. Therefore, in recent years, research on the autonomous intelligent control technology of the mobile robot in a complex environment has attracted extensive attention in academic circles and industrial circles, and the navigation control method, as a key technology thereof, becomes one of the research hotspots of the current intelligent robot. For example, in the complex environment of the existing hospital, the medicines are mainly taken and placed manually for transportation, and precious time is wasted. For the current intelligent mobile robot, a large amount of optimization is not made on the navigation path, so that the current intelligent mobile robot can realize autonomous movement, but the walking effect is not satisfactory.
Disclosure of Invention
The invention provides a robot navigation method, equipment, a medium and a product, which can realize autonomous navigation of a robot, realize uninterrupted transportation, improve the generation efficiency of a global path, enable the robot to obtain an optimal path in a complex building environment, and ensure that the robot can have good trafficability in the actual walking process.
In a first aspect, the present invention provides a robot navigation method, including: acquiring information of a main road of a current floor, an initial coordinate of the current floor and a target coordinate of the current floor; determining the distance between the starting coordinate and the target coordinate according to the starting coordinate and the target coordinate; judging whether the distance meets a first preset condition, wherein the first preset condition is used for determining a method for generating a global path, and if so, generating the global path through an A-x algorithm based on the initial coordinate and the target coordinate; otherwise, determining an initial projection point coordinate through the trunk information and the initial coordinate, determining a target projection point coordinate through the trunk information and the target coordinate, and generating the global path through a Dijkstra algorithm and an A algorithm based on the trunk information, the initial coordinate, the initial projection point coordinate, the target coordinate and the target projection point coordinate, so that the robot moves from the initial coordinate to the target coordinate according to the global path.
Further, said moving the robot from the start coordinate to the target coordinate according to the global path comprises: acquiring a current coordinate, wherein the current coordinate comprises the initial coordinate, and acquiring a corresponding look-ahead point, and the look-ahead point is selected on the global path according to a second preset condition; acquiring a projection point coordinate, wherein the projection point coordinate is the projection of the current coordinate on the global path, and determining a corresponding distance and a corresponding deviation angle according to the projection point coordinate and the current coordinate; determining a control coefficient according to the distance and the deviation angle, acquiring a preset maximum turning differential speed, and determining the speed of a left wheel and the speed of a right wheel according to the control coefficient and the maximum turning differential speed; and moving to the forward-looking point according to the left wheel speed and the right wheel speed, wherein the forward-looking point comprises the target coordinate.
Further, the robot comprises a computer for human-computer interaction and a microcomputer for logic calculation; and the interaction protocol of the computer and the microcomputer comprises instruction bytes, data segments, check codes and end marks.
Further, the acquiring the information of the main road of the current floor includes: inquiring the information of the main road corresponding to the current floor, and if the information of the main road corresponding to the current floor exists, acquiring the information of the main road of the current floor; otherwise, acquiring at least one local map information through a laser radar; splicing the at least one piece of local map information through an SIFT image matching algorithm to obtain an overall map; and receiving at least one endpoint coordinate set, and generating corresponding trunk road information according to the at least one endpoint coordinate set and the whole map, wherein the endpoint coordinate set comprises two endpoint coordinates.
Further, the splicing the at least one local map information through a SIFT image matching algorithm to obtain an overall map includes: acquiring a spliced map through human-computer interaction, wherein the spliced map is obtained by splicing at least one piece of local map information by a user; and obtaining the integral map through the SIFT image matching algorithm based on the spliced map.
Further, the robot comprises a camera and at least one box layer, and the at least one box layer corresponds to the information of the at least one box layer; and after the robot moves from the starting coordinate to the target coordinate according to the global path, the robot further comprises: the camera scans an identification code, the identification code comprises user information, box layer information and time information, and the identification code is used for opening the box layer corresponding to the box layer information.
Further, the robot comprises an encoder, a left wheel, a right wheel, a laser radar, an ultrasonic sensor and a communication connecting line; and before the obtaining of the information of the main road of the current floor, the starting coordinate of the current floor and the target coordinate of the current floor, the method further comprises the following steps: carrying out fault detection on the encoder, and if the encoder has a fault, stopping the motion and giving an alarm; carrying out fault detection on the laser radar, and if the laser radar has a fault, stopping moving and giving an alarm; carrying out fault detection on the ultrasonic sensor, and if the ultrasonic sensor has a fault, stopping moving and giving an alarm; carrying out fault detection on the communication connecting line; if the fault exists, stopping moving, and giving an alarm or giving an alarm after moving to the target coordinate; and, said performing fault detection on said encoder comprises: subtracting the pulse data of the first moment and the second moment of the left wheel fed back by the encoder to obtain a left wheel difference value, and subtracting the pulse data of the first moment and the second moment of the right wheel fed back by the encoder to obtain a right wheel difference value; determining the maximum difference corresponding to the first moment and the second moment according to the difference between the first moment and the second moment, the preset diameter of the wheel and the preset maximum linear velocity difference; if the maximum difference value is smaller than the absolute value of the difference value of the left wheel difference value and the right wheel difference value, the encoder has a fault; and/or acquiring the current left wheel speed and the current right wheel speed fed back by the encoder, and judging whether the current left wheel speed and the current right wheel speed meet a third preset condition, wherein if the current left wheel speed and the current right wheel speed do not meet the third preset condition, the encoder has a fault; and/or the fault detection of the lidar comprises: acquiring a first current position fed back by the laser radar and a second current position fed back by the encoder, and making a difference between the first current position and the second current position, wherein if the difference between the first current position and the second current position does not meet a fourth preset condition, the laser radar has a fault; and/or said fault detecting said ultrasonic sensor, comprising: acquiring a data packet fed back by the ultrasonic sensor, wherein if the data packet does not meet a fifth preset condition, the ultrasonic sensor has a fault; and/or the fault detection of the communication connection line comprises: and acquiring data corresponding to the communication connecting line, wherein if the acquisition fails, the communication connecting line has a fault.
In a second aspect, the present invention further provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the robot navigation method according to any one of the above methods when executing the computer program.
In a third aspect, the invention also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the robot navigation method as described in any of the above.
In a fourth aspect, the present invention also provides a computer program product comprising a computer program, wherein the computer program is configured to, when executed by a processor, implement the steps of the robot navigation method according to any one of the above.
According to the robot navigation method, the device, the medium and the product, the information of the main trunk of the current floor, the initial coordinate of the current floor and the target coordinate of the current floor are obtained; determining the distance between the initial coordinate and the target coordinate according to the initial coordinate and the target coordinate; judging whether the distance meets a first preset condition, wherein the first preset condition is used for determining a method for generating the global path, and if so, generating the global path through an A-x algorithm based on the initial coordinate and the target coordinate; otherwise, determining the initial projection point coordinate through the trunk information and the initial coordinate, determining the target projection point coordinate through the trunk information and the target coordinate, based on the trunk information, the initial coordinate, the initial projection point coordinate, the target coordinate and the target projection point coordinate, generating a global path through a Dijkstra algorithm and an A-algorithm, enabling the robot to move from the initial coordinate to the target coordinate according to the global path, it can be seen that the method can realize the autonomous path planning of the robot and the uninterrupted transportation, through the Dijkstra algorithm and the A-star algorithm, the generated global path improves the generation efficiency of the global path, so that the robot can obtain the optimal path in a complex building environment, the robot can have good trafficability in the actual walking process, and the method is applied to transportation and can release the manpower in article transportation.
Drawings
In order to more clearly illustrate the technical solutions of the present invention or the prior art, the drawings needed for the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
FIG. 1 is a schematic flow diagram of some embodiments of a robot navigation method provided in accordance with the present invention;
FIG. 2 is a schematic flow chart diagram of further embodiments of a robot navigation method provided in accordance with the present invention;
FIG. 3-1 is a schematic diagram of a sub-map of a robot navigation method provided in accordance with the present invention;
FIG. 3-2 is a schematic diagram of a sub-map stitching process of the robot navigation method provided in accordance with the present invention;
3-3 are schematic diagrams of sub-map stitching completion of the robot navigation method provided in accordance with the present invention;
3-4 are overall map diagrams of the robot navigation method provided in accordance with the present invention;
3-5 are schematic diagrams of the main road information of the robot navigation method provided according to the present invention;
3-6 are schematic diagrams of the location of look-ahead points;
3-7 are schematic diagrams of motion control of the robot at an angle off the target path;
3-8 are schematic views of the steering state of the robot;
3-9 are global path diagrams of a robot navigation method provided in accordance with the present invention;
3-10 are schematic diagrams of the global path planning steps of the robot navigation method provided in accordance with the present invention;
FIGS. 3-11 are schematic diagrams of the pill box ejection flow of the pill delivery robot;
3-12 are schematic diagrams of the robot ultrasound position;
3-13 are schematic diagrams of the hardware architecture of the robot;
3-14 are software architecture diagrams of a robot;
3-15 are schematic workflow diagrams of a drug delivery robot;
FIG. 4 is a schematic structural diagram of some embodiments of a robotic navigation device provided in accordance with the present invention;
fig. 5 is a schematic structural diagram of an electronic device provided in accordance with the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings. The embodiments and features of the embodiments of the present invention may be combined with each other without conflict.
It should be noted that the terms "first", "second", and the like in the present invention are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in the present invention are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that reference to "one or more" unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present invention are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
Referring to fig. 1, fig. 1 is a flowchart illustrating a robot navigation method according to some embodiments of the present invention. As shown in fig. 1, the method comprises the steps of:
step 101, obtaining information of a main road of a current floor, an initial coordinate of the current floor and a target coordinate of the current floor.
In some embodiments, the information of the main corridor may be composed of a corridor of the current floor, and the start coordinate and the target coordinate may be global positioning coordinates of the robot corresponding to the current floor obtained by a method such as laser radar or ultrasonic (or called ultrasound). As an example, the navigation method of the robot may also be applied in outdoor streets, sports stadiums, supermarkets, indoor medicine delivery, and other scenes.
And 102, determining the distance between the starting coordinate and the target coordinate according to the starting coordinate and the target coordinate.
In some embodiments, a straight-line distance between the start coordinate and the target coordinate may be calculated, and a trunk distance between the start coordinate and the target coordinate may also be calculated as needed.
And 103, judging whether the distance meets a first preset condition, wherein the first preset condition is used for determining a method for generating the global path, and if so, generating the global path through an A-x algorithm based on the initial coordinate and the target coordinate.
In some embodiments, the first preset condition may be that the distance is less than (greater than or equal to) a preset threshold or a threshold range, and if the distance is less than (greater than or equal to) the preset threshold or within the threshold range, the global path is generated through an a-algorithm based on the start coordinate and the target coordinate. The a-Star algorithm is the most effective direct search method for solving the shortest path in a static road network, and is also an effective algorithm for solving many search problems. The closer the distance estimate is to the actual value in the algorithm, the faster the final search speed. Suitable algorithms may also be selected as desired.
And step 104, otherwise, determining an initial projection point coordinate through the trunk information and the initial coordinate, determining a target projection point coordinate through the trunk information and the target coordinate, and generating a global path through a Dijkstra algorithm and an A algorithm based on the trunk information, the initial coordinate, the initial projection point coordinate, the target coordinate and the target projection point coordinate, so that the robot moves from the initial coordinate to the target coordinate according to the global path.
In some embodiments, the start (target) proxel coordinates may be the intersection of the start (target) coordinates and the perpendicular extension of the main road closest to the start (target) coordinates. The Dijkstra algorithm is a typical single-source shortest path algorithm for calculating the shortest path from one node to all other nodes. The method is mainly characterized in that the expansion is carried out layer by layer towards the outer part by taking the starting point as the center until the end point is reached. The Dijkstra algorithm is a very representative shortest path algorithm. Dijkstra's general representation is usually done in two ways, one with permanent and temporary labels and one with OPEN, CLOSE tables. An appropriate algorithm may also be selected according to particular needs.
According to the robot navigation method disclosed by some embodiments of the invention, the information of the main road of the current floor, the initial coordinate of the current floor and the target coordinate of the current floor are obtained; determining the distance between the initial coordinate and the target coordinate according to the initial coordinate and the target coordinate; judging whether the distance meets a first preset condition, wherein the first preset condition is used for determining a method for generating the global path, and if so, generating the global path through an A-x algorithm based on the initial coordinate and the target coordinate; otherwise, determining the initial projection point coordinate through the trunk information and the initial coordinate, determining the target projection point coordinate through the trunk information and the target coordinate, based on the trunk information, the initial coordinate, the initial projection point coordinate, the target coordinate and the target projection point coordinate, generating a global path through a Dijkstra algorithm and an A-algorithm, enabling the robot to move from the initial coordinate to the target coordinate according to the global path, it can be seen that the method can realize the autonomous path planning of the robot and the uninterrupted transportation, through the Dijkstra algorithm and the A-star algorithm, the generated global path improves the generation efficiency of the global path, so that the robot can obtain the optimal path in a complex building environment, the robot can have good trafficability in the actual walking process, and the method is applied to transportation and can release the manpower in article transportation.
Referring to fig. 2, fig. 2 is a flowchart illustrating a robot navigation method according to another embodiment of the present invention. As shown in fig. 2, the method comprises the steps of:
step 201, inquiring the main road information corresponding to the current floor, and if the main road information exists, acquiring the main road information of the current floor.
In some embodiments, the robot may access the designated address via a network to query the information about the main road, and may also query a local database to obtain the information about the main road. Before the robot inquires whether the information of the main trunk is available, the robot can automatically check whether the robot has a fault so as to find the problem at any time and ensure that the robot can safely run.
And step 202, otherwise, acquiring at least one piece of local map information through a laser radar.
In some embodiments, the method for obtaining the local map information is not limited by the present invention. The local map information represents a portion of a building environment global map on a current floor. According to the actual detection range of the robot, the size of the local map information can be adjusted. If the detection range of the robot can cover the building environment global of the current floor, the global map information can be directly obtained through the laser radar.
And 203, splicing at least one piece of local map information through an SIFT image matching algorithm to obtain an overall map.
In some optional implementation manners, the splicing at least one piece of local map information by using a SIFT image matching algorithm to obtain an overall map includes: acquiring a spliced map through human-computer interaction, wherein the spliced map is obtained by splicing at least one piece of local map information by a user; and obtaining the whole map by an SIFT image matching algorithm based on the spliced map. The map construction and main road generation mode based on the man-machine interaction mode improves the path generation rate.
The SIFT matching (scale invariant feature transform) algorithm is an algorithm of computer vision, which is used for detecting and describing local features in an image, searching extreme points in a spatial scale and extracting position, scale and rotation invariants of the extreme points. The description and detection of local image features can help to identify objects, and the SIFT features are based on some local appearance interest points on the object and are independent of the size and rotation of the image. The tolerance to light, noise, and slight viewing angle changes is also quite high. Based on these characteristics, they are highly significant and relatively easy to retrieve, easily identify objects and are rarely misidentified in feature databases with large denominations. The detection rate for partial object occlusion using SIFT characterization is also quite high. Under the current hardware speed of computer and the condition of small feature database, the recognition speed can approach to real-time operation. The SIFT features have large information quantity and are suitable for quick and accurate matching in a mass database. The method for splicing at least one local map information may also select a suitable method according to the situation, which is not limited in the present invention. As an example, local map information may also be manually spliced together to obtain an overall map, or the overall map may also be directly obtained through a SIFT image matching algorithm.
As an example, the map stitching mode based on human-computer interaction may refer to the following steps:
1. for floors with large spatial range, the size of the map required for navigation is large, and in a building environment, the map is generally straight due to few feature points. The accumulated deviation can occur when the single-line laser radar carries out large-scale image construction. Therefore, in the map creating mode, each local environment in the overall environment can be separately created (i.e. at least one piece of local map information is created), the created local map is saved in a subfolder of the program, and the names of the subfolders are sorted by using the time information. The constructed sub-map is shown in fig. 3-1.
2. After the splicing is started, the program searches for the map splicing under the two folders with the earliest time, as shown in fig. 3-2, the map shown by the black and white part is the first sub-map, the part outlined by the white line is the second sub-map, and the second sub-map can be manually translated and rotated to enable the common parts of the second sub-map and the first sub-map to be superposed; feature point matching can also be carried out on the two maps by adopting an SIFT image matching algorithm, and the two maps are connected after the same feature points in the two pictures are matched; or the second sub-map can be translated and rotated manually, and then the two maps are connected together by adopting an SIFT image matching algorithm. The resulting stitched map is shown in fig. 3-3.
3. For the case of multiple sub-maps, after the start of stitching, the sub-maps may be sequentially stitched according to the time sequence. The situation of the spliced whole map is shown in fig. 3-4. After the sub-maps are spliced, the sub-maps can be stored under the folders of the corresponding floors. The robot can also perform feature point matching with the spliced whole map according to the floor information and the local map information, find the current position of the robot in the map, and perform initial positioning (i.e., obtain the initial coordinates).
And 204, receiving at least one endpoint coordinate set, and generating corresponding trunk road information according to the at least one endpoint coordinate set and the whole map, wherein the endpoint coordinate set comprises two endpoint coordinates.
In some embodiments, a connection line of two endpoint coordinates may be determined as a main road by receiving the two endpoint coordinates, and obstacle detection may also be performed at the same time, where if there is no obstacle in the connection line of the two endpoint coordinates, the connection line is determined as a main road, and if there is an obstacle, the connection line of the two endpoint coordinates is excluded. The plurality of endpoint coordinate sets may determine a corresponding plurality of arterial roads, thereby obtaining arterial road information. As an example, the main road information may be identified by an image processing technique based on the whole map. The method for obtaining the information of the main road is not limited by the invention.
As an example, the method for generating the main road based on the human-computer interaction mode may refer to the following:
step (1), loading a map generated by human-computer interaction (namely acquiring an integral map), setting the length L of the medicine delivery robot, the width W of the medicine delivery robot and a safety threshold value
Figure 302204DEST_PATH_IMAGE001
Step (2), clicking two points (namely two end point coordinates) of the map, which need to generate the main road information, and generating a straight line taking the clicked two points as end points in the map
Figure 835953DEST_PATH_IMAGE002
And obtaining a main road path.
And (3) generating two end point coordinates of the main road route in a map coordinate system
Figure 421656DEST_PATH_IMAGE003
And
Figure 149702DEST_PATH_IMAGE004
length of path
Figure 453645DEST_PATH_IMAGE002
And angle
Figure 525506DEST_PATH_IMAGE005
And (4) forming.
And (4) performing collision safety detection on the generated path range of the main trunk road, expanding the outer contour of the medicine-feeding robot, and setting the side length of a circumscribed square as the detection range by using the circumscribed square of a rectangular circumscribed circle of the medicine-feeding robot
Figure 965715DEST_PATH_IMAGE006
Then, then
Figure 861733DEST_PATH_IMAGE007
The detection area range of the main road thus generated is as long as
Figure 387392DEST_PATH_IMAGE008
Wide is
Figure 528524DEST_PATH_IMAGE009
And (3) judging whether an obstacle point exists in the area, if no obstacle point exists in the rectangular range, storing the main road path, and if an obstacle point exists in the range, clearing the selected two points of information and prompting to reselect the main road information (or the endpoint coordinate set).
And (5) repeating the steps (2), (3) and (4) at the position of the main road needing to be generated on the map (namely, receiving at least one endpoint coordinate set, and generating corresponding main road information according to the at least one endpoint coordinate set and the whole map). And after the selection of the required main trunk is finished, storing the selected main trunk information, and storing the generated main trunk information into a subfolder in a txt text form (the storage mode and the method are not limited by the invention) for the subsequent generation of the global path call. The trunk information may refer to line segments as shown in fig. 3-5.
Step 205, obtaining the starting coordinate of the current floor and the target coordinate of the current floor.
And step 206, determining the distance between the starting coordinate and the target coordinate according to the starting coordinate and the target coordinate.
In some embodiments, specific implementations of step 205 and step 206 and technical effects brought by the same may refer to step 101 and step 102 in the embodiment corresponding to fig. 1, and are not described herein again.
And step 207, judging whether the distance meets a first preset condition, wherein the first preset condition is used for determining a method for generating the global path, and if so, generating the global path through an A-x algorithm based on the initial coordinate and the target coordinate.
In some embodiments, the method in step (4) in step 204 may be referred to for detection, collision detection may be performed on the generated local path, and if a collision is found, the planning of the current global path is deleted, and measures such as voice prompt, error display, or prompting for re-planning are taken; if no collision is found, a global path is generated.
And step 208, otherwise, determining an initial projection point coordinate through the trunk information and the initial coordinate, determining a target projection point coordinate through the trunk information and the target coordinate, and generating a global path through a Dijkstra algorithm and an A algorithm based on the trunk information, the initial coordinate, the initial projection point coordinate, the target coordinate and the target projection point coordinate, so that the robot moves from the initial coordinate to the target coordinate according to the global path.
In some embodiments, the global optimal path can be generated by using the a-algorithm and the Dijkstra algorithm, and the speed of finding the global path can be increased. On the other hand, the generated global path aims at good trafficability of the robot in a specific building environment.
In some alternative implementations, moving the robot from the start coordinate to the target coordinate according to the global path includes: acquiring a current coordinate, wherein the current coordinate comprises an initial coordinate, and acquiring a corresponding look-ahead point, wherein the look-ahead point is selected on the global path according to a second preset condition; acquiring a projection point coordinate, wherein the projection point coordinate is the projection of the current coordinate on the global path, and determining a corresponding distance and a corresponding deviation angle according to the projection point coordinate and the current coordinate; determining a control coefficient according to the distance and the deviation angle, acquiring a preset maximum turning differential speed, and determining the speed of a left wheel and the speed of a right wheel according to the control coefficient and the maximum turning differential speed; and moving to a forward looking point according to the left wheel speed and the right wheel speed, wherein the forward looking point comprises a target coordinate. That is, based on a fuzzy control algorithm, the robot is moved from the start coordinate to the target coordinate according to the global path. The robot can have good trafficability in the actual walking process.
As an example, in a robot drug delivery scenario, based on a fuzzy control algorithm, moving the robot from the start coordinates to the target coordinates according to a global path may refer to the following steps:
and (11) acquiring the actual position of the medicine delivery robot (or called robot) in a map (namely, acquiring the current coordinate) through a real-time laser radar, wherein a target path (such as a global path) of the medicine delivery robot can be the global path planned by a path planning module, calculating the coordinate of a projection point of the current position of the medicine delivery robot on the target path (namely, acquiring the coordinate of the projection point), calculating the position of a prospective point of the medicine delivery robot on the target path according to the current position of the projection point, and setting the prospective point as shown in fig. 3-6.
And (12) calculating the distance deviation of the medicine delivery robot from the target path, namely calculating the distance value of the medicine delivery robot and the angle of the medicine delivery robot from the target path (namely, determining the corresponding distance and deviation angle according to the coordinates of the projection point and the current coordinates), as shown in fig. 3-7.
And (13) as shown in figures 3-8, if the robot is a wheel type robot, different speed values are sent to the left wheel and the right wheel of the medicine delivery robot, the heading of the medicine delivery robot can be controlled, and the difference value of the speeds of the left wheel and the right wheel determines the heading of the robot. At that time, the medicine delivery robot turns to the right, and at that time, the medicine delivery robot turns to the left; determines the turning radius and speed of the robot. The method comprises the steps of controlling the course of the robot and comprising a turning maximum difference speed (namely the turning maximum difference speed) and a control coefficient mu (namely determining the left wheel speed and the right wheel speed according to the control coefficient and the turning maximum difference speed).
And (14) sending speed values to the left wheel and the right wheel of the robot, and continuously moving the robot from the current position to a forward-looking point on a path in front of the robot. The distance deviation and the angle deviation calculated according to the current position of the medicine delivery robot are calculated based on a motion algorithm of fuzzy control, corresponding values are given according to sums in different ranges, corresponding left and right wheel speed sums can be calculated, the medicine delivery robot can be sent to move towards a prospective point according to the obtained speed sums, and the process (the robot moves, the prospective is changed continuously and is close to the last point of the global path) is continuously circulated until the last point of the target path is reached.
In some optional implementations, generating the global path based on the main road information, the start coordinate, the start projection point coordinate, the target coordinate, and the target projection point coordinate by Dijkstra algorithm and a-x algorithm includes: determining first path information through an A-x algorithm based on the initial coordinates and the initial projection point coordinates; determining second path information through an A-algorithm based on the target coordinates and the target projection point coordinates; determining third path information through a Dijkstra algorithm based on the main road information, the initial projection point coordinates and the target projection point coordinates; and splicing the first path information, the second path information and the third path information to obtain a global path.
As an example, based on the start coordinate and the start projection point coordinate, determining the first path information by an a-algorithm, where the start coordinate of the robot may be used as a planned start coordinate of the a-algorithm, the projection point coordinate (i.e., the start projection point coordinate) where the start coordinate of the robot reaches the nearest main road is found as a planned target coordinate of the a-algorithm, and between the planned start coordinate and the start projection point coordinate, the first path information is found based on the a-algorithm, and if the path cannot be planned (for example, a collision is found in the first path information by referring to the collision method detection in step (4) in step 204), the global path planning is terminated; if the path can be planned, the returned first path information can be stored in the array a.
As an example, based on the target coordinates and the target projection point coordinates, determining the second path information through an a-algorithm, where the target coordinates of the robot may be used as planning start coordinates of the a-algorithm, the projection point coordinates (i.e., the target projection point coordinates) where the target coordinates of the robot reach the nearest main road is found as the a-algorithm planning target coordinates, and a path is found based on the a-algorithm between the target coordinates and the target projection point coordinates, and if the path cannot be planned (for example, referring to the collision method detection in step (4) in step 204, it is found that there is a collision in the first path information), the path planning is terminated; if the path can be planned, the returned second path information can be stored in the array b.
As an example, based on the arterial road information, the initial projection point coordinate and the target projection point coordinate, the Dijkstra algorithm is used to determine the third path information, which may be that according to the initial projection point coordinate and the target projection point coordinate of the robot, the Dijstra algorithm is used to search for the node of the arterial road information to obtain the path information of the shortest main road communicated between the two projection points, and the obtained path information (i.e., the third path information) may be stored in the array c.
As an example, the first path information, the second path information, and the third path information are concatenated to obtain the global path, the road information in the array c may be added to the back of the road information in the array a, then the road information in the array b is added to the array a, and finally the road information in the array a, that is, the global path, is obtained, and the global path may refer to fig. 3 to 9 (the dotted line portion is the global path). The step of generating the global path may refer to fig. 3-10.
In some optional implementations, the robot moves from the start coordinate to the target coordinate according to a global path, including: in the process that the robot moves from the initial coordinate to the target coordinate according to the global path, if the robot encounters an obstacle, a local path is generated, and after the robot bypasses the obstacle according to the local path, the robot continues to move from the current position to the target coordinate according to the global path; and, generating a local path, comprising: acquiring local target coordinates according to a preset condition, wherein the local target coordinates are selected on the global path according to the preset condition, and the second preset condition is used for determining the local target coordinates; and generating a local path according to the coordinates of the current position and the local target coordinates based on an artificial potential field algorithm.
As an example, when a drug delivery robot (or called robot) walks on a planned global path, if an obstacle needs to be avoided, a local path planning needs to be performed again, and the local path planning method is as follows:
step one, after the medicine delivery robot stops, taking the current point position as a start point coordinate, and taking a point on the global path at a certain distance in the forward direction of the point position as a target point.
And step two, obtaining a local path by adopting an artificial potential field method according to a current frame map of the current laser radar.
Step three, the medicine feeder moves according to the obtained local path, and if an obstacle stops in the moving process, the step one is repeated until the position of a target point on the global path is reached;
and step four, after the target point position on the global path is reached, the global path is continuously taken as a navigation path to move until the target coordinate of the current floor is reached.
In some optional implementations, the robot includes a camera and at least one box layer, the at least one box layer corresponding to the at least one box layer information; and after the robot moves from the initial coordinate to the target coordinate according to the global path, the robot further comprises: and scanning an identification code through a camera, wherein the identification code comprises user information, box layer information and time information, and the identification code is used for opening a box layer corresponding to the box layer information. In consideration of user information safety protection and abnormal use prevention, the mobile phone APP registration function is provided, the specific two-dimensional code can be generated according to the registration information, the code scanning time and the operation medicine box code, after the two-dimensional code is identified by the robot body end, only after the information of each party is accurate, the corresponding medicine box can be opened, and the use safety is guaranteed.
As an example, the identification code may be displayed by a mobile phone app including an iOS end and an android end, if the application scenario of the robot is indoor drug delivery, the box opening process of the drug delivery robot after delivering the drug may refer to fig. 3-11, and the steps are as follows:
and (21) opening the mobile phone APP, connecting a server of a computer side, and logging in user information.
Step (22), the program matches according to the stored user information and the user information transmitted by the mobile phone end, and if the matching is successful, the next medicine box opening instruction can be accepted; otherwise the feedback does not have this account information.
And (23) after login is successful, selecting a corresponding medicine box layer (the medicine box layer corresponds to the current floor), generating a specific two-dimensional code value, wherein the code value information comprises information of 'user name _ layer number _ timestamp', and aligning the generated two-dimensional code to a code scanning camera of the medicine delivery robot.
And (24) the camera processing part analyzes the recognized two-dimensional code data (namely, scanning an identification code through a camera, wherein the identification code comprises user information, box layer information and time information) to obtain the information of the medicine box layer to be opened.
And (25) the program sends out a corresponding opening instruction according to the obtained information of the medicine box layer to be opened, and the medicine box is opened.
And (26) after the corresponding medicine box layer is opened, selecting an ending instruction at the mobile phone end, generating a two-dimensional code of the ending instruction, and aligning the generated two-dimensional code to a code scanning camera of the medicine delivery robot.
And (27) ending the medicine box opening process.
And after the medicine delivery robot reaches the target coordinates of the current floor, selecting the corresponding medicine box layer (the medicine box layer corresponds to the current floor), opening the medicine box layer corresponding to the current floor by referring to the steps, and finishing medicine taking.
As can be seen from fig. 2, the robot obtains the overall map information from the local map information. And then the information of the main road is obtained, and collision detection is carried out when the information of the main road is generated, thereby being beneficial to improving the trafficability of the robot.
As an example, still taking the drug delivery robot as an example, the drug delivery robot may include a vehicle body, a left hub motor, a right hub motor, an ultrasonic obstacle avoidance detector, a drug storage box, a laser radar, a microcomputer, a WINCE industrial tablet computer, a camera, a driver, a odometer, a wireless communication module, a charging unit, and a battery; the left hub motor and the right hub motor are driven by a driver and are arranged at the bottom of the vehicle body; the driver is connected to the microcomputer through the can and is directly controlled by the microcomputer; the odometer is connected with the driver; the ultrasonic obstacle avoidance detector is arranged on the periphery of the vehicle body, and seven detection units are arranged in total; the laser radar is positioned between the chassis and the medicine chest and is connected with the microcomputer through the processing unit; the WINCE industrial tablet computer is connected with the microcomputer through a 232 serial port; the wireless communication module is hung on an industrial tablet computer.
The medicine delivery robot can adopt the principle of an AGV (automatic guided vehicle), the laser radar module is responsible for constructing map information and detecting the running direction and position information of the medicine delivery robot, the odometer is used for calculating the running speed of the medicine delivery robot, and the information is transmitted to the microcomputer and is used for calculating the position and the posture of the medicine delivery robot in the map; the ultrasonic sensor is responsible for emergency obstacle avoidance of the medicine feeder in the operation process; after the drawing is built and the medicine is operated, the medicine delivery robot operates to a medicine receiving position, medical staff obtains a real-time two-dimensional code through an APP on a mobile phone, scans the two-dimensional code in a code scanning camera scanning mode to open a medicine box, selects a department position to be delivered to on a WINCE industrial tablet computer on a locomotive, and the medicine delivery robot automatically operates to a designated department on a designated floor; after arriving at a designated department, the voice prompts medical care personnel to take the medicine, and after taking the medicine away, the robot automatically runs to a pharmacy to wait for the medicine to be transported.
In addition, the wheel can be a brushless speed reducing motor and is matched with an eight-inch tire, and the eight-inch tire has the characteristics of light weight and easiness in control. The speed of the left wheel and the speed of the right wheel are respectively controlled by the driver, so that the motion signal sent by the main control microcomputer can be received according to the specific running condition of the medicine delivery robot, and the walking and steering of the medicine delivery robot are stable by adopting a PID algorithm.
The ultrasonic sensor (or called ultrasonic sensor) of the obstacle avoidance detector has an effective detection distance of 5m, and can ensure a sufficient braking distance, and seven ultrasonic sensors around the body can ensure that the ultrasonic sensors on the medicine delivery robot are positioned as shown in figures 3-12 in the process of fast driving of the medicine delivery robot, so that obstacles around the vehicle body can be rapidly detected, and data of the sensors can be sent to the control panel through a serial port.
The laser radar that the robot adopted can be single line laser radar, and effective detection distance is 16 meters, and the positioning error of building the picture is 5cm, and sampling frequency is 16000 times/second, sends laser radar information to microcomputer processing through the net gape, can guarantee that send medicine robot clear acquisition own positional information in building environment.
The robot can be provided with two code scanning cameras which are connected with a computer through a USB serial port and a WeChat, and the first code scanning camera is used for scanning a code and opening a medicine box to protect the privacy and the safety of medicines; the second is swept a yard camera and is used for according to the elevator mouth and the two-dimensional code accurate positioning of position of charging, guarantees that the robot that send medicine gets into the gesture of elevator and the gesture of charging is the gesture of ideal.
The robot can adopt a WINCE tablet computer to perform data interaction with the microcomputer through a 232 serial port, the WINCE tablet computer is responsible for man-machine interaction and data transfer, the man-machine interaction data and the data transmitted by the control panel are sent to the microcomputer, and the microcomputer processes and analyzes the data.
The robot can be connected with a WINCE tablet computer through a serial port by adopting a wireless communication module and is responsible for interaction with external communication, and the robot comprises a wireless communication module at an elevator end, a remote server and a mobile phone end.
The speed encoder can be 1000 lines, 1000 pulses are output by the encoder every turn, the encoder is connected with an output shaft of a motor wheel and used for detecting the linear speed and the movement direction of the medicine-feeding robot, and the obtained data is sent to the microcomputer through the driver.
The explosive box can set up to four layers of drawers, and every layer of drawer all has an electronic lock, and when receiving the instruction of opening appointed layer, corresponding explosive box layer electronic lock is opened, and the explosive box drawer pops out, can guarantee the security and the privacy nature of medicine.
The charging unit is mainly responsible for the work flow during charging, including detecting the residual electric quantity of the battery, the on-state during charging, the electric quantity state during charging and the like.
The control board is connected with the WINCE tablet personal computer through a serial port and has the functions of processing data of the ultrasonic detector, controlling the on-off state of the charging unit, controlling the drawer layer of the medicine storage box, inquiring the state and the like.
The voice module is connected with the microcomputer and is responsible for voice prompt of traveling, charging, getting on and off of an elevator and other processes.
The microcomputer is an Inter industrial computer, performs interaction of control information and data information with a WINCE tablet computer, processes data such as motor states and the like of a driver, processes data of a laser radar, a camera and an ultrasonic sensor, and is responsible for processing the whole flow of the medicine delivery robot. The hardware architecture of the drug delivery robot can be referred to fig. 3-13.
The software architecture of the drug delivery robot may be found in fig. 3-14.
As shown in fig. 3-14, the software architecture of the drug delivery robot may include an ultrasound data processing portion, a lidar data processing portion, a speed feedback portion, a communication control portion, a map processing portion, a arterial road generation portion, a path planning portion, a state logic control portion, a motion control portion, and a data storage portion.
The ultrasonic data processing part is used for filtering data fed back by seven ultrasonic sensors around the medicine delivery robot, and can remove abnormal data sections from each frame of ultrasonic group data to ensure the accuracy of the data and assist in obstacle avoidance.
And the laser radar data processing part eliminates abnormal values in the obtained data by adopting a Kalman filtering algorithm on the positioning information uploaded by the radar module sensor, and extracts useful positioning information.
The speed feedback part is used for calculating the linear speed, the angular speed, the angle and the direction of the motion of the medicine delivery robot through data fed back by the encoder.
The communication control part interacts with the ultrasonic sensor data part, the laser radar data processing part, the speed feedback part, the data storage part, the map processing part and the state logic control part respectively and is used for checking or correcting the communication state and the working mode, detecting and controlling the laser radar, and reading, processing and sending the motion information, the control information, the positioning information and the map information.
The map processing part is used for processing the map data transmitted by the communication control module, because the radar is a single-line laser radar, deviation and the like can occur when a map is built in a large-area range, and the map processing module integrates a plurality of local small-range maps into large-area map data by adopting a processing mode based on human-computer interaction.
And the main road generating part is used for selecting main roads such as wide regions, corridors and the like in the map in a man-machine interaction mode according to the map information obtained by the map processing part and taking the main roads as the road information which is preferentially selected in the global path planning.
And the path planning part generates global path information and real-time local path information on the basis of the main road generated by the main road module based on the target position selected by human-computer interaction and the map information provided by the laser radar.
And the movement part is used for judging the error of the medicine delivery robot deviating from the set route during navigation based on the linear speed and the angle of the medicine delivery robot fed back by the speed feedback part and the positioning information fed back by the laser radar processing part so as to adjust the speed difference of the medicine delivery robot and keep the medicine delivery robot moving on the set route.
The state logic control part is respectively in communication connection with the communication control part, the path planning part and the motion control part, and is used for managing the working mode of the system program and commanding the communication control part, the path planning part and the motion control part to enter the specified working mode.
In the initialization mode, the work flow of state initialization and equipment self-checking of the medicine delivery robot is executed, the state logic control part reads parameters of the communication control module at the power-on initial stage of the medicine delivery trolley, initialization work is carried out on main control equipment and peripheral equipment of the medicine delivery robot according to a certain time sequence, initial state parameters and equipment running modes and the like of the medicine delivery robot are set, after initialization is completed, one-to-one self-checking is carried out on important modules, whether all the modules work normally or not can be checked through a communication or IO feedback mode, in the initialization mode, when the initialization setting failure or the self-checking failure of the important modules is found, alarm information is sent, the normal navigation mode is refused, and the normal navigation mode can be started after maintenance personnel solve problems and eliminate alarms.
When the system runs in an unknown environment for the first time, the medicine delivery robot can enter a map building mode, a map of a space where the medicine delivery robot runs is built, the elevator door and the charging position are calibrated, and after the built map is processed, trunk road information is generated and stored; and in the navigation mode, executing a quasi-navigation workflow of the medicine delivery robot. Specific workflow may refer to fig. 3-15.
In some alternative implementations, the robot includes a computer for human-computer interaction and a microcomputer for logic computation; and the interaction protocol of the computer and the microcomputer comprises instruction bytes, data segments, check codes and end marks. For data exchange, different commands are expressed based on different command heads, and check codes and specific tails are added to data tails, so that different commands can be analyzed conveniently, and abnormal behaviors of the robot caused by command analysis failure are avoided.
As an example, the key point of the communication data is the serial data interaction between the WINCE tablet computer and the microcomputer, where the serial data interaction protocol is a custom protocol, and the protocol format is: the instruction byte + the data segment + the check code +0xff 0xff 0xff 0xff (where 0xff 0xff 0xff 0xff may be an end mark), the data interaction includes various motion control instructions and process control instructions, and the map interaction may also be performed. The safety and the stability of the robot are ensured by the self-defined exchange data format.
In some alternative implementations, the robot may include an encoder, left and right wheels, a lidar, an ultrasonic sensor, and a communication link; and before acquiring the information of the main road of the current floor, the initial coordinate of the current floor and the target coordinate of the current floor, the method further comprises the following steps: carrying out fault detection on the encoder, and if the encoder has a fault, stopping the motion and giving an alarm; carrying out fault detection on the laser radar, and if the laser radar has a fault, stopping moving and giving an alarm; carrying out fault detection on the ultrasonic sensor, and if the fault occurs, stopping the movement and sending an alarm; carrying out fault detection on the communication connecting line; if the fault exists, stopping moving, and giving an alarm or giving an alarm after moving to a target coordinate; and, performing fault detection on the encoder, comprising: the pulse data of the first moment and the second moment of the left wheel fed back by the encoder are differenced to obtain a left wheel difference value, and the pulse data of the first moment and the second moment of the right wheel fed back by the encoder are differenced to obtain a right wheel difference value; determining the maximum difference value corresponding to the first moment and the second moment according to the difference value between the first moment and the second moment, the preset diameter of the wheel and the preset maximum linear velocity difference value; if the maximum difference value is smaller than the absolute value of the difference value of the left wheel difference value and the right wheel difference value, the encoder has a fault; and/or acquiring the current left wheel speed and the current right wheel speed fed back by the encoder, and judging whether the current left wheel speed and the current right wheel speed meet a third preset condition or not, wherein if the current left wheel speed and the current right wheel speed do not meet the third preset condition, the encoder has a fault; and/or performing fault detection on the laser radar, including: acquiring a first current position fed back by the laser radar and a second current position fed back by the encoder, and making a difference between the first current position and the second current position, wherein if the difference between the first current position and the second current position does not meet a fourth preset condition, the laser radar has a fault; and/or fault detection of the ultrasonic sensor, comprising: acquiring a data packet fed back by an ultrasonic sensor (or called ultrasonic sensor), wherein if the data packet does not meet a fifth preset condition, the ultrasonic sensor has a fault; and/or fault detection of the communication connection line, comprising: and acquiring data corresponding to the communication connecting line, wherein if the acquisition fails, the communication connecting line has a fault. The robot has a fault self-checking function, supports the functions of fault self-checking and fault automatic repairing, and improves the use safety of equipment.
As an example, the robot fault handling part (i.e. robot self-test) specifically includes the following parts:
in order to ensure the stability of the medicine delivery robot, whether functions of all parts of the robot are abnormal or not can be monitored in real time, on one hand, whether a problem occurs in a sensor or not is detected, and on the other hand, whether a problem that communication connecting lines of all parts of the robot are disconnected or the connecting lines are damaged or not is detected;
the sensor can include the encoder, laser radar and ultrasonic sensor, communication connection line mainly has can connecting wire between microcomputer and the driver, microcomputer and laser radar's net gape connecting wire, microcomputer and WINCE panel's 232 serial ports connecting wire, microcomputer and camera and voice module's USB connecting wire, wireless communication module and WINCE panel's 232 serial ports connecting wire, 485 serial ports connecting wire between control panel and WINCE panel:
1. method for detection of problems of the encoder itself:
the program acquires left and right wheel encoder data fed back by the can in real time, and calculates the difference value between the moment and the left wheel and the difference value between the moment and the right wheel;
according to the wheel track and the maximum rotation angular velocity between two motor wheels of the medicine-feeding robot set in the program, the maximum linear velocity difference of the left wheel and the right wheel can be calculated, and the maximum difference of the continuous two-frame difference of the encoders of the left wheel and the right wheel can be calculated according to the reduction ratio n of the motor wheels and the diameters of the wheels;
if so, judging that the encoder has a fault;
meanwhile, the program compares the current linear velocity value sent by the drug delivery person with the velocity value calculated by the encoder in real time, and if the left encoder appears or the encoder value and the target value are out of the threshold range, the encoder is also judged to be in fault.
2. The detection method for the problems of the laser radar comprises the following steps:
the program analyzes according to a data packet uploaded by the laser radar, the position of the medicine delivery robot is obtained from the laser radar data packet in real time, and when the self-positioning position obtained by the laser radar in the medicine delivery process of the medicine delivery robot is greatly different from the position calculated by the encoder, the occurrence of a problem of a laser radar sensor can be judged;
3. the detection method of the ultrasonic sensor comprises the following steps:
and (3) analyzing the program according to the data packet uploaded by the ultrasonic sensor, judging the data of each ultrasonic sensor, and judging that the corresponding sensor has a problem if the value of one ultrasonic sensor is changed from a normal value to 0 in the running process of the robot, the data of continuous frames is kept to be 0, and the value of the sensor is not changed when the robot continues to move.
4. The detection method for the disconnection or damage of the communication connecting line comprises the following steps:
and when the program cannot receive the data of the corresponding connecting line, the program tries to reopen the serial port and receive the data, the operation is repeated for five times, and if the data cannot be received or the data is abnormal, the problem at the connecting line is judged.
And aiming at different fault problems, the program of the medicine delivery robot gives corresponding feedback to the different fault problems.
6. For the conditions of encoder faults, laser radar faults, can connecting line faults between the microcomputer and the driver and net port connecting line faults between the microcomputer and the laser radar, the medicine delivery robot keeps a static state under the static condition, and stops running immediately under a navigation motion state, prompts fault information and waits for the personnel to solve the fault.
7. For the problem of the failure of the can connecting line between the microcomputer and the driver, in any mode, the program can immediately cut off the power of the driver and the motor part and wait for the personnel to solve the problem;
8. for the problems of the fault of the 232 serial port connecting line between the microcomputer and the WINCE tablet computer and the fault of the 485 serial port connecting line between the control panel and the WINCE tablet computer, the medicine delivery robot keeps a static state in the static state and prompts alarm information; in the process of delivering the medicine, the user can wait until the medicine delivering process is finished, and then the user can prompt alarm information and wait for the problem to be solved by the user.
9. For the fault problem of a USB connecting line between a microcomputer and a camera, a 232 serial port connecting line between a wireless communication module and a WINCE tablet personal computer, the medicine delivery robot cannot finish the processes of getting on and off an elevator and charging, and can stop moving after the medicine delivery is finished if the medicine delivery robot does not need to get on or off the elevator or the processes of getting on or off the elevator are finished in the medicine delivery process, otherwise, the medicine delivery robot stops moving immediately and is error information.
Referring to fig. 4, fig. 4 is a schematic structural diagram of some embodiments of a robot navigation device according to the present invention, and as an implementation of the methods shown in the above figures, the present invention further provides some embodiments of a robot navigation device, which correspond to the embodiments of the methods shown in fig. 1, and which can be applied to various electronic devices.
As shown in fig. 4, the robotic navigation device 400 of some embodiments includes a first processing module 401, a second processing module 402, a third processing module 403, a fourth processing module 403: the first processing module 401 is configured to obtain information of a trunk of a current floor, an initial coordinate of the current floor, and a target coordinate of the current floor; a second processing module 402, configured to determine a distance between the start coordinate and the target coordinate according to the start coordinate and the target coordinate; a third processing module 403, configured to determine whether the distance satisfies a first preset condition, where the first preset condition is used to determine a method for generating a global path, and if so, generate the global path through an a-matrix algorithm based on the initial coordinate and the target coordinate; and a fourth processing module 404, configured to determine, if not, an initial projection point coordinate through the trunk information and the initial coordinate, determine a target projection point coordinate through the trunk information and the target coordinate, and generate, based on the trunk information, the initial coordinate, the initial projection point coordinate, the target coordinate, and the target projection point coordinate, a global path through a Dijkstra algorithm and an a-algorithm, so that the robot moves from the initial coordinate to the target coordinate according to the global path.
In an optional implementation manner of some embodiments, the fourth processing module 404 is further configured to obtain a current coordinate, where the current coordinate includes a start coordinate, and obtain a corresponding look-ahead point, where the look-ahead point is selected on the global path according to a second preset condition, and the look-ahead point includes a target coordinate; acquiring a projection point coordinate, wherein the projection point coordinate is the projection of the current coordinate on the global path, and determining a corresponding distance and a corresponding deviation angle according to the projection point coordinate and the current coordinate; determining a control coefficient according to the distance and the deviation angle, acquiring a preset maximum turning differential speed, and determining the speed of a left wheel and the speed of a right wheel according to the control coefficient and the maximum turning differential speed; and moving to a forward looking point according to the left wheel speed and the right wheel speed, wherein the forward looking point comprises a target coordinate.
In an alternative implementation of some embodiments, the robot includes a computer for human-computer interaction and a microcomputer for logic computation; and the interaction protocol of the computer and the microcomputer comprises instruction bytes, data segments, check codes and end marks.
In an optional implementation manner of some embodiments, the first processing module 401 is further configured to query the information of the main road corresponding to the current floor, and if the information of the main road corresponding to the current floor exists, obtain the information of the main road of the current floor; otherwise, acquiring at least one local map information through a laser radar; splicing at least one piece of local map information through an SIFT image matching algorithm to obtain an overall map; and receiving at least one endpoint coordinate set, and generating corresponding trunk road information according to the at least one endpoint coordinate set and the whole map, wherein the endpoint coordinate set comprises two endpoint coordinates.
In an optional implementation manner of some embodiments, the first processing module 401 is further configured to obtain a mosaic map through human-computer interaction, where the mosaic map is obtained by stitching at least one piece of local map information by a user; and obtaining the whole map by an SIFT image matching algorithm based on the spliced map.
In an optional implementation of some embodiments, the robot includes a camera and at least one box layer, the at least one box layer corresponding to the at least one box layer information; and, the apparatus 400 further comprises: and the scanning module is used for scanning the identification code through the camera, the identification code comprises user information, box layer information and time information, and the identification code is used for opening a box layer corresponding to the box layer information.
In an optional implementation of some embodiments, the robot includes an encoder, left and right wheels, a lidar, an ultrasonic sensor, and a communication link; and, the apparatus 400 further comprises a fault detection module to: carrying out fault detection on the encoder, and if the encoder has a fault, stopping the motion and giving an alarm; carrying out fault detection on the laser radar, and if the laser radar has a fault, stopping moving and giving an alarm; carrying out fault detection on the ultrasonic sensor, and if the fault occurs, stopping the movement and sending an alarm; carrying out fault detection on the communication connecting line; if the fault exists, stopping moving, and giving an alarm or giving an alarm after moving to a target coordinate; and the fault detection module is further configured to: the pulse data of the first moment and the second moment of the left wheel fed back by the encoder are differenced to obtain a left wheel difference value, and the pulse data of the first moment and the second moment of the right wheel fed back by the encoder are differenced to obtain a right wheel difference value; determining the maximum difference value corresponding to the first moment and the second moment according to the difference value between the first moment and the second moment, the preset diameter of the wheel and the preset maximum linear velocity difference value; if the maximum difference value is smaller than the absolute value of the difference value of the left wheel difference value and the right wheel difference value, the encoder has a fault; and/or acquiring the current left wheel speed and the current right wheel speed fed back by the encoder, and judging whether the current left wheel speed and the current right wheel speed meet a third preset condition or not, wherein if the current left wheel speed and the current right wheel speed do not meet the third preset condition, the encoder has a fault; and/or the fault detection module is further configured to: acquiring a first current position fed back by the laser radar and a second current position fed back by the encoder, and making a difference between the first current position and the second current position, wherein if the difference between the first current position and the second current position does not meet a fourth preset condition, the laser radar has a fault; and/or fault detection of the ultrasonic sensor, comprising: acquiring a data packet fed back by the ultrasonic sensor, wherein if the data packet does not meet a fifth preset condition, the ultrasonic sensor has a fault; and/or the fault detection module is further configured to: and acquiring data corresponding to the communication connecting line, wherein if the acquisition fails, the communication connecting line has a fault.
It is understood that the modules recited in the apparatus 400 correspond to the steps in the method described with reference to fig. 1. Thus, the operations, features and advantages of the method described above are also applicable to the apparatus 400 and the modules and units included therein, and are not described herein again.
Fig. 5 illustrates a physical structure diagram of an electronic device, which may include, as shown in fig. 5: a processor (processor)510, a communication Interface (Communications Interface)520, a memory (memory)530 and a communication bus 540, wherein the processor 510, the communication Interface 520 and the memory 530 communicate with each other via the communication bus 540. Processor 510 may invoke logic instructions in memory 530 to perform a robot navigation method comprising: acquiring information of a main road of a current floor, an initial coordinate of the current floor and a target coordinate of the current floor; determining the distance between the initial coordinate and the target coordinate according to the initial coordinate and the target coordinate; judging whether the distance meets a first preset condition, wherein the first preset condition is used for determining a method for generating the global path, and if so, generating the global path through an A-x algorithm based on the initial coordinate and the target coordinate; otherwise, determining an initial projection point coordinate through the trunk information and the initial coordinate, determining a target projection point coordinate through the trunk information and the target coordinate, and generating a global path through a Dijkstra algorithm and an A-x algorithm based on the trunk information, the initial coordinate, the initial projection point coordinate, the target coordinate and the target projection point coordinate, so that the robot moves from the initial coordinate to the target coordinate according to the global path.
Furthermore, the logic instructions in the memory 530 may be implemented in the form of software functional units and stored in a computer readable storage medium when the software functional units are sold or used as independent products. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In another aspect, the present invention also provides a computer program product comprising a computer program stored on a non-transitory computer-readable storage medium, the computer program comprising program instructions which, when executed by a computer, enable the computer to perform the robot navigation method provided by the above methods, the method comprising: acquiring information of a main road of a current floor, an initial coordinate of the current floor and a target coordinate of the current floor; determining the distance between the initial coordinate and the target coordinate according to the initial coordinate and the target coordinate; judging whether the distance meets a first preset condition, wherein the first preset condition is used for determining a method for generating the global path, and if so, generating the global path through an A-x algorithm based on the initial coordinate and the target coordinate; otherwise, determining an initial projection point coordinate through the trunk information and the initial coordinate, determining a target projection point coordinate through the trunk information and the target coordinate, and generating a global path through a Dijkstra algorithm and an A-x algorithm based on the trunk information, the initial coordinate, the initial projection point coordinate, the target coordinate and the target projection point coordinate, so that the robot moves from the initial coordinate to the target coordinate according to the global path.
In yet another aspect, the present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program, which when executed by a processor is implemented to perform the robot navigation method provided above, the method comprising: acquiring information of a main road of a current floor, an initial coordinate of the current floor and a target coordinate of the current floor; determining the distance between the initial coordinate and the target coordinate according to the initial coordinate and the target coordinate; judging whether the distance meets a first preset condition, wherein the first preset condition is used for determining a method for generating the global path, and if so, generating the global path through an A-x algorithm based on the initial coordinate and the target coordinate; otherwise, determining an initial projection point coordinate through the trunk information and the initial coordinate, determining a target projection point coordinate through the trunk information and the target coordinate, and generating a global path through a Dijkstra algorithm and an A-x algorithm based on the trunk information, the initial coordinate, the initial projection point coordinate, the target coordinate and the target projection point coordinate, so that the robot moves from the initial coordinate to the target coordinate according to the global path.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the various embodiments or some parts of the above-described methods of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (9)

1. A method of robot navigation, comprising:
acquiring information of a main road of a current floor, an initial coordinate of the current floor and a target coordinate of the current floor;
determining the distance between the starting coordinate and the target coordinate according to the starting coordinate and the target coordinate;
judging whether the distance meets a first preset condition, wherein the first preset condition is used for determining a method for generating a global path, and if so, generating the global path through an A-x algorithm based on the initial coordinate and the target coordinate;
otherwise, determining an initial projection point coordinate through the trunk information and the initial coordinate, determining a target projection point coordinate through the trunk information and the target coordinate, generating a global path through a Dijkstra algorithm and an A algorithm based on the trunk information, the initial coordinate, the initial projection point coordinate, the target coordinate and the target projection point coordinate, and enabling the robot to move from the initial coordinate to the target coordinate according to the global path;
the first preset condition is a preset threshold range, when the distance is within the threshold range, the first preset condition is met, and a global path is generated through an A-x algorithm based on the initial coordinate and the target coordinate;
wherein the generating the global path based on the arterial road information, the start coordinate, the start projection point coordinate, the target coordinate and the target projection point coordinate through Dijkstra algorithm and a-x algorithm includes:
determining first path information through an A-x algorithm based on the initial coordinates and the initial projection point coordinates; determining second path information through an A-x algorithm based on the target coordinates and the target projection point coordinates; determining third path information through a Dijkstra algorithm based on the main road information, the initial projection point coordinates and the target projection point coordinates; and splicing the first path information, the second path information and the third path information to obtain a global path.
2. The robot navigation method of claim 1, wherein the moving the robot from the start coordinate to the target coordinate according to the global path comprises:
acquiring a current coordinate, wherein the current coordinate comprises the initial coordinate, and acquiring a corresponding look-ahead point, wherein the look-ahead point is selected on the global path according to a second preset condition, and the look-ahead point comprises the target coordinate;
acquiring a projection point coordinate, wherein the projection point coordinate is the projection of the current coordinate on the global path, and determining a corresponding distance and a corresponding deviation angle according to the projection point coordinate and the current coordinate;
determining a control coefficient according to the distance and the deviation angle, acquiring a preset maximum turning differential speed, and determining the speed of a left wheel and the speed of a right wheel according to the control coefficient and the maximum turning differential speed;
and moving to the forward-looking point according to the left wheel speed and the right wheel speed, wherein the forward-looking point comprises the target coordinate.
3. The robot navigation method of claim 1, wherein the robot includes a computer for human-computer interaction and a microcomputer for logic calculation; and the number of the first and second groups,
the interaction protocol of the computer and the microcomputer comprises instruction bytes, data segments, check codes and end marks.
4. The robot navigation method of claim 1, wherein the obtaining of the arterial road information of the current floor comprises:
inquiring the information of the main road corresponding to the current floor, and if the information of the main road corresponding to the current floor exists, acquiring the information of the main road of the current floor;
otherwise, acquiring at least one local map information through a laser radar;
splicing the at least one piece of local map information through an SIFT image matching algorithm to obtain an overall map;
and receiving at least one endpoint coordinate set, and generating corresponding trunk road information according to the at least one endpoint coordinate set and the whole map, wherein the endpoint coordinate set comprises two endpoint coordinates.
5. The method according to claim 4, wherein the stitching the at least one local map information by SIFT image matching algorithm to obtain an overall map comprises:
acquiring a spliced map through human-computer interaction, wherein the spliced map is obtained by splicing at least one piece of local map information by a user;
and obtaining the integral map through the SIFT image matching algorithm based on the spliced map.
6. The robot navigation method of claim 1, wherein the robot includes a camera and at least one box layer, the at least one box layer corresponding to at least one box layer information; and the number of the first and second groups,
after the robot moves from the starting coordinate to the target coordinate according to the global path, the robot further comprises:
the camera scans an identification code, the identification code comprises user information, box layer information and time information, and the identification code is used for opening the box layer corresponding to the box layer information.
7. The robot navigation method of claim 1, wherein the robot includes an encoder, left and right wheels, a lidar, an ultrasonic sensor, and a communication link; and before the obtaining of the information of the main road of the current floor, the starting coordinate of the current floor and the target coordinate of the current floor, the method further comprises the following steps:
carrying out fault detection on the encoder, and if the encoder has a fault, stopping the motion and giving an alarm;
carrying out fault detection on the laser radar, and if the laser radar has a fault, stopping moving and giving an alarm;
carrying out fault detection on the ultrasonic sensor, and if the ultrasonic sensor has a fault, stopping moving and giving an alarm;
carrying out fault detection on the communication connecting line; if the fault exists, stopping moving, and giving an alarm or giving an alarm after moving to the target coordinate; and the number of the first and second groups,
the detecting the fault of the encoder comprises:
subtracting the pulse data of the first moment and the second moment of the left wheel fed back by the encoder to obtain a left wheel difference value, and subtracting the pulse data of the first moment and the second moment of the right wheel fed back by the encoder to obtain a right wheel difference value;
determining the maximum difference corresponding to the first moment and the second moment according to the difference between the first moment and the second moment, the preset diameter of the wheel and the preset maximum linear velocity difference;
if the maximum difference value is smaller than the absolute value of the difference value of the left wheel difference value and the right wheel difference value, the encoder has a fault; and/or acquiring the current left wheel speed and the current right wheel speed fed back by the encoder, and judging whether the current left wheel speed and the current right wheel speed meet a third preset condition, wherein if the current left wheel speed and the current right wheel speed do not meet the third preset condition, the encoder has a fault; and/or
The fault detection of the laser radar comprises:
acquiring a first current position fed back by the laser radar and a second current position fed back by the encoder, and making a difference between the first current position and the second current position, wherein if the difference between the first current position and the second current position does not meet a fourth preset condition, the laser radar has a fault; and/or
The fault detection of the ultrasonic sensor comprises:
acquiring a data packet fed back by the ultrasonic sensor, wherein if the data packet does not meet a fifth preset condition, the ultrasonic sensor has a fault; and/or
The performing fault detection on the communication connection line includes:
and acquiring data corresponding to the communication connecting line, wherein if the acquisition fails, the communication connecting line has a fault.
8. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the robot navigation method according to any of claims 1 to 7 are implemented when the processor executes the program.
9. A non-transitory computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the robot navigation method according to any one of claims 1 to 7.
CN202111352081.8A 2021-11-16 2021-11-16 Robot navigation method, equipment, medium and product Active CN113791627B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111352081.8A CN113791627B (en) 2021-11-16 2021-11-16 Robot navigation method, equipment, medium and product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111352081.8A CN113791627B (en) 2021-11-16 2021-11-16 Robot navigation method, equipment, medium and product

Publications (2)

Publication Number Publication Date
CN113791627A CN113791627A (en) 2021-12-14
CN113791627B true CN113791627B (en) 2022-02-11

Family

ID=78955395

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111352081.8A Active CN113791627B (en) 2021-11-16 2021-11-16 Robot navigation method, equipment, medium and product

Country Status (1)

Country Link
CN (1) CN113791627B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114371713A (en) * 2022-01-12 2022-04-19 深圳鹏行智能研究有限公司 Path planning method of foot type robot, electronic device and storage medium
CN114851197B (en) * 2022-05-16 2023-08-04 华北电力大学(保定) Calandria cable robot and control method thereof
CN115083209B (en) * 2022-07-26 2022-11-04 广州市德赛西威智慧交通技术有限公司 Vehicle-road cooperation method and system based on visual positioning
CN116100576B (en) * 2023-04-13 2023-07-07 广东美的制冷设备有限公司 Mobile robot and safety monitoring module thereof
CN117516530A (en) * 2023-09-28 2024-02-06 中国科学院自动化研究所 Robot target navigation method and device

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11249734A (en) * 1998-03-03 1999-09-17 Mitsubishi Heavy Ind Ltd Autonomous guidance device
KR20010068706A (en) * 2000-01-07 2001-07-23 전기호 A method for motion detection of object
JP2001337723A (en) * 2000-05-25 2001-12-07 Geo Tec Electronics Gmbh Planning of running path for a machine and operating method for the machine on the path
CN101865696A (en) * 2010-04-21 2010-10-20 深圳市凯立德计算机系统技术有限公司 Navigation device and path planning method and navigation method thereof
CN103092204A (en) * 2013-01-18 2013-05-08 浙江大学 Mixed robot dynamic path planning method
CN105955273A (en) * 2016-05-25 2016-09-21 速感科技(北京)有限公司 Indoor robot navigation system and method
CN106527432A (en) * 2016-11-04 2017-03-22 浙江大学 Indoor mobile robot cooperative system based on fuzzy algorithm and two-dimensional code self correction
CN106647754A (en) * 2016-12-20 2017-05-10 安徽农业大学 Path planning method for orchard tracked robot
CN106774310A (en) * 2016-12-01 2017-05-31 中科金睛视觉科技(北京)有限公司 A kind of robot navigation method
CN106774347A (en) * 2017-02-24 2017-05-31 安科智慧城市技术(中国)有限公司 Robot path planning method, device and robot under indoor dynamic environment
CN108994838A (en) * 2018-08-21 2018-12-14 智久(厦门)机器人科技有限公司上海分公司 The relationship calculation method and system of robot location and planning path
CN109612483A (en) * 2018-12-06 2019-04-12 熵智科技(深圳)有限公司 A kind of Laser-guided automatic transporter path generating method
CN111338359A (en) * 2020-04-30 2020-06-26 武汉科技大学 Mobile robot path planning method based on distance judgment and angle deflection
CN111736611A (en) * 2020-07-06 2020-10-02 中国计量大学 Mobile robot path planning method based on A-star algorithm and artificial potential field algorithm
CN111780777A (en) * 2020-07-13 2020-10-16 江苏中科智能制造研究院有限公司 Unmanned vehicle route planning method based on improved A-star algorithm and deep reinforcement learning
CN111857149A (en) * 2020-07-29 2020-10-30 合肥工业大学 Autonomous path planning method combining A-algorithm and D-algorithm
CN113465604A (en) * 2021-05-31 2021-10-01 杭州易现先进科技有限公司 Method and system for cross-floor navigation

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11249734A (en) * 1998-03-03 1999-09-17 Mitsubishi Heavy Ind Ltd Autonomous guidance device
KR20010068706A (en) * 2000-01-07 2001-07-23 전기호 A method for motion detection of object
JP2001337723A (en) * 2000-05-25 2001-12-07 Geo Tec Electronics Gmbh Planning of running path for a machine and operating method for the machine on the path
CN101865696A (en) * 2010-04-21 2010-10-20 深圳市凯立德计算机系统技术有限公司 Navigation device and path planning method and navigation method thereof
CN103092204A (en) * 2013-01-18 2013-05-08 浙江大学 Mixed robot dynamic path planning method
CN105955273A (en) * 2016-05-25 2016-09-21 速感科技(北京)有限公司 Indoor robot navigation system and method
CN106527432A (en) * 2016-11-04 2017-03-22 浙江大学 Indoor mobile robot cooperative system based on fuzzy algorithm and two-dimensional code self correction
CN106774310A (en) * 2016-12-01 2017-05-31 中科金睛视觉科技(北京)有限公司 A kind of robot navigation method
CN106647754A (en) * 2016-12-20 2017-05-10 安徽农业大学 Path planning method for orchard tracked robot
CN106774347A (en) * 2017-02-24 2017-05-31 安科智慧城市技术(中国)有限公司 Robot path planning method, device and robot under indoor dynamic environment
CN108994838A (en) * 2018-08-21 2018-12-14 智久(厦门)机器人科技有限公司上海分公司 The relationship calculation method and system of robot location and planning path
CN109612483A (en) * 2018-12-06 2019-04-12 熵智科技(深圳)有限公司 A kind of Laser-guided automatic transporter path generating method
CN111338359A (en) * 2020-04-30 2020-06-26 武汉科技大学 Mobile robot path planning method based on distance judgment and angle deflection
CN111736611A (en) * 2020-07-06 2020-10-02 中国计量大学 Mobile robot path planning method based on A-star algorithm and artificial potential field algorithm
CN111780777A (en) * 2020-07-13 2020-10-16 江苏中科智能制造研究院有限公司 Unmanned vehicle route planning method based on improved A-star algorithm and deep reinforcement learning
CN111857149A (en) * 2020-07-29 2020-10-30 合肥工业大学 Autonomous path planning method combining A-algorithm and D-algorithm
CN113465604A (en) * 2021-05-31 2021-10-01 杭州易现先进科技有限公司 Method and system for cross-floor navigation

Also Published As

Publication number Publication date
CN113791627A (en) 2021-12-14

Similar Documents

Publication Publication Date Title
CN113791627B (en) Robot navigation method, equipment, medium and product
AU2019404207B2 (en) Collaborative autonomous ground vehicle
EP3371670B1 (en) Device and method for autonomous localisation
Bauer et al. The autonomous city explorer: Towards natural human-robot interaction in urban environments
US20190278273A1 (en) Odometry system and method for tracking traffic lights
Kümmerle et al. A navigation system for robots operating in crowded urban environments
BR112020024333A2 (en) track vehicles in a warehouse environment
CN107807652A (en) Merchandising machine people, the method for it and controller and computer-readable medium
US11688081B2 (en) Method of performing simultaneous localization and mapping with respect to a salient object in an image
Yoshida et al. A sensor platform for outdoor navigation using gyro-assisted odometry and roundly-swinging 3D laser scanner
JP7259274B2 (en) Information processing device, information processing method, and program
WO2015017691A1 (en) Time-dependent navigation of telepresence robots
CN103459099A (en) Interfacing with mobile telepresence robot
CN106647738A (en) Method and system for determining docking path of automated guided vehicle, and automated guided vehicle
Lidoris et al. The autonomous city explorer (ACE) project—mobile robot navigation in highly populated urban environments
US11835960B2 (en) System and method for semantically identifying one or more of an object and a location in a robotic environment
Garrote et al. An RRT-based navigation approach for mobile robots and automated vehicles
US11635759B2 (en) Method of moving robot in administrator mode and robot of implementing method
WO2021208015A1 (en) Map construction and positioning method, client, mobile robot, and storage medium
Pérez et al. Enhanced monte carlo localization with visual place recognition for robust robot localization
CN114415650A (en) System and method for assisting a delivery robot
CN214632899U (en) Intelligent guide walking stick
CN112015187A (en) Semantic map construction method and system for intelligent mobile robot
Lee et al. Cyber Physical Autonomous Mobile Robot (CPAMR) framework in the context of industry 4.0
Fung et al. Development of a hospital service robot for transporting task

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant