CN113733091B - Outdoor high-precision autonomous navigation system of mobile robot - Google Patents

Outdoor high-precision autonomous navigation system of mobile robot Download PDF

Info

Publication number
CN113733091B
CN113733091B CN202111088955.3A CN202111088955A CN113733091B CN 113733091 B CN113733091 B CN 113733091B CN 202111088955 A CN202111088955 A CN 202111088955A CN 113733091 B CN113733091 B CN 113733091B
Authority
CN
China
Prior art keywords
robot
image
coordinate
moving
direct
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111088955.3A
Other languages
Chinese (zh)
Other versions
CN113733091A (en
Inventor
孙鹏
魏星
杨长春
张效广
叶晓东
赵江海
张志华
孔令成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Advanced Manufacturing Technology
Original Assignee
Institute of Advanced Manufacturing Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Advanced Manufacturing Technology filed Critical Institute of Advanced Manufacturing Technology
Priority to CN202111088955.3A priority Critical patent/CN113733091B/en
Publication of CN113733091A publication Critical patent/CN113733091A/en
Application granted granted Critical
Publication of CN113733091B publication Critical patent/CN113733091B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning

Abstract

The invention discloses an outdoor high-precision autonomous navigation system of a mobile robot, which comprises a mechanical system and a software system; the mechanical system obtains outdoor scene information and outdoor object information through scanning and three-dimensional reproduction by a 3D scanner; obtaining displacement information of the mobile robot by the visual odometer through image information processing; receiving and transmitting instructions and data between the 3D scanner and the GPS positioning device by the server; performing format conversion and resolution adjustment on the acquired images of the 3D scanner and the visual odometer by a processor; calculating the cost consumed by the mobile robot in the moving process by the operation module; the software system comprises a robot operating system ROS; the current target location is determined through optimal path planning, the robot is moved from the current location to the current target location, and the autonomous navigation is realized through cyclic execution until the robot reaches the final target location; the invention has the advantages of accurate acquisition of environmental information, stable autonomous movement process, low cost and obvious optimization effect.

Description

Outdoor high-precision autonomous navigation system for mobile robot
Technical Field
The invention belongs to the field of autonomous navigation of mobile robots, and particularly relates to a high-precision autonomous navigation system of a mobile robot, which is suitable for an outdoor unstructured environment.
Background
The autonomous navigation mainly comprises a local navigation part and a global navigation part, and in order to establish a reliable autonomous navigation system, the robot needs to position itself, construct a local map according to the surrounding environment information acquired by the sensor and complete the planning of a global path. Although various mobile robots suitable for indoor and outdoor exist at present, most mobile robots in indoor environments adopt complicated equipment with relatively low precision such as an infrared sensor or magnetic stripe navigation, and the navigation effect is poor; in some indoor environments, mobile robots employ vision or lidar sensors, and the navigation system thereof has disadvantages of devices for detecting the environment, such as: a detection device composed of a monocular vision sensor cannot accurately obtain the depth information of an object; the detection device consisting of the double laser radar sensors can only acquire the information of obstacles in two planes in space, but cannot acquire the information of surrounding obstacles and the like, so that the navigation precision of the robot is influenced; in addition, although the indoor robot using the inertial navigation technology may guide the indoor robot to move forward in a specific direction by measuring the acceleration and then integrating to obtain the corresponding speed and displacement, the error of the integration accumulation may increase with time.
With the wide range of robot application scenarios, in the face of unknown environments, unstructured outdoor environments where various obstacles that may occur cannot be predicted, such as: when the mobile robot works in the environments such as mars, forests, deserts and the like, data acquired by the sensor is incomplete and difficult to process; in the standard iterative closest point algorithm, only geometric information is considered when the closest point is searched iteratively, and the ICP cannot independently realize accurate point cloud registration performance in a dynamic environment and an unstructured environment.
Disclosure of Invention
The invention aims to overcome the defects in the prior art, provides the outdoor high-precision autonomous navigation system for the mobile robot, realizes high-precision autonomous navigation of the mobile robot in an outdoor environment, ensures that the robot acquires environment information more accurately and the autonomous movement process of the robot is more stable, and has low implementation cost and obvious optimization effect.
The invention adopts the following technical scheme for solving the technical problems:
the invention relates to an outdoor high-precision autonomous navigation system of a mobile robot, which is characterized by comprising a mechanical system and a software system;
the mechanical system comprises a GPS positioning device, a 3D scanner, a visual odometer, a server, a processor and an operation module; in the mechanical system: acquiring outdoor scene information and outdoor object information by scanning and three-dimensional reproduction through the 3D scanner; obtaining displacement information of the mobile robot by the visual odometer through image information processing; receiving and transmitting instructions and data between the 3D scanner and the GPS positioning device by the server; format converting and resolution adjusting, by the processor, the acquired images of the 3D scanner and the visual odometer; calculating the cost consumed by the mobile robot in the moving process by the operation module; the software system comprises a robot operating system ROS;
recording an initial position coordinate before the robot starts moving as a coordinate O, and storing the initial position coordinate O into a close _ list set in a server; aiming at a moving robot, carrying out optimal path planning according to the following steps:
step 1, obtaining a coordinate set A of N direct places N which are adjacent to a place where a robot is located and can directly reach by using the GPS positioning equipment, wherein N is 1,2, N. Storing the coordinate set A into an open _ list set in a server, recording the coordinates of the current position of the robot as coordinates B, and storing the coordinates B into a close _ list set in the server;
step 2, aiming at the coordinate B and the coordinate set A stored in the server, the operation module calculates and obtains the cost F required to be paid when the robot moves from the coordinate B to each direct place n in the coordinate set A in a one-to-one correspondence mode n From N costs F n Middle screening out lowest cost F min
Step 3, if corresponding to the lowest cost F min Is 1, i.e. corresponds to the lowest cost F min If the only direct place is the only direct place, taking the only direct place as the current target place W, and entering the step 7; otherwise, entering step 4;
step 4, corresponding to the lowest cost F min Is more than 1, i.e. corresponds to the lowest cost F min Is not the only direct location,the current target location W is determined as follows: will correspond to the lowest cost F min M through sites of (A) is recorded as W j J 1, 2.., M; the operation module respectively obtains the movement of the robot from the initial position coordinate O to each direct position W through calculation j Each price G paid out j From M costs G j Middle screening out the lowest cost G min
Step 5, if corresponding to the lowest cost G min Is 1, i.e. corresponds to the lowest cost G min If the only direct place is the only direct place, taking the only direct place as the current target place W, and entering the step 7; otherwise, entering step 6;
step 6, if corresponding to the lowest cost G min Is more than 1, i.e. corresponds to the lowest cost F min Is not the only direct location, then corresponding to the lowest cost G min Any one of the direct points of (1) is taken as a current target point W, and the step 7 is carried out;
7, moving the robot from the current position to the current target position W, and finishing the movement of the current step when the robot reaches the current target position W; and then, taking the coordinates of the current target location W as the coordinates of the current location where the robot moves next, returning to the step 1 to continue moving next until the robot reaches the final target location, and ending the path planning.
The outdoor high-precision autonomous navigation system of the mobile robot is also characterized in that the direct place is determined as follows: and setting a robot moving step distance U, and regarding a place with a distance of U from the robot as an adjacent and direct place.
The outdoor high-precision autonomous navigation system of the mobile robot is also characterized in that the moving step U of the robot is obtained as follows: converting a three-dimensional image about an outdoor scene, which is obtained by scanning of a 3D scanner and stored in a server, into a two-dimensional image by using a visual odometer, establishing a two-dimensional rectangular coordinate system XOY by taking the gravity center of the two-dimensional image as an origin, and calculating and obtaining the moving distance of the robot in a T period as a moving step U aiming at image conversion in the T period.
The outdoor high-precision autonomous navigation system of the mobile robot is also characterized in that: the image transformation is image scaling transformation, and the robot moving step distance U is calculated and obtained according to the following mode:
recording the coordinate point of the image in a two-dimensional rectangular coordinate system as P i [x p ,y p ]The scaling transformation of the image takes place at the robot movement (t) i ,t j ) Within a time interval, obtaining the moving speed v of the image by using a measuring device 1 Then, the bit shift S of the image occurring in the unit period is calculated by equation (5-1):
Figure BDA0003266799450000031
in a unit period, an image observed by a robot measuring device is subjected to a scaling transformation due to the movement of the robot, and in the image subjected to the scaling transformation, a coordinate point P is present i [x p ,y p ]Corresponding transformation to coordinate point P i ‘[x p' ,y p' ]According to the similarity theorem, the coordinate point P i [x p ,y p ]The multiple λ by which the scaling transformation occurs in the unit period is calculated by equation (5-2):
Figure BDA0003266799450000032
in the formula (5-2):
x p ' and y p Is a coordinate point P i ‘[x p' ,y p' ]The coordinate values of (a); x is the number of p And y p As a coordinate point P i [x p ,y p ]The coordinate values of (a);
the displacement L of the robot in a unit time period is shown as the formula (5-3):
Figure BDA0003266799450000033
in the formula (5-3):
Figure BDA0003266799450000034
is the ratio of the amount of displacement occurring due to the image scaling transformation within a unit period to the amount of displacement moved by the robot within the unit period,
Figure BDA0003266799450000035
then, the robot moving step U is calculated by equation (5-4):
Figure BDA0003266799450000036
the moving step U is that the robot is at (t) i ,t j ) The amount of displacement occurring within the period of time of (t) i ,t j ) The time period is composed of a finite number of unit periods.
The outdoor high-precision autonomous navigation system of the mobile robot is also characterized in that: the image transformation is carried out to generate translation transformation, and the robot moving step distance U is calculated according to the translation transformation generated to the image as follows: the image translation transformation occurs in the unit time interval of the robot movement, and the speed v of the image movement is measured by the measuring equipment 1 Then, the displacement S of the image occurring in the unit period is calculated by equation (5-5):
Figure BDA0003266799450000041
the displacement L of the robot in a unit time period is shown as the formula (5-6):
L=υS (5-6)
in the formula (5-6):
upsilon is the ratio of the image translation distance in a unit time interval to the robot displacement, and upsilon is not equal to 0 and 1;
then, the robot moving step U is calculated by equation (5-7):
Figure BDA0003266799450000042
the moving step U is that the robot is at (t) i ,t j ) The amount of displacement occurring within the period of time of (t) i ,t j ) The time period is composed of a finite number of unit periods.
The outdoor high-precision autonomous navigation system of the mobile robot is also characterized in that: the cost F paid by the robot for moving from the current position to the current target position is as follows:
Figure BDA0003266799450000043
in the formula (6-1): the road surface influence factor is represented by gamma, and different types of road surfaces are represented by e; by gamma e The road surface influence factors of the different road surfaces are shown, and the four different road surfaces are characterized by e-I, e-II, e-III and e-IV.
The outdoor high-precision autonomous navigation system of the mobile robot is also characterized in that: the motion state of the robot in the outdoor environment is influenced by the road roughness C and the road gradient D; and comprises the following components:
Figure BDA0003266799450000044
in the formula (7-1): theta is an inclination angle formed by the road surface and the horizontal plane; h is the horizontal displacement of the robot movement and is measured by a visual odometer; z is the height of the slope changing point relative to the horizontal plane; the road surface influence factor γ is calculated by the formula (7-2):
Figure BDA0003266799450000045
in the formula (7-2): the coefficient k is a physical quantity for representing the intensity of shaking generated by the robot due to external influence when the robot moves; the road roughness C is different along with different types of roads, and the road is divided into the following four types according to the equivalent shear wave velocity of the soil layer and the thickness of the field covering layer:
class I: offshore sea surface, island, coast, lake and desert areas;
class II: the method is characterized by comprising the following steps of (1) referring to fields, villages, jungles, hills and sparse villages and urban suburbs;
class III: the urban district with dense building groups is pointed;
and IV: it refers to an urban area with dense building groups and high houses.
Compared with the prior art, the invention has the beneficial effects that:
1. according to the invention, the 3D scanner is used for acquiring external environment information, the mobile robot is subjected to high-precision autonomous navigation in an outdoor environment through optimal path planning, the environment information acquired by the robot is more accurate, the autonomous movement process of the robot is more stable, the realization cost is low, and the optimization effect is obvious.
2. The invention can provide reference for different types of robots such as service robots, guide robots and the like to realize similar functions in outdoor environment.
Drawings
FIG. 1 is a block diagram of the system of the present invention;
FIG. 2 is a flow chart of a path planning algorithm in accordance with the present invention;
Detailed Description
The outdoor high-precision autonomous navigation system of the mobile robot comprises a mechanical system and a software system.
Referring to fig. 1, the mechanical system in this embodiment includes a GPS positioning device, a 3D scanner, a visual odometer, a server, and a processor, and is provided with an operation module; in a mechanical system: acquiring outdoor scene information and outdoor object information by a 3D scanner through scanning and three-dimensional reproduction; obtaining displacement information of the mobile robot by the visual odometer through image information processing; receiving and transmitting instructions and data between the 3D scanner and the GPS positioning device by the server; carrying out format conversion and resolution adjustment on images acquired by the 3D scanner and the visual odometer by the processor; and calculating the cost consumed by the mobile robot in the moving process by the operation module.
As shown in fig. 1, the robot further includes a map building and navigating module, the robot first obtains global map information through the GPS module and then stores the global map information in the server, and moves from an initial location to a target location after knowing a specific position of the target location through the global map information. In the moving process, the robot judges the specific position of the robot in the global map by acquiring surrounding environment information through the 3D scanner carried by the robot, and continuously updates local map information according to the environment information acquired by the 3D scanner and the position information of the robot, so that path planning is effectively carried out to reach a target place with the minimum cost.
In the embodiment, the software system comprises a robot operating system ROS, an initial position coordinate before the robot starts to move is recorded as a coordinate O, for the coordinate O, firstly, the robot receives a navigation positioning signal sent by a satellite by using a carried GPS positioning device, and the GPS positioning device is a terminal with a built-in GPS module and a mobile communication module; secondly, a GPS module arranged in the GPS positioning equipment converts the received navigation positioning signal into data which can be identified and processed by a computer and then obtains the longitude and latitude of the current location of the robot; finally, the GPS positioning equipment uploads the longitude and latitude corresponding to the current position of the robot to a server through a mobile communication module, and the server analyzes the longitude and latitude information into a specific three-dimensional coordinate and then stores the coordinate O into a close _ list set; for a moving robot, as shown in the flow of fig. 2, optimal path planning is performed according to the following steps:
step 1, the robot communicates with a satellite group by using GPS positioning equipment, receives signals composed of position and time data sent by the satellite group, calculates and obtains a coordinate set of N direct places N which are adjacent to the current place where the robot is located and can be directly reached based on the data, and records the coordinate set as a coordinate set A, wherein N is 1,2, … … N; and storing the coordinate set A into an open _ list set in the server, recording the coordinates of the current position of the robot as coordinates B, and storing the coordinates B of the current position into a close _ list set in the server.
Step 2, aiming at the coordinate B and the coordinate set A stored in the server, the calculation module calculates and obtains the cost F required to be paid when the robot moves from the coordinate B to each direct position n in the coordinate set A in a one-to-one correspondence mode n From N costs F n Middle screening out lowest cost F min
Step 3, if corresponding to the lowest cost F min Is 1, i.e. corresponds to the lowest cost F min If the only direct place is the only direct place, taking the only direct place as the current target place W, and entering the step 7; otherwise, go to step 4.
Step 4, corresponding to the lowest cost F min Is more than 1, i.e. corresponds to the lowest cost F min Is not the only direct location, the current target location W is determined as follows:
will correspond to the lowest cost F min M through sites of (A) is recorded as W j J is 1,2 …, M; the operation module respectively obtains the movement of the robot from the initial position coordinate O to each direct position W through calculation j Each price G paid out j From M costs G j Middle screening out the lowest cost G min
Step 5, if the number of the direct sites corresponding to the lowest cost is 1, the lowest cost G is corresponding to min If the only direct place is the only direct place, taking the only direct place as the current target place W, and entering the step 7; otherwise, go to step 6.
Step 6, if corresponding to the lowest cost G min Is more than 1, i.e. corresponds to the lowest cost F min Is not the only direct location, corresponds to the lowest cost G min Any one of the through points of (1) is taken as the current target point W, and proceeds to step 7.
And 7, moving the robot from the current position to the current target position W, and finishing the movement of the current step when the robot reaches the current target position W. And then, storing the coordinates of the current target location W into a close _ list set in the server, taking the current target location W as the coordinates of the current location where the robot moves next, returning to the step 1 to continue moving next until the robot reaches the final target location, and finishing path planning.
In this embodiment, the robot is guided to move from the initial location to the target location by path planning, so that the robot can avoid, for example: the ground is damaged seriously, the magnetic field intensity is too big, the air humidity is too high, the barrier distributes the area of relatively abominable environment such as intensive to can reduce the probability that causes the loss to the mechanical component that the constitution robot needs and the electronic equipment that the robot self carried, guarantee that the software and hardware equipment of robot self keeps good operating condition relatively. In specific implementation, the corresponding technical measures also include:
the direct site is determined as follows: and setting a robot moving step distance U, and regarding a place with a distance of U from the robot as an adjacent and direct place.
The robot moving step U is obtained as follows: and converting a three-dimensional image about the outdoor scene, which is obtained by scanning the 3D scanner and stored in the server, into a two-dimensional image by using a visual odometer, and establishing a two-dimensional rectangular coordinate system XOY by taking the gravity center of the two-dimensional image as an origin. The method comprises the following steps that a 3D scanner synchronously moves along with a robot in the moving process of the robot, in the moving process, the 3D scanner projects laser to the surface of an object in an outdoor scene in a point, line or array mode, the position of the object is judged according to object reflection light, coordinate information of the object is obtained, and the coordinate information is uploaded to a server; the 3D scanner can not only obtain geometric information of an object based on the angle of reflected light, but also convert the stereoscopic information of an outdoor scene into data that can be directly processed by a computer. Based on the above, the moving distance of the robot in the T period can be calculated and obtained as the moving step U according to the transformation situation of the two-dimensional image in the T period.
The image transformation has two different forms, one is that the image is subjected to scaling transformation, and the other is that the image is subjected to translation transformation;
in specific implementation, the robot moving step U is calculated and obtained as follows for the scaling transformation of the image: images are displayed onThe coordinate point in the two-dimensional rectangular coordinate system is marked as P i [x p ,y p ]The scaling transformation of the image takes place at the robot movement (t) i ,t j ) By measuring the rate v at which the image moves during the time interval 1 Then, the displacement S of the image occurring in the unit period is calculated by equation (5-1):
Figure BDA0003266799450000071
in a unit period, an image observed by a robot measuring device is subjected to a scaling transformation due to the movement of the robot, and in the image subjected to the scaling transformation, a coordinate point P is present i [x p ,y p ]Corresponding transformation to coordinate point P i ‘[x p′ ,y p′ ]According to the similarity theorem, the coordinate point P i [x p ,y p ]The multiple λ by which the scaling transformation occurs in the unit period can be calculated by equation (5-2):
Figure BDA0003266799450000072
in the formula (5-2):
x p′ and y p′ As a coordinate point P i ‘[x p′ ,y p′ ]The coordinate values of (a); x is the number of p And y p As a coordinate point P i [x p ,y p ]The coordinate values of (a);
the displacement L of the robot in a unit time period is shown as the formula (5-3):
Figure BDA0003266799450000081
in the formula (5-3):
Figure BDA0003266799450000082
is the ratio of the amount of displacement occurring due to the image scaling transformation within a unit period of time to the amount of displacement moved by the robot within the unit period of time,
Figure BDA0003266799450000083
Then, the robot moving step U is calculated by equation (5-4):
Figure BDA0003266799450000084
the moving step U is that the robot is at (t) i ,t j ) Amount of displacement occurring within a period of time of (t) i ,t j ) The time period is composed of a finite number of unit periods.
And calculating the moving step distance U of the robot according to the following mode aiming at the image translation transformation: the image translation transformation occurs in the unit time interval of the robot movement, and the speed v of the image movement is measured by the measuring equipment 1 Then, the displacement S of the image occurring in the unit period is calculated by equation (5-5):
Figure BDA0003266799450000085
the displacement L of the robot in a unit time period is shown as the formula (5-6):
L=υS (5-6)
in the formula (5-6): upsilon is the ratio of the image translation distance in a unit time interval to the robot displacement, and upsilon is not equal to 0 and l;
then, the robot moving step U is calculated by equation (5-7):
Figure BDA0003266799450000086
the moving step U is that the robot is at (t) i ,t j ) The amount of displacement occurring within the period of time of (t) i ,t j ) The time period is composed of a finite number of unit periods.
The cost F paid by the robot to move from the current position to the current target position W is as follows:
Figure BDA0003266799450000087
in the formula (5-8): the road surface influence factor is expressed by gamma, different types of road surfaces are expressed by e, and gamma e Namely, the road surface influence factors are expressed as road surface influence factors of different types of road surfaces, and the total four different types of road surfaces are characterized by e-I, e-II, e-III and e-IV respectively; the road surface influence factor γ is used to characterize the influence of physical properties of the road surface itself on the motion state of the robot, such as: the motion state of the robot in the outdoor environment is influenced by the road roughness C and the road gradient D; and comprises the following components:
Figure BDA0003266799450000088
in the formula (5-9): theta is an inclination angle formed by the road surface and the horizontal plane; h is the horizontal displacement of the robot movement and is measured by a visual odometer; z is the height of the slope changing point relative to the horizontal plane;
the road surface influence factor γ is calculated by the formula (7-2):
Figure BDA0003266799450000091
in the formula (5-10): the coefficient k is a physical quantity representing the intensity of shaking generated by the robot when the robot moves under the influence of the outside; the road surface gradient D is a physical quantity representing the degree of steepness of a ground surface unit, and the expression mode is as shown in a formula (5-9); the road roughness C is different along with different types of roads, and the road is divided into the following four types according to the equivalent shear wave velocity of the soil layer and the thickness of the field covering layer:
class I: offshore sea surface, island, coast, lake and desert areas;
class II: the method is characterized by comprising the following steps of (1) referring to fields, villages, jungles, hills and sparse villages and urban suburbs;
class III: the urban district with dense building groups is pointed;
and IV: it refers to an urban area with dense building groups and high houses.
The road surface roughness C refers to the ability of the corners of the road surface aggregate to hinder the movement of the robot, and is generally related to the road surface friction coefficient and the road surface structure depth, and causes relatively stronger influence on the motion state of the robot along with the increase of the grade corresponding to the road surface type, so that the robot pays higher cost in the process of reaching the target place.
The robot in the invention takes a 3D scanner as a tool for capturing environment information, so that the robot can accurately identify the distribution places of obstacles in the surrounding environment in the moving process to effectively avoid, and the stability of the self movement is ensured; and reasonable path planning is carried out according to the self positioning and the final target location, and the external interference on the moving process is reduced.

Claims (7)

1. An outdoor high-precision autonomous navigation system of a mobile robot is characterized in that: comprises a mechanical system and a software system;
the mechanical system comprises a GPS positioning device, a 3D scanner, a visual odometer, a server, a processor and an operation module; in the mechanical system: acquiring outdoor scene information and outdoor object information by scanning and three-dimensional reproduction through the 3D scanner; obtaining displacement information of the mobile robot by the visual odometer through image information processing; receiving and transmitting instructions and data between the 3D scanner and the GPS positioning device by the server; format converting and resolution adjusting, by the processor, the acquired images of the 3D scanner and the visual odometer; calculating the cost consumed by the mobile robot in the moving process by the operation module; the software system comprises a robot operating system ROS;
recording an initial position coordinate before the robot starts moving as a coordinate O, and storing the initial position coordinate O into a close _ list set in a server; aiming at the moving robot, the optimal path planning is carried out according to the following steps:
step 1, obtaining a coordinate set A of N direct places N adjacent to the current place where the robot is located and capable of directly reaching by using the GPS positioning equipment, wherein N is 1,2 and … … N; storing the coordinate set A into an open _ list set in a server, recording the coordinates of the current position of the robot as coordinates B, and storing the coordinates B into a close _ list set in the server;
step 2, aiming at the coordinate B and the coordinate set A stored in the server, the calculation module calculates and obtains the cost F required to be paid when the robot moves from the coordinate B to each direct position n in the coordinate set A in a one-to-one correspondence mode n From N costs F n Middle screening out lowest cost F min
Step 3, if corresponding to the lowest cost F min Is 1, i.e. corresponds to the lowest cost F min If the only direct place is the only direct place, taking the only direct place as the current target place W, and entering step 7; otherwise, entering step 4;
step 4, corresponding to the lowest cost F min Is more than 1, i.e. corresponds to the lowest cost F min Is not the only direct location, the current target location W is determined as follows:
will correspond to the lowest cost F mjn M through sites of (A) is recorded as W j J is 1,2 …, M; the operation module respectively obtains the movement of the robot from the initial position coordinate O to each direct position W through calculation j Each price G paid out j From M costs G j Middle screening out the lowest cost G min
Step 5, if corresponding to the lowest cost G min Is 1, i.e. corresponds to the lowest cost G min If the only direct place is the only direct place, taking the only direct place as the current target place W, and entering the step 7; otherwise, entering step 6;
step 6, if corresponding to the lowest cost G min Is more than 1, i.e. corresponds to the lowest cost F min Is not the only direct location, then corresponding to the lowest cost G min Any one of the direct points of (1) is taken as a current target point W, and the step 7 is carried out;
7, moving the robot from the current position to the current target position W, and finishing the movement of the current step when the robot reaches the current target position W; and then, taking the coordinates of the current target location W as the coordinates of the current location where the robot moves next, returning to the step 1 to continue moving next until the robot reaches the final target location, and ending the path planning.
2. The outdoor high-precision autonomous navigation system of a mobile robot according to claim 1, characterized in that: the direct site is determined as follows: and setting a robot moving step distance U, and regarding a place with a distance of U from the robot as an adjacent and direct place.
3. The outdoor high-precision autonomous navigation system of a mobile robot according to claim 2, characterized in that: the robot moving step U is obtained as follows: converting a three-dimensional image about an outdoor scene, which is obtained by scanning of a 3D scanner and stored in a server, into a two-dimensional image by using a visual odometer, establishing a two-dimensional rectangular coordinate system XOY by taking the gravity center of the two-dimensional image as an origin, and calculating and obtaining the moving distance of the robot in a T period as a moving step U aiming at image conversion in the T period.
4. The outdoor high-precision autonomous navigation system of a mobile robot according to claim 3, characterized in that: the image transformation is carried out image scaling transformation, and the robot moving step distance U is obtained by calculating the scaling transformation aiming at the image as follows: recording the coordinate point of the image in a two-dimensional rectangular coordinate system as P i [x p ,y p ]The scaling transformation of the image takes place at the robot movement (t) i ,t j ) Within a time interval, obtaining the moving speed v of the image by using a measuring device 1 Then, the displacement S of the image occurring in the unit period is calculated by the equation (5-1):
Figure FDA0003266799440000021
in a unit period, an image observed by a robot measuring device is subjected to a scaling transformation due to the movement of the robot, and in the image subjected to the scaling transformation, a coordinate point P is present i [x p ,y p ]Corresponding transformation to coordinate point P i ‘[x p' ,y p' ]According to the similarity theorem, the coordinate point P i [x p ,y p ]The multiple λ by which the scaling transformation occurs in the unit period is calculated by equation (5-2):
Figure FDA0003266799440000022
in the formula (5-2):
x p ' and y p Is a coordinate point P i ‘[x p' ,y p' ]The coordinate values of (a); x is the number of p And y p As a coordinate point P i [x p ,y p ]The coordinate values of (a);
the displacement L of the robot in a unit time period is shown as the formula (5-3):
Figure FDA0003266799440000023
in the formula (5-3):
Figure FDA0003266799440000024
is the ratio of the amount of displacement occurring due to the image scaling transformation within a unit period to the amount of displacement moved by the robot within the unit period,
Figure FDA0003266799440000025
then, the robot moving step U is calculated by equation (5-4):
Figure FDA0003266799440000031
the moving step U is that the robot is at (t) i ,t j ) The amount of displacement occurring within the period of time of (t) i ,t j ) The time period is composed of a finite number of unit periods.
5. The outdoor high-precision autonomous navigation system of a mobile robot according to claim 3, characterized in that: the image transformation is carried out to generate translation transformation, and the robot moving step distance U is calculated according to the translation transformation generated to the image as follows: the image translation transformation occurs in the unit time interval of the robot movement, and the speed v of the image movement is measured by the measuring equipment 1 Then, the displacement S of the image occurring in the unit period is calculated by equation (5-5):
Figure FDA0003266799440000032
the displacement L of the robot in a unit time period is shown as the formula (5-6):
L=υS (5-6)
in the formula (5-6):
upsilon is the ratio of the image translation distance in a unit time interval to the robot displacement, and upsilon is not equal to 0 and 1;
then, the robot moving step U is calculated by equation (5-7):
Figure FDA0003266799440000033
the moving step U is that the robot is at (t) i ,t j ) The amount of displacement occurring within the period of time of (t) i ,t j ) The time period is composed of a finite number of unit periods.
6. The outdoor high-precision autonomous navigation system of a mobile robot according to claim 1, characterized in that: the cost F paid by the robot for moving from the current position to the current target position is as follows:
Figure FDA0003266799440000034
in the formula (6-1): the road surface influence factor is expressed by gamma, different types of road surfaces are expressed by e, and the road surface influence factor is expressed by gamma e The road surface influence factors of the different road surfaces are shown, and the four different road surfaces are characterized by e-I, e-II, e-III and e-IV.
7. The outdoor high-precision autonomous navigation system of a mobile robot according to claim 6, characterized in that: the motion state of the robot in the outdoor environment is influenced by the road roughness C and the road gradient D; and comprises the following components:
Figure FDA0003266799440000041
in the formula (7-1): theta is an inclination angle formed by the road surface and the horizontal plane, H is the horizontal displacement of the robot movement and is measured by the visual odometer, and Z is the height of the slope change point relative to the horizontal plane;
the road surface influence factor γ is calculated by the formula (7-2):
Figure FDA0003266799440000042
in the formula (7-2): the coefficient k is a physical quantity for representing the intensity of shaking generated by the robot due to external influence when the robot moves; the road roughness C is different along with different types of roads, and the road is divided into the following four types according to the equivalent shear wave velocity of the soil layer and the thickness of the field covering layer:
class I: offshore sea surface, island, coast, lake and desert areas;
class II: the method is characterized by comprising the following steps of (1) referring to fields, villages, jungles, hills and sparse villages and urban suburbs;
class III: the urban district with dense building groups is pointed;
and IV: it refers to an urban area with dense building groups and high houses.
CN202111088955.3A 2021-09-16 2021-09-16 Outdoor high-precision autonomous navigation system of mobile robot Active CN113733091B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111088955.3A CN113733091B (en) 2021-09-16 2021-09-16 Outdoor high-precision autonomous navigation system of mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111088955.3A CN113733091B (en) 2021-09-16 2021-09-16 Outdoor high-precision autonomous navigation system of mobile robot

Publications (2)

Publication Number Publication Date
CN113733091A CN113733091A (en) 2021-12-03
CN113733091B true CN113733091B (en) 2022-08-30

Family

ID=78739407

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111088955.3A Active CN113733091B (en) 2021-09-16 2021-09-16 Outdoor high-precision autonomous navigation system of mobile robot

Country Status (1)

Country Link
CN (1) CN113733091B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105116902A (en) * 2015-09-09 2015-12-02 北京进化者机器人科技有限公司 Mobile robot obstacle avoidance navigation method and system
CN106607907A (en) * 2016-12-23 2017-05-03 西安交通大学 Mobile vision robot and measurement and control method thereof

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8271132B2 (en) * 2008-03-13 2012-09-18 Battelle Energy Alliance, Llc System and method for seamless task-directed autonomy for robots
KR101761313B1 (en) * 2010-12-06 2017-07-25 삼성전자주식회사 Robot and method for planning path of the same
US9969081B2 (en) * 2012-07-27 2018-05-15 Alberto Daniel Lacaze Method and system for the directed control of robotic assets

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105116902A (en) * 2015-09-09 2015-12-02 北京进化者机器人科技有限公司 Mobile robot obstacle avoidance navigation method and system
WO2017041730A1 (en) * 2015-09-09 2017-03-16 北京进化者机器人科技有限公司 Method and system for navigating mobile robot to bypass obstacle
CN106607907A (en) * 2016-12-23 2017-05-03 西安交通大学 Mobile vision robot and measurement and control method thereof

Also Published As

Publication number Publication date
CN113733091A (en) 2021-12-03

Similar Documents

Publication Publication Date Title
Barber et al. Geometric validation of a ground-based mobile laser scanning system
CN109709801A (en) A kind of indoor unmanned plane positioning system and method based on laser radar
Zhao et al. Reconstructing textured CAD model of urban environment using vehicle-borne laser range scanners and line cameras
US20150153444A1 (en) System and methods for data point detection and spatial modeling
JP2009294214A (en) Method and system for navigation based on topographic structure
CN1149916A (en) Method for collection, analysis, measurement and storage of geographical data
Toth R&D of mobile LIDAR mapping and future trends
CN101782642B (en) Method and device for absolutely positioning measurement target by multi-sensor fusion
Scaioni et al. Technical aspects related to the application of sfm photogrammetry in high mountain
Khoshelham et al. Vehicle positioning in the absence of GNSS signals: Potential of visual-inertial odometry
Chellappa et al. On the positioning of multisensor imagery for exploitation and target recognition
CN110986888A (en) Aerial photography integrated method
Grejner-Brzezinska et al. From Mobile Mapping to Telegeoinformatics
IL267309B2 (en) Terrestrial observation device having location determination functionality
CN113733091B (en) Outdoor high-precision autonomous navigation system of mobile robot
Ellum et al. Land-based integrated systems for mapping and GIS applications
CN116027351A (en) Hand-held/knapsack type SLAM device and positioning method
Chan Feature-Based Boresight Self-Calibration of a Mobile Mapping System
Iannucci et al. Cross-Modal Localization: Using automotive radar for absolute geolocation within a map produced with visible-light imagery
Ishii et al. Autonomous UAV flight using the Total Station Navigation System in Non-GNSS Environments
Niu et al. Directly georeferencing terrestrial imagery using MEMS-based INS/GNSS integrated systems
Wei Multi-sources fusion based vehicle localization in urban environments under a loosely coupled probabilistic framework
Chen et al. Panoramic epipolar image generation for mobile mapping system
Li et al. Terrestrial mobile mapping towards real-time geospatial data collection
Tamimi et al. Performance Assessment of a Mini Mobile Mapping System: Iphone 14 pro Installed on a e-Scooter

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant