CN112799403A - Space-time ultrasonic navigation data acquisition method and system for robot - Google Patents

Space-time ultrasonic navigation data acquisition method and system for robot Download PDF

Info

Publication number
CN112799403A
CN112799403A CN202011644928.5A CN202011644928A CN112799403A CN 112799403 A CN112799403 A CN 112799403A CN 202011644928 A CN202011644928 A CN 202011644928A CN 112799403 A CN112799403 A CN 112799403A
Authority
CN
China
Prior art keywords
robot
data
ultrasonic
control instruction
data acquisition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011644928.5A
Other languages
Chinese (zh)
Inventor
秦家虎
王帅
高炤
张展鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Science and Technology of China USTC
Original Assignee
University of Science and Technology of China USTC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology of China USTC filed Critical University of Science and Technology of China USTC
Priority to CN202011644928.5A priority Critical patent/CN112799403A/en
Publication of CN112799403A publication Critical patent/CN112799403A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Acoustics & Sound (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a space-time ultrasonic navigation data acquisition method for a robot, wherein the robot is provided with at least one ultrasonic sensing device, and the method comprises the following steps: s1, acquiring a robot motion control instruction and a data acquisition control instruction; s2, performing motion according to the robot motion control instruction, and acquiring state information and control information of the robot during motion in real time to obtain navigation data in continuous time periods; s3, controlling the ultrasonic sensing equipment to acquire ultrasonic data in continuous time periods according to the data acquisition control instruction; and S4, performing space-time data aggregation on the navigation data and the ultrasonic data to obtain space-time ultrasonic navigation data. The invention constructs a time-space ultrasonic navigation data set by using the ultrasonic sensor, trains the neural network controller based on the time-space ultrasonic navigation data, and can realize autonomous navigation of the robot in an indoor environment.

Description

Space-time ultrasonic navigation data acquisition method and system for robot
Technical Field
The invention relates to the technical field of autonomous navigation of robots, in particular to a method and a system for acquiring space-time ultrasonic navigation data of a robot.
Background
At present, various mobile robots which do not need manual remote control and can independently run, such as restaurant food delivery robots, express logistics robots, family floor sweeping robots, hotel guiding robots, hospital disinfection robots and the like, all need autonomous navigation technology to realize autonomous motion of the robots. So-called robot autonomous navigation, namely: the robot starts from the current position, and autonomously reaches a destination under the condition of not colliding with surrounding targets by planning a driving path. Specifically, autonomous navigation techniques generally involve sub-problems of mapping, localization, perception, planning, control, and the like.
With the development of artificial intelligence technology represented by deep learning, the artificial intelligence technology is also gradually applied to various sub-problems in the technology in robot autonomous navigation, such as: the robot is used for establishing and positioning the working environment of the robot through a graph neural network; the perception of the robot to surrounding targets is realized through the technologies of target detection, target tracking and the like so as to avoid collision; the real-time planning of the motion path of the robot is realized through a deep reinforcement learning technology; and a neural network controller is constructed through a deep neural network model, so that the motion control of the robot is realized.
The deep learning foundation stone is data, and training optimization of the deep learning model can be realized only by acquiring enough data and making the data into a standard data set, so that a relevant robot navigation task is completed.
The datasets for autonomous navigation of robots can be roughly classified according to the type of sensor they use, and currently it is more popular to use image datasets acquired by a camera and point cloud datasets acquired by a lidar. The image and lidar data have the advantage of being able to provide information that is richer in the surrounding environment, the image provides 2-dimensional information (pixel values), and the lidar provides three-dimensional information (three-dimensional point clouds), but have the disadvantage of being expensive.
Ultrasonic sensors have been the main sensors for robots in the last century because of their simple structure and low price, but with the development of the times, they have not been used as the main sensors at present because of less information they can provide (a single ultrasonic device only returns a distance value in a certain direction), and they are only used in a small amount when implementing an obstacle avoidance function for robots.
Disclosure of Invention
Technical problem to be solved
Aiming at the problems, the invention provides a space-time ultrasonic navigation data acquisition method and system for a robot, which are used for at least partially solving the technical problems of high cost of a sensor and the like in the traditional robot navigation system.
(II) technical scheme
The invention provides a space-time ultrasonic navigation data acquisition method for a robot, wherein the robot is provided with at least one ultrasonic sensing device, and the method comprises the following steps: s1, acquiring a robot motion control instruction and a data acquisition control instruction; s2, performing motion according to the robot motion control instruction, and acquiring state information and control information of the robot during motion in real time to obtain navigation data in continuous time periods; s3, controlling the ultrasonic sensing equipment to acquire ultrasonic data in continuous time periods according to the data acquisition control instruction; and S4, performing space-time data aggregation on the navigation data and the ultrasonic data to obtain space-time ultrasonic navigation data.
Further, the motion control instructions of the robot are acquired, including an accelerator, a brake, a gear and a steering, so as to control the motion of the robot; the acquisition of the data acquisition control instruction comprises an acquisition starting or stopping instruction, and the start and stop of an acquisition program are controlled.
Further, the state information collected in S2 includes the position and heading of the robot, and the collected control information includes the linear velocity and angular velocity of the robot.
Further, S2 specifically includes: and converting the robot motion control instruction into a linear velocity and angular velocity bottom layer control instruction which can be directly executed by the robot, and controlling the operation of the robot.
Further, the method for converting the robot motion control instruction into the bottom layer control instruction comprises the following steps: s201, recording the accelerator, the brake and the steering as a proportionality coefficient pgas、pbrake、psteerThe numerical range is [0, 1 ]]The gear is denoted as pgearTaking the value as {0, 1, 2, 3 }; s202, calculating the linear velocity according to the following formula:
Figure BDA0002874824560000021
wherein,
Figure BDA0002874824560000031
is the linear velocity at the present moment in time,
Figure BDA0002874824560000032
in order to determine the speed of the last time line,
Figure BDA0002874824560000033
the maximum allowable linear velocity;
s203, calculating the angular velocity according to the following formula:
Figure BDA0002874824560000034
wherein,
Figure BDA0002874824560000035
is the linear velocity at the present moment in time,
Figure BDA0002874824560000036
the last time line velocity;
s204, judging whether the bottom layer control instruction is the same moment instruction or not according to the time stamp contained in the bottom layer control instruction of the linear velocity and the angular velocity, if not, continuing to wait for the instruction to be updated until the robot motion control instruction at the same moment is obtained; and if so, sending the bottom control instruction to the robot.
Further, the ultrasonic data in S3 includes 2D point cloud data in at least one direction, and the single-time point cloud data processing is performed by the following formula:
Figure BDA0002874824560000037
the subscript i is the ultrasonic serial number and represents different directions, d is the Euclidean distance value, and (x, y) is the original point cloud value.
Further, S3 and S4 specifically include: s301, carrying out time stamp synchronization on ultrasonic data, state information and control information after point cloud data processing, and storing data acquired at the same time as a data sample; s302, storing the data acquired in continuous time into the same file, wherein the data in the file form space-time ultrasonic navigation data in the continuous time period; s401, repeating the operations of S301 and S302, respectively storing the space-time ultrasonic navigation data collected in different time periods into different files, and forming a space-time ultrasonic navigation data set by collecting all files after finishing.
In another aspect, the present invention provides a spatiotemporal ultrasonic navigation data acquisition system for a robot, comprising: the vehicle control node is used for sending a robot motion control instruction and a data acquisition control instruction to the data acquisition node; the data acquisition platform is used for carrying out movement according to the robot movement control instruction and acquiring state information and control information of the robot during movement in real time to obtain navigation data; controlling the ultrasonic sensing equipment to acquire ultrasonic data according to the data acquisition control instruction; the data acquisition node is used for acquiring a robot motion control instruction and a data acquisition control instruction and sending corresponding instructions to the data acquisition platform; and carrying out data fusion on the navigation data and the ultrasonic data to obtain space-time ultrasonic navigation data.
Furthermore, the data acquisition node comprises a robot control sub-node, a data processing sub-node and a data acquisition sub-node, wherein the robot control sub-node is used for converting a robot motion control instruction into a bottom layer control instruction and controlling the motion of the robot; the data acquisition sub-node is used for controlling the operation of the ultrasonic sensor according to the data acquisition control instruction and receiving the returned ultrasonic data; and the data processing sub-node is used for processing the ultrasonic data and the navigation data in real time.
In still another aspect, the present invention provides a training method for autonomous navigation of a robot, including constructing a spatiotemporal ultrasonic navigation data set according to the method for acquiring spatiotemporal ultrasonic navigation data for a robot, and training a neural network for autonomous navigation of a robot according to the spatiotemporal ultrasonic navigation data set.
(III) advantageous effects
The embodiment of the invention provides a method and a system for acquiring space-time ultrasonic navigation data for a robot.
Drawings
FIG. 1 schematically illustrates a flow diagram of a spatiotemporal ultrasound navigation data acquisition method for a robot in accordance with an embodiment of the present invention;
FIG. 2 schematically illustrates a flow diagram of a method for spatiotemporal data aggregation of navigation data with ultrasound data, in accordance with an embodiment of the present invention;
FIG. 3 schematically illustrates a structural diagram of a spatiotemporal ultrasound navigation data acquisition system for a robot in accordance with an embodiment of the present invention;
FIG. 4 schematically shows a flow chart of a robot control algorithm according to an embodiment of the invention;
FIG. 5 schematically illustrates a flow chart of a spatiotemporal ultrasound navigation data acquisition algorithm in accordance with an embodiment of the present invention;
FIG. 6 schematically shows a physical map of a data acquisition platform according to an embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to specific embodiments and the accompanying drawings.
Before describing the embodiments of the present invention, some concepts related to the present invention will be explained.
Spatio-temporal data: spatiotemporal data is a class of data that has both temporal and spatial dimensions. For example, a single frame image only contains spatial information of an environment at a certain time, and a video contains spatial information of an environment within a period of time, from which motion information of an object can be extracted. Therefore, the spatio-temporal data has a larger data size and a larger effect than the traditional spatial data or temporal data, but correspondingly, the overhead required for processing the spatio-temporal data is larger, and the model is more complex.
ROS: the Robot Operating System is an open-source Robot Operating System and is widely used. At present, most robots realize related functions of the robots by installing the operating system and developing related algorithms based on the operating system.
The spatio-temporal ultrasound navigation data set is a data set including spatio-temporal ultrasound data and robot navigation data. The space-time ultrasonic wave and the space-time ultrasonic wave navigation are concepts which are newly proposed by us, and the definitions of the concepts are as follows:
1. ultrasonic data acquired at a single moment is called ultrasonic data;
2. ultrasonic data acquired in a period of continuous time and ultrasonic data organized in a specific organization mode are called space-time ultrasonic data;
3. in the moving process of the robot, state information (position, course, and the like) and control information (linear velocity, angular velocity) of the robot are called navigation data;
4. a plurality of effective data which are formed by matching the space-time ultrasonic data and the navigation data and can be used for completing the robot navigation task are called space-time ultrasonic navigation data, and the plurality of data jointly form a space-time ultrasonic navigation data set.
It is believed that the potential of ultrasonic sensors may be further exploited, and even with simple and inexpensive ultrasonic sensors, more complex functions, such as autonomous navigation of a robot, may be possible. The possibility is explored in other work, and the neural network controller based on the time-space ultrasonic navigation data is trained by constructing a time-space ultrasonic navigation data set, so that autonomous navigation of the robot in an indoor environment can be realized. In conclusion, the invention mainly focuses on how to collect and make a standard time-space ultrasonic navigation data set for subsequently completing tasks such as autonomous navigation of the robot.
The embodiment of the invention provides a space-time ultrasonic navigation data acquisition method for a robot, wherein the robot is provided with at least one ultrasonic sensing device, please refer to fig. 1, and the method comprises the following steps: s1, acquiring a robot motion control instruction and a data acquisition control instruction; s2, performing motion according to the robot motion control instruction, and acquiring state information and control information of the robot during motion in real time to obtain navigation data in continuous time periods; s3, controlling the ultrasonic sensing equipment to acquire ultrasonic data in continuous time periods according to the data acquisition control instruction; and S4, performing space-time data aggregation on the navigation data and the ultrasonic data to obtain space-time ultrasonic navigation data.
The time-space ultrasonic navigation data acquisition method is mainly characterized in that original ultrasonic data, vehicle state information and vehicle control information are correspondingly processed according to data acquisition instructions to form time-space ultrasonic navigation data, and the time-space ultrasonic navigation data are stored to form a time-space ultrasonic navigation data set, and a working flow chart is shown in figure 2: detecting a data acquisition instruction, entering the next step if acquisition is started, and otherwise stopping the step for detection all the time; respectively receiving original ultrasonic data (2D point cloud data in 16 directions) with time stamps, vehicle state information (comprising positions and course angles) and vehicle control information (comprising linear speeds and angular speeds); processing point cloud data at a single moment, and calculating Euclidean distance values in 16 directions by the following formula; and carrying out time stamp synchronization on the three data, storing the data acquired at the same time as a data sample into a csv file, and storing the data acquired at continuous time into the same csv file, so that the data in the csv file form space-time ultrasonic navigation data in a certain time period. Here, the example of the ultrasonic data in 16 directions is not limited to that the number of the ultrasonic sensing devices is only 16, and any number of the ultrasonic sensing devices is possible.
On the basis of the above embodiment, the motion control command of the robot obtained in S1 includes accelerator, brake, gear, and steering, and the motion of the robot is controlled; the acquisition of the data acquisition control instruction comprises an acquisition starting or stopping instruction, and the start and stop of an acquisition program are controlled.
The linear speed of the robot, namely the speed of movement, can be controlled through an accelerator and a brake; the angular speed, namely the movement direction of the robot can be controlled through gears and steering; the free movement of the robot can be controlled through four movement control instructions. The data acquisition is either stopped or started, once the data acquisition is started, ultrasonic data, vehicle state information and vehicle control information are acquired simultaneously, and the ultrasonic data and the navigation data jointly form space-time ultrasonic navigation data.
On the basis of the above embodiment, the state information collected in S2 includes the position and heading of the robot, and the collected control information includes the linear velocity and angular velocity of the robot.
And according to the current position and course of the robot, the current linear speed and angular speed and the path planning, obtaining the next movement speed and movement direction and controlling the free operation of the robot.
On the basis of the above embodiments, referring to fig. 5, S2 specifically includes: and converting the robot motion control instruction into a linear velocity and angular velocity bottom layer control instruction which can be directly executed by the robot, and controlling the operation of the robot.
The robot control algorithm is mainly responsible for converting a high semantic level robot control instruction issued by a human driving expert through a remote control device into a bottom layer control instruction which can be directly executed by the robot and is issued to the robot, so that the control of the robot is realized. The four commands of 'accelerator', 'brake', 'gear position' and 'steering' are converted into bottom layer control commands which can be directly executed by the robot, namely 'linear velocity' and 'angular velocity', and the actual operation of the robot is controlled.
On the basis of the embodiment, the method for converting the robot motion control instruction into the bottom layer control instruction comprises the following steps: s201, recording the accelerator, the brake and the steering as a proportionality coefficient pgas、pbrake、psteerThe numerical range is [0, 1 ]]The gear is denoted as pgearTaking the value as {0, 1, 2, 3 }; s202, calculating the linear velocity according to the following formula:
Figure BDA0002874824560000071
wherein,
Figure BDA0002874824560000072
is the linear velocity at the present moment in time,
Figure BDA0002874824560000073
in order to determine the speed of the last time line,
Figure BDA0002874824560000074
the maximum allowable linear velocity; s203, calculating the angular velocity according to the following formula:
Figure BDA0002874824560000075
wherein,
Figure BDA0002874824560000076
is the linear velocity at the present moment in time,
Figure BDA0002874824560000077
the last time line velocity; s204, judging whether the bottom layer control instruction is the same moment instruction or not according to the time stamp contained in the bottom layer control instruction of the linear velocity and the angular velocity, if not, continuing to wait for the instruction to be updated until the robot motion control instruction at the same moment is obtained; and if so, sending the bottom control instruction to the robot.
Taking a control instruction sent by a human driving expert through a remote control device as program input; the linear velocity controller calculates a linear velocity control instruction of the robot at the current moment according to the previous moment reticle velocity and the current moment accelerator, brake and gear value by using the control rate; the angular velocity controller calculates the angular velocity control command of the robot at the current moment according to the angular velocity at the previous moment, the gear and the steering value at the current moment by using the control rate; and if the linear velocity control instruction and the angular velocity control instruction are judged to be the same moment control instruction, the synchronized control instruction is issued to the robot controller to control the robot to move, and is simultaneously sent to a space-time ultrasonic navigation data acquisition algorithm to wait for further processing.
On the basis of the above-described embodiment, the ultrasonic data in S3 includes 2D point cloud data in at least one direction, and the single-time point cloud data processing is performed by the following formula:
Figure BDA0002874824560000081
the subscript i is the ultrasonic serial number and represents different directions, d is the Euclidean distance value, and (x, y) is the original point cloud value.
The ultrasonic data comprises position information of the obstacles reflected back by the obstacles in multiple directions, a single ultrasonic device only returns a distance value in a certain direction, and more abundant three-dimensional environment information can be obtained by deploying multiple ultrasonic sensing devices on the robot. Compared with an image data set acquired by a camera and a point cloud data set acquired by a laser radar, the ultrasonic sensing equipment has the advantages of simple structure and low price.
On the basis of the above embodiments, referring to fig. 3, S3 and S4 specifically include: s301, carrying out time stamp synchronization on ultrasonic data, state information and control information after point cloud data processing, and storing data acquired at the same time as a data sample; s302, storing the data acquired in continuous time into the same file, wherein the data in the file form space-time ultrasonic navigation data in the continuous time period; s401, repeating the operations of S301 and S302, respectively storing the space-time ultrasonic navigation data collected in different time periods into different files, and forming a space-time ultrasonic navigation data set by collecting all files after finishing.
Carrying out time stamp synchronization on ultrasonic data and navigation data, storing data acquired at the same time as a data sample into a csv file, and storing data samples acquired at continuous time into the same csv file, wherein the data in the csv file form space-time ultrasonic navigation data in a certain time period; and repeating the process, respectively storing the space-time ultrasonic navigation data acquired in different time periods into different csv files, and finally forming a space-time ultrasonic navigation data set by all the csv files.
Another embodiment of the present invention provides a spatiotemporal ultrasonic navigation data acquisition system for a robot, referring to fig. 4, including: the vehicle control node is used for sending a robot motion control instruction and a data acquisition control instruction to the data acquisition node; the data acquisition platform is used for carrying out movement according to the robot movement control instruction and acquiring state information and control information of the robot during movement in real time to obtain navigation data; controlling the ultrasonic sensing equipment to acquire ultrasonic data according to the data acquisition control instruction; the data acquisition node is used for acquiring a robot motion control instruction and a data acquisition control instruction and sending corresponding instructions to the data acquisition platform; and carrying out data fusion on the navigation data and the ultrasonic data to obtain space-time ultrasonic navigation data.
The system mainly comprises a data acquisition platform, a vehicle control node and a data acquisition node, and the operation process of the whole system is as follows:
1. firstly, in a vehicle control node, a human driving expert (data acquisition engineer) sends a data acquisition control instruction and a robot motion control instruction with a high semantic level to the data acquisition node through a remote control device.
2. Then, in the data acquisition nodes, on one hand, the robot control sub-node converts the received robot motion control instruction with the high semantic level into a bottom control instruction which can be actually executed by the robot, and sends the bottom control instruction to the robot controller to control the robot to move; and on the other hand, the data acquisition sub-node controls the operation of the ultrasonic sensor according to the received data acquisition control instruction, receives the returned ultrasonic data, and simultaneously acquires the state information and the control information of the robot in real time to form navigation data.
3. And finally, the data processing sub-node processes the ultrasonic data and the robot navigation data in real time to finally form space-time ultrasonic navigation data, and the space-time ultrasonic navigation data are stored in the data storage equipment to form a space-time ultrasonic navigation data set.
On the basis of the embodiment, the data acquisition node comprises a robot control sub-node, a data processing sub-node and a data acquisition sub-node, wherein the robot control sub-node is used for converting a robot motion control instruction into a bottom layer control instruction and controlling the robot to move; the data acquisition sub-node is used for controlling the operation of the ultrasonic sensor according to the data acquisition control instruction and receiving the returned ultrasonic data; and the data processing sub-node is used for processing the ultrasonic data and the navigation data in real time.
Robot control child node: the method comprises the steps of receiving vehicle motion control instructions transmitted by a remote control device, namely four instructions of 'accelerator', 'brake', 'gear position' and 'steering', converting the vehicle motion control instructions into bottom layer control instructions which can be directly executed by the robot, namely 'linear velocity' and 'angular velocity', and controlling the actual operation of the robot. The specific control logic is seen in the robot control algorithm section, as shown in fig. 5.
Data acquisition child node: the acquisition and stop of ultrasonic data are controlled by receiving 'start acquisition' and 'stop acquisition' instructions transmitted by remote control, and the motion state and vehicle control information of the vehicle are acquired. See the space-time ultrasonic navigation data acquisition algorithm in detail, as shown in fig. 2.
A data processing sub-node: and processing the acquired ultrasonic data, the vehicle motion state data and the vehicle control data in real time to form space-time ultrasonic navigation data, and storing the space-time ultrasonic navigation data into the data storage equipment. See the space-time ultrasonic navigation data acquisition algorithm in detail, as shown in fig. 2.
In addition, the vehicle control node includes: the human driving expert is mainly responsible for manually controlling the robot to operate in a working area and acquiring sensor data, robot motion data and robot control data in the operation process, namely navigation information required by the robot during autonomous operation; the remote control device is generally a manual remote control device, is connected with the robot controller through Bluetooth or a wireless network, and transmits a control instruction of a human driver expert and a data acquisition instruction to the robot; the method comprises the following steps of data acquisition control and vehicle motion control, wherein the data acquisition control controls the start and stop of an acquisition program by sending two control instructions of 'start acquisition' or 'stop acquisition' to a data acquisition node; the vehicle motion control controls the motion of the robot by sending four commands of 'accelerator', 'brake', 'gear' and 'steering' to the robot control sub-node.
The data acquisition platform includes: the wheeled mobile robot is a universal wheeled mobile platform, wherein equipment such as an encoder, an IMU (inertial measurement Unit), a gyroscope and the like are carried to acquire the motion state of the robot in real time, and an NVIDIA TX2 development board is carried to be used as a controller and data processing equipment of the robot; the data storage device is a mechanical hard disk and is connected with the TX2 and used for storing data; the ultrasonic sensing equipment comprises 16 ultrasonic sensors in total, is distributed around the robot and is used for acquiring distance values of the robot body in 16 directions.
An autonomously modified Pioneer3-DX mobile robot is used as a data acquisition platform, and a real object diagram of the platform is shown in figure 6.
In another embodiment of the present invention, a training method for autonomous navigation of a robot is provided, which includes constructing a spatiotemporal ultrasonic navigation data set according to the method for acquiring spatiotemporal ultrasonic navigation data of a robot, and training a neural network for autonomous navigation of a robot according to the spatiotemporal ultrasonic navigation data set.
The autonomous navigation of the robot in the indoor environment can be realized by constructing a time-space ultrasonic navigation data set, acquiring enough data, making the data into a standard data set and training a neural network controller based on the time-space ultrasonic navigation data. Of course, there are more possibilities to be developed for future spatio-temporal ultrasound navigation data.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are only exemplary embodiments of the present invention, and are not intended to limit the present invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A spatiotemporal ultrasonic navigation data acquisition method for a robot is provided, wherein the robot is provided with at least one ultrasonic sensing device, and the method comprises the following steps:
s1, acquiring a robot motion control instruction and a data acquisition control instruction;
s2, performing motion according to the robot motion control instruction, and acquiring state information and control information of the robot during motion in real time to obtain navigation data in a continuous time period;
s3, controlling the ultrasonic sensing equipment to acquire ultrasonic data in the continuous time period according to the data acquisition control instruction;
and S4, performing space-time data aggregation on the navigation data and the ultrasonic data to obtain space-time ultrasonic navigation data.
2. The spatiotemporal ultrasonic navigation data acquisition method for the robot as claimed in claim 1, wherein the robot motion control commands obtained in S1 include throttle, brake, gear and steering to control the motion of the robot; the acquisition data acquisition control instruction comprises an acquisition starting or stopping instruction and controls the starting and stopping of an acquisition program.
3. The method for acquiring spatiotemporal ultrasonic navigation data of a robot according to claim 1, wherein the state information acquired in S2 includes the position and heading of the robot, and the control information acquired includes linear velocity and angular velocity of the robot.
4. The spatiotemporal ultrasound navigation data acquisition method for robots according to claim 3, wherein said S2 specifically comprises: and converting the robot motion control instruction into a linear velocity and angular velocity bottom layer control instruction which can be directly executed by the robot, and controlling the operation of the robot.
5. The spatiotemporal ultrasound navigation data acquisition method for robots according to claim 4, wherein the method of translating the robot motion control commands into underlying control commands comprises:
s201, recording the accelerator, the brake and the steering as a proportionality coefficient pgas、pbrake、psteerThe numerical range is [0, 1 ]]The gear is denoted as pgearTaking the value as {0, 1, 2, 3 };
s202, calculating the linear velocity according to the following formula:
Figure FDA0002874824550000021
wherein,
Figure FDA0002874824550000022
is the linear velocity at the present moment in time,
Figure FDA0002874824550000023
in order to determine the speed of the last time line,
Figure FDA0002874824550000024
the maximum allowable linear velocity;
s203, calculating the angular velocity according to the following formula:
Figure FDA0002874824550000025
wherein,
Figure FDA0002874824550000026
is the linear velocity at the present moment in time,
Figure FDA0002874824550000027
is the last momentLinear velocity;
s204, judging whether the bottom layer control instruction is the same moment instruction or not according to the timestamp contained in the bottom layer control instruction of the linear velocity and the angular velocity, if not, continuing to wait for instruction updating until the robot motion control instruction at the same moment is obtained; and if so, sending the bottom layer control instruction to the robot.
6. The spatiotemporal ultrasonic navigation data acquisition method for a robot according to claim 1, wherein the ultrasonic data in S3 includes 2D point cloud data in at least one direction, and the single moment point cloud data processing is performed by the following formula:
Figure FDA0002874824550000028
the subscript i is the ultrasonic serial number and represents different directions, d is the Euclidean distance value, and (x, y) is the original point cloud value.
7. The spatiotemporal ultrasound navigation data acquisition method for robots according to claim 6, wherein the S3, S4 specifically comprises:
s301, carrying out time stamp synchronization on the ultrasonic data, the state information and the control information after the point cloud data processing, and storing the data acquired at the same time as a data sample;
s302, storing data acquired in continuous time into the same file, wherein the data in the file form space-time ultrasonic navigation data in the continuous time period;
s401, repeating the operations of S301 and S302, respectively storing the space-time ultrasonic navigation data collected in different time periods into different files, and forming a space-time ultrasonic navigation data set by collecting all files after finishing.
8. A spatiotemporal ultrasound navigation data acquisition system for a robot, comprising:
the vehicle control node is used for sending a robot motion control instruction and a data acquisition control instruction to the data acquisition node;
the data acquisition platform is used for carrying out movement according to the robot movement control instruction and acquiring state information and control information of the robot during movement in real time to obtain navigation data; controlling the ultrasonic sensing equipment to acquire ultrasonic data according to the data acquisition control instruction;
the data acquisition node is used for acquiring a robot motion control instruction and a data acquisition control instruction and sending corresponding instructions to the data acquisition platform; and carrying out data fusion on the navigation data and the ultrasonic data to obtain space-time ultrasonic navigation data.
9. The spatiotemporal ultrasonic navigation data acquisition system for robots of claim 8, wherein the data acquisition nodes comprise a robot control sub-node, a data processing sub-node, a data acquisition sub-node, the robot control sub-node is used for converting a robot motion control instruction into a bottom layer control instruction for controlling the robot motion; the data acquisition sub-node is used for controlling the operation of the ultrasonic sensor according to the data acquisition control instruction and receiving returned ultrasonic data; and the data processing sub-node is used for processing the ultrasonic data and the navigation data in real time.
10. A training method for robot autonomous navigation, comprising constructing a spatiotemporal ultrasonic navigation data set according to the method for spatiotemporal ultrasonic navigation data acquisition of a robot of any one of claims 1 to 7, and training a neural network for robot autonomous navigation according to the spatiotemporal ultrasonic navigation data set.
CN202011644928.5A 2020-12-31 2020-12-31 Space-time ultrasonic navigation data acquisition method and system for robot Pending CN112799403A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011644928.5A CN112799403A (en) 2020-12-31 2020-12-31 Space-time ultrasonic navigation data acquisition method and system for robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011644928.5A CN112799403A (en) 2020-12-31 2020-12-31 Space-time ultrasonic navigation data acquisition method and system for robot

Publications (1)

Publication Number Publication Date
CN112799403A true CN112799403A (en) 2021-05-14

Family

ID=75807592

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011644928.5A Pending CN112799403A (en) 2020-12-31 2020-12-31 Space-time ultrasonic navigation data acquisition method and system for robot

Country Status (1)

Country Link
CN (1) CN112799403A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102621986A (en) * 2012-04-13 2012-08-01 西北农林科技大学 Navigation control system based on vision and ultrasonic waves
CN105629970A (en) * 2014-11-03 2016-06-01 贵州亿丰升华科技机器人有限公司 Robot positioning obstacle-avoiding method based on supersonic wave

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102621986A (en) * 2012-04-13 2012-08-01 西北农林科技大学 Navigation control system based on vision and ultrasonic waves
CN105629970A (en) * 2014-11-03 2016-06-01 贵州亿丰升华科技机器人有限公司 Robot positioning obstacle-avoiding method based on supersonic wave

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
HUITEMA, RB,等: "Ultrasonic motion analysis system - measurement of temporal and spatial gait parameters", 《JOURNAL OF BIOMECHANICS》 *
SHUAI WANG,等: "Spatio-Temporal Ultrasonic Dataset: LearningDriving from Spatial and Temporal Ultrasonic Cues", 《2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS)》 *
YONGBIN QI,等: "Estimation of Spatial-Temporal Gait Parameters Using a Low-Cost Ultrasonic Motion Analysis System", 《SENSORS》 *

Similar Documents

Publication Publication Date Title
CN111273655B (en) Motion planning method and system for an autonomous vehicle
US10705528B2 (en) Autonomous visual navigation
US10725471B2 (en) Virtual line-following and retrofit method for autonomous vehicles
US11740624B2 (en) Advanced control system with multiple control paradigms
JP2019501384A (en) Autonomous positioning navigation equipment, positioning navigation method and autonomous positioning navigation system
CN105629970A (en) Robot positioning obstacle-avoiding method based on supersonic wave
CN115200588B (en) SLAM autonomous navigation method and device for mobile robot
Chirca et al. Autonomous valet parking system architecture
CN111208814B (en) Memory-based optimal motion planning for an automatic vehicle using dynamic models
CN110901656B (en) Experimental design method and system for autonomous vehicle control
WO2020008755A1 (en) Information processing device, information processing system, action planning method, and program
CN114347033A (en) Robot article grabbing method and device, robot and storage medium
CN112857370A (en) Robot map-free navigation method based on time sequence information modeling
CN115805595B (en) Robot navigation method and device and sundry cleaning robot
Pandey et al. Real time navigation strategies for webots using fuzzy controller
CN112799403A (en) Space-time ultrasonic navigation data acquisition method and system for robot
CN115685992A (en) Automatic driving vehicle path planning method, device, vehicle and medium
CN112947426A (en) Cleaning robot motion control system and method based on multi-sensing fusion
Wang et al. Multi-robot formation and tracking control method
EP4024155B1 (en) Method, system and computer program product of control of unmanned aerial vehicles
Rosendal Distributed Formation Control with Dynamic Obstacle Avoidance of Omni-wheeled Mobile Robots via Model Predictive Control
Mohan et al. A comprehensive review of SLAM techniques
Mohan et al. 22 A Comprehensive
CN115014357A (en) Navigation robot system based on user-defined map area
CN118349003A (en) Robot navigation obstacle avoidance method and system based on reinforcement learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210514

RJ01 Rejection of invention patent application after publication