CN109079799B - Robot perception control system and control method based on bionics - Google Patents

Robot perception control system and control method based on bionics Download PDF

Info

Publication number
CN109079799B
CN109079799B CN201811236664.2A CN201811236664A CN109079799B CN 109079799 B CN109079799 B CN 109079799B CN 201811236664 A CN201811236664 A CN 201811236664A CN 109079799 B CN109079799 B CN 109079799B
Authority
CN
China
Prior art keywords
robot
mechanical arm
visual
map
steering wheel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811236664.2A
Other languages
Chinese (zh)
Other versions
CN109079799A (en
Inventor
陈浩耀
费耀南
全凤宇
李衍杰
刘云辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Graduate School Harbin Institute of Technology
Original Assignee
Shenzhen Graduate School Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Graduate School Harbin Institute of Technology filed Critical Shenzhen Graduate School Harbin Institute of Technology
Priority to CN201811236664.2A priority Critical patent/CN109079799B/en
Publication of CN109079799A publication Critical patent/CN109079799A/en
Application granted granted Critical
Publication of CN109079799B publication Critical patent/CN109079799B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Abstract

The invention discloses a robot perception control system and a robot perception control method based on bionics. The invention uses the visual inertial navigation module as the visual navigation positioning system of the bionic robot, and uses the multi-degree-of-freedom motion of the mechanical arm to enable the visual system of the robot to get rid of the limitation of the motion of the robot body, so that the robot can deal with complicated and changeable dynamic scenes, and simultaneously uses the visual inertial SLAM technology to realize the real-time navigation and positioning functions, so that the bionic robot has strong man-machine interaction function and can be applied to various dynamic scenes.

Description

Robot perception control system and control method based on bionics
Technical Field
The invention relates to the field of unmanned aerial vehicles, in particular to a robot perception control system and a robot perception control method based on bionics.
Background
With the rapid development of the robot technology, especially, the robot with the mechanical arm makes a lot of breakthroughs in the scenes of carrying, grabbing, classifying and the like, and the robot positioning and navigation technology with the environment interaction function becomes an important problem to be solved by various large companies and research institutions. Since the movement and grasping of the flying robot arm must be performed in most environments under the condition that the robot body is kept static or in a stable state, the movement of the robot is greatly limited, and secondly, most of the positioning and navigation modules of the robot are fixed on the robot body at present, the visual field of the robot is forced to change along with the movement of the robot, so that the visual field of the robot is kept fixed when the robot arm performs various actions, the environment outside the visual field cannot be perceived, the perception capability of the robot on the surrounding environment is also greatly limited, and the robot cannot cope with unpredictable dynamic practical application scenes.
The invention relates to a bionic eye-based unmanned vehicle path planning (publication number: 106529466A). The invention discloses a method for planning a route of an unmanned vehicle based on a bionic eye, wherein the bionic eye is controlled by a motor to move up and down, left and right and move around an optical axis, simultaneously obtains an image of a surrounding environment, performs SIFT feature point matching and Harris corner matching on the image to realize three-dimensional reconstruction of the environment, and simultaneously identifies and tracks a target vehicle to obtain the position and motion information of the target vehicle. The camera is fixed on the upper part of the vehicle body, the motor is used for controlling the camera to rotate so as to enlarge the visual field range, the freedom degree of the camera is limited, and the camera body cannot move randomly, so that the application range of the camera is very limited.
The invention relates to a bionic eye positioning and tracking system and a working method thereof (publication number: 106042005A). The invention discloses a positioning and tracking method of a bionic eye, wherein the bionic eye is realized by a bionic eye holder module, and the holder can realize the horizontal rotation and the vertical rotation of a camera to realize the real-time capture of the surrounding environment. The camera in the invention is fixedly connected to the robot through the steering engine control joint, similar to the invention patent, and the motion range of the camera is limited.
The invention relates to a bionic eye movement control system and a bionic eye movement control method (publication number: 106155113A). The invention discloses a bionic eye movement control system and a bionic eye control method, wherein four stepping motors are used for respectively controlling the pitching and the deflection of two cameras, the speed of a target in an image plane is calculated by utilizing the difference between two adjacent frames of images captured by the cameras, and the movement speeds of the four stepping motors are controlled according to the movement speed of the target, so that the effect of simulating human eyes to realize saccadic and smooth tracking is realized. In the method, only the image information of the camera controls the motion of the camera, and the functions of advanced positioning navigation and the like of a vision system are not realized, so that the important requirement of the vision navigation positioning of the robot cannot be met.
Disclosure of Invention
In order to solve the technical problems, one of the objectives of the present invention is to provide a robot sensing control system based on bionics, which is to separate a visual navigation positioning system of a robot from a robot body by simulating a bird vision system, so as to achieve that the visual field range of the robot is not restricted by the freedom of the robot body, and the motion of the robot body is not restricted by the grabbing of a mechanical arm, in order to solve the problems that the current robot positioning navigation system is fixed on the robot body, so that the visual field of the robot is limited, and the motion of the robot body is not flexible enough.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
a robot perception control system based on bionics comprises a robot body, a mechanical arm and a visual inertial navigation module, wherein a mechanical arm base is arranged on the robot body, the mechanical arm is mounted on the mechanical arm base, a plurality of rotary joints are arranged in the mechanical arm, a steering engine is arranged in each rotary joint, the visual inertial navigation module is mounted at the tail end of the mechanical arm, the visual inertial navigation module is used for carrying out real-time positioning and synchronously establishing a map by adopting a visual inertial odometer, and the visual inertial odometer comprises a fisheye camera and an IMU;
the fisheye camera is used for capturing images;
the IMU obtains attitude information of the IMU by measuring three-axis attitude angular velocity and acceleration and performing pre-integration, the IMU and the visual odometer adopt a tightly-coupled fusion mode to add image characteristic information into a state vector of a system for joint optimization, and the attitude information of the IMU and the visual information of the visual odometer are fused for updating an attitude map and updating a map in real time;
the mechanical arm adjusts the visual inertial navigation module at the tail end of the mechanical arm through the rotary joint to reach any appointed position and attitude in a working space, and the robot navigates and positions in the established map through the visual inertial navigation module.
Preferably, the visual inertial odometer is a monocular visual inertial odometer, and the fisheye camera is a monocular camera.
Preferably, the visual inertial odometer is a binocular visual inertial odometer, that is, two fish-eye cameras are respectively arranged on two sides of the IMU and used for expanding the visual field.
Preferably, the mechanical arm has six degrees of freedom, and each degree of freedom comprises a motor, a reduction gear set and an encoder. The reduction gear set is installed on the output shaft of motor, set up a plurality of intermeshing's gear in the reduction gear set, set up the rotation axis that is used for connecting the steering wheel among the steering wheel on the play shaft gear among the reduction gear set.
Preferably, the steering engine further comprises a steering engine body, a steering engine base, an auxiliary steering wheel, a main steering wheel and a steering engine outer side wall, wherein the steering engine outer side wall is arranged between the auxiliary steering wheel and the main steering wheel and distributed along the circumferential direction of the steering engine body, the steering engine body is arranged on the steering engine base, and the auxiliary steering wheel and the main steering wheel are respectively arranged on the outer side of the top wall and the outer side of the bottom wall, which are opposite to each other, of the steering engine body. The main rudder disk and the auxiliary rudder disk are provided with mounting holes which are mutually opposite, and the mounting holes are used for fixing a transmission connecting piece, so that the auxiliary rudder disk can synchronously move along with the main rudder disk.
Preferably, the steering engine comprises a first steering engine, a second steering engine, a third steering engine, a fourth steering engine, a fifth steering engine and a sixth steering engine, wherein an auxiliary steering wheel of the first steering engine is connected with the mechanical arm base, a main steering wheel of the first steering engine is connected with the outer side wall of the steering engine of the second steering engine, a main steering wheel of the second steering engine is connected with the fixed side of one end of an extension arm rod, the other end of the extension arm rod is connected with the steering engine base of the third steering engine, the main steering wheel and the auxiliary steering wheel of the third steering engine are respectively connected with one ends of two parallel connecting rods, the other ends of the two parallel connecting rods are respectively fixed on the outer side wall of the steering engine of the fourth steering engine, the main steering wheel of the fourth steering engine is connected with the base of the fifth steering engine, the main steering wheel and the auxiliary steering wheel of the fifth steering engine are respectively connected with one ends of the two parallel connecting rods, the other ends of the two parallel connecting rods are connected with the outer side wall of the steering engine of the sixth steering engine, and the main steering wheel of the sixth steering engine is connected with the visual inertial navigation module. The steering engine is driven by PWM waves output by the motor controller, the motor controller adjusts the rotation angle of the motor by adjusting the duty ratio of the PWM waves, and each motor is provided with an encoder for reading an angular velocity value, so that the feedback control of each joint can be realized.
Preferably, a rotating shaft is arranged in the steering engine and used for connecting the motor and the main steering wheel, the rotating shaft of the third steering engine is intersected with the rotating shaft of the fourth steering engine in direction, the rotating shaft of the fourth steering engine is intersected with the rotating shaft of the first steering engine in direction, and the rotating shaft of the fourth steering engine is identical to the rotating shaft of the sixth steering engine in direction.
The invention also aims to provide a control method of the robot perception control system based on the bionics, which realizes the position accuracy of the robot body in the positioning navigation by controlling the robot body to move by using a visual inertial navigation module at the tail end of a mechanical arm, and comprises the following steps:
the method comprises the following steps:
the method comprises the following steps that (1), a robot traverses once in a required environment through a visual inertial navigation module to establish a map required by positioning navigation, and feature points and descriptors are stored in the map;
the robot captures information in the surrounding environment through a fisheye camera at the tail end of a mechanical arm and matches the information with key feature points in a built map to obtain position information of the fisheye camera in the current map;
and (3) determining the state of the mechanical arm in the current state to enable the tail end of the mechanical arm and the machine body to form a fixed connection relation, calculating a rotation and translation relation between the fisheye camera and the robot body according to the fixed connection relation, calculating the position information of the robot body in the current map by using the rotation and translation relation between the camera and the robot body, and controlling the robot to reach the target position according to the position information after calculating the position information of the robot body in the current map coordinate system.
Preferably, the process of establishing the map in step (1) is a SLAM process, and the SLAM process includes two parts, namely a front end part and a back end part:
s1, a visual odometer detects and matches characteristic points through an image captured by a fisheye camera, judges whether a current frame is a key frame or not, calculates the rotation and translation relation of the image according to an eight-point method, and recovers three-dimensional information of the characteristic points of the image in the current frame through a triangularization part to generate a 3D point;
s2, if the current frame is a key frame, entering a rear-end optimization part, measuring three-axis attitude angular velocity and acceleration by the IMU and performing pre-integration to obtain attitude information of the IMU, adding image characteristic information into a state vector of the system to perform joint optimization by the IMU and the visual odometer in a tightly coupled fusion mode, processing IMU data by a pre-integration technology, converting a reference system of a motion model of the IMU from a fixed initial reference system to a constantly changing relative reference system, fusing the attitude information of the mechanical arm and the visual information of the visual odometer to update an attitude map for global optimization and updating the map in real time, and updating the attitude map by loop detection in the process of detecting and matching the characteristic points of the approximate image by the visual odometer so as to update the map.
The fish-eye camera and the IMU are used as a visual navigation positioning system of the bionic robot, map scale information under a monocular scene is recovered through the IMU information, and meanwhile multi-degree-of-freedom positioning of the unmanned flying mechanical arm in the robot is achieved on the basis of the visual inertial odometer.
Preferably, the step (3) for the robot body to reach the target position includes the following steps:
a1. obtaining the pose of the fisheye camera at the end of the robot arm relative to the world coordinate system by means of a visual odometer, including translation
Figure BDA0001838326350000041
And rotation
Figure BDA0001838326350000042
The external parameters of the fisheye camera and the mechanical arm are obtained by a hand-eye calibration method;
a2. the mechanical arm is fixedly connected below the robot body, and the posture of the fisheye camera at the tail end of the mechanical arm relative to the robot body can be obtained through the kinematic relation of the mechanical arm and comprises translation
Figure BDA0001838326350000043
And rotation
Figure BDA0001838326350000044
a3. Obtaining the posture of the robot body relative to the world coordinate system through the coordinate system transformation in the three-dimensional space, including translation
Figure BDA0001838326350000045
And rotation
Figure BDA0001838326350000046
Wherein
Figure BDA0001838326350000047
Preferably, the calculation process of the robot body to reach the target position in the step (3) is as follows:
Figure BDA0001838326350000051
Figure BDA0001838326350000052
Figure BDA0001838326350000053
T=(x,y,z)T
wherein
Figure BDA0001838326350000054
The position of the robot body under the world coordinate system is shown,
Figure BDA0001838326350000055
the position of the fisheye camera in a robot coordinate system is shown,
Figure BDA0001838326350000056
the position of the fisheye camera in a world coordinate system is shown, R represents rotation in three-dimensional space, R, p and l respectively represent angles of rotation around X, Y, Z three axes, T represents translation in the three-dimensional space, and x, y and z respectively represent translation along X, Y, Z three axes.
Compared with the prior art, the invention has the beneficial technical effects that:
the robot perception control system and the control method based on the bionics adopt the mechanical arm as the neck of the bionics robot, the visual inertial navigation module is fixedly connected to the tail end of the mechanical arm and serves as the visual system of the bionics robot, the movement range of the visual system is enlarged to the maximum extent, and the flexibility of the robot body is enhanced to deal with various dynamic scenes. The positioning navigation system adopts a scheme of combining monocular vision inertial odometer with real-time map construction, and simultaneously utilizes the generated grid map to perform real-time navigation and positioning. The vision inertial navigation module is used as a vision navigation positioning system of the bionic robot, the robot vision system is free from the limitation of the movement of a robot body by utilizing the multi-degree-of-freedom movement of the mechanical arm, the vision range of the robot is greatly increased, the robot can cope with complex and changeable dynamic scenes, and meanwhile, the vision system realizes real-time navigation and positioning functions by utilizing the vision inertial SLAM technology, so that the bionic robot has a strong man-machine interaction function, can be applied to various dynamic scenes and has great application value.
Drawings
FIG. 1 is a schematic structural diagram of a robot sensing control system based on bionics;
FIG. 2 is a flow chart of SLAM process in the control method of the robot sensing control system based on the bionics;
fig. 3 is a robot vision control flow chart in the control method of the bionic-based robot perception control system of the invention.
Reference numerals:
1. a propeller; 2. a propeller motor; 3. a body frame; 4. a battery; 5. a mechanical arm base; 6. a first steering engine; 7. a second steering engine; 8. an extension arm; 9. a third steering engine; 10. a fourth steering engine; 11. a fifth steering engine; 12. a sixth steering engine; 13. a fisheye camera; 14. a camera support; 15. a connecting rod.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments, but the scope of the present invention is not limited to the following embodiments.
System embodiment
As shown in fig. 1 and 2, a robot perception control system based on bionics, includes robot body, arm and vision inertial navigation module, set up arm base 5, a plurality of evenly distributed's screw 1, control screw 1 rotatory screw motor 2, the body frame 3 and the battery 4 of installation screw 1 on the robot body, battery 4 installs the upper surface at arm base 5, the arm install in the lower surface of arm base 5, rotate the flight that realizes the robot through the control spiral, set up a plurality of rotary joint in the arm, every rotary joint corresponds a degree of freedom, every set up a steering wheel among the rotary joint. The robot arm adjusts the visual inertial navigation module at the tail end of the robot arm through the rotary joint to reach any appointed position and attitude in a working space, and the robot navigates and positions in the established cloud map through the visual inertial navigation module.
The mechanical arm has six degrees of freedom, and each degree of freedom comprises motor, reduction gear group and encoder the steering wheel. The reduction gear set is installed on the output shaft of motor, set up a plurality of intermeshing's gear in the reduction gear set, set up the rotation axis that is used for connecting the steering wheel among the steering wheel on the play shaft gear among the reduction gear set. The steering engine further comprises a steering engine body, a steering engine base, an auxiliary steering wheel, a main steering wheel and a steering engine outer side wall, wherein the steering engine outer side wall is arranged between the auxiliary steering wheel and the main steering wheel and is distributed along the circumferential direction of the steering engine body, the steering engine body is arranged on the steering engine base, and the auxiliary steering wheel and the main steering wheel are respectively arranged on the outer side of the top wall and the outer side of the bottom wall, which are opposite to each other, of the steering engine body. The main rudder disk and the auxiliary rudder disk are provided with mounting holes which are mutually opposite, and the mounting holes are used for fixing a transmission connecting piece, so that the auxiliary rudder disk can synchronously move along with the main rudder disk.
The steering wheel includes first steering wheel 6, second steering wheel 7, third steering wheel 9, fourth steering wheel 10, fifth steering wheel 11 and sixth steering wheel 12, the vice steering wheel of first steering wheel 6 with arm base 5 is connected, thereby the main steering wheel of first steering wheel 6 with the steering wheel lateral wall of second steering wheel 7 is connected and is driven second steering wheel 7 and rotates. Thereby the main steering wheel of second steering wheel 7 links to each other with the one end fixed side of extension armed lever 8 and drives extension armed lever 8 and rotate, the other end of extension armed lever 8 with the steering wheel base of third steering wheel 9 is connected, and extension armed lever 8 drives third steering wheel 9 when rotating and rotates, the main steering wheel of third steering wheel 9 and vice steering wheel link to each other with the one end of two connecting rods 15 that are parallel to each other respectively, and the other end of two connecting rods 15 that are parallel to each other is fixed respectively on the steering wheel lateral wall of fourth steering wheel 10, realize through connecting rod 15 that the main steering wheel of third steering wheel 9 and vice steering wheel drive fourth steering wheel 10 and rotate. The main steering wheel of the fourth steering engine 10 is connected with the steering engine base of the fifth steering engine 11 so as to drive the fifth steering engine 11 to rotate, the main steering wheel and the auxiliary steering wheel of the fifth steering engine 11 are respectively connected with one ends of two mutually parallel connecting rods 15, the other ends of the two mutually parallel connecting rods 15 are connected with the outer side wall of the steering engine of the sixth steering engine 12, and the main steering wheel and the auxiliary steering wheel of the fifth steering engine 11 are used for driving the sixth steering engine 12 to rotate through the connecting rods 15. The main steering wheel of the sixth steering engine 12 is connected with the visual inertial navigation module through a camera support 14, and the main steering wheel of the sixth steering engine drives the visual inertial navigation module on the camera support 14 to rotate. The motor and the main steering wheel are connected through a rotating shaft arranged in the steering engine, the direction of the rotating shaft of a third steering engine 9 is intersected with the direction of the rotating shaft of a fourth steering engine 10, the direction of the rotating shaft of the fourth steering engine 10 is intersected with the direction of the rotating shaft of a first steering engine 6, the direction of the rotating shaft of the fourth steering engine 10 is the same as that of the rotating shaft of a sixth steering engine 12, the direction of the rotating shaft of the first steering engine 6 is intersected with that of the rotating shaft of a second steering engine 7, the direction of the rotating shaft of the second steering engine 7 is the same as that of the rotating shaft of the third steering engine 9, and the direction of the rotating shaft of the third steering engine 9 is the same as that of the rotating shaft of a fifth steering engine 11.
The steering engine is driven by PWM waves output by the motor controller, the motor controller adjusts the rotation angle of the motor by adjusting the duty ratio of the PWM waves, and each motor is provided with an encoder for reading an angular velocity value, so that the feedback control of each joint can be realized. Six steering engines form six degrees of freedom, so that the mechanical arm can reach any position and posture specified in the working space of the mechanical arm. The mechanical arm is used as the neck of the bionic robot, the visual inertial navigation module is fixedly connected to the tail end of the mechanical arm and used as a visual system of the bionic robot, the movement range of the visual system is enlarged to the maximum extent, and the flexibility of the robot body is enhanced to deal with various dynamic scenes.
The visual inertial navigation module is installed at the tail end of the mechanical arm and adopts a visual inertial odometer to synchronously establish a cloud map in real time, and the visual inertial odometer comprises a fisheye camera 13 and an IMU; the fisheye camera 13 is used for capturing images, and the fisheye camera 13 is used as a sensor for capturing images; the visual odometer obtains three-dimensional information of a current frame according to an image sequence captured by the fisheye camera 13 to generate a 3D point, and obtains the three-dimensional information of the current frame by a characteristic point method and determines the position and the posture of the robot; the IMU is an inertial measurement unit and is a device for measuring three-axis attitude angles (or angular rates) and acceleration of an object, the IMU obtains attitude information of the IMU by measuring the three-axis attitude angles and the acceleration and performing pre-integration, the IMU and the visual odometer add image characteristic information into a state vector of a system for joint optimization in a tightly-coupled fusion mode, and the attitude information of the mechanical arm and the visual information of the visual odometer are fused and used for updating an attitude map and updating a cloud map in real time. The visual inertial odometer is a monocular visual inertial odometer, and the fisheye camera 13 is a monocular camera; the visual inertial odometer can also be a binocular visual inertial odometer, namely, two fisheye cameras are respectively arranged on two sides of the IMU and used for expanding the visual field.
The embodiment of the control method comprises the following steps:
as shown in fig. 2 and 3, another object of the present invention is to provide a control method of a robot sensing control system based on bionics, which controls a robot body to move by using a visual inertial navigation module at the end of a mechanical arm, so as to achieve position accuracy of the robot body in positioning navigation, and the method includes the following steps:
the method comprises the following steps:
the method comprises the following steps that (1), a robot traverses once in a required environment through a visual inertial navigation module to establish a map required by positioning navigation, and feature points and descriptors are stored in the map;
step (2), the robot captures information in the surrounding environment through a fisheye camera 13 at the tail end of the mechanical arm and matches the information with key feature points in the built map to obtain position information of the fisheye camera 13 in the current map;
and (3) determining the state of the mechanical arm in the current state to enable the tail end of the mechanical arm and the machine body to form a fixed connection relation, calculating a rotation and translation relation between the fisheye camera 13 and the robot body according to the fixed connection relation, calculating the position information of the robot body in the current map by using the rotation and translation relation between the camera and the robot body, and controlling the robot to reach the target position according to the position information after calculating the position information of the robot body in the current map coordinate system.
Preferably, the process of establishing the map in step (1) is a SLAM process, and the SLAM process includes two parts, namely a front end part and a back end part:
s1, a visual odometer detects and matches characteristic points through an image captured by a fish-eye camera 13, calculates the rotation and translation relation of the image according to an eight-point method, then restores the three-dimensional information of the image at the characteristic points of a current frame through a triangularization part to generate 3D points, and then continuously judges whether the current frame is a key frame; the triangulation part adopts direct linear change, and utilizes the projection coordinates of the same characteristic point under different two frames to calculate the 3D coordinates of the characteristic point under a world coordinate system;
and S2, if the current frame is a key frame, entering a rear-end optimization part, measuring three-axis attitude angular velocity and acceleration by the IMU and performing pre-integration to obtain attitude information of the IMU, adding image characteristic information into a state vector of the system by the IMU and the visual odometer in a tightly-coupled fusion mode to perform joint optimization, wherein the state vector is a value of the state variable at a certain moment, namely the state of the system at the moment. The value of the state variable at the time t ═ 0 is referred to as the initial state or starting state of the system, i.e. also as the initial state vector or starting state vector. Meanwhile, the IMU data is processed by a pre-integration technology, a reference system of a motion model of the IMU is converted into a constantly changing relative reference system from a fixed initial reference system, the posture information of the mechanical arm and the visual information of the visual odometer are fused and used for updating a posture graph for global optimization and updating a map in real time, and the visual odometer updates the posture graph through loop detection in the processes of detecting and matching the characteristic points of the approximate image, so that the map is updated. The fisheye camera 13 and the IMU are used as a visual navigation positioning system of the bionic robot, map scale information under a monocular scene is recovered through the IMU information, and meanwhile multi-degree-of-freedom positioning of the unmanned flying mechanical arm in the robot is achieved on the basis of the visual inertial odometer.
When the robot captures two frames of images with high similarity in the flying process, whether the matching quantity is enough or not is compared according to feature matching, the feature matching is accelerated by generally adopting a bag-of-words model, and if the matching features are enough, the 3D point is updated so as to update the cloud map.
The step (3) for the robot body to reach the target position comprises the following steps:
a1. the pose of the fisheye camera 13 at the end of the arm relative to the world coordinate system is obtained by means of a visual odometer, including translation
Figure BDA0001838326350000091
And rotation
Figure BDA0001838326350000092
The external parameters of the fisheye camera 13 and the mechanical arm are obtained by a hand-eye calibration method;
a2. the mechanical arm is fixedly connected below the robot body, and the posture of the fisheye camera 13 at the tail end of the mechanical arm relative to the robot body can be obtained through the kinematic relation of the mechanical arm and comprises translation
Figure BDA0001838326350000093
And rotation
Figure BDA0001838326350000094
a3. Obtaining the posture of the robot body relative to the world coordinate system through the coordinate system transformation in the three-dimensional space, including translation
Figure BDA0001838326350000095
And rotation
Figure BDA0001838326350000096
Wherein
Figure BDA0001838326350000097
Preferably, the calculation process of the robot body to reach the target position in the step (3) is as follows:
Figure BDA0001838326350000098
Figure BDA0001838326350000099
Figure BDA00018383263500000910
T=(x,y,z)T
wherein
Figure BDA00018383263500000911
The position of the robot body under the world coordinate system is shown,
Figure BDA00018383263500000912
the position of the fisheye camera in a robot coordinate system is shown,
Figure BDA00018383263500000913
the position of the fisheye camera in a world coordinate system is shown, R represents rotation in three-dimensional space, R, p and l respectively represent angles of rotation around X, Y, Z three axes, T represents translation in the three-dimensional space, and x, y and z respectively represent translation along X, Y, Z three axes.
The positioning navigation system adopts a scheme of combining monocular vision inertial odometer with real-time map construction, and simultaneously utilizes the generated grid map to perform real-time navigation and positioning. The vision inertial navigation module is used as a vision navigation positioning system of the bionic robot, the robot vision system is free from the limitation of the movement of a robot body by utilizing the multi-degree-of-freedom movement of the mechanical arm, the vision range of the robot is greatly increased, the robot can cope with complex and changeable dynamic scenes, and meanwhile, the vision system realizes real-time navigation and positioning functions by utilizing the vision inertial SLAM technology, so that the bionic robot has a strong man-machine interaction function, can be applied to various dynamic scenes and has great application value.
Variations and modifications to the above-described embodiments may occur to those skilled in the art, which fall within the scope and spirit of the above description. Therefore, the present invention is not limited to the specific embodiments disclosed and described above, and some modifications and variations of the present invention should fall within the scope of the claims of the present invention. Furthermore, although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.

Claims (10)

1. A robot perception control system based on bionics is characterized by comprising a robot body, a mechanical arm and a visual inertial navigation module, wherein a mechanical arm base is arranged on the robot body, the mechanical arm is mounted on the mechanical arm base, a plurality of rotary joints are arranged in the mechanical arm, a steering engine is arranged in each rotary joint, the visual inertial navigation module is mounted at the tail end of the mechanical arm, the visual inertial navigation module is used for carrying out real-time positioning and map building through a visual inertial odometer, and the visual inertial odometer comprises a fisheye camera and an IMU;
the fisheye camera is used for capturing images;
the IMU measures the three-axis attitude angular velocity and acceleration and performs pre-integration to obtain the relative attitude information of the IMU, the IMU and the visual odometer adopt a tightly coupled fusion mode to add image characteristic information into a state vector of a system for joint optimization, and the attitude information of the IMU and the visual information of the visual odometer are fused to update an attitude map and update a map in real time;
the mechanical arm adjusts the visual inertial navigation module at the tail end of the mechanical arm through the rotary joint to reach any position and any posture appointed in the working space of the mechanical arm, and the robot navigates and positions in the established map through the visual inertial navigation module;
the robot captures information in the surrounding environment through a fisheye camera at the tail end of a mechanical arm and matches the information with key frame feature points in a built map to obtain position information of the fisheye camera in the current map; determining the state of the mechanical arm under the current state to enable the tail end of the mechanical arm and the machine body to form a fixed connection relation, calculating the rotation and translation relation between the fisheye camera and the robot body according to the fixed connection relation, calculating the position information of the robot body in a current map by using the rotation and translation relation between the camera and the robot body, and controlling the robot to reach a target position according to the position information after calculating the position information of the robot body in a current map coordinate system.
2. The bionic-based robot perception control system according to claim 1, wherein the visual inertial odometer is a monocular visual inertial odometer or a binocular visual inertial odometer, and the fisheye camera is a monocular camera.
3. The bionic-based robot perception control system according to claim 1, wherein the mechanical arm has six degrees of freedom, and each degree of freedom comprises a motor, a reduction gear set and an encoder.
4. The robot perception control system based on bionics of claim 3, characterized in that, the steering wheel includes steering wheel body, steering wheel base, vice steering wheel, main steering wheel, sets up between vice steering wheel and the main steering wheel along the steering wheel body circumference distribution's steering wheel lateral wall, the steering wheel body sets up on the steering wheel base, vice steering wheel and main steering wheel set up respectively the relative roof that sets up of steering wheel body and the outside of diapire.
5. The bionic-based robot perception control system according to claim 4, wherein the steering gears comprise a first steering gear, a second steering gear, a third steering gear, a fourth steering gear, a fifth steering gear and a sixth steering gear, an auxiliary steering gear of the first steering gear is connected with the mechanical arm base, a main steering gear of the first steering gear is connected with the outer side wall of the second steering gear, a main steering gear of the second steering gear is connected with the fixed side of one end of an extension arm rod, the other end of the extension arm rod is connected with the steering gear base of the third steering gear, the main steering gear and the auxiliary steering gear of the third steering gear are respectively connected with one ends of two parallel connecting rods, the other ends of the two parallel connecting rods are respectively fixed on the outer side wall of the fourth steering gear, and the main steering gear of the fourth steering gear is connected with the base of the fifth steering gear, and the main steering wheel and the auxiliary steering wheel of the fifth steering engine are respectively connected with one ends of two mutually parallel connecting rods, the other ends of the two mutually parallel connecting rods are connected with the outer side wall of the steering engine of the sixth steering engine, and the main steering wheel of the sixth steering engine is connected with the visual inertial navigation module.
6. The bionic-based robot perception control system according to claim 5, wherein a rotating shaft is arranged in the steering engine and used for being connected with the motor and the main steering wheel, the rotating shaft of the third steering engine is intersected with the rotating shaft of the fourth steering engine in direction, the rotating shaft of the fourth steering engine is intersected with the rotating shaft of the first steering engine in direction, and the rotating shaft of the fourth steering engine is identical to the rotating shaft of the sixth steering engine in direction.
7. A control method applying the bionic-based robot perception control system of any one of claims 1-6, characterized by comprising the following steps:
the method comprises the following steps that (1), a robot traverses once in a required environment through a visual inertial navigation module to establish a map required by positioning navigation, and feature points and descriptors are stored in the map;
the robot captures information in the surrounding environment through a fisheye camera at the tail end of the mechanical arm and matches the information with key frame feature points in the built map to obtain position information of the fisheye camera in the current map;
and (3) determining the state of the mechanical arm in the current state to enable the tail end of the mechanical arm and the machine body to form a fixed connection relation, calculating a rotation and translation relation between the fisheye camera and the robot body according to the fixed connection relation, calculating the position information of the robot body in the current map by using the rotation and translation relation between the camera and the robot body, and controlling the robot to reach the target position according to the position information after calculating the position information of the robot body in the current map coordinate system.
8. The method for controlling a bionic-based robot sensing control system according to claim 7, wherein the process of establishing the map in step (1) is a SLAM process, and the SLAM process comprises two parts, namely a front end part and a back end part:
s1, a visual odometer detects and matches characteristic points through an image captured by a fisheye camera, judges whether a current frame is a key frame or not, calculates the rotation and translation relation of the image according to an eight-point method, and recovers three-dimensional information of the characteristic points of the image in the current frame through a triangularization part to generate a 3D point;
s2, if the current frame is a key frame, entering a rear-end optimization part, measuring three-axis attitude angular velocity and acceleration by the IMU and performing pre-integration to obtain relative attitude information of the IMU, adding image characteristic information into a state vector of the system to perform joint optimization by the IMU and the visual odometer in a tightly coupled fusion mode, processing IMU data by a pre-integration technology, converting a reference system of a motion model of the IMU from a fixed initial reference system to a constantly changing relative reference system, fusing the attitude information of the IMU and the visual information of the visual odometer to update an attitude map for global optimization and updating the map in real time, and updating the attitude map by loop detection in the process of detecting and matching the characteristic points of the approximate image by the visual odometer so as to update the map.
9. The control method of the bionic-based robot sensing control system according to claim 7, wherein the step of the robot body to reach the target position in the step (3) is as follows:
a1. through the visual liningThe pose of the fisheye camera at the end of the arm is obtained by the programmer relative to the world coordinate system, including translation
Figure FDA0003189954900000031
And rotation
Figure FDA0003189954900000032
The external parameters of the fisheye camera and the mechanical arm are obtained by a hand-eye calibration method;
a2. the mechanical arm is fixedly connected below the robot body, and the posture of the fisheye camera at the tail end of the mechanical arm relative to the robot body can be obtained through the kinematic relation of the mechanical arm and comprises translation
Figure FDA0003189954900000033
And rotation
Figure FDA0003189954900000034
a3. Obtaining the posture of the robot body relative to the world coordinate system through the coordinate system transformation in the three-dimensional space, including translation
Figure FDA0003189954900000035
And rotation
Figure FDA0003189954900000036
Wherein
Figure FDA0003189954900000037
10. The control method of the bionic-based robot sensing control system according to claim 7, wherein the calculation process of the robot body to reach the target position in the step (3) is as follows:
Figure FDA0003189954900000038
Figure FDA0003189954900000039
Figure FDA00031899549000000310
T=(x,y,z)T
wherein
Figure FDA00031899549000000311
The position of the robot body under the world coordinate system is shown,
Figure FDA00031899549000000312
the position of the fisheye camera in a robot coordinate system is shown,
Figure FDA00031899549000000313
the position of the fisheye camera in a world coordinate system is shown, R represents rotation in three-dimensional space, R, p and l respectively represent angles of rotation around X, Y, Z three axes, T represents translation in the three-dimensional space, and x, y and z respectively represent translation along X, Y, Z three axes.
CN201811236664.2A 2018-10-23 2018-10-23 Robot perception control system and control method based on bionics Active CN109079799B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811236664.2A CN109079799B (en) 2018-10-23 2018-10-23 Robot perception control system and control method based on bionics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811236664.2A CN109079799B (en) 2018-10-23 2018-10-23 Robot perception control system and control method based on bionics

Publications (2)

Publication Number Publication Date
CN109079799A CN109079799A (en) 2018-12-25
CN109079799B true CN109079799B (en) 2021-11-12

Family

ID=64843842

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811236664.2A Active CN109079799B (en) 2018-10-23 2018-10-23 Robot perception control system and control method based on bionics

Country Status (1)

Country Link
CN (1) CN109079799B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110775288B (en) * 2019-11-26 2021-05-25 哈尔滨工业大学(深圳) Bionic-based flight mechanical neck eye system and control method
CN111645067B (en) * 2020-05-15 2022-05-31 深圳国信泰富科技有限公司 High-intelligence robot environment sensing method and system
CN112819943B (en) * 2021-01-15 2022-08-30 北京航空航天大学 Active vision SLAM system based on panoramic camera
CN112862818B (en) * 2021-03-17 2022-11-08 合肥工业大学 Underground parking lot vehicle positioning method combining inertial sensor and multi-fisheye camera
CN113305830B (en) * 2021-04-28 2022-08-16 吉林大学 Humanoid robot action system based on human body posture control and control method
CN114102639A (en) * 2022-01-26 2022-03-01 北京宜亿瑰夏科技有限公司 High-flexibility-ratio unmanned coffee robot based on visual positioning
CN114485648B (en) * 2022-02-08 2024-02-02 北京理工大学 Navigation positioning method based on bionic compound eye inertial system
CN114602323B (en) * 2022-02-18 2023-05-09 中国科学院水生生物研究所 Clamping type filter membrane replacement method and system for environmental DNA sampling
CN114770461B (en) * 2022-04-14 2023-12-01 深圳技术大学 Mobile robot based on monocular vision and automatic grabbing method thereof
CN115125923B (en) * 2022-08-03 2023-09-15 浙江傲宋智能科技有限公司 Small-size waters rubbish clearance robot

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2003057A2 (en) * 2007-06-11 2008-12-17 Honeywell International Inc. Airborne manipulator unmanned aerial vehicle (UAV)
CN105437232A (en) * 2016-01-11 2016-03-30 湖南拓视觉信息技术有限公司 Method and device for controlling multi-joint moving robot to avoid obstacle
CN105953796A (en) * 2016-05-23 2016-09-21 北京暴风魔镜科技有限公司 Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone
CN205891228U (en) * 2016-07-29 2017-01-18 华南理工大学 Flying robot
CN106803271A (en) * 2016-12-23 2017-06-06 成都通甲优博科技有限责任公司 A kind of camera marking method and device of vision guided navigation unmanned plane
CN106873619A (en) * 2017-01-23 2017-06-20 上海交通大学 A kind of processing method in unmanned plane during flying path
CN106995053A (en) * 2017-04-25 2017-08-01 桂林电子科技大学 A kind of rotor wing unmanned aerial vehicle of new armed four
CN206505318U (en) * 2017-02-27 2017-09-19 哈瓦国际航空技术(深圳)有限公司 A kind of axle unmanned aerial vehicle onboard head of night vision camera three
CN206633010U (en) * 2016-09-28 2017-11-14 三峡大学 A kind of snake robot with computer vision function
CN107504969A (en) * 2017-07-24 2017-12-22 哈尔滨理工大学 Four rotor-wing indoor air navigation aids of view-based access control model and inertia combination
CN107687850A (en) * 2017-07-26 2018-02-13 哈尔滨工业大学深圳研究生院 A kind of unmanned vehicle position and orientation estimation method of view-based access control model and Inertial Measurement Unit
CN107850436A (en) * 2015-05-23 2018-03-27 深圳市大疆创新科技有限公司 Merged using the sensor of inertial sensor and imaging sensor
CN108489482A (en) * 2018-02-13 2018-09-04 视辰信息科技(上海)有限公司 The realization method and system of vision inertia odometer

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060249622A1 (en) * 2005-05-04 2006-11-09 Lockheed Martin Corporation Autonomous Environmental Control System and Method For Post-Capture and Pre-Launch Management of an Unmanned Air Vehicle
US10191486B2 (en) * 2016-03-28 2019-01-29 Aveopt, Inc. Unmanned surveyor

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2003057A2 (en) * 2007-06-11 2008-12-17 Honeywell International Inc. Airborne manipulator unmanned aerial vehicle (UAV)
CN107850436A (en) * 2015-05-23 2018-03-27 深圳市大疆创新科技有限公司 Merged using the sensor of inertial sensor and imaging sensor
CN105437232A (en) * 2016-01-11 2016-03-30 湖南拓视觉信息技术有限公司 Method and device for controlling multi-joint moving robot to avoid obstacle
CN105953796A (en) * 2016-05-23 2016-09-21 北京暴风魔镜科技有限公司 Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone
CN205891228U (en) * 2016-07-29 2017-01-18 华南理工大学 Flying robot
CN206633010U (en) * 2016-09-28 2017-11-14 三峡大学 A kind of snake robot with computer vision function
CN106803271A (en) * 2016-12-23 2017-06-06 成都通甲优博科技有限责任公司 A kind of camera marking method and device of vision guided navigation unmanned plane
CN106873619A (en) * 2017-01-23 2017-06-20 上海交通大学 A kind of processing method in unmanned plane during flying path
CN206505318U (en) * 2017-02-27 2017-09-19 哈瓦国际航空技术(深圳)有限公司 A kind of axle unmanned aerial vehicle onboard head of night vision camera three
CN106995053A (en) * 2017-04-25 2017-08-01 桂林电子科技大学 A kind of rotor wing unmanned aerial vehicle of new armed four
CN107504969A (en) * 2017-07-24 2017-12-22 哈尔滨理工大学 Four rotor-wing indoor air navigation aids of view-based access control model and inertia combination
CN107687850A (en) * 2017-07-26 2018-02-13 哈尔滨工业大学深圳研究生院 A kind of unmanned vehicle position and orientation estimation method of view-based access control model and Inertial Measurement Unit
CN108489482A (en) * 2018-02-13 2018-09-04 视辰信息科技(上海)有限公司 The realization method and system of vision inertia odometer

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
《Swan-Inspired Unmanned Aerial Vehicles With Long-neck Visual Perception System》;费耀南;《Proceeding of the 2019 IEEE/ASME International Conference on Advanced Intelligent Mechatronics》;20190731;第1335-1340页 *

Also Published As

Publication number Publication date
CN109079799A (en) 2018-12-25

Similar Documents

Publication Publication Date Title
CN109079799B (en) Robot perception control system and control method based on bionics
Zhang et al. Visual servoing with dynamics: Control of an unmanned blimp
CN106029501B (en) UAV panoramic imagery
CN109164829B (en) Flying mechanical arm system based on force feedback device and VR sensing and control method
CN109895099B (en) Flying mechanical arm visual servo grabbing method based on natural features
CN110775288B (en) Bionic-based flight mechanical neck eye system and control method
CN105045293B (en) Cloud platform control method, outer carrier control method and holder
CN109129523A (en) Mobile robot real-time remote control system based on human-computer interaction
Li et al. Localization and navigation for indoor mobile robot based on ROS
CN110163963B (en) Mapping device and mapping method based on SLAM
CN106444810A (en) Unmanned plane mechanical arm aerial operation system with help of virtual reality, and control method for unmanned plane mechanical arm aerial operation system
CN109254587A (en) Can under the conditions of wireless charging steadily hovering small drone and its control method
CN107941167B (en) Space scanning system based on unmanned aerial vehicle carrier and structured light scanning technology and working method thereof
Do Quang et al. Mapping and navigation with four-wheeled omnidirectional mobile robot based on robot operating system
CN109693235B (en) Human eye vision-imitating tracking device and control method thereof
CN106231192A (en) A kind of image-pickup method and device
CN111571591B (en) Four-eye bionic eye device, four-eye bionic eye device and target searching method thereof
Ishikawa High-speed vision and its applications toward high-speed intelligent systems
Liu et al. A deep-learning based multi-modality sensor calibration method for usv
Mohamed et al. Automating active stereo vision calibration process with cobots
CN116009583A (en) Pure vision-based distributed unmanned aerial vehicle cooperative motion control method and device
Sato et al. A simple autonomous flight control method of quadrotor helicopter using only single Web camera
CN212193168U (en) Robot head with laser radars arranged on two sides
CN112975988A (en) Live working robot control system based on VR technique
Zhao et al. Active visual mapping system for digital operation environment of bridge crane

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant