CN109917786A - A kind of robot tracking control and system operation method towards complex environment operation - Google Patents
A kind of robot tracking control and system operation method towards complex environment operation Download PDFInfo
- Publication number
- CN109917786A CN109917786A CN201910109167.4A CN201910109167A CN109917786A CN 109917786 A CN109917786 A CN 109917786A CN 201910109167 A CN201910109167 A CN 201910109167A CN 109917786 A CN109917786 A CN 109917786A
- Authority
- CN
- China
- Prior art keywords
- robot
- module
- camera
- mechanical arm
- tracking control
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Abstract
The invention discloses a kind of robot tracking control and system operation method towards complex environment operation, including at least more than one laser radar, at least more than one binocular vision camera, at least more than one mechanical arm trick camera, at least more than one Inertial Measurement Unit (IMU), it is intended to complete the moon or Mars construction of base and the star catalogue services of subsequent planning, the present invention has on rough loose sandy soil ground, cement flooring, stabilized walking and steering capability under the geomorphologic conditions such as meadow, with independent navigation, avoidance, obstacle detouring and path planning ability, has the ability for carrying out refinement for specific observed object, has the Formation keeping under full load conditions, walking turns to and rises crouching ability, has resistance to external shocks disturbance ability, has robot pose measurement function, has wireless communication function.
Description
Technical field
The present invention relates to mobile robot technologies, are a kind of robot tracking controls and system towards complex environment operation
Operation method, the sensory perceptual system and system operation method of specifically a kind of three-dimensional reconstruction, autonomous positioning and path planning have complete
Autonomous roaming, refinement and the intelligentized feature of height.
Background technique
The subsequent deep space exploration task in China proposes complete autonomous roaming, fining to the future thrust of robot for space
Operation and height intelligence etc. are distinctly claimed, for the demand, develop a set of lightweight, integrated level is high, locomotivity is strong,
The space intelligent star catalogue sniffing robot for having refinement ability, it is intended to which the moon or Mars base for completing subsequent planning are built
And if star catalogue services, specifically include the construction in base, the operation of scientific instruments equipment, the maintenance of infrastructure, the moon money
Source utilizes.
Intelligent star catalogue sniffing robot is that the functions such as a kind of efficient movement of collection, intelligent operation, multi-machine collaborative are one
The comprehensive intelligent robot of body, can independently adapt to planetary surface environment complicated and changeable, and complete independently such as star catalogue is maked an inspection tour
The complex works such as detection, cargo are carried, equipment dismounting is safeguarded.
Important component of the robot tracking control as intelligent star catalogue sniffing robot, mainly assumes responsibility for complex environment
Intellisense, three-dimensional map building, robot self-localization, vision guided navigation path planning, multi-sensor information fusion and specific
A series of space tasks such as observed object three-dimensional pose measurement.
Summary of the invention
For overcome the deficiencies in the prior art, the purpose of the present invention is to provide a kind of machines towards complex environment operation
The robot tracking control and system operation method of people's three-dimensional reconstruction, autonomous positioning and path planning, the present invention are by following
Technical solution is realized:
The invention discloses a kind of robot tracking controls towards complex environment operation, including at least more than one laser
Radar, at least more than one binocular vision camera, at least more than one mechanical arm trick camera, at least more than one inertia measurement
Unit (IMU) includes the vision controller and the remote operating meter of at least more than one of the system environments of vision software operation
Calculation machine, laser radar, binocular vision camera, mechanical arm trick camera, Inertial Measurement Unit with vision controller wired connection,
The remote operating computer is connect with vision controller signal.
As a further improvement, robot tracking control of the present invention can be applied to the detection of wheel type intelligent star catalogue
Robot, sufficient formula intelligence star catalogue sniffing robot etc., does not limit robot type.
As a further improvement, laser radar of the present invention is fixed on robot body head, horizontal direction
Unobstructed, for medium and long distance around robot environment sensing;
As a further improvement, binocular vision camera of the present invention is fixed on robot body head, it to be used for machine
The environment sensing of short distance in device people's direction of travel;
As a further improvement, mechanical arm trick camera of the present invention is fixed on the end of robot body mechanical arm
End, the three-dimensional pose for specific observed object measure, and guidance robot arm end effector completes refinement, and mechanical arm is solid
It is scheduled on robot body, for carrying out the crawl of object;
As a further improvement, IMU of the present invention is fixed on the head of robot body, binocular vision phase is abutted
Machine, the attitude angle for robot measurement ontology;
As a further improvement, vision controller of the present invention is fixed on the back of robot body, for connecing
The image and data that perception subsystem internal all the sensors acquire in real time are received, image procossing, Multi-source Information Fusion, obstacle are completed
Analyte detection, three-dimensional map building, robot self-localization, path planning and observed object three-dimensional measurement, realize with it is above-mentioned each
Data communication between sensor, mechanical arm controller and remote operating computer.
As a further improvement, remote operating computer of the present invention be fixedly placed in it is distant except robot body
Robot self-localization, three-dimensional map, path are completed for receiving the telemetry of vision controller transmission in operating platform region
The information visuallizations such as planning and the display for perceiving subsystem operating status realize that the data between motion controller are logical
Letter;The motion controller moves in space for controlling robot, and the motion controller and vision controller are believed
Number connection.
As a further improvement, the present invention includes following comprising modules: multisensor demarcating module, robot build figure with
Locating module, robot perception module, robot navigation's module, cooperative detection module, irregular sample rebuild module.
As a further improvement, multisensor demarcating module of the present invention, robot build figure and locating module, machine
Device people sensing module, robot navigation's module, cooperative detection module, irregular sample rebuild module with remote operating trusted computer
Number connection.
As a further improvement, multisensor demarcating module of the present invention is that multi-sensor information fusion uses
Premise, the relative pose for demarcating between each sensor;
The robot builds figure and locating module is used to support the real time position feedback of robot and the people with operator
Machine interaction realizes that the orientation of target is calculated by robot;
The robot perception module is real-time perception of the robot to itself ambient enviroment, for disturbance in judgement object
Parameter determines feasible and infeasible region;
Robot navigation's module is the collisionless path computing of robot to be realized, to instruct based on the above two
Robot ambulation;
It is the measuring and calculating to robot manipulating task object that the cooperative detection module and irregular sample, which rebuild module, to refer to
Mechanical arm is led to be moved.
As a further improvement, the present invention builds figure and locating module to construct robot, installed at the top of robot
Laser radar and IMU;In order to construct robot perception module and robot navigation's module, local grid map is established, is detected
Barrier, in robot front fitting depth camera;Module is rebuild in order to construct cooperative detection module and irregular sample, in machine
Tool arm hand installs binocular camera;For image viewing task, forward sight binocular camera is installed in robot front.
The invention also discloses a kind of robot tracking controls towards complex environment operation, and system operation method is: more
Transducer calibration module is the premise that multi-sensor information fusion uses, laser radar, binocular vision camera, mechanical arm trick phase
Machine, Inertial Measurement Unit all need to carry out multisensor calibration using preceding;Figure and locating module, robot motion are built in robot
The color image real-time perfoming robot localization provided in the process according to binocular camera;Pass through fusion depth camera and laser later
Information completes the building of grating map, realizes that robot builds figure;In robot perception module and robot navigation's module, borrow
Global grating map is helped, the acceleration that provides according to the point cloud data that laser provides with IMU, that angular speed obtains robot is current
Position artificially gives target point later, carries out robot path planning;Mould is rebuild in cooperative detection module and irregular sample
Block goes out the barrier in the public visual field by the image reconstruction that mechanical arm trick camera provides, and is grabbed by mechanical arm.
As a further improvement, the present invention builds figure and locating module to construct robot, installed at the top of robot
Laser radar and IMU;In order to construct robot perception module and robot navigation's module, local grid map is established, is detected
Barrier, in robot front fitting depth camera;Module is rebuild in order to construct cooperative detection module and irregular sample, in machine
Tool arm hand installs binocular camera;For image viewing task, forward sight binocular camera is installed in robot front.
Compared with prior art, the beneficial effects of the present invention are:
The robot three-dimensional that the invention discloses a kind of towards complex environment operation rebuilds, autonomous positioning and path planning
Robot tracking control and system operation method are a set of lightweights, integrated level is high, locomotivity is strong, has refinement
The space intelligent robot tracking control of ability, it is intended to complete subsequent planning the moon or Mars construction of base and star catalogue service work
Make, specifically includes the construction in base, the operation of scientific instruments equipment, the maintenance of infrastructure, lunar surveyor utilize, the present invention
Have the stabilized walking under the geomorphologic conditions such as rough loose sandy soil (sandstone) ground, cement flooring, meadow and steering
Ability has independent navigation, avoidance, obstacle detouring and path planning ability, has and opens for specific observed object (containing static, movement)
The ability for opening up refinement has Formation keeping, walking steering and crouching ability under the conditions of fully loaded (30kg), has anti-outer
Boundary's shock vibration (except transverse direction) ability, has robot pose measurement function, has wireless communication function.
Inventive process have the advantage that:
In conjunction with space exploration task of the invention and functional requirement, as large-scale three-dimensional environment is rebuild and is freely paid a return visit
Feature, it can be seen that the present invention and the existing similar place of robot tracking control that generallys use both at home and abroad at present are also different
Place.Wherein large-scale three-dimensional environment is rebuild and is freely paid a return visit, and determines that the Choice of Sensors of robot must be further
Meet big visual angle and long-term robustness requirement, this point laser radar sensor has certain advantage compared to vision.And it examines
Consider some technical bottlenecks that current vision guided navigation has been exposed, and there is no significant substitutions for corresponding solution
Trend, so the present invention more meets the technology exploration to the following star catalogue robot using the navigation mode of laser radar.And it should
No matter in size or technically sensor, will be more mature compared to any laser camera method.It considers simultaneously
The mature speed of unpiloted development, the sensor will be than other technologies faster.
The almost all of work of local environment perceptible aspect is all highly dependent on the depth recovery of binocular and the court of binocular
To.In conjunction with laser sensor, it may not be necessary to the environment reconstruction for solving the big baseline binocular in far field, because laser has been able to
The remote enough visual field and depth reconstruction are provided.This is also verified in present unmanned application.Therefore main biography
Sensor type selecting and the technology of use can be realized with reference to the binocular or depth camera of closely short baseline to small pieces in front of robot
The accurate terrain reconstruction in region.Thus preferably the grating map in far field and near field can be fused together.
In terms of path planning, it is contemplated that legged type robot has bigger freedom of motion than wheeled robot, so being not required to
Consider more complicated movement meta-model, but in view of the configuration of legged type robot is in apparent cuboid, so if
It is moved in the narrow interior space, it is still necessary to by three-dimensional path planning algorithm, fully consider the sheet of robot
Narrow landform is passed through in body direction, realization.On the other hand, the mode similarly navigated by the overall situation with local navigational solution coupling,
It realizes the map reconstruction to given area and return visit, while guaranteeing local path planning and stopping hindering avoidance.
In addition, it is contemplated that there are also arms to be controlled for robot of the invention, this explores machine in presently relevant star catalogue
It is not much and sees in people, so rare invention refers to.According to the demand of invention, it is contemplated that the characteristic of cooperative target and noncooperative target
Star catalogue and in earth's surface it is integrally identical.So mainly realizing using the mechanical arm visual position servo techniques of earth's surface to two classes
The position detection and feedback of target, guidance mechanical arm are correctly operated.
The present invention has complex environment Intellisense, three-dimensional map building, robot self-localization, vision guided navigation path rule
Draw, multi-sensor information fusion and specific observed object three-dimensional pose measurement etc. functions.
Detailed description of the invention
Fig. 1 is the system hardware installation diagram of the method for the present invention;
Fig. 2 is each algoritic module information flow of the invention;
Fig. 3 is system power supply and distribution interface of the invention;
In Fig. 1,1 is robot body, and 2 be laser radar, and 3 be the left mesh of vision binocular camera, and 4 be that vision binocular camera is right
Mesh, 5 be IMU, and 6 be depth camera, and 7 be the left mesh of mechanical arm trick camera, and 8 be the right mesh of mechanical arm trick camera, and 9 be visual spatial attention
Device.
Specific embodiment
In the following, being further described in conjunction with Figure of description and specific embodiment to technical solution of the present invention:
Fig. 1 is the system hardware installation diagram of the method for the present invention;Including 1, laser radar 2 of a robot body, vision binocular phase
The left mesh 3 of machine and the right mesh 4 of vision binocular camera, the left mesh 7 of mechanical arm trick camera and mechanical arm trick camera 8, inertia of right mesh
Measuring unit (IMU5), a depth camera 6, include vision software operation system environments vision controller 9 and extremely
Few more than one remote operating computer, laser radar 2 are fixed on 1 head of robot body, it is desirable that horizontal direction is unobstructed, uses
The environment sensing of medium and long distance around robot;Binocular vision camera is fixed on 1 head of robot body, is used for robot row
Into the environment sensing of short distance in direction;Mechanical arm trick camera is fixed on mechanical arm tail end, three for specific observed object
Pose measurement is tieed up, guidance robot arm end effector completes refinement;Mechanical arm is fixed on robot body 1, is used for
Carry out the crawl of object;IMU5 is fixed on 1 head of robot body, abuts binocular vision camera, is used for robot measurement ontology 1
Attitude angle;Vision controller 9 is fixed on 1 back of robot body, receives perception subsystem internal all the sensors and adopts in real time
The image and data of collection, completion image procossing, Multi-source Information Fusion, detection of obstacles, three-dimensional map building, robot make by oneself
Position, path planning and three-dimensional measurement of observed object etc. are realized and above-mentioned each sensor, mechanical arm controller and remote operating
Data communication between computer;Remote operating computer is fixedly placed in the remote-controlled operation platform region except robot body 1, connects
The telemetry that vision controller 9 is sent is received, the information visuallizations such as robot self-localization, three-dimensional map, path planning are completed
And the display of perception subsystem operating status, realize the data communication between motion controller;Motion controller is for controlling
Robot processed moves in space.
Robot tracking control of the present invention towards complex environment operation, including following comprising modules: more sensings
Device demarcating module, robot build figure and locating module, robot perception module, robot navigation's module, cooperative detection module, non-
Regular sample rebuilds module;Multisensor demarcating module, robot build figure and locating module, robot perception module, robot
Navigation module, cooperative detection module, irregular sample are rebuild module and are connect with remote operating Computer signal.
Multisensor demarcating module is the premise that multi-sensor information fusion uses, the phase for demarcating between each sensor
To pose;Robot builds figure and locating module is used to support the real time position feedback of robot and the human-computer interaction with operator,
Realize that the orientation of target is calculated by robot;Robot perception module is real-time perception of the robot to itself ambient enviroment, is used
In the parameter of disturbance in judgement object, feasible and infeasible region is determined;Robot navigation's module is to realize robot based on the above two
Collisionless path computing, thus guidance machine people walk;It is to machine that cooperative detection module and irregular sample, which rebuild module,
The measuring and calculating of people's manipulating object, so that mechanical arm be instructed to be moved.Figure and locating module are built in order to construct robot, in robot
Top is mounted with laser radar 2 and IMU5;In order to construct robot perception module and robot navigation's module, local grid is established
Map detects barrier, in robot front fitting depth camera 6;In order to construct cooperative detection module and irregular sample weight
Block is modeled, binocular camera is installed in mechanical arm hand;For image viewing task, forward sight binocular phase is installed in robot front
Machine.
The invention also discloses a kind of robot tracking controls towards complex environment operation, and system operation method is: more
Transducer calibration module is the premise that multi-sensor information fusion uses, laser radar 2, binocular vision camera, mechanical arm trick
Camera, Inertial Measurement Unit all need to carry out multisensor calibration using preceding;Figure and locating module, robot fortune are built in robot
The color image real-time perfoming robot localization provided during dynamic according to binocular camera;Pass through fusion 6 He of depth camera later
Laser intelligence completes the building of grating map, realizes that robot builds figure;In robot perception module and robot navigation's module
In, by global grating map, acceleration, the angular speed that the point cloud data and IMU5 provided according to laser provides obtain machine
People current location artificially gives target point later, carries out robot path planning;In cooperative detection module and irregular sample weight
Block is modeled, the barrier in the public visual field is gone out by the image reconstruction that mechanical arm trick camera provides, and carry out by mechanical arm
Crawl.
Fig. 2 is each algoritic module information flow of the invention;Algorithm is divided into driving layer and algorithm layer.Wherein drive the main of layer
Effect is to obtain the data and feeding that laser radar 2, binocular vision camera, mechanical arm trick camera, Inertial Measurement Unit acquire
Algorithm layer, algorithm layer build figure and locating module, robot perception module, robot navigation by transducer calibration module, robot
Module, cooperative detection module, irregular sample rebuild module composition;Multisensor demarcating module is that multi-sensor information fusion makes
Premise, laser radar 2, binocular vision camera, mechanical arm trick camera, Inertial Measurement Unit are all needing progress more using preceding
Transducer calibration;Figure and locating module, the color image provided in robot kinematics according to binocular camera are built in robot
Real-time perfoming robot localization;Later by fusion laser intelligence, the building of grating map is completed, realizes that robot builds figure;?
In robot perception module and robot navigation's module, by global grating map, the point cloud data that is provided according to laser with
Acceleration, the angular speed of IMU5 offer obtain robot current location, artificially give target point later, carry out robot path rule
It draws;Module is rebuild in cooperative detection module and irregular sample, the image reconstruction provided by mechanical arm trick camera is public out
Stone in the visual field, and grabbed by mechanical arm.The effect that irregular sample is rebuild is rendering image, is reconstructed in the visual field
Object.The effect of cooperative detection is the barrier detected in the visual field, such as slope, gully, plateau, ladder.
Fig. 3 is system power supply and distribution interface of the invention.Laser radar 2, mechanical arm trick camera, is used to binocular vision camera
Property measuring unit all with 9 wired connection of vision controller, powered by vision controller 9 is unified;Specific power supply interface type is such as
Shown in Fig. 3, vision controller 9 also with power subsystem, display, keyboard, mouse wired connection, and is respectively completed power supply, aobvious
The operation such as show, control, remote operating computer, motion controller are connect with 9 signal of vision controller respectively, and remote operating computer connects
By vision controller 9 transmit signal, processing after be sent to motion controller, by motion control implement body control robot into
Row movement.
Finally, it should also be noted that it is listed above be only 1 specific embodiment of the invention.Obviously, the present invention is not
It is limited to above embodiments, acceptable there are many deformations.Those skilled in the art can be direct from present disclosure
All deformations for exporting or associating, are considered as protection scope of the present invention.
Claims (9)
1. a kind of robot tracking control towards complex environment operation, which is characterized in that including at least more than one laser thunder
Up to (2), at least more than one binocular vision camera (3) (4), at least more than one mechanical arm trick camera (7) (8), at least one
A above Inertial Measurement Unit IMU (5) includes the vision controller (9) and at least of the system environments of vision software operation
More than one remote operating computer, the laser radar (2), binocular vision camera (3) (4), mechanical arm trick camera (7)
(8), Inertial Measurement Unit (5) with vision controller (9) wired connection, the remote operating computer and vision controller
(9) signal connects.
2. the robot tracking control according to claim 1 towards complex environment operation, which is characterized in that the machine
Device people's sensory perceptual system can be applied to wheel type intelligent star catalogue sniffing robot, sufficient formula intelligence star catalogue sniffing robot etc., not limit
Robot type.
3. the robot tracking control according to claim 1 towards complex environment operation, which is characterized in that described swashs
Optical radar (2) is fixed on robot body (1) head, and horizontal direction is unobstructed, the ring for medium and long distance around robot
Border perception;
The binocular vision camera (3) (4) is fixed on robot body (1) head, for low coverage in robot direction of travel
From environment sensing;
The mechanical arm trick camera (7) (8) is fixed on the end of robot body (1) mechanical arm, is used for specific observation mesh
The measurement of target three-dimensional pose, guidance robot arm end effector complete refinement, and the mechanical arm is fixed on robot
On ontology (1), for carrying out the crawl of object;
The IMU (5) is fixed on the head of robot body (1), abuts binocular vision camera, is used for robot measurement ontology
(1) attitude angle;
The vision controller (9) is fixed on the back of robot body (1), for receiving perception all biographies of subsystem internal
The image and data that sensor acquires in real time complete image procossing, Multi-source Information Fusion, detection of obstacles, three-dimensional map building, machine
Device people is self-positioning, the three-dimensional measurement of path planning and observed object, realize with above-mentioned each sensor, mechanical arm controller and
Data communication between remote operating computer;
The remote operating computer is fixedly placed in the remote-controlled operation platform region except robot body (1), for receiving view
Feel the telemetry that controller (9) are sent, complete robot self-localization, three-dimensional map, the information visuallizations such as path planning with
And the display of perception subsystem operating status.
4. the robot tracking control according to claim 1 or 3 towards complex environment operation, which is characterized in that including
Following comprising modules: multisensor demarcating module, robot build figure and locating module, robot perception module, robot navigation
Module, cooperative detection module, irregular sample rebuild module.
5. the robot tracking control according to claim 4 towards complex environment operation, which is characterized in that described is more
Transducer calibration module, robot build figure and locating module, robot perception module, robot navigation's module, cooperative detection mould
Block, irregular sample are rebuild module and are connect with remote operating Computer signal.
6. the robot tracking control according to claim 5 towards complex environment operation, it is characterised in that:
The multisensor demarcating module is the premise that multi-sensor information fusion uses, for demarcating between each sensor
Relative pose;
The robot builds figure and locating module is used to support the real time position feedback of robot and the man-machine friendship with operator
Mutually, realize that the orientation of target is calculated by robot;
The robot perception module is real-time perception of the robot to itself ambient enviroment, the ginseng for disturbance in judgement object
Number, determines feasible and infeasible region;
Robot navigation's module is the collisionless path computing of robot to be realized, thus guidance machine based on the above two
People's walking;
It is the measuring and calculating to robot manipulating task object that the cooperative detection module and irregular sample, which rebuild module, to instruct machine
Tool arm is moved.
7. the robot tracking control according to claim 5 or 6 towards complex environment operation, which is characterized in that in order to
Building robot builds figure and locating module, and laser radar (2) and IMU (5) are mounted at the top of robot;In order to construct robot
Sensing module and robot navigation's module establish local grid map, barrier are detected, in robot front fitting depth camera
(6);Module is rebuild in order to construct cooperative detection module and irregular sample, binocular camera (7) (8) are installed in mechanical arm hand;
For image viewing task, forward sight binocular camera (3) (4) are installed in robot front.
8. a kind of operation side of the robot tracking control towards complex environment operation as described in claims 1 or 2 or 4 or 5
Method, which is characterized in that multisensor demarcating module is the premise that multi-sensor information fusion uses, the laser radar (2),
Binocular vision camera (3) (4), mechanical arm trick camera (7) (8), Inertial Measurement Unit (5) all need to carry out sense more using preceding
Device calibration;Figure and locating module are built in robot, and the color image provided in robot kinematics according to binocular camera is real-time
Carry out robot localization;Later by fusion depth camera (6) and laser (2) information, the building of grating map is completed, realizes machine
Device people builds figure;In robot perception module and robot navigation's module, by global grating map, provided according to laser
Acceleration, the angular speed that point cloud data and IMU (5) provide obtain robot current location, artificially give target point later, carry out
Robot path planning;Module is rebuild in cooperative detection module and irregular sample, is mentioned by mechanical arm trick camera (7) (8)
The image reconstruction of confession goes out the barrier in the public visual field, and is grabbed by mechanical arm.
9. the operation method of the robot tracking control as claimed in claim 8 towards complex environment operation, which is characterized in that
Figure and locating module are built in order to construct robot, laser radar (2) and IMU (5) are mounted at the top of robot;In order to construct machine
Device people sensing module and robot navigation's module establish local grid map, barrier are detected, in robot front fitting depth
Camera (6);Module is rebuild in order to construct cooperative detection module and irregular sample, binocular camera (7) are installed in mechanical arm hand
(8);For image viewing task, forward sight binocular camera (3) (4) are installed in robot front.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910109167.4A CN109917786A (en) | 2019-02-04 | 2019-02-04 | A kind of robot tracking control and system operation method towards complex environment operation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910109167.4A CN109917786A (en) | 2019-02-04 | 2019-02-04 | A kind of robot tracking control and system operation method towards complex environment operation |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109917786A true CN109917786A (en) | 2019-06-21 |
Family
ID=66961439
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910109167.4A Pending CN109917786A (en) | 2019-02-04 | 2019-02-04 | A kind of robot tracking control and system operation method towards complex environment operation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109917786A (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110764110A (en) * | 2019-11-12 | 2020-02-07 | 深圳创维数字技术有限公司 | Path navigation method, device and computer readable storage medium |
CN110849351A (en) * | 2019-11-21 | 2020-02-28 | 大连理工大学 | Method for constructing grid map by using depth camera and binocular camera |
CN111123911A (en) * | 2019-11-22 | 2020-05-08 | 北京空间飞行器总体设计部 | Legged intelligent star catalogue detection robot sensing system and working method thereof |
CN111174765A (en) * | 2020-02-24 | 2020-05-19 | 北京航天飞行控制中心 | Planet vehicle target detection control method and device based on visual guidance |
CN111531549A (en) * | 2020-06-18 | 2020-08-14 | 北京海益同展信息科技有限公司 | Robot system and positioning navigation method |
CN112276975A (en) * | 2020-10-30 | 2021-01-29 | 北京市安全生产科学技术研究院 | Large-load emergency rescue robot control system facing complex environment |
CN112405489A (en) * | 2020-10-16 | 2021-02-26 | 国网上海市电力公司 | Visual-auditory cooperative electric power emergency robot and operation method |
CN112765299A (en) * | 2021-01-26 | 2021-05-07 | 中国科学院西北生态环境资源研究院 | Visualization method and device for irregular raster data, electronic equipment and storage medium |
CN113084809A (en) * | 2021-04-06 | 2021-07-09 | 泰州左岸信息科技有限公司 | Industrial AI (Artificial Intelligence) key technology based on binocular 3D (three-dimensional) perception |
WO2021147546A1 (en) * | 2020-01-20 | 2021-07-29 | 深圳市普渡科技有限公司 | Multi-sensor fusion slam system, multi-sensor fusion method, robot, and medium |
CN113199454A (en) * | 2021-06-22 | 2021-08-03 | 北京航空航天大学 | Wheeled mobile intelligent logistics operation robot system |
CN113211447A (en) * | 2021-05-27 | 2021-08-06 | 山东大学 | Mechanical arm real-time perception planning method and system based on bidirectional RRT algorithm |
CN114104139A (en) * | 2021-09-28 | 2022-03-01 | 北京炎凌嘉业机电设备有限公司 | Bionic foot type robot walking platform fusion obstacle crossing and autonomous following system |
CN114659556A (en) * | 2022-03-03 | 2022-06-24 | 中国科学院计算技术研究所 | Tour device oriented separable star catalogue material identification method and system |
CN114715363A (en) * | 2022-04-02 | 2022-07-08 | 浙江大学 | Navigation method and system for submarine stratum space drilling robot and electronic equipment |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102037326A (en) * | 2008-07-31 | 2011-04-27 | 电子地图有限公司 | Method of displaying navigation data in 3D |
CN104156972A (en) * | 2014-08-25 | 2014-11-19 | 西北工业大学 | Perspective imaging method based on laser scanning distance measuring instrument and multiple cameras |
CN104476549A (en) * | 2014-11-20 | 2015-04-01 | 北京卫星环境工程研究所 | Method for compensating motion path of mechanical arm based on vision measurement |
CN104933708A (en) * | 2015-06-07 | 2015-09-23 | 浙江大学 | Barrier detection method in vegetation environment based on multispectral and 3D feature fusion |
CN106646508A (en) * | 2016-11-24 | 2017-05-10 | 中国科学院自动化研究所 | Slope angle estimation method for slope region based on multiline laser radar |
US20180051991A1 (en) * | 2016-08-17 | 2018-02-22 | Sharp Laboratories Of America, Inc. | Lazier graph-based path planning for autonomous navigation |
CN207206431U (en) * | 2017-09-27 | 2018-04-10 | 哈工大机器人(合肥)国际创新研究院 | A kind of Movement Controller of Mobile Robot |
CN108710376A (en) * | 2018-06-15 | 2018-10-26 | 哈尔滨工业大学 | The mobile chassis of SLAM and avoidance based on Multi-sensor Fusion |
CN108776474A (en) * | 2018-05-24 | 2018-11-09 | 中山赛伯坦智能科技有限公司 | Robot embedded computing terminal integrating high-precision navigation positioning and deep learning |
-
2019
- 2019-02-04 CN CN201910109167.4A patent/CN109917786A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102037326A (en) * | 2008-07-31 | 2011-04-27 | 电子地图有限公司 | Method of displaying navigation data in 3D |
CN104156972A (en) * | 2014-08-25 | 2014-11-19 | 西北工业大学 | Perspective imaging method based on laser scanning distance measuring instrument and multiple cameras |
CN104476549A (en) * | 2014-11-20 | 2015-04-01 | 北京卫星环境工程研究所 | Method for compensating motion path of mechanical arm based on vision measurement |
CN104933708A (en) * | 2015-06-07 | 2015-09-23 | 浙江大学 | Barrier detection method in vegetation environment based on multispectral and 3D feature fusion |
US20180051991A1 (en) * | 2016-08-17 | 2018-02-22 | Sharp Laboratories Of America, Inc. | Lazier graph-based path planning for autonomous navigation |
CN106646508A (en) * | 2016-11-24 | 2017-05-10 | 中国科学院自动化研究所 | Slope angle estimation method for slope region based on multiline laser radar |
CN207206431U (en) * | 2017-09-27 | 2018-04-10 | 哈工大机器人(合肥)国际创新研究院 | A kind of Movement Controller of Mobile Robot |
CN108776474A (en) * | 2018-05-24 | 2018-11-09 | 中山赛伯坦智能科技有限公司 | Robot embedded computing terminal integrating high-precision navigation positioning and deep learning |
CN108710376A (en) * | 2018-06-15 | 2018-10-26 | 哈尔滨工业大学 | The mobile chassis of SLAM and avoidance based on Multi-sensor Fusion |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110764110A (en) * | 2019-11-12 | 2020-02-07 | 深圳创维数字技术有限公司 | Path navigation method, device and computer readable storage medium |
CN110764110B (en) * | 2019-11-12 | 2022-04-08 | 深圳创维数字技术有限公司 | Path navigation method, device and computer readable storage medium |
CN110849351A (en) * | 2019-11-21 | 2020-02-28 | 大连理工大学 | Method for constructing grid map by using depth camera and binocular camera |
CN111123911A (en) * | 2019-11-22 | 2020-05-08 | 北京空间飞行器总体设计部 | Legged intelligent star catalogue detection robot sensing system and working method thereof |
WO2021147546A1 (en) * | 2020-01-20 | 2021-07-29 | 深圳市普渡科技有限公司 | Multi-sensor fusion slam system, multi-sensor fusion method, robot, and medium |
CN111174765B (en) * | 2020-02-24 | 2021-08-13 | 北京航天飞行控制中心 | Planet vehicle target detection control method and device based on visual guidance |
CN111174765A (en) * | 2020-02-24 | 2020-05-19 | 北京航天飞行控制中心 | Planet vehicle target detection control method and device based on visual guidance |
CN111531549A (en) * | 2020-06-18 | 2020-08-14 | 北京海益同展信息科技有限公司 | Robot system and positioning navigation method |
WO2021254367A1 (en) * | 2020-06-18 | 2021-12-23 | 京东科技信息技术有限公司 | Robot system and positioning navigation method |
CN112405489A (en) * | 2020-10-16 | 2021-02-26 | 国网上海市电力公司 | Visual-auditory cooperative electric power emergency robot and operation method |
CN112276975A (en) * | 2020-10-30 | 2021-01-29 | 北京市安全生产科学技术研究院 | Large-load emergency rescue robot control system facing complex environment |
CN112765299A (en) * | 2021-01-26 | 2021-05-07 | 中国科学院西北生态环境资源研究院 | Visualization method and device for irregular raster data, electronic equipment and storage medium |
CN113084809A (en) * | 2021-04-06 | 2021-07-09 | 泰州左岸信息科技有限公司 | Industrial AI (Artificial Intelligence) key technology based on binocular 3D (three-dimensional) perception |
CN113211447A (en) * | 2021-05-27 | 2021-08-06 | 山东大学 | Mechanical arm real-time perception planning method and system based on bidirectional RRT algorithm |
CN113211447B (en) * | 2021-05-27 | 2023-10-27 | 山东大学 | Mechanical arm real-time perception planning method and system based on bidirectional RRT algorithm |
CN113199454A (en) * | 2021-06-22 | 2021-08-03 | 北京航空航天大学 | Wheeled mobile intelligent logistics operation robot system |
CN114104139A (en) * | 2021-09-28 | 2022-03-01 | 北京炎凌嘉业机电设备有限公司 | Bionic foot type robot walking platform fusion obstacle crossing and autonomous following system |
CN114659556A (en) * | 2022-03-03 | 2022-06-24 | 中国科学院计算技术研究所 | Tour device oriented separable star catalogue material identification method and system |
CN114659556B (en) * | 2022-03-03 | 2024-03-12 | 中国科学院计算技术研究所 | Inspection device-oriented separable star table material identification method and system |
CN114715363A (en) * | 2022-04-02 | 2022-07-08 | 浙江大学 | Navigation method and system for submarine stratum space drilling robot and electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109917786A (en) | A kind of robot tracking control and system operation method towards complex environment operation | |
Gao et al. | Review of wheeled mobile robots’ navigation problems and application prospects in agriculture | |
Özaslan et al. | Inspection of penstocks and featureless tunnel-like environments using micro UAVs | |
WO2021254367A1 (en) | Robot system and positioning navigation method | |
CN111123911B (en) | Legged intelligent star catalogue detection robot sensing system and working method thereof | |
CN104881027B (en) | Wheel-track combined Intelligent Mobile Robot active obstacle system and control method | |
CN111308490B (en) | Balance car indoor positioning and navigation system based on single-line laser radar | |
CN109765901A (en) | Dynamic cost digital map navigation method based on line laser and binocular vision | |
CN108958250A (en) | Multisensor mobile platform and navigation and barrier-avoiding method based on known map | |
CN106325275A (en) | Robot navigation system, robot navigation method and robot navigation device | |
CN108177149A (en) | Movable mechanical arm control system and method based on MR and motion planning technology | |
CN112461227B (en) | Wheel type chassis robot inspection intelligent autonomous navigation method | |
US11340620B2 (en) | Navigating a mobile robot | |
CN106066179A (en) | A kind of robot location based on ROS operating system loses method for retrieving and control system | |
Li et al. | Localization and navigation for indoor mobile robot based on ROS | |
CN112518739A (en) | Intelligent self-navigation method for reconnaissance of tracked chassis robot | |
CN113325837A (en) | Control system and method for multi-information fusion acquisition robot | |
CN214520204U (en) | Port area intelligent inspection robot based on depth camera and laser radar | |
CN204557216U (en) | Wheel-track combined Intelligent Mobile Robot active obstacle system | |
Balaram | Kinematic state estimation for a Mars rover | |
Chiu et al. | FUMA: environment information gathering wheeled rescue robot with one-DOF arm | |
Lamon et al. | The SmartTer-a vehicle for fully autonomous navigation and mapping in outdoor environments | |
US11746501B1 (en) | Autonomous control of operations of powered earth-moving vehicles using data from on-vehicle perception systems | |
Lamon | 3D-position tracking and control for all-terrain robots | |
Bajracharya et al. | Target tracking, approach, and camera handoff for automated instrument placement |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190621 |