CN104965513B - Son hopping robot recovery system and recovery method - Google Patents

Son hopping robot recovery system and recovery method Download PDF

Info

Publication number
CN104965513B
CN104965513B CN201510349751.9A CN201510349751A CN104965513B CN 104965513 B CN104965513 B CN 104965513B CN 201510349751 A CN201510349751 A CN 201510349751A CN 104965513 B CN104965513 B CN 104965513B
Authority
CN
China
Prior art keywords
robot
sub
hopping
female
hopping robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201510349751.9A
Other languages
Chinese (zh)
Other versions
CN104965513A (en
Inventor
张军
宋光明
杨茜
宋爱国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN201510349751.9A priority Critical patent/CN104965513B/en
Publication of CN104965513A publication Critical patent/CN104965513A/en
Application granted granted Critical
Publication of CN104965513B publication Critical patent/CN104965513B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention discloses a son hopping robot recovery system and a recovery method based on son-mother robot dynamic collaboration, which comprise a son hopping robot, a mother hopping robot and a son hopping robot position and pose detection method on the basis of dynamic collaboration between the son hopping robot and the mother hopping robot. The head part of the son hopping robot is provided with a marker for visual detection; the mother robot acquires marker color and depth data via an RGB-D sensor; the marker color and contour information are extracted on the basis of an image processing module; azimuth information of the son hopping robot is calculated so as to adjust the azimuth of the mother robot; the yaw angle information of the son hopping robot in relative to the mother robot is acquired through marker elliptic parameters and the color, and the son hopping robot is wirelessly controlled to adjust the course; and the depth data are used for acquiring the distance between the son robot and the mother robot for providing information for adjusting the relative position. The difficult problem of short distance position and pose identification by the micro hopping robot for a carrying process can be solved, and a technical support is provided for automatic recovery of the robot of the kind.

Description

A kind of recovery system and recovery method of sub- hopping robot
Technical field
The present invention relates to multi-robot system, hopping robot, sensor technical field, more particularly to a kind of to be based on primary and secondary The sub- hopping robot method for recognizing position and attitude of Robotic Dynamic cooperation.
Background technology
Wireless communication ability between microsensor technology and node imparts that wireless sensor network is huge should to be worth, It has been widely used in the fields such as military investigation, modern agriculture at present.It is severe that sensing node is normally operated in that people cannot be close to In even dangerous remote environment, artificial deployment acquires a certain degree of difficulty, if earthward being shed by air-robot, pick-up point has Very big randomness.This problem can be solved using the mobile robot of multiple carrying sensing nodes as mobile sensor node. Multiple miniature mobile robots are formed a team to complete information gathering, environmental monitoring in narrow space environment, build mobile network temporarily Etc. task.The Ground Ruptures in rugged environment after the calamities such as earthquake, building collapsing, wheeled and caterpillar type robot hardly enters existing Field search life-information, and miniature hopping robot can clear the jumps by itself jumping exercise and enter ruins.Patent CN201210003779.3 proposes that a kind of single Motor drive Self-resetting of miniature hopping robot, take-off direction and take-off angle are adjusted Section mechanism, it is possible to achieve continuous bounce motor function, but by contrast miniature hopping robot sport efficiency is low, cruising time It is short, it is impossible to long-time, over long distances continuous work, it is therefore necessary to combined with wheeled or other motion modes carrying robots. Carry robot multiple miniature hopping robots are transported near destination, miniature hopping robot then scatters everywhere and holds Row appointed task.It is discontinuous with the ground of control resource-constrained and bounce motion yet with the sensing of miniature hopping robot Property, the recovery problem of miniature hopping robot is a key difficulties after task is completed.
Patent CN201310516094.3 proposes that a kind of child robot based on Female Robot vision enters cabin method, its dependence Female Robot vision collecting child robot side and the static pattern for going up below, such that it is able to decision guidance child robot cabin is entered, But the patent is mainly using static pattern as mark.
The content of the invention
It is to reclaim a difficult problem for miniature hopping robot sensing node to invent technical problem to be solved, and provides a kind of Sub- hopping robot orientation quick detection and quick adjustment, so as to timely and effectively reclaim the system and method for child robot.
To solve above-mentioned technical problem, the technical solution used in the present invention is:
A kind of recovery system of sub- hopping robot, including:Image collecting device, the range finding biography being arranged on Female Robot Sensor, image processing module, orientation recognition unit, female control unit and radio communications gateway, are arranged on sub- hopping robot On mark, wireless communication node and sub-control unit, it is characterised in that:It is described to be designated circle and two faces have difference Visual signature, be also associated with the mark control it is described mark rotate servomotor;Described image harvester For recognizing the mark on the sub- hopping robot;The distance measuring sensor is used to measure sub- hopping robot away from Female Robot Distance;Described image processing module carries out process to the mark shape that described image harvester is gathered and obtains oval center of circle seat Mark, oval transverse axis and the oval longitudinal axis, and process identifying colouring information, obtain surface visual signature;Know in the orientation Other unit, obtains oval central coordinate of circle and recognizes the sub- hopping robot in described image according to the process of described image processing module The position of harvester pickup area can obtain orientation coefficient of the sub- hopping robot relative to Female Robot;Know in the orientation Other unit can obtain oval transverse axis, the oval longitudinal axis to process according to described image processing module, and surface visual signature is known Azimuth of the not described Female Robot relative to sub- hopping robot;Female control unit is obtained according to the orientation recognition unit To orientation coefficient control the Female Robot adjustment orientation and make the sub- hopping robot just to described image harvester Pickup area middle position;The radio communications gateway is used to be sent to the azimuth information that the orientation recognition unit is obtained The sub- hopping robot;The wireless communication node is used to receive the azimuth letter that Female Robot radio communications gateway sends Breath;The sub-control unit controls the course of the sub- hopping robot according to the azimuth that the wireless communication node is received, Make sub- hopping robot just to the Female Robot.
The visual signature is color or brightness.
Bullet of the present invention is jumped robot head and is approximately a cuboid block, and apical head is bright equipped with a disc color The adjustable mark of degree, can do 360 ° of rotations under step motor drive around vertical axes;For vision differentiation, mark disk is just Anti- two sides color arranges different, for being recognized by Female Robot detection.
Recovery system of the present invention, installed in the image collecting device of Female Robot afterbody the rear of Female Robot is faced, The mark for jumping robot head installed in bullet is just jumping the dead ahead of robot, mark disc and bullet chaser device facing to bullet The front end face of people is parallel, and it is color to define mark disc near the i.e. positive color in side of sub- hopping robot front end face One, it is color two that opposite side is the color of reverse side;Sub- hopping robot is located at any one place around Female Robot, and its Course angle is any.
Described image harvester is colour imagery shot, and the distance measuring sensor is infrared depth camera.
A kind of recovery method of sub- hopping robot, comprises the following steps:
Step 1:The Female Robot original place rotates at a slow speed itself course angle, and the video flowing produced from image collecting device Intercept image sequence;
Step 2:The image processing module of the Female Robot is parsed in real time to step 1 truncated picture sequence, when When occurring the mark of sub- hopping robot 2 in image, Female Robot stops operating, and using circular indicia have it is circular or ellipse Rounded form feature obtains identifying 4 parameters x under oval configuration, wherein y, a, b, x, and y is oval central coordinate of circle, and a is ellipse Circle transverse axis, b is the oval longitudinal axis;
Step 3:Orientation recognition unit obtains coordinate (x, y) of the oval center of circle in the plane of delineation according to step 2, obtains son Orientation coefficient of the hopping robot with respect to Female Robot;
Step 4:Female Robot adjusts itself orientation according to the orientation coefficient pivot stud that step 3 is obtained, and makes bullet chaser Device people is in the center position of image, now the positive antithetical phrase hopping robot of Female Robot;
Step 5:Female Robot comes according to the oval transverse axis obtained in step 2 and oval longitudinal axis parameter, and surface color Instruct sub- hopping robot course to adjust, make the front end face of sub- hopping robot just to Female Robot;
Step 6:The distance between the sub- hopping robot obtained according to distance measuring sensor and Female Robot information and son The spring distance of hopping robot and height, control the spring recovery distance that the Female Robot moves to sub- hopping robot It is interior;
Step 7:Control sub- hopping robot to snap in Female Robot recycling box.
4 parameters x under oval configuration are identified in the step 2, the process step of y, a, b is:
Step 21, based on colour recognition to truncated picture extract mark color, so as to position area-of-interest, to feel it is emerging Interesting area image chooses suitable threshold binarization, and smoothed Filtering Processing, rim detection;
Step 22, from the edge image for obtaining extract profile, under image collecting device different azimuth visual angle, carry out with Oval sequence in machine hough-circle transform detection bianry image;
Step 23, selection threshold value, ineligible elliptic contour is filtered;
Step 24, Randomized Hough circle transformation determine the position of ellipse and shape, and 4 ginsengs of ellipse are obtained from parameter set Number x, y, a, b, wherein x, y are oval central coordinate of circle, and a is oval transverse axis, and b is the oval longitudinal axis.
Obtain sub- hopping robot in the step 3 to be defined as with respect to the orientation coefficient D of Female Robot:
D=(2x-w)/w, wherein w are picture traverse;
Represent that sub- hopping robot 2 is occurred near the left border in the colour imagery shot visual field during D ≈ -1;
Represent that sub- hopping robot 2 is occurred near the right side boundary in the colour imagery shot visual field during D ≈ 1;
Represent that sub- hopping robot 2 occurs in the colour imagery shot visual field middle position during D ≈ 0;
-1<D<When 0, sub- hopping robot 2 in colour imagery shot visual field left field, 0<D<When 1, sub- hopping robot 2 In colour imagery shot visual field right side area.
The method of step 5 neutron hopping robot course adjustment is:
During initial position, mark disc is parallel with sub- hopping robot front end face:
If a=b, illustrate that the profile that Jing steps 2 are extracted is circle, now identify just to image collecting device, according to surface Color judges which face of mark just to image collecting device, if the course of the front then sub- hopping robot of mark is uncomfortable It is whole, if reverse side, then adjust sub- hopping robot anglec of rotation π;
If a ≈ 0, illustrate that sub- hopping robot front end face is vertical with Female Robot trailing flank, sub- hopping robot mark is suitable Hour hands turn an angle, if now detecting a<B and for color one, then illustrate sub- hopping robot towards Female Robot left side Direction, now controls sub- hopping robot and adjusts its course pi/2 clockwise;If now detecting a<B and for color two, then illustrate Sub- hopping robot now controls sub- hopping robot and adjusts its course pi/2 counterclockwise towards Female Robot right direction;
If 0<a<B, illustrates to identify profile for ellipse, remembers that a values now are a0
(1) if surface is color one, sub- hopping robot is θ=arcos (a relative to the yaw angle of Female Robot0/ B), now sub- hopping robot mark rotates clockwise certain angle, and Female Robot detects a values again, illustrates if a values increase Sub- hopping robot front end face controls sub- hopping robot and adjusts its course θ angles clockwise towards Female Robot left back;If The reduction of a values then illustrates sub- hopping robot front end face towards Female Robot right back, controls sub- hopping robot and adjusts counterclockwise Its course θ angles;
(2) if surface is color two, sub- hopping robot is θ=π-arcos relative to the yaw angle of Female Robot (a0/ b), now sub- hopping robot mark rotates clockwise certain angle, and Female Robot detects a values again, if a values increase Sub- hopping robot front end face dorsad Female Robot left back is illustrated, sub- hopping robot is controlled and is adjusted its course θ angles counterclockwise Degree;Illustrate that sub- hopping robot front end face, towards Female Robot right back, controls sub- hopping robot clockwise if a values reduce Adjust its course θ angles.Beneficial effect:
The present invention proposes a kind of child robot method for recognizing position and attitude based on primary and secondary robot dynamic cooperative, based on color and Depth information identification spring child robot pose, what the picture neutron hopping robot collected by analyzing Female Robot was located Position can obtain course angle of the Female Robot relative to child robot;By child robot dynamic regulation mark of bouncing, and with Wheeled Female Robot cooperative motion realizes course angle detection and adjustment of the sub- hopping robot relative to Female Robot;By depth Information obtains distance of the sub- hopping robot relative to Female Robot, by course angle and distance from main modulation, be bullet The carrying for jumping robot is reclaimed and laid a good foundation, the present invention have it is simple and reliable, the advantages of good environmental adaptability.
Description of the drawings
Fig. 1 is primary-secondary type multi-robot system schematic diagram of the present invention;
Fig. 2 is this Female Robot system block diagram;
Fig. 3 is sub- hopping robot structure left front view of the invention;
Fig. 4 is sub- hopping robot structure right front view of the invention;
Fig. 5 is sub- hopping robot mark of the invention and motor assembling schematic diagram;
Fig. 6 is the schematic diagram (vertical view that sub- hopping robot of the invention is located at Female Robot photographic head front possible position Figure);
Fig. 7 is that sub- hopping robot of the invention is located at Female Robot photographic head dead ahead visual angle schematic diagram (top view);
Fig. 8 is the sub- hopping robot recovery method flow chart of primary-secondary type multi-robot system of the present invention;
Fig. 9 is that sub- hopping robot of the invention detects and adjust flow chart relative to Female Robot course angle.
Specific embodiment
With reference to the accompanying drawings and examples, the present invention is further described in detail.
Embodiment:Reference Fig. 1, a kind of sub- hopping robot recovery system based on primary and secondary robot dynamic cooperative, including One is used for Female Robot 1 and the sub- hopping robot 2 of object being recovered that transport is carried;The described tail of Female Robot 1 Recycling box 1-8 of the portion equipped with a top end opening, for accommodating the sub- hopping robot 2 jumped into, the described afterbody of Female Robot 1 Edge is provided with image collecting device 1-1 and distance measuring sensor 1-2, the installation of image collecting device 1-1 and distance measuring sensor 1-2 Height is suitable with sub- hopping robot height, for obtain the surrounding image of Female Robot 1 and it and sub- hopping robot 2 it Between distance.The described energy supply of Female Robot 1 is sufficient, and kinematic accuracy is higher, with stronger computing capability;Described son The small volume of hopping robot 2, light weight, sensing and computing resource are limited, can cooperate with the radio communication of Female Robot 1.Described The nose shape of sub- hopping robot 2 is approximately a cuboid block, and apical head is adjustable equipped with a disc colour brightness Mark 2-5,360 ° of rotations can be done around vertical axes under step motor drive, mark 2-5 is used to detect knowledge by Female Robot 1 Not.Described Female Robot 1 can turn according to the pose of the sub- hopping robot 2 for recognizing and the relative position information for detecting Chemical conversion motion control instruction feeds back to sub- hopping robot 2, while adjusting itself azimuth and position.
With reference to Fig. 2, described Female Robot 1 is by image collecting device 1-1, distance measuring sensor 1-2, image processing module 1- 3rd, orientation recognition unit 1-4, female control unit 1-5, radio communications gateway 1-6, power-supply management system 1-7 and recycling box 1-8 group Into Female Robot 1 has higher motion precision, and differential driving can realize the advance of robot, retreat and turn to basic exercise Function.Described image harvester 1-1 is used to recognize mark 2-5 on the sub- hopping robot 1;The distance measuring sensor 1- 2 are used to measure distance of the sub- hopping robot 2 away from Female Robot 1;Described image processing module 1-3 is to described image harvester The mark shape of 1-1 collections carries out process and obtains oval central coordinate of circle, oval transverse axis and the oval longitudinal axis, and to identifying color letter Breath is processed, and obtains surface visual signature;The orientation recognition unit 1-4 is processed according to described image processing module 1-3 Recognize that the sub- hopping robot 2 can be obtained in the position of described image harvester 1-1 pickup areas to oval central coordinate of circle To sub- hopping robot 2 relative to Female Robot 1 orientation coefficient;The orientation recognition unit 1-4 can be with according to the figure As the process of processing module 1-3 obtains oval transverse axis, the oval longitudinal axis, and surface visual signature recognizes that the Female Robot 1 is relative In the azimuth of sub- hopping robot 2;The orientation system that female control unit 1-5 is obtained according to the orientation recognition unit 1-4 The adjustment orientation of Female Robot 1 described in numerical control system simultaneously makes the sub- hopping robot 2 just to described image harvester 1-1 acquisition zones Domain middle position;The radio communications gateway 1-6 is used to be sent to the azimuth information that the orientation recognition unit 1-4 is obtained The sub- hopping robot 2;
With reference to Fig. 3, Fig. 4 and Fig. 5, described sub- hopping robot 2 is the object being recovered, and involved in the present invention is main Composition has sub- hopping robot front end face 2-1, steering mechanism 2-2, wireless communication node 2-3, sub-control unit 2-4, mark 2-5 And the motor 2-6 for identifying is driven, for vision differentiation, the positive and negative color for identifying 2-5 discs arranges difference, changes The structural parameters of sub- hopping robot 2 can adjust robot jumping height and distance, and control steering mechanism 2-2 can change machine Device people course.The wireless communication node 2-3 is used to receive the azimuth information that the radio communications gateway 1-6 of Female Robot 1 sends; Sub-control unit 2-4 controls the boat of the sub- hopping robot 2 according to the azimuth that the wireless communication node 2-3 is received To making sub- hopping robot 2 just to the Female Robot 1.
Described image collecting device 1-1 and distance measuring sensor 1-2, by head the afterbody of Female Robot 1 is fixed on, and is had There is a horizontal direction to swing degree of freedom and an angle of pitch regulation degree of freedom;Described image collecting device 1-1 is one general Logical colour imagery shot, described distance measuring sensor 1-2 is an infrared depth camera, and colour imagery shot gathers Female Robot 1 ambient condition information, for the extraction of the attitude of hopping robot 2 and azimuth information;Infrared depth camera is used to obtain machine tool The distance between device people 1 and sub- hopping robot 2.
With reference to Fig. 6, Fig. 7, Fig. 8 and Fig. 9, installed in the image collecting device 1-1 and distance measuring sensor of the afterbody of Female Robot 1 1-2 faces the rear of Female Robot 1, and mark 2-5 installed in the head of sub- hopping robot 2 just jumps robot facing to bullet Dead ahead, mark 2-5 discs it is parallel with the front end face 2-1 of sub- hopping robot 2, and define mark disc near bullet chaser The color of the side of device people front end face 2-1 is color one, and the color of opposite side is color two;Described is dynamic based on primary and secondary robot The sub- hopping robot pose identification of state cooperation and recovery method step include:
Step S1:The original place of the Female Robot 1 rotates adjust itself course angle at a slow speed, and produces from image collecting device 1-1 Raw video flowing intercepts image sequence;
Step S2:The image processing module 1-5 of the Female Robot 1 is solved in real time to step S1 truncated picture sequence Analysis, is processed truncated picture based on colour recognition, when the mark 2-5 color for occurring sub- hopping robot 2 in image, The color of record identification 2-5, Female Robot 1 stops operating and does following sub-step and processes:
1st, color extraction is identified to truncated picture based on colour recognition, it is emerging to feeling so as to position area-of-interest Interesting area image chooses suitable threshold value by image binaryzation, and smoothed Filtering Processing, rim detection;
2nd, profile is extracted from the edge image for obtaining, has significantly circular or oval configuration special using mark 2-5 Levy, carry out the oval sequence in Randomized Hough circle transformation detection bianry image;
3rd, to reduce background environment interference, suitable threshold value is selected based on experience value, by ineligible elliptic contour Filter;
4th, Randomized Hough circle transformation determines position and the shape of ellipse, and the center of circle that ellipse is obtained from parameter set is flat in image Coordinate (x, y) in face;
If in step s 2 course adjusts 360 degree still without mark 2-5 for detecting sub- hopping robot 2 to Female Robot 1 Color, then control the course of sub- hopping robot 2 and adjust certain angle, then repeat step S1 and S2, until Female Robot 1 is examined Mark 2-5 for measuring sub- hopping robot 2 terminates;
Step S3:Orientation coefficient of the sub- hopping robot 2 relative to Female Robot 1 is calculated, the sub- phase of hopping robot 2 is obtained Direction to RGB-D sensor 1-3-1 (afterbody of Female Robot 1):
Define orientation coefficient D:D=(2x-w)/w, wherein w are picture traverse;
Represent that sub- hopping robot 2 is occurred near the left border in the colour imagery shot visual field during D ≈ -1;
Represent that sub- hopping robot 2 is occurred near the right side boundary in the colour imagery shot visual field during D ≈ 1;
Represent that sub- hopping robot 2 occurs in the colour imagery shot visual field middle position during D ≈ 0;
-1<D<When 0, sub- hopping robot 2 in colour imagery shot visual field left field, 0<D<When 1, sub- hopping robot 2 In colour imagery shot visual field right side area;
Step S4:Female Robot 1 adjusts itself orientation according to the azimuth information pivot stud for obtaining, and often adjusts and once then adopts Collect a two field picture, until image feedback presentation of information D ≈ 0, now sub- hopping robot 2 is attached in colour imagery shot visual field central authorities Closely, i.e., sub- hopping robot is in Female Robot afterbody dead astern, and the colour imagery shot of Female Robot 1 faces bullet chaser device People 2, and sub- hopping robot 2 is located at colour imagery shot what comes into a driver's central authorities;
Step S5:Sub- hopping robot 2 detects that the sub- hopping robot 2 of guidance is carried out relative to the yaw angle of Female Robot 1 Course adjusts:
Again the mark of antithetical phrase hopping robot 2 2-5 discs are detected Female Robot 1, because sub- hopping robot 2 differs Surely it is to face Female Robot 1, it is thus possible to the visual angle of eight kinds of situations occur, the mark under eight kinds of visual angles is probably circular or ellipse Circle, the mark color for detecting is probably color one or color two;Through Randomized Hough circle transformation determine oval position and Shape, obtains coordinate (x, y), ellipse transverse axis a and oval longitudinal axis b of the center of circle of ellipse in the plane of delineation from parameter set, by It is close to the mark setting height(from bottom) of sub- hopping robot 2 in RGB-D photographic head 1-3-1 setting height(from bottom)s, oval longitudinal axis b is close to mark circle The diameter in face, sub- hopping robot 2 can be calculated relative to Female Robot 1 according to the value and mark 2-5 colors of a, b Yaw angle, remembers that a values that initial time is detected are a0;Visual angle one:If a=b, and for color one, illustrate to identify 2-5 profiles for circle Shape, now the front end face 2-1 of sub- hopping robot 2 face the rear portion of Female Robot 1, without the need for course adjust;
Visual angle two:If a=b, and for color two, illustrate to identify 2-5 profiles for circle, now the front end of sub- hopping robot 2 Face 2-1 facing away from the rear portion of Female Robot 1, and the course of sub- hopping robot 2 adjusts 180 degree;
Visual angle three:If a ≈ 0, now the front end face 2-1 of sub- hopping robot 2 is vertical with the trailing flank of Female Robot 1, and bullet is jumped The mark 2-5 of robot 2 (top view) clockwise turns an angle, if now detecting a<B and for color one, then illustrate bullet Robot 2 is jumped towards the left direction of Female Robot 1, clockwise (top view) adjusts its course now to control sub- hopping robot 2 90 degree;
Visual angle four:If a ≈ 0, now the front end face 2-1 of sub- hopping robot 2 is vertical with the trailing flank of Female Robot 1, and bullet is jumped The mark 2-5 of robot 2 (top view) clockwise turns an angle, if now detecting a<B and for color two, then illustrate bullet Robot 2 is jumped towards the right direction of Female Robot 1, counterclockwise (top view) adjusts its course now to control sub- hopping robot 2 90 degree;
Visual angle five:If 0<a<B, and for color one, illustrate to identify 2-5 profiles for ellipse, sub- hopping robot is relative to mother The yaw angle of robot is θ=arcos (a0/ b), now clockwise (top view) rotates certain to the mark of sub- hopping robot 2 2-5 Angle, Female Robot 1 detects a values again, illustrates the front end face 2-1 of sub- hopping robot 2 towards Female Robot 1 if a values increase Left back, clockwise (top view) adjusts its course θ angles to control sub- hopping robot 2;
Visual angle six:If 0<a<B, and for color one, illustrate to identify profile for ellipse, sub- hopping robot is relative to machine tool device The yaw angle of people is θ=arcos (a0/ b), now clockwise (top view) rotates certain angle to the mark of sub- hopping robot 2 2-5 Degree, Female Robot 1 detects a values again, illustrates that the front end face 2-1 of sub- hopping robot 2 is right towards Female Robot 1 if a values reduce Rear, counterclockwise (top view) adjusts its course θ angles to control sub- hopping robot 2;
Visual angle seven:If 0<a<B, and for color two, illustrate to identify 2-5 profiles for ellipse, sub- hopping robot is relative to mother The yaw angle of robot is θ=π-arcos (a0/ b), now clockwise (top view) rotates one to the mark of sub- hopping robot 2 2-5 Determine angle, Female Robot 1 detects a values again, if a values increase the front end face 2-1 of sub- hopping robot 2 dorsad Female Robots are illustrated 1 left back, counterclockwise (top view) adjusts its course θ angles to control sub- hopping robot 2;
Visual angle eight:If 0<a<B, and for color two, illustrate to identify 2-5 profiles for ellipse, sub- hopping robot is relative to mother The yaw angle of robot is θ=π-arcos (a0/ b), now clockwise (top view) rotates one to the mark of sub- hopping robot 2 2-5 Determine angle, Female Robot 1 detects a values again, if a values reduce the front end face 2-1 of sub- hopping robot 2 dorsad Female Robots are illustrated 1 right back, clockwise (top view) adjusts its course θ angles to control sub- hopping robot 2;Step S6:Through step S5 bullet Jump the front end face of robot 1 and mark 2-5 discs all face the distance measuring sensor 1-2 of Female Robot 1, distance measuring sensor 1-2 is obtained It is taken just to the depth map in direction, the coordinate system of coordinate system and distance measuring sensor 1-2 to image collecting device 1-1 carries out coordinate Conversion, mark 2-5 central coordinate of circle (x, y) be mapped in distance measuring sensor 1-2 three-dimensional system of coordinates and obtain (x ', y ', z), marked Know vertical dimension z of 2-5 discs and distance measuring sensor 1-2 planes, according to corresponding relation can be converted into sub- hopping robot 2 with The distance of the afterbody of Female Robot 1;
Step S7:According to z values and the actual attainable spring distance of sub- hopping robot 2, the straight forward of Female Robot 1 or Retreat, the entrance of sub- hopping robot 2 is jumped in the distance range of recycling box 1-8 of Female Robot 1, sub- hopping robot 2 Prepare take-off;
Step S8:Sub- hopping robot 2 is jumped in recycling box 1-8 of top end opening.

Claims (7)

1. a kind of recovery system of sub- hopping robot, including:Image collecting device, the range finding sensing being arranged on Female Robot Device, image processing module, orientation recognition unit, female control unit and radio communications gateway, are arranged on sub- hopping robot Mark, wireless communication node and sub-control unit, it is characterised in that:It is described be designated circle and two faces have it is different Visual signature, is also associated with servomotor of the control mark around 360 ° of rotations of vertical axes in the mark;The figure As harvester is used to recognize the mark on the sub- hopping robot;The distance measuring sensor is used to measure sub- hopping robot Away from the distance of Female Robot;Described image processing module carries out process and obtains to the mark shape that described image harvester is gathered Oval central coordinate of circle, oval transverse axis and the oval longitudinal axis;The orientation recognition unit, is processed according to described image processing module Recognize the sub- hopping robot in the position of described image harvester pickup area and according to described to oval central coordinate of circle Image processing module process obtains oval transverse axis and the oval longitudinal axis recognizes side of the sub- hopping robot relative to Female Robot Parallactic angle;Female control unit Female Robot according to the position control that the orientation recognition unit is obtained adjusts orientation and makes The sub- hopping robot is just to described image harvester pickup area middle position;The radio communications gateway is used for institute State the azimuth information that image processing module obtains and be sent to the sub- hopping robot;The wireless communication node is used to receive The azimuth information that Female Robot radio communications gateway sends;The sub-control unit is received according to the wireless communication node Azimuth controls the course of the sub- hopping robot just to the Female Robot.
2. the recovery system of sub- hopping robot according to claim 1, it is characterised in that:The visual signature is color Or brightness.
3. the recovery system of sub- hopping robot according to claim 1 and 2, it is characterised in that:Described image collection dress Colour imagery shot is set to, the distance measuring sensor is infrared depth camera.
4. a kind of recovery method based on recovery system described in claim 1, it is characterised in that comprise the following steps:
Step 1:The Female Robot original place rotates at a slow speed itself course angle, and the video flowing produced from image collecting device is intercepted Image sequence;
Step 2:The image processing module of the Female Robot is parsed in real time to step 1 truncated picture sequence, works as image It is middle there is sub- hopping robot mark when, Female Robot stops operating, and the circular or ellipse having using circular indicia Morphological characteristic obtains identifying 4 parameters x under oval configuration, wherein y, a, b, x, and y is oval central coordinate of circle, and a is oval horizontal Axle, b is the oval longitudinal axis;
Step 3:The parsing of Jing steps 2 obtains coordinate (x, y) of the oval center of circle in the plane of delineation, obtains sub- hopping robot relative The direction of Female Robot;
Step 4:Female Robot adjusts itself orientation according to the orientation information pivot stud that step 3 is obtained, and makes sub- hopping robot In the center position of Female Robot;
Step 5:Female Robot instructs sub- hopping robot to navigate according to the oval transverse axis obtained in step 2 and oval longitudinal axis parameter To adjustment, the front end face of sub- hopping robot is made just to Female Robot;
Step 6:The distance between the sub- hopping robot obtained according to distance measuring sensor and Female Robot information and bullet are jumped The spring distance of robot, controls the Female Robot and moves to the spring recovery of sub- hopping robot apart from interior;
Step 7:Control sub- hopping robot to snap in Female Robot recycling box.
5. recovery method according to claim 4, it is characterised in that 4 under oval configuration are identified in the step 2 Parameter x, the process step of y, a, b is:
Step 21, based on colour recognition to truncated picture extract mark color, so as to position area-of-interest, to region of interest Area image chooses suitable threshold binarization, and smoothed Filtering Processing, rim detection;
Step 22, from the edge image for obtaining extract profile, under image collecting device different azimuth visual angle, carry out it is random suddenly Oval sequence in husband's circle transformation detection bianry image;
Step 23, selection threshold value, ineligible elliptic contour is filtered;
Step 24, Randomized Hough circle transformation determine the position of ellipse and shape, and 4 parameters x of ellipse are obtained from parameter set, Y, a, b, wherein x, y is oval central coordinate of circle, and a is oval transverse axis, and b is the oval longitudinal axis.
6. recovery method according to claim 4, it is characterised in that sub- hopping robot is obtained in the step 3 relative The method of the direction of Female Robot is:
Define orientation coefficient D:D=(2x-w)/w, wherein w are picture traverse;
Represent that sub- hopping robot is occurred near the left border in the colour imagery shot visual field during D ≈ -1;
Represent that sub- hopping robot is occurred near the right side boundary in the colour imagery shot visual field during D ≈ 1;
Represent that sub- hopping robot occurs in the colour imagery shot visual field middle position during D ≈ 0;
-1<D<When 0, sub- hopping robot on the left of the colour imagery shot visual field, 0<D<When 1, sub- hopping robot 2 is in colored shooting Head visual field right side.
7. recovery method according to claim 4, it is characterised in that the step 5 neutron hopping robot course adjustment Method be:
During initial position, mark disc is parallel with sub- hopping robot front end face:
If a=b, illustrate that the profile that Jing steps 2 are extracted is circle, now identify just to image collecting device, according to surface color Which face of mark is judged just to image collecting device, if the course of the front then sub- hopping robot of mark does not adjust, If reverse side, then sub- hopping robot anglec of rotation π is adjusted;
If a ≈ 0, illustrate that sub- hopping robot front end face is vertical with Female Robot trailing flank, sub- hopping robot mark is clockwise Turn an angle, if now detecting a<B and for color one, then illustrate sub- hopping robot towards Female Robot left side side To now controlling sub- hopping robot and adjust its course pi/2 clockwise;If now detecting a<B and for color two, then illustrate son Hopping robot now controls sub- hopping robot and adjusts its course pi/2 counterclockwise towards Female Robot right direction;
If 0<a<B, illustrates to identify profile for ellipse, remembers that a values now are a0
(1) if surface is color one, sub- hopping robot is θ=arcos (a relative to the yaw angle of Female Robot0/ b), this When sub- hopping robot mark rotate clockwise certain angle, Female Robot detects a values again, and if the increase of a values bullet is illustrated Robot front end face is jumped towards Female Robot left back, sub- hopping robot is controlled and is adjusted its course θ angles clockwise;If a values Reduction then illustrates sub- hopping robot front end face towards Female Robot right back, controls sub- hopping robot and adjusts its boat counterclockwise To θ angles;
(2) if surface is color two, sub- hopping robot is θ=π-arcos (a relative to the yaw angle of Female Robot0/ b), Now sub- hopping robot mark rotates clockwise certain angle, and Female Robot detects a values again, explanation if a values increase Hopping robot front end face dorsad Female Robot left back, controls sub- hopping robot and adjusts its course θ angles counterclockwise;If a Value reduction then illustrates sub- hopping robot front end face towards Female Robot right back, controls sub- hopping robot and adjusts it clockwise Course θ angles.
CN201510349751.9A 2015-06-23 2015-06-23 Son hopping robot recovery system and recovery method Expired - Fee Related CN104965513B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510349751.9A CN104965513B (en) 2015-06-23 2015-06-23 Son hopping robot recovery system and recovery method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510349751.9A CN104965513B (en) 2015-06-23 2015-06-23 Son hopping robot recovery system and recovery method

Publications (2)

Publication Number Publication Date
CN104965513A CN104965513A (en) 2015-10-07
CN104965513B true CN104965513B (en) 2017-05-17

Family

ID=54219552

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510349751.9A Expired - Fee Related CN104965513B (en) 2015-06-23 2015-06-23 Son hopping robot recovery system and recovery method

Country Status (1)

Country Link
CN (1) CN104965513B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105665970B (en) * 2016-03-01 2018-06-22 中国科学院自动化研究所 For the path point automatic creation system and method for welding robot
CN107479544A (en) * 2016-06-08 2017-12-15 科沃斯机器人股份有限公司 Mother and sons' machine cooperative operation system and its method of work
CN107089275B (en) * 2017-03-27 2019-03-26 西北工业大学 It is a kind of aerial posture adjustment and the sufficient roll-type interval hopping robot of energy regenerating to be landed
CN109955265A (en) * 2019-03-08 2019-07-02 武汉理工大学 A kind of indoor range complex intelligence shell case cleaning robot
CN112515541B (en) * 2019-09-17 2022-02-15 佛山市云米电器科技有限公司 Cleaning method and system based on mother-child linkage type floor sweeping robot
CN113998021B (en) * 2021-12-10 2023-10-03 东南大学 Bionic search and rescue robot and space self-deployment method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102353355A (en) * 2011-06-14 2012-02-15 西安工程大学 Method for measuring power transmission line pole and tower inclination based on video differences
CN102556193A (en) * 2012-01-09 2012-07-11 东南大学 Hopping robot capable of hopping continuously
CN103522304A (en) * 2013-10-28 2014-01-22 中国科学院自动化研究所 Capsule entry method of slave robots based on master robot vision
CN103593849A (en) * 2013-11-26 2014-02-19 北京建筑大学 Method for quickly recognizing and tracking image sequence oval artificial target points
CN104180808A (en) * 2014-08-05 2014-12-03 南京航空航天大学 Aerial autonomous refueling circular taper sleeve vision position and attitude resolving method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101857122B1 (en) * 2010-12-17 2018-05-14 한국전자통신연구원 Method and system for providing seamless localization

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102353355A (en) * 2011-06-14 2012-02-15 西安工程大学 Method for measuring power transmission line pole and tower inclination based on video differences
CN102556193A (en) * 2012-01-09 2012-07-11 东南大学 Hopping robot capable of hopping continuously
CN103522304A (en) * 2013-10-28 2014-01-22 中国科学院自动化研究所 Capsule entry method of slave robots based on master robot vision
CN103593849A (en) * 2013-11-26 2014-02-19 北京建筑大学 Method for quickly recognizing and tracking image sequence oval artificial target points
CN104180808A (en) * 2014-08-05 2014-12-03 南京航空航天大学 Aerial autonomous refueling circular taper sleeve vision position and attitude resolving method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
A bio-inspired jumping robot: Modeling, simulation, design, and experimental results;Jun Zhang等;《Mechatronics》;20131231;第23卷;第1123-1140页 *
Aerial Posture Adjustment of a Bio-Inspired Jumping Robot for Safe Landing: Modeling and Simulation;Jun Zhang等;《Proceedings of the 2014 IEEE International Conference on Robotics and Biomimetics》;20141231;第968-973页 *
Self-Righting, Steering and Takeoff Angle Adjusting for a Jumping Robot;Jun Zhang等;《2012 IEEE/RSJ International Conference on Intelligent Robots and Systems》;20121231;第2089-2094页 *
一种基于母机器人视觉指引的子机器人回收方法;赵鹏等;《华中科技大学学报(自然科学版)》;20131031;第41卷(第增刊1期);第429-435页 *

Also Published As

Publication number Publication date
CN104965513A (en) 2015-10-07

Similar Documents

Publication Publication Date Title
CN104965513B (en) Son hopping robot recovery system and recovery method
CN105512628B (en) Vehicle environmental sensory perceptual system based on unmanned plane and method
CN110782481B (en) Unmanned ship intelligent decision-making method and system
CN105929850B (en) A kind of UAV system and method with lasting locking and tracking target capability
CN105892471B (en) Automatic driving method and apparatus
CN105318888B (en) Automatic driving vehicle paths planning method based on unmanned plane perception
McGee et al. Obstacle detection for small autonomous aircraft using sky segmentation
CN102944224B (en) Work method for automatic environmental perception systemfor remotely piloted vehicle
CN206691107U (en) Pilotless automobile system and automobile
CN108229366A (en) Deep learning vehicle-installed obstacle detection method based on radar and fusing image data
CN109885086B (en) Unmanned aerial vehicle vertical landing method based on composite polygonal mark guidance
CN106896353A (en) A kind of unmanned vehicle crossing detection method based on three-dimensional laser radar
CN104808685A (en) Vision auxiliary device and method for automatic landing of unmanned aerial vehicle
CN105787478A (en) Face direction change recognition method based on neural network and sensitivity parameter
CN111735445A (en) Monocular vision and IMU (inertial measurement Unit) integrated coal mine tunnel inspection robot system and navigation method
CN108106617A (en) A kind of unmanned plane automatic obstacle-avoiding method
CN105243664A (en) Vision-based wheeled mobile robot fast target tracking method
CN106774363A (en) UAV flight control system and method
CN211890820U (en) Air-ground cooperative intelligent inspection robot
CN112783181B (en) Multi-rotor unmanned aerial vehicle cluster vision landing method based on fuzzy control
CN110132060A (en) A kind of method of the interception unmanned plane of view-based access control model navigation
Jun et al. Autonomous driving system design for formula student driverless racecar
CN106155082A (en) A kind of unmanned plane bionic intelligence barrier-avoiding method based on light stream
CN109521780A (en) The control system and control method of remote operation vehicle
Chiu et al. Vision-only automatic flight control for small UAVs

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170517

Termination date: 20210623

CF01 Termination of patent right due to non-payment of annual fee