CN104965513A - Son hopping robot recovery system and recovery method - Google Patents

Son hopping robot recovery system and recovery method Download PDF

Info

Publication number
CN104965513A
CN104965513A CN201510349751.9A CN201510349751A CN104965513A CN 104965513 A CN104965513 A CN 104965513A CN 201510349751 A CN201510349751 A CN 201510349751A CN 104965513 A CN104965513 A CN 104965513A
Authority
CN
China
Prior art keywords
robot
sub
hopping
hopping robot
female
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510349751.9A
Other languages
Chinese (zh)
Other versions
CN104965513B (en
Inventor
张军
宋光明
杨茜
宋爱国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN201510349751.9A priority Critical patent/CN104965513B/en
Publication of CN104965513A publication Critical patent/CN104965513A/en
Application granted granted Critical
Publication of CN104965513B publication Critical patent/CN104965513B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention discloses a son hopping robot recovery system and a recovery method based on son-mother robot dynamic collaboration, which comprise a son hopping robot, a mother hopping robot and a son hopping robot position and pose detection method on the basis of dynamic collaboration between the son hopping robot and the mother hopping robot. The head part of the son hopping robot is provided with a marker for visual detection; the mother robot acquires marker color and depth data via an RGB-D sensor; the marker color and contour information are extracted on the basis of an image processing module; azimuth information of the son hopping robot is calculated so as to adjust the azimuth of the mother robot; the yaw angle information of the son hopping robot in relative to the mother robot is acquired through marker elliptic parameters and the color, and the son hopping robot is wirelessly controlled to adjust the course; and the depth data are used for acquiring the distance between the son robot and the mother robot for providing information for adjusting the relative position. The difficult problem of short distance position and pose identification by the micro hopping robot for a carrying process can be solved, and a technical support is provided for automatic recovery of the robot of the kind.

Description

A kind of recovery system of sub-hopping robot and recovery method
Technical field
The present invention relates to multi-robot system, hopping robot, sensor technical field, particularly a kind of sub-hopping robot method for recognizing position and attitude based on primary and secondary robot dynamic cooperative.
Background technology
Microsensor technology and internodal wireless communication ability impart huge should being worth of wireless sensor network, have been widely used in the fields such as military investigation, modern agriculture at present.Sensing node is normally operated in the severe even dangerous remote environment that people cannot be close, and artificial deployment acquires a certain degree of difficulty, if shed earthward by air-robot, then pick-up point has very large randomness.Use multiple mobile robot carrying sensing node can address this problem as mobile sensor node.Multiple miniature mobile robot forms a team in narrow space environment, to complete information acquisition, environmental monitoring, build the tasks such as mobile network temporarily.Ground Ruptures in rugged environment after the calamities such as earthquake, building collapsing, wheeled and caterpillar type robot is difficult to enter search site life-information, and miniature hopping robot can rely on self jumping to clear the jumps to enter ruins.Single motor that patent CN201210003779.3 proposes a kind of miniature hopping robot drives Self-resetting, direction of take off and take-off angle governor motion, continuous bounce motor function can be realized, but miniature hopping robot sport efficiency is low by contrast, cruising time is short, cannot for a long time, the continuous working of long distance, be therefore necessary to be combined with lift-launch robot that is wheeled or other mode of motion.Carrying robot is transported near destination by multiple miniature hopping robot, and miniature hopping robot then scatters everywhere and performs appointed task.But due to the sense and control technique resource-constrained of miniature hopping robot and the ground noncontinuity of bounce motion, after finishing the work, the recovery problem of miniature hopping robot is a key difficulties.
Patent CN201310516094.3 proposes a kind of child robot based on Female Robot vision and enters cabin method, its static pattern relying on Female Robot vision collecting child robot side and go up below, thus can enter cabin by decision guidance child robot, but this patent mainly adopts static pattern as mark.
Summary of the invention
Inventing technical matters to be solved is reclaim a difficult problem for miniature hopping robot sensing node, and provides a kind of sub-hopping robot orientation to detect fast and rapid adjustment, thus can reclaim the system and method for child robot timely and effectively.
For solving the problems of the technologies described above, the technical solution used in the present invention is:
A kind of recovery system of sub-hopping robot, comprise: be arranged on the image collecting device on Female Robot, distance measuring sensor, image processing module, orientation recognition unit, female control module and radio communications gateway, be arranged on the mark on sub-hopping robot, wireless communication node and sub-control unit, it is characterized in that: described in be designated circle and two masks have different visual signatures, described mark is also connected with the servomotor that the described mark of a control is rotated; Described image collecting device is for identifying the mark on described sub-hopping robot; Described distance measuring sensor is for measuring the distance of sub-hopping robot apart from Female Robot; The mark shape of described image processing module to described image acquisition device is carried out process and is obtained oval central coordinate of circle, oval transverse axis and the oval longitudinal axis, and processes identification color information, obtains surperficial visual signature; Described orientation recognition unit, obtains sub-hopping robot described in oval central coordinate of circle identification according to described image processing module process and can obtain the orientation coefficient of sub-hopping robot relative to Female Robot in the position in described image acquisition device region; Described orientation recognition unit can also obtain oval transverse axis, the oval longitudinal axis according to described image processing module process, and Female Robot described in surperficial visual signature identification is relative to the position angle of sub-hopping robot; Described female control module controls described Female Robot adjustment orientation according to the orientation coefficient that described orientation recognition unit obtains and makes described sub-hopping robot just to described image acquisition device region middle position; Described radio communications gateway is used for the azimuth information that described orientation recognition unit obtains to send to described sub-hopping robot; The azimuth information that described wireless communication node sends for receiving Female Robot radio communications gateway; Described sub-control unit controls the course of described sub-hopping robot according to the position angle that described wireless communication node receives, and makes sub-hopping robot just to described Female Robot.
Described visual signature is color or brightness.
Bullet of the present invention is jumped robot head and is approximately a rectangular parallelepiped block, and apical head is equipped with all adjustable mark of a disc colour brightness, can do 360 ° of rotations under driving stepper motor around vertical axes; In order to vision is distinguished, the tow sides color of mark disk arranges difference, identifies for being detected by Female Robot.
Recovery system of the present invention, the image collecting device being arranged on Female Robot afterbody faces the rear of Female Robot, be arranged on bullet and jump the mark of robot head just facing to the dead ahead of bullet jumping robot, mark disc is parallel with the front end face of sub-hopping robot, and definition mark disc is color one near the side of sub-hopping robot front end face and the color in front, the color of opposite side and reverse side is color two; Sub-hopping robot is positioned at any one place around Female Robot, and its course angle is any.
Described image collector is set to colour imagery shot, and described distance measuring sensor is infrared depth camera.
A recovery method for sub-hopping robot, comprises the following steps:
Step 1: described Female Robot original place rotates at a slow speed self course angle, and intercept image sequence from the video flowing that image collecting device produces;
Step 2: the image processing module of described Female Robot is resolved step 1 truncated picture sequence in real time, when there is the mark of sub-hopping robot 2 in image, Female Robot stops operating, and the circle utilizing circular indicia to have or oval configuration feature obtain identifying 4 parameter x, y under oval configuration, a, b, wherein x, y are oval central coordinate of circle, a is oval transverse axis, and b is the oval longitudinal axis;
Step 3: orientation recognition unit obtains the coordinate (x, y) of the oval center of circle in the plane of delineation according to step 2, obtains the orientation coefficient of the relative Female Robot of sub-hopping robot;
Step 4: self orientation of coefficient pivot stud adjustment, orientation that Female Robot obtains according to step 3, makes sub-hopping robot be in the center position of image, now the positive antithetical phrase hopping robot of Female Robot;
Step 5: Female Robot is according to the oval transverse axis obtained in step 2 and oval longitudinal axis parameter, and surface color instructs sub-hopping robot course to adjust, and makes the front end face of sub-hopping robot just to Female Robot;
Step 6: the range information between the sub-hopping robot obtained according to distance measuring sensor and Female Robot and the spring Distance geometry height of sub-hopping robot, controls described Female Robot and moves in the spring recovery distance of sub-hopping robot;
Step 7: control sub-hopping robot and snap in Female Robot recycling box.
The treatment step identifying 4 parameter x, y, a, b under oval configuration in described step 2 is:
Step 21, based on colour recognition, identification color is extracted to truncated picture, thus location area-of-interest, suitable threshold binarization is chosen to region of interest area image, and through the disposal of gentle filter, rim detection;
Step 22, from the edge image obtained, extract profile, under image collecting device different azimuth visual angle, carry out Randomized Hough circle transformation and detect oval sequence in bianry image;
Step 23, selection threshold value, by ineligible elliptic contour filtering;
Step 24, Randomized Hough circle transformation determine oval position and shape, and from parameter set, obtain oval 4 parameter x, y, a, b, wherein x, y are oval central coordinate of circle, and a is oval transverse axis, and b is the oval longitudinal axis.
The orientation coefficient D obtaining the relative Female Robot of sub-hopping robot in described step 3 is defined as:
D=(2x-w)/w, wherein w is picture traverse;
Represent during D ≈-1 that sub-hopping robot 2 appears near the left border in the colour imagery shot visual field;
Represent during D ≈ 1 that sub-hopping robot 2 appears near the right side boundary in the colour imagery shot visual field;
Represent during D ≈ 0 that sub-hopping robot 2 appears at colour imagery shot visual field middle position;
During-1<D<0, sub-hopping robot 2 is in colour imagery shot visual field left field, and during 0<D<1, sub-hopping robot 2 is in colour imagery shot visual field right side area.
The method of described step 5 neutron hopping robot course adjustment is:
During initial position, mark disc is parallel with sub-hopping robot front end face:
If a=b, illustrate that the profile extracted through step 2 is for circular, now identify just to image collecting device, judge which face of mark is just to image collecting device according to surface color, if the course of the front then sub-hopping robot of mark does not adjust, if reverse side, then adjust sub-hopping robot anglec of rotation π;
If a ≈ 0, illustrate that sub-hopping robot front end face is vertical with Female Robot trailing flank, sub-hopping robot mark rotates clockwise certain angle, if now a<b detected and for color one, then illustrate that sub-hopping robot is towards Female Robot left direction, now control sub-hopping robot and adjust its course pi/2 clockwise; If now a<b detected and for color two, then illustrate that sub-hopping robot is towards Female Robot right direction, now control sub-hopping robot and adjust its course pi/2 counterclockwise;
If 0<a<b, illustrate that mark profile is for oval, note a value is now a 0:
(1) if surface is color one, then sub-hopping robot is θ=arcos (a relative to the crab angle of Female Robot 0/ b), now sub-hopping robot mark rotates clockwise certain angle, and Female Robot detects a value again, if a value increases, illustrates that sub-hopping robot front end face is towards Female Robot left back, controls sub-hopping robot and adjust its course θ angle clockwise; If a value reduces, illustrate that sub-hopping robot front end face is towards Female Robot right back, control sub-hopping robot and adjust its course θ angle counterclockwise;
(2) if surface is color two, then sub-hopping robot is θ=π-arcos (a relative to the crab angle of Female Robot 0/ b), now sub-hopping robot mark rotates clockwise certain angle, and Female Robot detects a value again, if a value increases, sub-hopping robot front end face Female Robot left back is dorsad described, controls sub-hopping robot and adjust its course θ angle counterclockwise; If a value reduces, illustrate that sub-hopping robot front end face is towards Female Robot right back, control sub-hopping robot and adjust its course θ angle clockwise.Beneficial effect:
The present invention proposes a kind of child robot method for recognizing position and attitude based on primary and secondary robot dynamic cooperative, based on color and depth information identification spring child robot pose, the course angle of Female Robot relative to child robot can be obtained by the position analyzing the picture neutron hopping robot place that Female Robot collects; By spring child robot dynamic adjustments mark, and realize the course angle Detection and adjustment of sub-hopping robot relative to Female Robot with wheeled Female Robot cooperative motion; Obtain the distance of sub-hopping robot relative to Female Robot by depth information, by the autonomous adjustment to course angle and distance, the lift-launch for sub-hopping robot is reclaimed and is laid a good foundation, the advantages such as the present invention has simple and reliable, good environmental adaptability.
Accompanying drawing explanation
Fig. 1 is primary-secondary type multi-robot system schematic diagram of the present invention;
Fig. 2 is this Female Robot system chart;
Fig. 3 is the present invention's sub-hopping robot structure left front view;
Fig. 4 is the present invention's sub-hopping robot structure right front view;
Fig. 5 is that the sub-hopping robot mark of the present invention assembles schematic diagram with stepper motor;
Fig. 6 is the schematic diagram (vertical view) that the sub-hopping robot of the present invention is positioned at Female Robot camera front possible position;
Fig. 7 is that the sub-hopping robot of the present invention is positioned at visual angle, Female Robot camera dead ahead schematic diagram (vertical view);
Fig. 8 is primary-secondary type multi-robot system of the present invention sub-hopping robot recovery method process flow diagram;
Fig. 9 is that the sub-hopping robot of the present invention is relative to Female Robot course angle measuring and adjustation process flow diagram.
Embodiment
Below in conjunction with drawings and Examples, the present invention is further described in detail.
Embodiment: with reference to Fig. 1, a kind of sub-hopping robot recovery system based on primary and secondary robot dynamic cooperative, comprises one for transporting Female Robot 1 and the sub-hopping robot 2 of the object be recovered of lift-launch; Described Female Robot 1 afterbody is equipped with the recycling box 1-8 of a top end opening, for holding the sub-hopping robot 2 jumped into, described Female Robot 1 tail edge is provided with image collecting device 1-1 and distance measuring sensor 1-2, the setting height(from bottom) of image collecting device 1-1 and distance measuring sensor 1-2 is suitable with sub-hopping robot height, for obtaining Female Robot 1 surrounding environment image and the distance between it and sub-hopping robot 2.Described Female Robot 1 energy is in liberal supply, and kinematic accuracy is higher, has stronger computing power; Described sub-hopping robot 2 volume is little, and quality is light, sensing and computational resource limited, can cooperate with Female Robot 1 radio communication.Described sub-hopping robot 2 nose shape is approximately a rectangular parallelepiped block, apical head is equipped with all adjustable mark 2-5 of a disc colour brightness, under driving stepper motor, can do 360 ° of rotations around vertical axes, mark 2-5 is used for being detected identification by Female Robot 1.Described Female Robot 1 can change into motion control instruction according to the pose of the sub-hopping robot 2 recognized and the relative position information detected and feed back to sub-hopping robot 2, adjusts self position angle and position simultaneously.
With reference to Fig. 2, described Female Robot 1 is made up of image collecting device 1-1, distance measuring sensor 1-2, image processing module 1-3, orientation recognition unit 1-4, female control module 1-5, radio communications gateway 1-6, power-supply management system 1-7 and recycling box 1-8, Female Robot 1 has higher motion precision, differential driving can realize robot advance, retreat and turn to basic exercise function.Described image collecting device 1-1 is for identifying the mark 2-5 on described sub-hopping robot 1; Described distance measuring sensor 1-2 is for measuring the distance of sub-hopping robot 2 apart from Female Robot 1; Described image processing module 1-3 carries out process to the mark shape that described image collecting device 1-1 gathers and obtains oval central coordinate of circle, oval transverse axis and the oval longitudinal axis, and processes identification color information, obtains surperficial visual signature; Described orientation recognition unit 1-4 obtains sub-hopping robot 2 described in oval central coordinate of circle identification according to described image processing module 1-3 process can obtain the orientation coefficient of sub-hopping robot 2 relative to Female Robot 1 in the position of described image collecting device 1-1 pickup area; Described orientation recognition unit 1-4 can also obtain oval transverse axis, the oval longitudinal axis according to described image processing module 1-3 process, and Female Robot 1 described in surperficial visual signature identification is relative to the position angle of sub-hopping robot 2; Described female control module 1-5 controls described Female Robot 1 according to the orientation coefficient that described orientation recognition unit 1-4 obtains and adjusts orientation and make described sub-hopping robot 2 just to described image collecting device 1-1 pickup area middle position; Described radio communications gateway 1-6 is used for the azimuth information that described orientation recognition unit 1-4 obtains to send to described sub-hopping robot 2;
With reference to Fig. 3, Fig. 4 and Fig. 5, the described object of sub-hopping robot 2 for being recovered, main composition involved in the present invention has sub-hopping robot front end face 2-1, steering mechanism 2-2, wireless communication node 2-3, sub-control unit 2-4, mark 2-5 and drives the stepper motor 2-6 of mark, in order to vision is distinguished, the pros and cons color of mark 2-5 disc arranges equal difference, the structural parameters changing sub-hopping robot 2 can regulate robot jumping height and distance, control steering mechanism 2-2 and can change robot course.The azimuth information that described wireless communication node 2-3 sends for receiving Female Robot 1 radio communications gateway 1-6; Described sub-control unit 2-4 controls the course of described sub-hopping robot 2 according to the position angle that described wireless communication node 2-3 receives, and makes sub-hopping robot 2 just to described Female Robot 1.
Described image collecting device 1-1 and distance measuring sensor 1-2, is fixed on the afterbody of Female Robot 1 by The Cloud Terrace, have a horizontal direction and swing degree of freedom and an angle of pitch adjustment degree of freedom; Described image collecting device 1-1 is a common colour imagery shot, and described distance measuring sensor 1-2 is an infrared depth camera, and colour imagery shot gathers Female Robot 1 ambient condition information, for the extraction of hopping robot 2 attitude and azimuth information; Infrared depth camera is for obtaining the distance between Female Robot 1 and sub-hopping robot 2.
With reference to Fig. 6, Fig. 7, Fig. 8 and Fig. 9, be arranged on the rear that the image collecting device 1-1 of Female Robot 1 afterbody and distance measuring sensor 1-2 faces Female Robot 1, the mark 2-5 being arranged on sub-hopping robot 2 head is just jumping the dead ahead of robot facing to bullet, mark 2-5 disc is parallel with the front end face 2-1 of sub-hopping robot 2, and the color of definition mark disc near the side of sub-hopping robot front end face 2-1 is color one, the color of opposite side is color two; The described sub-hopping robot pose identification based on primary and secondary robot dynamic cooperative and recovery method step comprise:
Step S1: described Female Robot 1 original place rotates at a slow speed self course angle of adjustment, and intercept image sequence from the video flowing that image collecting device 1-1 produces;
Step S2: the image processing module 1-5 of described Female Robot 1 resolves step S1 truncated picture sequence in real time, based on colour recognition, truncated picture is processed, when there is the mark 2-5 color of sub-hopping robot 2 in image, the color of record identification 2-5, Female Robot 1 stops operating and does following sub-step process:
1, based on colour recognition, identification color extraction is carried out to truncated picture, thus location area-of-interest, suitable threshold value is chosen by image binaryzation to region of interest area image, and through the disposal of gentle filter, rim detection;
2, from the edge image obtained, extract profile, utilize mark 2-5 to have significantly circular or oval configuration feature, carry out the oval sequence in Randomized Hough circle transformation detection bianry image;
3, for reducing background environment interference, suitable threshold value is selected based on experience value, by ineligible elliptic contour filtering;
4, oval position and shape are determined in Randomized Hough circle transformation, obtain the oval coordinate (x, y) of the center of circle in the plane of delineation from parameter set;
If Female Robot 1 in step s 2 course regulates the 360 degree of colors mark 2-5 of sub-hopping robot 2 still not detected, then control sub-hopping robot 2 course and regulate certain angle, then step S1 and S2 is repeated, until Female Robot 1 detects that the mark 2-5 of sub-hopping robot 2 terminates;
Step S3: calculate the orientation coefficient of sub-hopping robot 2 relative to Female Robot 1, obtain sub-hopping robot 2 relatively RGB-D sensor 1-3-1 (Female Robot 1 afterbody) towards:
Definition orientation coefficient D:D=(2x-w)/w, wherein w is picture traverse;
Represent during D ≈-1 that sub-hopping robot 2 appears near the left border in the colour imagery shot visual field;
Represent during D ≈ 1 that sub-hopping robot 2 appears near the right side boundary in the colour imagery shot visual field;
Represent during D ≈ 0 that sub-hopping robot 2 appears at colour imagery shot visual field middle position;
During-1<D<0, sub-hopping robot 2 is in colour imagery shot visual field left field, and during 0<D<1, sub-hopping robot 2 is in colour imagery shot visual field right side area;
Step S4: Female Robot 1 is according to self orientation of azimuth information pivot stud adjustment obtained, often adjust and once then gather a two field picture, until image feedback information displaying D ≈ 0, now sub-hopping robot 2 is near the central authorities of the colour imagery shot visual field, namely sub-hopping robot is in Female Robot afterbody dead astern, the colour imagery shot of Female Robot 1 faces sub-hopping robot 2, and sub-hopping robot 2 is positioned at colour imagery shot what comes into a driver's central authorities;
Step S5: sub-hopping robot 2 detects relative to the crab angle of Female Robot 1, instructs sub-hopping robot 2 to carry out course adjustment:
Female Robot 1 again antithetical phrase hopping robot 2 identifies 2-5 disc and detects, because sub-hopping robot 2 not necessarily faces Female Robot 1, therefore the visual angle of eight kinds of situations may be there is, mark under eight kinds of visual angles may be circular or oval, and the identification color detected may be color one or color two; Oval position and shape is determined through Randomized Hough circle transformation, oval coordinate (the x of the center of circle in the plane of delineation is obtained from parameter set, y), oval transverse axis a and oval longitudinal axis b, due to RGB-D camera 1-3-1 setting height(from bottom) and sub-hopping robot 2, to identify setting height(from bottom) close, oval longitudinal axis b is close to the diameter of mark disc, can calculate the crab angle of sub-hopping robot 2 relative to Female Robot 1 according to the value of a, b and mark 2-5 color, a value that note initial time detects is a 0; Visual angle one: if a=b, and be color one, illustrate that mark 2-5 profile is for circular, now sub-hopping robot 2 front end face 2-1 faces Female Robot 1 rear portion, regulates without the need to course;
Visual angle two: if a=b, and be color two, illustrate that mark 2-5 profile is for circular, now sub-hopping robot 2 front end face 2-1 facing away from Female Robot 1 rear portion, and sub-hopping robot 2 course regulates 180 degree;
Visual angle three: if a ≈ 0, now sub-hopping robot 2 front end face 2-1 is vertical with Female Robot 1 trailing flank, sub-hopping robot 2 identifies 2-5 clockwise (vertical view) and turns an angle, if now a<b detected and for color one, then illustrate that sub-hopping robot 2 is towards Female Robot 1 left direction, now control sub-hopping robot 2 clockwise (vertical view) and adjust 90 degree, its course;
Visual angle four: if a ≈ 0, now sub-hopping robot 2 front end face 2-1 is vertical with Female Robot 1 trailing flank, sub-hopping robot 2 identifies 2-5 clockwise (vertical view) and turns an angle, if now a<b detected and for color two, then illustrate that sub-hopping robot 2 is towards Female Robot 1 right direction, now control sub-hopping robot 2 counterclockwise (vertical view) and adjust 90 degree, its course;
Visual angle five: if 0<a<b, and be color one, illustrate that mark 2-5 profile is for oval, sub-hopping robot is θ=arcos (a relative to the crab angle of Female Robot 0/ b), now sub-hopping robot 2 identify 2-5 clockwise (vertical view) turn an angle, Female Robot 1 detects a value again, if a value increases, illustrate that sub-hopping robot 2 front end face 2-1 is towards Female Robot 1 left back, control sub-hopping robot 2 clockwise (vertical view) and adjust its course θ angle;
Visual angle six: if 0<a<b, and be color one, illustrate that mark profile is for oval, sub-hopping robot is θ=arcos (a relative to the crab angle of Female Robot 0/ b), now sub-hopping robot 2 identify 2-5 clockwise (vertical view) turn an angle, Female Robot 1 detects a value again, if a value reduces, illustrate that sub-hopping robot 2 front end face 2-1 is towards Female Robot 1 right back, control sub-hopping robot 2 counterclockwise (vertical view) and adjust its course θ angle;
Visual angle seven: if 0<a<b, and be color two, illustrate that mark 2-5 profile is for oval, sub-hopping robot is θ=π-arcos (a relative to the crab angle of Female Robot 0/ b), now sub-hopping robot 2 identify 2-5 clockwise (vertical view) turn an angle, Female Robot 1 detects a value again, if a value increases, sub-hopping robot 2 front end face 2-1 Female Robot 1 left back is dorsad described, controls sub-hopping robot 2 counterclockwise (vertical view) and adjust its course θ angle;
Visual angle eight: if 0<a<b, and be color two, illustrate that mark 2-5 profile is for oval, sub-hopping robot is θ=π-arcos (a relative to the crab angle of Female Robot 0/ b), now sub-hopping robot 2 identify 2-5 clockwise (vertical view) turn an angle, Female Robot 1 detects a value again, if a value reduces, sub-hopping robot 2 front end face 2-1 Female Robot 1 right back is dorsad described, controls sub-hopping robot 2 clockwise (vertical view) and adjust its course θ angle, step S6: the distance measuring sensor 1-2 all facing Female Robot 1 through step S5 hopping robot 1 front end face and mark 2-5 disc, distance measuring sensor 1-2 obtains it just to the depth map in direction, coordinate conversion is carried out to the coordinate system of image collecting device 1-1 and the coordinate system of distance measuring sensor 1-2, mark 2-5 central coordinate of circle (x, y) be mapped in distance measuring sensor 1-2 three-dimensional system of coordinate and obtain (x ', y ', z), obtain the vertical range z of mark 2-5 disc and distance measuring sensor 1-2 plane, the distance of sub-hopping robot 2 and Female Robot 1 afterbody is can be converted into according to the relation of correspondence,
Step S7: according to z value and the actual attainable spring distance of sub-hopping robot 2, Female Robot 1 straight forward or retrogressing, in the distance range hopping robot 2 being entered can jump into Female Robot 1 recycling box 1-8, sub-hopping robot 2 prepares take-off;
Step S8: sub-hopping robot 2 is jumped in the recycling box 1-8 of top end opening.

Claims (7)

1. the recovery system of a sub-hopping robot, comprise: be arranged on the image collecting device on Female Robot, distance measuring sensor, image processing module, orientation recognition unit, female control module and radio communications gateway, be arranged on the mark on sub-hopping robot, wireless communication node and sub-control unit, it is characterized in that: described in be designated circle and two masks have different visual signatures, described mark is also connected with the servomotor that the described mark of a control is rotated; Described image collecting device is for identifying the mark on described sub-hopping robot; Described distance measuring sensor is for measuring the distance of sub-hopping robot apart from Female Robot; The mark shape of described image processing module to described image acquisition device is carried out process and is obtained oval central coordinate of circle, oval transverse axis and the oval longitudinal axis; Described orientation recognition unit, obtains sub-hopping robot described in oval central coordinate of circle identification according to described image processing module process and obtains sub-hopping robot described in oval transverse axis and oval longitudinal axis identification relative to the position angle of Female Robot in the position in described image acquisition device region with according to described image processing module process; Female Robot adjustment orientation described in the position control that described female control module obtains according to described orientation recognition unit also makes described sub-hopping robot just to described image acquisition device region middle position; Described radio communications gateway is used for the azimuth information that described image processing module obtains to send to described sub-hopping robot; The azimuth information that described wireless communication node sends for receiving Female Robot radio communications gateway; Described sub-control unit controls the course of described sub-hopping robot just to described Female Robot according to the position angle that described wireless communication node receives.
2. the recovery system of sub-hopping robot according to claim 1, is characterized in that: described visual signature is color or brightness.
3. the recovery system of sub-hopping robot according to claim 1 and 2, is characterized in that: described image collector is set to colour imagery shot, described distance measuring sensor is infrared depth camera.
4. based on a recovery method for recovery system described in claim 1, it is characterized in that, comprise the following steps:
Step 1: described Female Robot original place rotates at a slow speed self course angle, and intercept image sequence from the video flowing that image collecting device produces;
Step 2: the image processing module of described Female Robot is resolved step 1 truncated picture sequence in real time, when there is the mark of sub-hopping robot 2 in image, Female Robot stops operating, and the circle utilizing circular indicia to have or oval configuration feature obtain identifying 4 parameter x, y under oval configuration, a, b, wherein x, y are oval central coordinate of circle, a is oval transverse axis, and b is the oval longitudinal axis;
Step 3: through step 2 resolve obtain the coordinate (x, y) of the oval center of circle in the plane of delineation, obtain the relative Female Robot of sub-hopping robot towards;
Step 4: self orientation of orientation information pivot stud adjustment that Female Robot obtains according to step 3, makes sub-hopping robot be in the center position of Female Robot;
Step 5: Female Robot instructs sub-hopping robot course to adjust according to the oval transverse axis obtained in step 2 and oval longitudinal axis parameter, makes the front end face of sub-hopping robot just to Female Robot;
Step 6: the range information between the sub-hopping robot obtained according to distance measuring sensor and Female Robot and the spring distance of sub-hopping robot, controls described Female Robot and moves in the spring recovery distance of sub-hopping robot;
Step 7: control sub-hopping robot and snap in Female Robot recycling box.
5. recovery method according to claim 4, is characterized in that, the treatment step identifying 4 parameter x, y, a, b under oval configuration in described step 2 is:
Step 21, based on colour recognition, identification color is extracted to truncated picture, thus location area-of-interest, suitable threshold binarization is chosen to region of interest area image, and through the disposal of gentle filter, rim detection;
Step 22, from the edge image obtained, extract profile, under image collecting device different azimuth visual angle, carry out Randomized Hough circle transformation and detect oval sequence in bianry image;
Step 23, selection threshold value, by ineligible elliptic contour filtering;
Step 24, Randomized Hough circle transformation determine oval position and shape, and from parameter set, obtain oval 4 parameter x, y, a, b, wherein x, y are oval central coordinate of circle, and a is oval transverse axis, and b is the oval longitudinal axis.
6. recovery method according to claim 4, is characterized in that, obtain in described step 3 the relative Female Robot of sub-hopping robot towards method be:
Definition orientation coefficient D:D=(2x-w)/w, wherein w is picture traverse;
Represent during D ≈-1 that sub-hopping robot 2 appears near the left border in the colour imagery shot visual field;
Represent during D ≈ 1 that sub-hopping robot 2 appears near the right side boundary in the colour imagery shot visual field;
Represent during D ≈ 0 that sub-hopping robot 2 appears at colour imagery shot visual field middle position;
During-1<D<0, sub-hopping robot 2 is on the left of the colour imagery shot visual field, and during 0<D<1, sub-hopping robot 2 is on the right side of the colour imagery shot visual field.
7. recovery method according to claim 4, is characterized in that, the method for described step 5 neutron hopping robot course adjustment is:
During initial position, mark disc is parallel with sub-hopping robot front end face:
If a=b, illustrate that the profile extracted through step 2 is for circular, now identify just to image collecting device, judge which face of mark is just to image collecting device according to surface color, if the course of the front then sub-hopping robot of mark does not adjust, if reverse side, then adjust sub-hopping robot anglec of rotation π;
If a ≈ 0, illustrate that sub-hopping robot front end face is vertical with Female Robot trailing flank, sub-hopping robot mark rotates clockwise certain angle, if now a<b detected and for color one, then illustrate that sub-hopping robot is towards Female Robot left direction, now control sub-hopping robot and adjust its course pi/2 clockwise; If now a<b detected and for color two, then illustrate that sub-hopping robot is towards Female Robot right direction, now control sub-hopping robot and adjust its course pi/2 counterclockwise;
If 0<a<b, illustrate that mark profile is for oval, note a value is now a 0:
(1) if surface is color one, then sub-hopping robot is θ=arcos (a relative to the crab angle of Female Robot 0/ b), now sub-hopping robot mark rotates clockwise certain angle, and Female Robot detects a value again, if a value increases, illustrates that sub-hopping robot front end face is towards Female Robot left back, controls sub-hopping robot and adjust its course θ angle clockwise; If a value reduces, illustrate that sub-hopping robot front end face is towards Female Robot right back, control sub-hopping robot and adjust its course θ angle counterclockwise;
(2) if surface is color two, then sub-hopping robot is θ=π-arcos (a relative to the crab angle of Female Robot 0/ b), now sub-hopping robot mark rotates clockwise certain angle, and Female Robot detects a value again, if a value increases, sub-hopping robot front end face Female Robot left back is dorsad described, controls sub-hopping robot and adjust its course θ angle counterclockwise; If a value reduces, illustrate that sub-hopping robot front end face is towards Female Robot right back, control sub-hopping robot and adjust its course θ angle clockwise.
CN201510349751.9A 2015-06-23 2015-06-23 Son hopping robot recovery system and recovery method Expired - Fee Related CN104965513B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510349751.9A CN104965513B (en) 2015-06-23 2015-06-23 Son hopping robot recovery system and recovery method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510349751.9A CN104965513B (en) 2015-06-23 2015-06-23 Son hopping robot recovery system and recovery method

Publications (2)

Publication Number Publication Date
CN104965513A true CN104965513A (en) 2015-10-07
CN104965513B CN104965513B (en) 2017-05-17

Family

ID=54219552

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510349751.9A Expired - Fee Related CN104965513B (en) 2015-06-23 2015-06-23 Son hopping robot recovery system and recovery method

Country Status (1)

Country Link
CN (1) CN104965513B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105665970A (en) * 2016-03-01 2016-06-15 中国科学院自动化研究所 System and method for automatic generation for path points of welding robot
CN107089275A (en) * 2017-03-27 2017-08-25 西北工业大学 It is a kind of can posture adjustment in the air and land energy regenerating sufficient roll-type interval hopping robot
WO2017211315A1 (en) * 2016-06-08 2017-12-14 科沃斯机器人股份有限公司 Cooperative work system formed by mother robot and child robot, and operation method thereof
CN109955265A (en) * 2019-03-08 2019-07-02 武汉理工大学 A kind of indoor range complex intelligence shell case cleaning robot
CN112515541A (en) * 2019-09-17 2021-03-19 佛山市云米电器科技有限公司 Cleaning method and system based on mother-child linkage type floor sweeping robot
CN113998021A (en) * 2021-12-10 2022-02-01 东南大学 Bionic search and rescue robot and space self-deployment method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102353355A (en) * 2011-06-14 2012-02-15 西安工程大学 Method for measuring power transmission line pole and tower inclination based on video differences
US20120158177A1 (en) * 2010-12-17 2012-06-21 The Industry & Academic Cooperation In Chungnam National University Method and system for performing seamless localization
CN102556193A (en) * 2012-01-09 2012-07-11 东南大学 Hopping robot capable of hopping continuously
CN103522304A (en) * 2013-10-28 2014-01-22 中国科学院自动化研究所 Capsule entry method of slave robots based on master robot vision
CN103593849A (en) * 2013-11-26 2014-02-19 北京建筑大学 Method for quickly recognizing and tracking image sequence oval artificial target points
CN104180808A (en) * 2014-08-05 2014-12-03 南京航空航天大学 Aerial autonomous refueling circular taper sleeve vision position and attitude resolving method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120158177A1 (en) * 2010-12-17 2012-06-21 The Industry & Academic Cooperation In Chungnam National University Method and system for performing seamless localization
CN102353355A (en) * 2011-06-14 2012-02-15 西安工程大学 Method for measuring power transmission line pole and tower inclination based on video differences
CN102556193A (en) * 2012-01-09 2012-07-11 东南大学 Hopping robot capable of hopping continuously
CN103522304A (en) * 2013-10-28 2014-01-22 中国科学院自动化研究所 Capsule entry method of slave robots based on master robot vision
CN103593849A (en) * 2013-11-26 2014-02-19 北京建筑大学 Method for quickly recognizing and tracking image sequence oval artificial target points
CN104180808A (en) * 2014-08-05 2014-12-03 南京航空航天大学 Aerial autonomous refueling circular taper sleeve vision position and attitude resolving method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
JUN ZHANG等: "A bio-inspired jumping robot: Modeling, simulation, design, and experimental results", 《MECHATRONICS》 *
JUN ZHANG等: "Aerial Posture Adjustment of a Bio-Inspired Jumping Robot for Safe Landing: Modeling and Simulation", 《PROCEEDINGS OF THE 2014 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS》 *
JUN ZHANG等: "Self-Righting, Steering and Takeoff Angle Adjusting for a Jumping Robot", 《2012 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS》 *
赵鹏等: "一种基于母机器人视觉指引的子机器人回收方法", 《华中科技大学学报(自然科学版)》 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105665970A (en) * 2016-03-01 2016-06-15 中国科学院自动化研究所 System and method for automatic generation for path points of welding robot
WO2017211315A1 (en) * 2016-06-08 2017-12-14 科沃斯机器人股份有限公司 Cooperative work system formed by mother robot and child robot, and operation method thereof
CN107479544A (en) * 2016-06-08 2017-12-15 科沃斯机器人股份有限公司 Mother and sons' machine cooperative operation system and its method of work
US11648675B2 (en) 2016-06-08 2023-05-16 Ecovacs Robotics Co., Ltd. Mother-child robot cooperative work system and work method thereof
CN107089275A (en) * 2017-03-27 2017-08-25 西北工业大学 It is a kind of can posture adjustment in the air and land energy regenerating sufficient roll-type interval hopping robot
CN107089275B (en) * 2017-03-27 2019-03-26 西北工业大学 It is a kind of aerial posture adjustment and the sufficient roll-type interval hopping robot of energy regenerating to be landed
CN109955265A (en) * 2019-03-08 2019-07-02 武汉理工大学 A kind of indoor range complex intelligence shell case cleaning robot
CN112515541A (en) * 2019-09-17 2021-03-19 佛山市云米电器科技有限公司 Cleaning method and system based on mother-child linkage type floor sweeping robot
CN112515541B (en) * 2019-09-17 2022-02-15 佛山市云米电器科技有限公司 Cleaning method and system based on mother-child linkage type floor sweeping robot
CN113998021A (en) * 2021-12-10 2022-02-01 东南大学 Bionic search and rescue robot and space self-deployment method
CN113998021B (en) * 2021-12-10 2023-10-03 东南大学 Bionic search and rescue robot and space self-deployment method

Also Published As

Publication number Publication date
CN104965513B (en) 2017-05-17

Similar Documents

Publication Publication Date Title
CN104965513A (en) Son hopping robot recovery system and recovery method
CN110782481B (en) Unmanned ship intelligent decision-making method and system
CN105512628B (en) Vehicle environmental sensory perceptual system based on unmanned plane and method
CN105929850B (en) A kind of UAV system and method with lasting locking and tracking target capability
McGee et al. Obstacle detection for small autonomous aircraft using sky segmentation
CN107544550B (en) Unmanned aerial vehicle automatic landing method based on visual guidance
Lange et al. Autonomous landing for a multirotor UAV using vision
CN104808685A (en) Vision auxiliary device and method for automatic landing of unmanned aerial vehicle
CN111735445B (en) Monocular vision and IMU (inertial measurement Unit) integrated coal mine tunnel inspection robot system and navigation method
CN109753081A (en) A kind of patrol unmanned machine system in tunnel based on machine vision and air navigation aid
CN104298248A (en) Accurate visual positioning and orienting method for rotor wing unmanned aerial vehicle
CN113093772B (en) Method for accurately landing hangar of unmanned aerial vehicle
CN109885086A (en) A kind of unmanned plane vertical landing method based on the guidance of multiple polygonal shape mark
Vidal et al. UAV vision aided positioning system for location and landing
CN110097620A (en) High-precision map creation system based on image and three-dimensional laser
CN109164825A (en) A kind of independent navigation barrier-avoiding method and device for multi-rotor unmanned aerial vehicle
CN107272735A (en) Mobile platform evades the method, system and mobile platform of collision automatically
CN109521780A (en) The control system and control method of remote operation vehicle
Chiu et al. Vision-only automatic flight control for small UAVs
CN110825098A (en) Unmanned aerial vehicle distribution network intelligence system of patrolling and examining
CN203133607U (en) Constant-illumination vision AGV (automated guided vehicle)
CN102968119A (en) Automatic visual guide vehicle for constant illumination
CN116578035A (en) Rotor unmanned aerial vehicle autonomous landing control system based on digital twin technology
Luo et al. Stereo Vision-based Autonomous Target Detection and Tracking on an Omnidirectional Mobile Robot.
Li et al. Visual servoing of micro aerial vehicles with the cooperation of ground vehicle

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170517

Termination date: 20210623