US20080300722A1 - Amphibious robotic device - Google Patents

Amphibious robotic device Download PDF

Info

Publication number
US20080300722A1
US20080300722A1 US12/192,579 US19257908A US2008300722A1 US 20080300722 A1 US20080300722 A1 US 20080300722A1 US 19257908 A US19257908 A US 19257908A US 2008300722 A1 US2008300722 A1 US 2008300722A1
Authority
US
United States
Prior art keywords
legs
motion
robotic device
control system
leg
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/192,579
Inventor
Gregory Dudek
Chris Prahacs
Shane Saunderson
Phillppe Giguere
Junaed Sattar
Michael Jenkin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
McGill University
Original Assignee
McGill University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by McGill University filed Critical McGill University
Priority to US12/192,579 priority Critical patent/US20080300722A1/en
Assigned to MCGILL UNIVERSITY reassignment MCGILL UNIVERSITY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DUDEK, GREGORY, GIGUERE, PHILIPPE, PRAHACS, CHRIS, SATTAR, JUNAED, SAUNDERSON, SHANE
Publication of US20080300722A1 publication Critical patent/US20080300722A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60FVEHICLES FOR USE BOTH ON RAIL AND ON ROAD; AMPHIBIOUS OR LIKE VEHICLES; CONVERTIBLE VEHICLES
    • B60F3/00Amphibious vehicles, i.e. vehicles capable of travelling both on land and on water; Land vehicles capable of travelling under water
    • B60F3/0007Arrangement of propulsion or steering means on amphibious vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63CLAUNCHING, HAULING-OUT, OR DRY-DOCKING OF VESSELS; LIFE-SAVING IN WATER; EQUIPMENT FOR DWELLING OR WORKING UNDER WATER; MEANS FOR SALVAGING OR SEARCHING FOR UNDERWATER OBJECTS
    • B63C11/00Equipment for dwelling or working underwater; Means for searching for underwater objects
    • B63C11/34Diving chambers with mechanical link, e.g. cable, to a base
    • B63C11/36Diving chambers with mechanical link, e.g. cable, to a base of closed type
    • B63C11/42Diving chambers with mechanical link, e.g. cable, to a base of closed type with independent propulsion or direction control

Definitions

  • the present invention relates to robotic devices, particularly to robotic devices designed to maneuver in a liquid medium as well as on a solid medium.
  • underwater robotics poses certain unique challenges that render many of the principles of terrestrial robotics problematic.
  • a robot underwater is able to move along six degrees of freedom, and maneuvering with six degrees of freedom creates serious complications.
  • a computationally straightforward task of pose maintenance on land becomes far more challenging under water, because of environmental factors such as strong currents in marine environments.
  • Infra-red sensors lose some of their effectiveness in water as well.
  • Wireless radio communications are also impossible over a large distance in water compared to ground based control. All these issues make underwater robotics problems more difficult than terrestrial robotics.
  • visual tracking is the process of repeatedly computing a position of a feature or sets of features in a sequence of input images.
  • a number of methods for visual tracking in a dry environment (i.e. not underwater) based on the color of the target are known.
  • One of the known approaches is color-blob tracking, where the trackers segment out sections of the image that match a threshold level for the given target and based on the segmentation output, tracks the shape, size or centroid of the blob, among other features.
  • Another approach is the matching of color histograms, which are a measure of color distribution over an image.
  • a robotic device for navigating in at least a liquid medium
  • the robotic device comprising: a legged propulsion system having a series of legs external of a body of the robotic device, each of the legs being independently driven and mounted to the body for pivotal movement about a respective transverse axis, each of the legs being operable to at least oscillate relative to the body about the respective transverse axis such that interaction between the legs and the liquid medium produces propulsive forces that displace the robotic device within the liquid medium; and a control system operatively connected to the legged propulsion system for autonomous control and operation of the robotic device based on information received from at least one sensor providing data about an environment of the device, the control system using data from the at least one sensor to determine a desired motion of the robotic device and a corresponding required leg motion of each of the legs to produce the desired motion, and the control system autonomously actuating each of the legs of the legged propulsion system in accordance with the corresponding required leg motion.
  • an amphibious robotic device comprising: a legged propulsion system having a series of legs, each of said legs being driven by an actuator and mounted for pivotal movement about a respective transverse axis in one of at least a swimming mode and a walking mode, said legs being configured to pivotally oscillate relative to the transverse axis in said swimming mode when the device is in a liquid medium such that interaction between said legs and the liquid medium provides propulsive forces that displace the vehicle body within the liquid medium, said legs being configured to rotate relative to the transverse axis in said walking mode when the device is on a solid medium such that interaction between said legs and the solid medium provides propulsive forces that displace the vehicle body in a desired direction on the solid medium; and a control system having at least one sensor operable to autonomously detect with which of the liquid medium and the solid medium the robotic device is interacting and a leg controller synchronously operating said legs in either one of the swimming mode and the walking mode based on the detected medium.
  • a control system for autonomously maneuvering a robotic device in at least one of a liquid medium and a solid medium the robotic device including a propulsion system having a series of individually controlled legs
  • the control system comprising: at least one visual sensor retrieving an image of an environment of the device in the medium; an image analyzing module receiving the image, determining a presence of an object of a given type therein and analyzing at least one property of the object; a motion calculator determining a desired motion of the device based on the at least one property of the object; and a controller operating the propulsion system of the device, the controller calculating a respective required leg motion of each of the legs to obtain the desired motion of the device and operating each of the legs based on the respective required leg motion calculated, such that the robotic device autonomously maneuvers in said medium.
  • FIG. 1 is a perspective view of a robotic device in accordance with a particular embodiment of the present invention
  • FIG. 2 is a top view of the device of FIG. 1 , with a top panel thereof removed for improved clarity;
  • FIG. 3 is a perspective view of a leg of the device in accordance with an alternate embodiment of the present invention.
  • FIG. 4 is a cross-sectional view of the leg of FIG. 3 ;
  • FIG. 5 is a perspective view of a leg of the device in accordance with another alternate embodiment of the present invention.
  • FIG. 6 is a cross-sectional view of the leg of FIG. 5 ;
  • FIG. 7 is a perspective view of a leg of the device in accordance with a further alternate embodiment of the present invention.
  • FIG. 8 is a cross-sectional view of the leg of FIG. 7 ;
  • FIG. 9 is a block diagram of a control system for the device of FIG. 1 ;
  • FIG. 10 is a block diagram of a particular embodiment of the control system of FIG. 9 ;
  • FIG. 11 is a block diagram of another particular embodiment of the control system of FIG. 9 ;
  • FIG. 12A is a schematic top view of the device of FIG. 1 showing parameters used in a hovering gait thereof;
  • FIG. 12B is a schematic side view of the device of FIG. 1 illustrating a stand-by range and a thrust range of the hovering gait;
  • FIG. 12C is a schematic, partial side view of the device of FIG. 1 illustrating the angle, thrust and pressure drag for one of the legs.
  • a robotic device in accordance with a particular embodiment of the present invention is generally shown at 10 .
  • the device 10 is designed as an aquatic swimming robot that is also capable of operating on a solid medium, including compact ground surfaces and sand.
  • the device 10 is said to be amphibious in that it can walk on a solid surface and penetrate a neighboring liquid medium, swim in that liquid medium and exit the liquid medium on an appropriate interface between the solid and liquid medium (e.g. a ramped up bottom surface of the liquid environment becoming the solid medium such as a beach).
  • the device 10 comprises a body 12 including a waterproof shell 14 which can be for example made of aluminum, inside which all electronics, sensors, power supplies, actuators, etc. are housed.
  • the shell 14 includes buoyancy plates which allows for the buoyancy of the device 10 to be adjusted depending on the properties of the liquid medium the device 10 is going to be submerged in (e.g. salt water, fresh water).
  • the buoyancy of the device 10 is preferably adjusted to near neutral.
  • the device 10 comprises six legs 16 a which are attached to the body such as to be rotatable about a respective rotational axis 18 extending transversely with respect to the body 12 .
  • the motion of each leg 16 a is controlled by a single respective actuator 20 (see FIG. 2 ), thus minimizing the complexity of the motion of the device 10 as well as the required energy to actuate the legs 16 a .
  • the legs 16 a give the device the ability to turn sideways (yaw), dive (pitch) and rotate on its horizontal axis (roll), as well as the ability to move forward and backward (surge), and up and down (heave).
  • the device 10 comprises a front camera assembly 90 mounted in the front of the body 12 and a rear camera assembly 92 mounted in the rear of the body 12 .
  • the device also includes a power source, for example a series of batteries 94 (see FIG. 2 ).
  • the device 10 can further comprise various internal sensors for monitoring the device functions, for example battery power and power consumption levels for the leg actuators 20 .
  • each actuator 20 is mounted rigidly to a cylindrical hip extension 22 , 22 ′ which is in turn rigidly mounted to the body 12 .
  • Each leg 16 a includes a leg mount 24 which is rigidly mounted on an output shaft 25 (see FIG. 1 ) of the respective actuator 20 passing through the respective hip extension 22 , 22 ′, such that the leg mount 24 is rotated by the output shaft 25 .
  • Each leg 16 a also includes a flipper 26 which extends from the leg mount 24 such that a longitudinal axis 27 (see FIG. 2 ) of the flipper 26 is perpendicular or substantially perpendicular to the rotational axis 18 of the respective leg 16 a .
  • each flipper 26 produces thrust underwater through an oscillatory motion, i.e. a reciprocating pivotal motion of each flipper 26 about its respective rotational axis 18 .
  • each flipper 26 includes two flexible sheets, for example made of vinyl, which are glued together over a series of rods.
  • the flexible sheets provide the necessary flexibility to capture water when the legs 16 a are oscillating, while the rods provide the necessary stiffness to generate thrust.
  • the legs 16 a are distributed symmetrically about the body 12 , with three legs 16 a being equally spaced apart on each side thereof.
  • the middle hip extensions 22 ′ are longer than the front and rear hip extensions 22 , such that the middle legs 16 a extend outwardly of the other legs 16 a to avoid interference between adjacent flippers 26 .
  • the relative size of the body 12 and the flippers 26 is such that the hip extensions 22 have a similar length without interference between the legs 16 a of a same side. While the legs 16 a are particularly well suited for use underwater, they are less efficient for motion on a solid medium.
  • an amphibious leg 16 b according to a particular embodiment of the present invention is shown.
  • This leg 16 b allows for a good compromise between underwater propulsion and propulsion on a solid medium.
  • legs 18 a of the device 10 By replacing the legs 18 a of the device 10 with legs such as the leg 18 b , good underwater propulsion is provided, although generally less efficient than with the legs 16 a , and efficient propulsion on a solid medium is provided, both forward and backward.
  • Each leg 16 b includes a leg mount 24 similar to that of the previously described legs 16 a , and which rigidly engages the respective actuator output shaft 25 (not shown in FIGS. 3-4 ) for rotation and/or oscillation about the respective axis 18 .
  • An upper member 30 b of the leg 16 b includes an attachment plate 32 rigidly attached to the leg mount 24 , and spaced apart upper rigid plates 34 b extending from the attachment plate 32 perpendicularly to the rotational axis 18 of the leg 16 b .
  • the upper plates 34 b are teardrop-shaped, with their distal larger end 38 b being interconnected by two spaced apart transverse plates 40 b .
  • a joint pin 42 b also extends between the distal ends 38 b of the upper plates 34 b between the transverse plates 40 b and the attachment plate 32 .
  • a lower member 44 b of the leg 16 b includes lower spaced apart rigid plates 46 b also extending perpendicularly to the rotational axis 18 of the leg 16 b .
  • the proximal ends 48 b of the lower plates 46 b extend outwardly of the distal ends 38 b of the upper plates 34 b and are also interconnected by the joint pin 42 b .
  • the joint pin 42 b is rotationally connected to at least one of the upper and lower members 30 b , 44 b , such that the members 30 b , 44 b are pivotally interconnected by the joint pin 42 b .
  • the lower plates 46 b are also teardrop-shaped, with their distal larger end 50 b being interconnected by two spaced apart end pins 52 b , and by a rounded end connecting member 54 .
  • Each of the upper plates 34 b forms with its associated lower plate 46 b a substantially planar frame 56 b mainly defined in a plane perpendicular to the rotational axis 18 of the leg 16 b , and as such creating minimal interference with the water when the device 10 is submerged.
  • the frames 56 b are made of a rigid material which can be for example an adequate metal or composite material.
  • the leg 16 b also includes an elastic member 58 b which extends between the frames 56 b perpendicularly thereto.
  • the elastic member 58 b is a double member which extends from the attachment plate 32 to around the end pins 52 b , passing between the transverse plates 40 b and on each side of the joint pin 42 b .
  • the elastic member 58 b is a thrust producing member when the leg 16 b is moved underwater.
  • the spaced apart end pins 52 b shape the elastic member 58 b so that during swimming, the liquid medium is forced away from the end of the lower section 54 , thus reducing the drag.
  • the transverse plates 40 b offset the bending point of the elastic member 58 b from the joint pin 42 b , such as to increase the amount by which the elastic member 58 b stretches upon relative pivoting of the members 30 b , 44 b .
  • the elastic member 58 b thus provides compliance to the leg 16 b , while limiting the relative pivoting motion of the upper and lower members 30 b , 44 b such that the frames 56 b can bear the weight of the device 10 when the device 10 moves on solid ground.
  • the elastic member 58 b is made of a material providing increased resistance to the pivoting motion of the members 30 b , 44 b about the joint pin 42 b as the members 30 b , 44 b are pivoted away from the aligned position shown in the Figures.
  • the leg 16 b therefore defines a jointed limb that is at least partially compliant when used on a solid medium and at least partially stiff when used in a liquid medium.
  • the elastic member 58 b acts similarly to a ligament which interconnects the two members 30 b , 44 b of the leg 16 b and constrains the relative pivotal movement such as to arrive at a desired gait of the device, both on a solid medium and in a liquid medium (e.g. water).
  • each leg 16 c includes a leg mount 24 (not shown in FIGS. 5-6 ) is similar to that of the previously described legs 16 a,b and which rigidly engages the respective actuator output shaft 25 (not shown in FIGS. 5-6 ).
  • An attachment plate 32 is rigidly attached to the leg mount 24 , and spaced apart rigid plates 34 c extend from the attachment plate 32 perpendicularly to the rotational axis 18 of the leg 16 c .
  • the plates 34 c have rounded distal ends 38 c interconnected at each extremity thereof by a transverse plate 40 c .
  • a flexible toe plate 60 is rigidly connected to each transverse plate 40 c , the toe plate 60 flexing upon contact of the leg 16 c with a solid surface to augment a contact area between the leg 16 c and the solid surface, and springing back into an unflexed position when contact with the ground is lost.
  • the plates 40 c thus define spaced apart rigid frames 56 c which, by being mainly defined in a plane perpendicular to the rotational axis 18 of the leg 16 c , have a minimal interference with the water when the device 10 is submerged.
  • the leg 16 c also includes a flexible flipper 62 , the proximal end 64 thereof being rigidly connected to the attachment plate 32 and the distal end 66 thereof being free.
  • the flipper 62 extends between the rigid frames 56 c perpendicularly thereto. Because of its flexibility, the flipper 62 is free to flap between the rigid frames 56 c .
  • the flipper 62 is shorter than the rigid frames 56 c such as to avoid contact with the ground in a dry environment.
  • the flipper 62 and toe plates 60 are thrust producing members when the leg 16 c is moved underwater.
  • the rigid frames 56 c provide a channeling effect underwater such that there is not spill over when the flipper 62 bends, thus increasing the thrust produced.
  • each leg 16 d includes a leg mount 24 (not shown in FIGS. 7-8 ) which is similar to that of the previously described legs 16 a,b,c , and which rigidly engages the respective actuator output shaft 25 (not shown in FIGS. 7-8 ).
  • An upper member 30 d includes an attachment plate 32 rigidly attached to the leg mount 24 , and spaced apart upper rigid plates 34 d extending from the attachment plate 32 perpendicularly to the rotational axis 18 of the leg 16 d .
  • the upper plates are 34 d oblong shaped, although any shape conducive to good fluid flow underwater while maintaining adequate strength and rigidity when loaded by the weight of the device 10 on land can be used.
  • the proximal end 36 d of the upper plates 34 d is interconnected by first and second spaced apart end pins 68 , 69 .
  • An upper joint pin 42 d also extends between the distal ends 38 d of the upper plates 34 d.
  • a middle member 70 of the leg 16 d includes middle spaced apart rigid plates 72 also extending perpendicularly to the rotational axis 18 of the leg 16 d .
  • the middle plates 72 are also oblong shaped, although here again any shape conductive to good fluid flow underwater while maintaining adequate strength and rigidity when loaded by the weight of the device 10 on land can be used.
  • the proximal ends 74 of the middle plates 72 extend inwardly of the upper plates 34 d and are also interconnected by the upper joint pin 42 d .
  • the upper joint pin 42 d is rotationally connected to at least one of the upper and middle members 30 d , 70 , such that the upper and middle members 30 d , 70 are pivotally interconnected by the upper joint pin 42 d .
  • the proximal ends 74 of the middle plates 72 also include a stopper 76 which limits the pivoting motion between the upper and middle members 30 d , 70 along a single direction from the aligned position shown in the Figures.
  • the distal ends 78 of the middle plates 72 are interconnected by a lower joint pin 80 .
  • a lower member 44 d of the leg 16 d includes lower spaced apart rigid plates 46 d also extending perpendicularly to the rotational axis 18 of the leg 16 d .
  • the lower plates 46 d are semi-circular, with a rounded edge 82 defining a contact area with the ground surface.
  • the lower plates 46 d along a proximal end 48 d of the rounded edge 82 , extend outwardly of the middle plates 72 and are also interconnected by the lower joint pin 80 .
  • the lower joint pin 80 is rotationally connected to at least one of the middle and lower members 70 , 44 d , such that the middle and lower members 70 , 44 d are pivotally interconnected by the lower joint pin 80 .
  • the distal ends 78 of the middle plates 72 include a stopper 81 which limits the pivoting motion between the middle and lower members 70 , 44 d along a single direction from the aligned position shown in the Figures.
  • the lower plates 46 d along a distal end 50 d of the rounded edge 82 , are interconnected by a third end pin 52 d.
  • Each set of connected upper, middle and lower members 30 d , 70 , 44 d thus defines a substantially planar frame 56 d , the two apart frames 56 d being mainly defined in a plane perpendicular to the rotational axis 18 of the leg 16 d and as such having a minimal interference with the water when the device is submerged.
  • the leg 16 d also includes an elastic member 58 d which extends between the frames 56 d perpendicularly thereto.
  • the elastic member 58 d is a double member which extends around the first end pin 68 and the third end pin 52 d , passing in between in a zigzag pattern against the second end pin 69 , against the upper joint pin 42 d on a side thereof opposite that of the second end pin 69 and against the lower joint pin 80 on side thereof opposite that of the upper joint pin 42 d .
  • the elastic member 58 d is a thrust producing member when the leg 16 d is moved underwater.
  • the elastic member 58 d also provides compliance to the leg 16 d while limiting the relative pivoting motion between the members 30 d , 70 , 44 d about the joint pins 42 d , 80 , by forcing the leg 16 d in the aligned position, shown in the Figures, against the stoppers 76 , 81 .
  • the frames 56 d can bear the weight of the device 10 when the device 10 moves on solid ground.
  • the elastic member 58 d is made of a material providing increased resistance to the pivoting motion of the members 30 d , 70 , 44 d about the joint pins 42 d , 80 as the members 30 d , 70 , 44 d are pivoted away from the aligned position.
  • the system 102 optionally includes an operator control unit 104 allowing an operator to directly control the device 10 , the operator control unit 104 sending signals to a motion calculator 106 in accordance with the operator input.
  • the operator control unit 104 receives feedback from at least one visual sensor 108 of a visual control system 120 , which in a particular embodiment includes a camera from one or both of the front and rear camera assemblies 90 , 92 , which provides streaming video from the device 10 .
  • the operator control unit 104 also receives feedback from an inertial sensor 110 , or Inertial Measurement Unit (IMU), installed within the body 12 for orientation and acceleration sensing.
  • IMU Inertial Measurement Unit
  • the operator control unit 104 can also receive feedback from the motion calculator 106 containing any other relevant data to assist in controlling the device 10 . Communication between the device 10 and the operator control unit 104 is optionally done over a fiber optic tether (not shown).
  • the visual control system 120 also controls the device 10 , and as such the operator control unit 104 can be omitted or used in conjunction with the visual control system 120 .
  • the motion calculator 106 computes a desired motion of the device 10 , i.e. pitch, roll, yaw, heave and/or surge, based for example on the signal from the operator control unit 104 , and communicates this desired motion to a leg controller 112 .
  • the leg controller 112 computes a required thrust at each leg 16 a,b,c,d to obtain the desired, motion, and determines the corresponding motion for each leg 16 a,b,c,d .
  • the leg controller 112 moves the legs 16 a,b,c,d in accordance with a series of preset gaits programmed therein. Gaits are a combination of leg parameters that generate a fixed motion for a fixed set of parameters. Depending on whether the device 10 is swimming or walking, different table-driven gaits are used to drive the device 10 . Walking gaits move the device 10 through complete rotation of the legs 16 a,b,c,d . swimming gaits move the device through a “kicking” motion of the legs 16 a,b,c,d , i.e.
  • the leg controller 112 computes a series of three parameters for each leg 16 a,b,c,d based on the required leg thrust: amplitude, offset and phase.
  • the leg controller 112 actuates the actuator 20 of each leg 16 a,b,c,d based on the three parameters calculated.
  • the amplitude parameter governs the distance the legs 16 a,b,c,d sweep along the spherical arch defined around the rotational axis 18 during each cycle. Offset dictates the starting orientation of the legs 16 a,b,c,d relative to each other at the beginning of the cycle.
  • Direction of the leg motion is controlled by the phase parameters of each leg 16 a,b,c,d.
  • the swimming gaits only allow the device 10 to turn if it is moving forward or backward.
  • the leg controller 112 also includes hovering gaits to allow the device 10 to move along 5 degrees of freedom, i.e. pitch, roll, yaw, heave and/or surge without forward or rearward movement, such as to be able to hold a fixed position despite water currents, and to turn at that fixed location, for example to keep an object within camera range.
  • the motion calculator 106 also receives inertial data from the inertial sensor 110 which can influence the required motion communicated to the leg controller 112 .
  • the motion calculator 106 uses the inertial data to determine if the device 10 is in a dry or underwater environment, and directs the leg controller 112 to use either the walking or the swimming/hovering gaits accordingly. As such the device 10 can autonomously switch to the appropriate gaits upon entering or coming out of the water.
  • the hovering gaits of the leg controller 112 which allow the device 10 to perform station keeping, are described below.
  • the leg controller 112 receives the required motion input from the motion calculator 106 including pitch (Cp), roll (Cr), yaw (Cy), heave (Ch) and surge (Cs).
  • the leg controller first computes Fx and Fz, the column vectors representing the desired thrust at each leg location in the x and z directions (see FIG. 12A , where arrow 114 indicates the forward direction).
  • the legs 16 a,b,c,d cannot generate thrust in the y direction. Accordingly, the desired thrust Fx and Fz for each leg 16 a,b,c,d is computed by the leg controller 112 according to the following:
  • the size of the column vectors Fx and Fz is equal to the number of legs 16 a,b,c,d on the device 10 , which in this case is six (6).
  • the legs 16 a,b,c,d are shown in FIG. 12A as identified by a number n ranging from 0 to 5, the legs 0 and 5 facing forward to provide an extended moment arm and symmetric pitching moment with the legs 2 and 3 , as well as to provide quick reverse surge.
  • the selected thrust angle ⁇ c,n and magnitude Tc,n for each leg n is thus computed by the leg controller 112 as:
  • the most efficient way to generate thrust in a direction ⁇ r is through rapid oscillation (period of 250-500 ms) of the leg 16 a,b,c,d around that angle ⁇ r.
  • the magnitude of the thrust is approximately proportional to the amplitude of the oscillation.
  • the leg angle ⁇ f over time is thus:
  • ⁇ f ( t ) T n sin( ⁇ f t+ ⁇ f )+ ⁇ r
  • ⁇ f is the fixed frequency of oscillation and a phase shift ⁇ f is used between the legs.
  • the leg angle ⁇ f and thrust Tn are shown in FIG. 12C for one of the legs 16 a,b,c,d.
  • the leg controller 112 limits the range of thrust angle for each leg to a region 116 (see FIG. 12B ) of 180°. This reduces the average reorientation angle responsible for the parasitic forces at the cost of reduced maximum device thrust. For example, the front legs are not used when a forward thrust is commanded, thereby reducing the maximum possible forward thrust of the device 10 . To further improve the reaction time of the device 10 , the leg controller 112 uses the pressure drag forces D (see FIG. 12C ) generated when the legs 16 a,b,c,d are reoriented.
  • the leg 16 a,b,c,d When the difference between the desired thrust angle and the current leg angle is greater than 45°, the leg 16 a,b,c,d is rotated at a rate that generates a pressure drag D consistent with the desired thrust Tc via a constant KPD. As the leg surface passes the 45° region, the oscillation amplitude is increased until it reaches its selected amplitude as given by the equation for ⁇ f(t) set out above. Using discrete time equations and letting ⁇ r be the ramped value of the computed thrust angle ⁇ c and magnitude Tc, the motion for each leg 16 a,b,c,d is set by the leg controller 112 in accordance with the two following equations:
  • ramp(rate,a,b) function ramps value b toward a at a constant rate.
  • the leg controller 112 gradually moves the legs 16 a,b,c,d back to a stand-by range 118 (see FIG. 12B ) of 90°, such that the legs 16 a,b,c,d are always able to generate the proper thrust rapidly by making the leg surface or its normal no more than 45° away from any desired thrust within the thrust range.
  • the hovering gaits thus allow the device 10 to maintain position based on operator commands through the operator control unit 104 , the visual control system 120 (described further below) and/or input from the inertial sensor 110 .
  • the device 10 also includes the visual control system 120 , 120 a , 120 b allowing the device 10 to be controlled based on visual input, i.e. autonomously from input from the operator control unit 104 .
  • the visual control system 120 , 120 a , 120 b includes the motion calculator 106 , leg controller 112 and visual sensor 108 described above, as well as an image analyzing module 122 receiving data from the visual sensor 108 .
  • the visual sensor 108 includes a digital camera which is part of the front camera assembly 90 and which interfaces with the image analyzing module 122 .
  • the image analyzing module 122 detects the presence of a target of a selected type, and sends data on at least one property of that target to the motion calculator 106 , as will be described in more detail below.
  • the motion calculator 106 computes the required device motion and communicated it to the leg controller 112 which controls the leg actuators 20 in accordance with the appropriate gaits.
  • the target to be recognized by the visual control system 120 , 120 a , 120 b can be designed to be highly robust and easy to read by both person and the device 10 , and can be for example arranged in the form on a booklet and/or on the faces of a geometric object (such as for example a cube) to be easy to manipulate and transport.
  • a geometric object such as for example a cube
  • the device 10 includes two computers 124 , one for the leg controller 112 , and the other for the motion calculator 106 and image analyzing module 122 .
  • Both computers 124 are of the PC104/Plus form factor, due to the space restrictions inside the body 12 .
  • These two computers 124 along with additional port and interface circuit boards stacked on top of each other, connect via ISA and PCI buses.
  • the visual control system 120 , 120 a , 120 b can be used with a robotic device having a propulsion system other than legs, such as for example thrusters, the leg controller 112 being replaced by an equivalent controller determining the required thrust of the propulsion system and actuating it accordingly.
  • FIG. 10 illustrates a particular embodiment of the visual control system 120 a .
  • the image analyzing module 122 includes a visual tracking module 126 , which receives an image from the visual sensor 108 , determines the position of a given target and calculates an error in the target's position through comparison with a desired position signal from a desired position module 128 , the desired position usually corresponding to the center of the image.
  • the visual tracking module 126 preferably uses at least one of a color blob tracker algorithm, a histogram tracker algorithm and a mean-shift tracker algorithm, which are described below.
  • the color blob tracker is initialized with the target's color properties in the normalized RGB space, which is in effect an over-represented hue space, where the effect of lighting changes common underwater are minimum. This makes the tracking more effective in the underwater environment.
  • the tracker scans the image converted in normalized RGB format, pixel-by-pixel, and the pixels falling within a given threshold of the color values of the target are turned on in the output image while the other pixels are turned off.
  • the median filtering algorithm is used over the segmented image with either 5-by-5 or 7-by-7 pixel grids, with typical threshold values of 30%-40%.
  • the tracking algorithm detects the blob in the binary image in every frame, and calculates its centroid, which is taken as the new target location.
  • the total mass of the blob is used as a confidence factor.
  • the error signal is computed using the Euclidean distance in the image between the centroid of the blob and the center of the image frame. Two error signals are used for pitch and yaw, and both these signals are sent to the motion controller. A yellow target was found to work well with this type of tracker.
  • the histogram tracker compares rectangular regions of the input frame with the target region by comparing their corresponding color histograms.
  • a histogram of the target to be tracked is created and stored. Every image from the camera is divided into rectangular regions and their normalized histograms having a fixed number of bins over the hue space are calculated. Computationally, the color histogram is formed by discretizing the colors within an image and counting the number of pixels of each color. Depending on the target and the size of the image frame, different number of bins can be used, with preferred numbers being 32 or 64, and with the regions having either one-eighth or one-sixteenth the dimension of the image frame.
  • the use of normalized histograms reduces the effect on color matching of brightness changes in the underwater environment.
  • the histograms are one-dimensional vectors that combine the multi-hue channel data. Similarity between histograms are computed by known measures, for example the histogram intersection measure, the ⁇ 2 (Chi-squared) measure, the Bhattacharyya distance measure, or Jeffrey's Divergence. Since the histograms are normalized, the measures return values ranging from 0 to 1; higher values indicating higher degree of similarity. The minimum similarity measure is preferably taken as 0.5; any measure below this threshold is not accepted as a valid target region. The center of the chosen window is taken as the new target location. As in the case of the color blob tracker, two error signals are used for pitch and yaw, and both these signals are sent to the motion controller. This type of tracker is suitable for tracking objects that have a variety of color.
  • color histograms are also used as the underlying distribution.
  • the histograms are three-dimensional arrays in this case, one each for the three RGB channels, with preferably 16 bins per channel.
  • the target histogram are computed in a square window of sides preferably equaling 100 pixels.
  • the color model probability density function for the target is calculated by overlaying the sub window by a kernel having the Epanechnikov profile.
  • the weights for the mean-shift vector are calculated using the Epanechnikov kernel.
  • the tracker is initialized with the last known location of the target and the target PDF model.
  • the candidate window having the same size as the target is created at the location of the last known target position, the candidate PDF model is calculated and the weights for pixel are calculated, leading to a new candidate position.
  • the mean-shift process preferably uses 10 iteration steps to choose a new target location.
  • the Bhattacharyya distance between the candidate PDF model and the target PDF model is calculated to quantify the similarity between the target and the new candidate location.
  • the location with the minimum Bhattacharyya distance is chosen as the new target location.
  • two error signals (pitch and yaw) are sent to the motion controller.
  • the mean shift tracker is resistant to changes in lighting and appearance of duplicate objects in the frame, but necessitates substantially more computation than the preceding trackers.
  • the visual tracking module 126 is able to track the target at almost 15 frames/second, and therefore without filtering, the commands sent to the leg actuators 20 by the leg controller 112 would be changing at such a high rate that it would yield a highly unstable swimming behavior.
  • the motion calculator 106 which receives the error signals from the visual tracking module 126 includes a pitch controller 130 and a yaw controller 132 which are both PID controllers, used to take these target locations and produce pitch and yaw commands at a rate to ensure stable behavior of the device 10 .
  • the roll axis is not affected by this particular type of visual control. Given the input from the visual tracking module 126 at any instant, and the previous tracker inputs, each of the pitch and yaw controllers 130 , 132 generates commands based on the following control law:
  • ⁇ t is the time-averaged error signal at time t and is defined recursively as:
  • ⁇ t is the error signal at time t
  • KP, KI and KD are respectively the proportional, integral and differential gains
  • is the error propagation constant.
  • the pitch and yaw controllers 130 , 132 work identically as follows.
  • Each controller 130 , 132 includes a low-pass first-order infinite-impulse response or IIR filter (i.e. a digital filter blocking high frequency signals), smoothing out fast changing pitch and yaw commands by averaging them over a period of time.
  • a time constant is defined for the low-pass filter for each controller 130 , 132 .
  • the gains KP, KI and KD for each controller 130 , 132 are input manually, with limits to truncate the gains.
  • Each controller 130 , 132 has a dead band limit applied to the error signal, i.e. a range of change in output for which the controller 130 , 132 will not respond at all. This prevents the controller output from changing too frequently, by ignoring small changes in the error signal.
  • a sleep time between each iteration in servoing is also introduced to reduce command overhead of the controllers 130 , 132 .
  • the parameters for the controllers 130 , 132 are as follows: a KP of 1.0 with corresponding limit of 1.0, a KI of 0.0 with corresponding limit of 0.3, and a KD of 0.0 with corresponding limit of 1.0 for both controllers 130 , 132 , a dead band of 0.2 for both controllers 130 , 132 , a time constant of 0.35 for the pitch controller 130 and 0.05 for the yaw controller 132 , and a command limit of 1.0 for both controllers 130 , 132 .
  • the pitch and yaw controllers 130 , 132 thus send required pitch and yaw of the device 10 to the leg controller 112 , which as mentioned above computes a required thrust at each leg 16 a,b,c,d , determines a corresponding leg motion following the appropriate gaits, and controls the actuator 20 of each leg 16 a,b,c,d accordingly to obtain the required pitch and yaw.
  • the visual tracking module 126 also compares the size of the target with a reference size, and sends a size error signal to the motion calculator 106 , which computes a desired speed change for the device 10 , sending corresponding motion data to the leg controller 112 . As such the device 10 can remain within a given distance of the target by modifying its speed.
  • the visual control system 120 a thus allows the device to follow a moving target or, through use of the hovering gait, hold position relative to a stationary target.
  • FIG. 11 illustrates another embodiment of the visual control system 120 b .
  • the image analyzing module 122 includes a marker detection module 134 , which is configured to detect a target of a particular type, for example an ARTag marker.
  • ARTag markers include both symbolic and geometric content, and are constructed using an error-correcting code to enhance robustness.
  • the marker detection module 134 sends the description of the marker, which in the case of an ARTag marker is a binary number corresponding to the black and white regions of the marker, to a marker identification module 136 which is part of the motion calculator 106 .
  • the motion calculator 106 also includes a marker library 138 , which contains a list of the possible markers, each being associated with a particular command, for example turn right, change speed to 1 m/s, go to location X, film during a given period, switch to the hovering, swimming or walking gaits, etc.
  • the marker identification module 136 thus accesses the marker library 138 to identify the marker and retrieve the associated command.
  • the marker identification module 136 sends the required motion signal (pitch, yaw, roll, heave and/or surge) to the leg controller 112 , which as mentioned above computes a required thrust at each leg 16 a,b,c,d , determines the motion of each leg 16 a,b,c,d based on the appropriate gaits and controls the actuator 20 of each leg 16 a,b,c,d accordingly to produce the required motion.
  • a diver can thus communicate directly with the device 10 and give it a series of instructions simply by showing it different cards having the adequate markers illustrated thereon.
  • the motion calculator 106 can also memorize a series of commands such that the diver can in fact program in advance a series of motions or tasks for the device 10 .
  • the visual control systems 120 a,b of FIG. 10 and FIG. 11 can be used together, such that for example the device 10 follows a given target unless a marker is detected, at which point the device 10 stops following the target and obeys the commands dictated by the marker.
  • the visual control system 120 , 120 a , 120 b can be used with a robotic device moved by a propulsion system other than legs, the leg controller 112 being replaced by an equivalent controller receiving the desired device motion signal, determining the corresponding required thrust of the propulsion system and actuating the propulsion system accordingly.
  • the visual control abilities of the device 10 allows it to follow a moving object, for instance a diver, and/or accept commands from the diver on presentation of cards carrying predetermined markers corresponding with tasks to be performed.
  • a complete sequence of actions can be programmed into the device 10 using the predetermined markers.
  • a diver can communicate directly with the device 10 without the assistance of an operator located on the surface and as such with or without a tether.
  • the device 10 can thus be operated in a semi-autonomous manner, with or without input from an operator on the surface through the operator control unit 104 .
  • the device 10 of the present invention can be used in a wide range of applications. These include underwater search and rescue, coral health monitoring, monitoring of underwater establishments (e.g. oil pipelines, communication cables) and many more. Specifically, environmental assessment tasks in which visual measurements of a marine ecosystem must be taken on a regular basis can be performed by the device 10 .
  • the device 10 can also be used in a variety of diver-assisting tasks, such as monitoring divers from a surface, providing lighting (for example while following a diver), providing communication between divers and the surface, carrying cargo and/or tools, carrying audio equipment or air reserves, etc.
  • the device 10 includes an acoustic transducer and as such allows the diver to hear sounds transmitted from the surface, stored on the device 10 and/or synthesized by the device 10 , as well as to send acoustic signals back to the surface by having them relayed by the device 10 , while following the diver or another target and/or responding to commands given by the diver through the use of visual markers.
  • the sounds could be, for example, music, instructional narrative and/or cautionary information.
  • the sounds emitted by the device 10 can depend on various factors that can be sensed by the device 10 , for example, on the depth or location of the device 10 , the length of time the diver has been underwater, the water temperature, or other environmental parameters.
  • the visual tracking module 126 can be used to recognize given landmarks and as such allow the device 10 to return autonomously to its starting point once a given task is performed.
  • the amphibious legs 16 b,c,d allow the device to start from and return to a location on dry land while performing a task (such as video surveillance) underwater.

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Ocean & Marine Engineering (AREA)
  • Manipulator (AREA)

Abstract

A robotic device for navigating in at least a liquid medium, includes a legged propulsion system having a series of legs external of a body of the robotic device, each of the legs being independently driven and mounted to the body for pivotal movement about a respective transverse axis. The legs oscillating relative to the body about the respective transverse axis such that interaction between the legs and the liquid medium produces propulsive forces that displace the robotic device within the liquid medium. A control system is operatively connected to the legged propulsion system for autonomous control and operation of the robotic device based on information received from at least one sensor providing data about an environment of the device.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present application is a continuation of U.S. patent application Ser. No. 11/497,302 filed Aug. 2, 2006, the entire contents of which is incorporated herein by reference.
  • FIELD OF THE INVENTION
  • The present invention relates to robotic devices, particularly to robotic devices designed to maneuver in a liquid medium as well as on a solid medium.
  • BACKGROUND ART
  • In general, underwater robotics poses certain unique challenges that render many of the principles of terrestrial robotics problematic. A robot underwater is able to move along six degrees of freedom, and maneuvering with six degrees of freedom creates serious complications. A computationally straightforward task of pose maintenance on land becomes far more challenging under water, because of environmental factors such as strong currents in marine environments. Infra-red sensors lose some of their effectiveness in water as well. Wireless radio communications are also impossible over a large distance in water compared to ground based control. All these issues make underwater robotics problems more difficult than terrestrial robotics.
  • The traditional approach used to propel undersea vehicles is by using propellers or thrusters. Although simple by design, these vehicles lack the maneuverability and agility seen in fish and other marine species. In addition, thrusters are not an energy efficient approach to station keeping underwater.
  • In computer vision, visual tracking is the process of repeatedly computing a position of a feature or sets of features in a sequence of input images. A number of methods for visual tracking in a dry environment (i.e. not underwater) based on the color of the target are known. One of the known approaches is color-blob tracking, where the trackers segment out sections of the image that match a threshold level for the given target and based on the segmentation output, tracks the shape, size or centroid of the blob, among other features. Another approach is the matching of color histograms, which are a measure of color distribution over an image. Some of the tracking methods are combined with statistical methods to provide more accurate results, one example being mean-shift tracking algorithms, which attempt to maximize the statistical correlation between two distributions. However, the tracking of objects in a dry environment is very different from the tracking of objects underwater. Underwater, vision is impaired by the turbidity of the water caused by floating sedimentation (“aquatic snow”) and other floating debris. The behavior of the light beams is altered by many factors including refraction, which is influenced by waves and water salinity level, scattering, which causes a reduction of contrast between objects and influences color hues, and absorption, which is frequency dependent and makes detection of certain colors difficult. As such, vision in underwater environments has rarely been examined due to the complications involved.
  • SUMMARY OF INVENTION
  • It is therefore an aim of the present invention to provide an improved robotic device.
  • Therefore, in accordance with the present invention, there is provided a robotic device for navigating in at least a liquid medium, the robotic device comprising: a legged propulsion system having a series of legs external of a body of the robotic device, each of the legs being independently driven and mounted to the body for pivotal movement about a respective transverse axis, each of the legs being operable to at least oscillate relative to the body about the respective transverse axis such that interaction between the legs and the liquid medium produces propulsive forces that displace the robotic device within the liquid medium; and a control system operatively connected to the legged propulsion system for autonomous control and operation of the robotic device based on information received from at least one sensor providing data about an environment of the device, the control system using data from the at least one sensor to determine a desired motion of the robotic device and a corresponding required leg motion of each of the legs to produce the desired motion, and the control system autonomously actuating each of the legs of the legged propulsion system in accordance with the corresponding required leg motion.
  • There is also provided, in accordance with the present invention, an amphibious robotic device comprising: a legged propulsion system having a series of legs, each of said legs being driven by an actuator and mounted for pivotal movement about a respective transverse axis in one of at least a swimming mode and a walking mode, said legs being configured to pivotally oscillate relative to the transverse axis in said swimming mode when the device is in a liquid medium such that interaction between said legs and the liquid medium provides propulsive forces that displace the vehicle body within the liquid medium, said legs being configured to rotate relative to the transverse axis in said walking mode when the device is on a solid medium such that interaction between said legs and the solid medium provides propulsive forces that displace the vehicle body in a desired direction on the solid medium; and a control system having at least one sensor operable to autonomously detect with which of the liquid medium and the solid medium the robotic device is interacting and a leg controller synchronously operating said legs in either one of the swimming mode and the walking mode based on the detected medium.
  • There is further provided, in accordance with the present invention, a control system for autonomously maneuvering a robotic device in at least one of a liquid medium and a solid medium, the robotic device including a propulsion system having a series of individually controlled legs, the control system comprising: at least one visual sensor retrieving an image of an environment of the device in the medium; an image analyzing module receiving the image, determining a presence of an object of a given type therein and analyzing at least one property of the object; a motion calculator determining a desired motion of the device based on the at least one property of the object; and a controller operating the propulsion system of the device, the controller calculating a respective required leg motion of each of the legs to obtain the desired motion of the device and operating each of the legs based on the respective required leg motion calculated, such that the robotic device autonomously maneuvers in said medium.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Reference will now be made to the accompanying drawings, showing by way of illustration a particular embodiment of the present invention and in which:
  • FIG. 1 is a perspective view of a robotic device in accordance with a particular embodiment of the present invention;
  • FIG. 2 is a top view of the device of FIG. 1, with a top panel thereof removed for improved clarity;
  • FIG. 3 is a perspective view of a leg of the device in accordance with an alternate embodiment of the present invention;
  • FIG. 4 is a cross-sectional view of the leg of FIG. 3;
  • FIG. 5 is a perspective view of a leg of the device in accordance with another alternate embodiment of the present invention;
  • FIG. 6 is a cross-sectional view of the leg of FIG. 5;
  • FIG. 7 is a perspective view of a leg of the device in accordance with a further alternate embodiment of the present invention;
  • FIG. 8 is a cross-sectional view of the leg of FIG. 7;
  • FIG. 9 is a block diagram of a control system for the device of FIG. 1;
  • FIG. 10 is a block diagram of a particular embodiment of the control system of FIG. 9;
  • FIG. 11 is a block diagram of another particular embodiment of the control system of FIG. 9;
  • FIG. 12A is a schematic top view of the device of FIG. 1 showing parameters used in a hovering gait thereof;
  • FIG. 12B is a schematic side view of the device of FIG. 1 illustrating a stand-by range and a thrust range of the hovering gait; and
  • FIG. 12C is a schematic, partial side view of the device of FIG. 1 illustrating the angle, thrust and pressure drag for one of the legs.
  • DETAILED DESCRIPTION OF PARTICULAR EMBODIMENTS
  • Referring to FIGS. 1-2, a robotic device in accordance with a particular embodiment of the present invention is generally shown at 10. The device 10 is designed as an aquatic swimming robot that is also capable of operating on a solid medium, including compact ground surfaces and sand. As such, the device 10 is said to be amphibious in that it can walk on a solid surface and penetrate a neighboring liquid medium, swim in that liquid medium and exit the liquid medium on an appropriate interface between the solid and liquid medium (e.g. a ramped up bottom surface of the liquid environment becoming the solid medium such as a beach).
  • The device 10 comprises a body 12 including a waterproof shell 14 which can be for example made of aluminum, inside which all electronics, sensors, power supplies, actuators, etc. are housed. The shell 14 includes buoyancy plates which allows for the buoyancy of the device 10 to be adjusted depending on the properties of the liquid medium the device 10 is going to be submerged in (e.g. salt water, fresh water). The buoyancy of the device 10 is preferably adjusted to near neutral.
  • The device 10 comprises six legs 16 a which are attached to the body such as to be rotatable about a respective rotational axis 18 extending transversely with respect to the body 12. As such, the motion of each leg 16 a is controlled by a single respective actuator 20 (see FIG. 2), thus minimizing the complexity of the motion of the device 10 as well as the required energy to actuate the legs 16 a. Underwater, the legs 16 a give the device the ability to turn sideways (yaw), dive (pitch) and rotate on its horizontal axis (roll), as well as the ability to move forward and backward (surge), and up and down (heave).
  • The device 10 comprises a front camera assembly 90 mounted in the front of the body 12 and a rear camera assembly 92 mounted in the rear of the body 12. The device also includes a power source, for example a series of batteries 94 (see FIG. 2). The device 10 can further comprise various internal sensors for monitoring the device functions, for example battery power and power consumption levels for the leg actuators 20.
  • Leg Design
  • In the embodiment of FIGS. 1-2, the legs 16 a shown are mainly designed for water propulsion. Each actuator 20 is mounted rigidly to a cylindrical hip extension 22, 22′ which is in turn rigidly mounted to the body 12. Each leg 16 a includes a leg mount 24 which is rigidly mounted on an output shaft 25 (see FIG. 1) of the respective actuator 20 passing through the respective hip extension 22, 22′, such that the leg mount 24 is rotated by the output shaft 25. Each leg 16 a also includes a flipper 26 which extends from the leg mount 24 such that a longitudinal axis 27 (see FIG. 2) of the flipper 26 is perpendicular or substantially perpendicular to the rotational axis 18 of the respective leg 16 a. The legs 16 a produce thrust underwater through an oscillatory motion, i.e. a reciprocating pivotal motion of each flipper 26 about its respective rotational axis 18. In a particular embodiment, each flipper 26 includes two flexible sheets, for example made of vinyl, which are glued together over a series of rods. The flexible sheets provide the necessary flexibility to capture water when the legs 16 a are oscillating, while the rods provide the necessary stiffness to generate thrust. The legs 16 a are distributed symmetrically about the body 12, with three legs 16 a being equally spaced apart on each side thereof. The middle hip extensions 22′ are longer than the front and rear hip extensions 22, such that the middle legs 16 a extend outwardly of the other legs 16 a to avoid interference between adjacent flippers 26. In an alternate embodiment, the relative size of the body 12 and the flippers 26 is such that the hip extensions 22 have a similar length without interference between the legs 16 a of a same side. While the legs 16 a are particularly well suited for use underwater, they are less efficient for motion on a solid medium.
  • Referring to FIGS. 3-4, an amphibious leg 16 b according to a particular embodiment of the present invention is shown. This leg 16 b allows for a good compromise between underwater propulsion and propulsion on a solid medium. By replacing the legs 18 a of the device 10 with legs such as the leg 18 b, good underwater propulsion is provided, although generally less efficient than with the legs 16 a, and efficient propulsion on a solid medium is provided, both forward and backward. Each leg 16 b includes a leg mount 24 similar to that of the previously described legs 16 a, and which rigidly engages the respective actuator output shaft 25 (not shown in FIGS. 3-4) for rotation and/or oscillation about the respective axis 18. An upper member 30 b of the leg 16 b includes an attachment plate 32 rigidly attached to the leg mount 24, and spaced apart upper rigid plates 34 b extending from the attachment plate 32 perpendicularly to the rotational axis 18 of the leg 16 b. The upper plates 34 b are teardrop-shaped, with their distal larger end 38 b being interconnected by two spaced apart transverse plates 40 b. A joint pin 42 b also extends between the distal ends 38 b of the upper plates 34 b between the transverse plates 40 b and the attachment plate 32.
  • A lower member 44 b of the leg 16 b includes lower spaced apart rigid plates 46 b also extending perpendicularly to the rotational axis 18 of the leg 16 b. The proximal ends 48 b of the lower plates 46 b extend outwardly of the distal ends 38 b of the upper plates 34 b and are also interconnected by the joint pin 42 b. The joint pin 42 b is rotationally connected to at least one of the upper and lower members 30 b, 44 b, such that the members 30 b, 44 b are pivotally interconnected by the joint pin 42 b. The lower plates 46 b are also teardrop-shaped, with their distal larger end 50 b being interconnected by two spaced apart end pins 52 b, and by a rounded end connecting member 54.
  • Each of the upper plates 34 b forms with its associated lower plate 46 b a substantially planar frame 56 b mainly defined in a plane perpendicular to the rotational axis 18 of the leg 16 b, and as such creating minimal interference with the water when the device 10 is submerged. The frames 56 b are made of a rigid material which can be for example an adequate metal or composite material.
  • The leg 16 b also includes an elastic member 58 b which extends between the frames 56 b perpendicularly thereto. Referring particularly to FIG. 4, the elastic member 58 b is a double member which extends from the attachment plate 32 to around the end pins 52 b, passing between the transverse plates 40 b and on each side of the joint pin 42 b. The elastic member 58 b is a thrust producing member when the leg 16 b is moved underwater. The spaced apart end pins 52 b shape the elastic member 58 b so that during swimming, the liquid medium is forced away from the end of the lower section 54, thus reducing the drag. The transverse plates 40 b offset the bending point of the elastic member 58 b from the joint pin 42 b, such as to increase the amount by which the elastic member 58 b stretches upon relative pivoting of the members 30 b, 44 b. The elastic member 58 b thus provides compliance to the leg 16 b, while limiting the relative pivoting motion of the upper and lower members 30 b, 44 b such that the frames 56 b can bear the weight of the device 10 when the device 10 moves on solid ground. In a particular embodiment, the elastic member 58 b is made of a material providing increased resistance to the pivoting motion of the members 30 b, 44 b about the joint pin 42 b as the members 30 b, 44 b are pivoted away from the aligned position shown in the Figures.
  • The leg 16 b therefore defines a jointed limb that is at least partially compliant when used on a solid medium and at least partially stiff when used in a liquid medium. The elastic member 58 b acts similarly to a ligament which interconnects the two members 30 b, 44 b of the leg 16 b and constrains the relative pivotal movement such as to arrive at a desired gait of the device, both on a solid medium and in a liquid medium (e.g. water).
  • Referring to FIGS. 5-6, an amphibious leg 16 c according to an alternate embodiment of the present invention is shown. Each leg 16 c includes a leg mount 24 (not shown in FIGS. 5-6) is similar to that of the previously described legs 16 a,b and which rigidly engages the respective actuator output shaft 25 (not shown in FIGS. 5-6). An attachment plate 32 is rigidly attached to the leg mount 24, and spaced apart rigid plates 34 c extend from the attachment plate 32 perpendicularly to the rotational axis 18 of the leg 16 c. The plates 34 c have rounded distal ends 38 c interconnected at each extremity thereof by a transverse plate 40 c. A flexible toe plate 60 is rigidly connected to each transverse plate 40 c, the toe plate 60 flexing upon contact of the leg 16 c with a solid surface to augment a contact area between the leg 16 c and the solid surface, and springing back into an unflexed position when contact with the ground is lost. The plates 40 c thus define spaced apart rigid frames 56 c which, by being mainly defined in a plane perpendicular to the rotational axis 18 of the leg 16 c, have a minimal interference with the water when the device 10 is submerged.
  • The leg 16 c also includes a flexible flipper 62, the proximal end 64 thereof being rigidly connected to the attachment plate 32 and the distal end 66 thereof being free. The flipper 62 extends between the rigid frames 56 c perpendicularly thereto. Because of its flexibility, the flipper 62 is free to flap between the rigid frames 56 c. The flipper 62 is shorter than the rigid frames 56 c such as to avoid contact with the ground in a dry environment. The flipper 62 and toe plates 60 are thrust producing members when the leg 16 c is moved underwater. The rigid frames 56 c provide a channeling effect underwater such that there is not spill over when the flipper 62 bends, thus increasing the thrust produced.
  • Referring to FIGS. 7-8, an amphibious leg 16 d according to a further alternate embodiment of the present invention is shown. This leg 16 d has however limited efficiency in a backward movement on land because of its directionally limited pivoting motion, as will be described further below. Each leg 16 d includes a leg mount 24 (not shown in FIGS. 7-8) which is similar to that of the previously described legs 16 a,b,c, and which rigidly engages the respective actuator output shaft 25 (not shown in FIGS. 7-8). An upper member 30 d includes an attachment plate 32 rigidly attached to the leg mount 24, and spaced apart upper rigid plates 34 d extending from the attachment plate 32 perpendicularly to the rotational axis 18 of the leg 16 d. In the embodiment shown, the upper plates are 34 d oblong shaped, although any shape conducive to good fluid flow underwater while maintaining adequate strength and rigidity when loaded by the weight of the device 10 on land can be used. The proximal end 36 d of the upper plates 34 d is interconnected by first and second spaced apart end pins 68, 69. An upper joint pin 42 d also extends between the distal ends 38 d of the upper plates 34 d.
  • A middle member 70 of the leg 16 d includes middle spaced apart rigid plates 72 also extending perpendicularly to the rotational axis 18 of the leg 16 d. In the embodiment shown the middle plates 72 are also oblong shaped, although here again any shape conductive to good fluid flow underwater while maintaining adequate strength and rigidity when loaded by the weight of the device 10 on land can be used. The proximal ends 74 of the middle plates 72 extend inwardly of the upper plates 34 d and are also interconnected by the upper joint pin 42 d. The upper joint pin 42 d is rotationally connected to at least one of the upper and middle members 30 d, 70, such that the upper and middle members 30 d, 70 are pivotally interconnected by the upper joint pin 42 d. The proximal ends 74 of the middle plates 72 also include a stopper 76 which limits the pivoting motion between the upper and middle members 30 d, 70 along a single direction from the aligned position shown in the Figures. The distal ends 78 of the middle plates 72 are interconnected by a lower joint pin 80.
  • A lower member 44 d of the leg 16 d includes lower spaced apart rigid plates 46 d also extending perpendicularly to the rotational axis 18 of the leg 16 d. The lower plates 46 d are semi-circular, with a rounded edge 82 defining a contact area with the ground surface. The lower plates 46 d, along a proximal end 48 d of the rounded edge 82, extend outwardly of the middle plates 72 and are also interconnected by the lower joint pin 80. The lower joint pin 80 is rotationally connected to at least one of the middle and lower members 70, 44 d, such that the middle and lower members 70, 44 d are pivotally interconnected by the lower joint pin 80. The distal ends 78 of the middle plates 72 include a stopper 81 which limits the pivoting motion between the middle and lower members 70, 44 d along a single direction from the aligned position shown in the Figures. The lower plates 46 d, along a distal end 50 d of the rounded edge 82, are interconnected by a third end pin 52 d.
  • Each set of connected upper, middle and lower members 30 d, 70, 44 d thus defines a substantially planar frame 56 d, the two apart frames 56 d being mainly defined in a plane perpendicular to the rotational axis 18 of the leg 16 d and as such having a minimal interference with the water when the device is submerged.
  • The leg 16 d also includes an elastic member 58 d which extends between the frames 56 d perpendicularly thereto. The elastic member 58 d is a double member which extends around the first end pin 68 and the third end pin 52 d, passing in between in a zigzag pattern against the second end pin 69, against the upper joint pin 42 d on a side thereof opposite that of the second end pin 69 and against the lower joint pin 80 on side thereof opposite that of the upper joint pin 42 d. The elastic member 58 d is a thrust producing member when the leg 16 d is moved underwater. The elastic member 58 d also provides compliance to the leg 16 d while limiting the relative pivoting motion between the members 30 d, 70, 44 d about the joint pins 42 d, 80, by forcing the leg 16 d in the aligned position, shown in the Figures, against the stoppers 76, 81. As such, the frames 56 d can bear the weight of the device 10 when the device 10 moves on solid ground. In a particular embodiment, the elastic member 58 d is made of a material providing increased resistance to the pivoting motion of the members 30 d, 70, 44 d about the joint pins 42 d, 80 as the members 30 d, 70, 44 d are pivoted away from the aligned position.
  • Control System
  • Referring to FIG. 9, a control system 102 for the device 10 is shown. The system 102 optionally includes an operator control unit 104 allowing an operator to directly control the device 10, the operator control unit 104 sending signals to a motion calculator 106 in accordance with the operator input. The operator control unit 104 receives feedback from at least one visual sensor 108 of a visual control system 120, which in a particular embodiment includes a camera from one or both of the front and rear camera assemblies 90, 92, which provides streaming video from the device 10. The operator control unit 104 also receives feedback from an inertial sensor 110, or Inertial Measurement Unit (IMU), installed within the body 12 for orientation and acceleration sensing. The operator control unit 104 can also receive feedback from the motion calculator 106 containing any other relevant data to assist in controlling the device 10. Communication between the device 10 and the operator control unit 104 is optionally done over a fiber optic tether (not shown).
  • As described further below, the visual control system 120 also controls the device 10, and as such the operator control unit 104 can be omitted or used in conjunction with the visual control system 120.
  • The motion calculator 106 computes a desired motion of the device 10, i.e. pitch, roll, yaw, heave and/or surge, based for example on the signal from the operator control unit 104, and communicates this desired motion to a leg controller 112.
  • The leg controller 112 computes a required thrust at each leg 16 a,b,c,d to obtain the desired, motion, and determines the corresponding motion for each leg 16 a,b,c,d. The leg controller 112 moves the legs 16 a,b,c,d in accordance with a series of preset gaits programmed therein. Gaits are a combination of leg parameters that generate a fixed motion for a fixed set of parameters. Depending on whether the device 10 is swimming or walking, different table-driven gaits are used to drive the device 10. Walking gaits move the device 10 through complete rotation of the legs 16 a,b,c,d. Swimming gaits move the device through a “kicking” motion of the legs 16 a,b,c,d, i.e. an oscillatory motion of the legs 16 a,b,c,d with various phase and amplitude offsets. In both cases, the leg controller 112 computes a series of three parameters for each leg 16 a,b,c,d based on the required leg thrust: amplitude, offset and phase. The leg controller 112 actuates the actuator 20 of each leg 16 a,b,c,d based on the three parameters calculated. The amplitude parameter governs the distance the legs 16 a,b,c,d sweep along the spherical arch defined around the rotational axis 18 during each cycle. Offset dictates the starting orientation of the legs 16 a,b,c,d relative to each other at the beginning of the cycle. Direction of the leg motion is controlled by the phase parameters of each leg 16 a,b,c,d.
  • However, the swimming gaits only allow the device 10 to turn if it is moving forward or backward. As such, the leg controller 112 also includes hovering gaits to allow the device 10 to move along 5 degrees of freedom, i.e. pitch, roll, yaw, heave and/or surge without forward or rearward movement, such as to be able to hold a fixed position despite water currents, and to turn at that fixed location, for example to keep an object within camera range.
  • The motion calculator 106 also receives inertial data from the inertial sensor 110 which can influence the required motion communicated to the leg controller 112. In a particular embodiment, the motion calculator 106 uses the inertial data to determine if the device 10 is in a dry or underwater environment, and directs the leg controller 112 to use either the walking or the swimming/hovering gaits accordingly. As such the device 10 can autonomously switch to the appropriate gaits upon entering or coming out of the water.
  • Hovering Gaits
  • The hovering gaits of the leg controller 112, which allow the device 10 to perform station keeping, are described below. The leg controller 112 receives the required motion input from the motion calculator 106 including pitch (Cp), roll (Cr), yaw (Cy), heave (Ch) and surge (Cs). The leg controller first computes Fx and Fz, the column vectors representing the desired thrust at each leg location in the x and z directions (see FIG. 12A, where arrow 114 indicates the forward direction). The legs 16 a,b,c,d cannot generate thrust in the y direction. Accordingly, the desired thrust Fx and Fz for each leg 16 a,b,c,d is computed by the leg controller 112 according to the following:
  • F x = ( 0 0 - k y 0 k s 0 0 - k y 0 k s 0 0 - k y 0 k s 0 0 k y 0 k s 0 0 k y 0 k s 0 0 k y 0 k s ) × C F z = ( k p k r 0 k h 0 0 k r 0 k h 0 - k p k r 0 k h 0 - k p - k r 0 k h 0 0 - k r 0 k h 0 k p - k r 0 k h 0 ) × C
  • where C=[Cp Cr Cy Ch Cs]T and the constants kp, kr, ky, kh and ks are used to scale down the input so that the absolute maximum value of the output is less than or equal to 1. The size of the column vectors Fx and Fz is equal to the number of legs 16 a,b,c,d on the device 10, which in this case is six (6). The legs 16 a,b,c,d are shown in FIG. 12A as identified by a number n ranging from 0 to 5, the legs 0 and 5 facing forward to provide an extended moment arm and symmetric pitching moment with the legs 2 and 3, as well as to provide quick reverse surge.
  • The selected thrust angle θc,n and magnitude Tc,n for each leg n is thus computed by the leg controller 112 as:
  • θ c , n = arctan ( F zn F xn ) and T c , n = F xn 2 + F zn 2
  • The most efficient way to generate thrust in a direction θr is through rapid oscillation (period of 250-500 ms) of the leg 16 a,b,c,d around that angle θr. The magnitude of the thrust is approximately proportional to the amplitude of the oscillation. The leg angle θf over time is thus:

  • θf(t)=T n sin(ωf t+φ f)+θr
  • where ωf is the fixed frequency of oscillation and a phase shift φf is used between the legs. The leg angle θf and thrust Tn are shown in FIG. 12C for one of the legs 16 a,b,c,d.
  • To minimize delay in the execution of commands and parasitic forces generated upon orientation of the legs 16 a,b,c,d, the leg controller 112 limits the range of thrust angle for each leg to a region 116 (see FIG. 12B) of 180°. This reduces the average reorientation angle responsible for the parasitic forces at the cost of reduced maximum device thrust. For example, the front legs are not used when a forward thrust is commanded, thereby reducing the maximum possible forward thrust of the device 10. To further improve the reaction time of the device 10, the leg controller 112 uses the pressure drag forces D (see FIG. 12C) generated when the legs 16 a,b,c,d are reoriented. When the difference between the desired thrust angle and the current leg angle is greater than 45°, the leg 16 a,b,c,d is rotated at a rate that generates a pressure drag D consistent with the desired thrust Tc via a constant KPD. As the leg surface passes the 45° region, the oscillation amplitude is increased until it reaches its selected amplitude as given by the equation for θf(t) set out above. Using discrete time equations and letting θr be the ramped value of the computed thrust angle θc and magnitude Tc, the motion for each leg 16 a,b,c,d is set by the leg controller 112 in accordance with the two following equations:
  • θ r [ t + 1 ] = θ r [ t ] if θ c [ t ] outside of thrust range ramp ( KPDTc , θ r [ t ] , θ c [ t ] ) otherwise θ f [ t + 1 ] = θ r [ r ] if θ c [ t ] - θ f [ t ] > 45 ° T c θ r [ t ] - θ f [ t ] 45 ° sin ( ω f t + φ f ) + θ r [ t ] otherwise
  • where the ramp(rate,a,b) function ramps value b toward a at a constant rate.
  • Also, in order to improve slow-changing commands, when the demanded thrust Tc reaches zero, the leg controller 112 gradually moves the legs 16 a,b,c,d back to a stand-by range 118 (see FIG. 12B) of 90°, such that the legs 16 a,b,c,d are always able to generate the proper thrust rapidly by making the leg surface or its normal no more than 45° away from any desired thrust within the thrust range.
  • The hovering gaits thus allow the device 10 to maintain position based on operator commands through the operator control unit 104, the visual control system 120 (described further below) and/or input from the inertial sensor 110.
  • Visual Control Systems
  • Referring back to FIG. 9, the device 10 also includes the visual control system 120, 120 a, 120 b allowing the device 10 to be controlled based on visual input, i.e. autonomously from input from the operator control unit 104. The visual control system 120, 120 a, 120 b includes the motion calculator 106, leg controller 112 and visual sensor 108 described above, as well as an image analyzing module 122 receiving data from the visual sensor 108. In a particular embodiment, the visual sensor 108 includes a digital camera which is part of the front camera assembly 90 and which interfaces with the image analyzing module 122. The image analyzing module 122 detects the presence of a target of a selected type, and sends data on at least one property of that target to the motion calculator 106, as will be described in more detail below. As described above, the motion calculator 106 computes the required device motion and communicated it to the leg controller 112 which controls the leg actuators 20 in accordance with the appropriate gaits.
  • The target to be recognized by the visual control system 120, 120 a, 120 b can be designed to be highly robust and easy to read by both person and the device 10, and can be for example arranged in the form on a booklet and/or on the faces of a geometric object (such as for example a cube) to be easy to manipulate and transport.
  • Referring back to FIG. 2, in a particular embodiment, the device 10 includes two computers 124, one for the leg controller 112, and the other for the motion calculator 106 and image analyzing module 122. Both computers 124 are of the PC104/Plus form factor, due to the space restrictions inside the body 12. These two computers 124, along with additional port and interface circuit boards stacked on top of each other, connect via ISA and PCI buses.
  • The visual control system 120, 120 a, 120 b can be used with a robotic device having a propulsion system other than legs, such as for example thrusters, the leg controller 112 being replaced by an equivalent controller determining the required thrust of the propulsion system and actuating it accordingly.
  • Tracker-Based Visual Control System
  • FIG. 10 illustrates a particular embodiment of the visual control system 120 a. The image analyzing module 122 includes a visual tracking module 126, which receives an image from the visual sensor 108, determines the position of a given target and calculates an error in the target's position through comparison with a desired position signal from a desired position module 128, the desired position usually corresponding to the center of the image. The visual tracking module 126 preferably uses at least one of a color blob tracker algorithm, a histogram tracker algorithm and a mean-shift tracker algorithm, which are described below.
  • The color blob tracker is initialized with the target's color properties in the normalized RGB space, which is in effect an over-represented hue space, where the effect of lighting changes common underwater are minimum. This makes the tracking more effective in the underwater environment. The tracker scans the image converted in normalized RGB format, pixel-by-pixel, and the pixels falling within a given threshold of the color values of the target are turned on in the output image while the other pixels are turned off. To remove high-frequency (or shot/salt-and-pepper) noise, the median filtering algorithm is used over the segmented image with either 5-by-5 or 7-by-7 pixel grids, with typical threshold values of 30%-40%. The tracking algorithm detects the blob in the binary image in every frame, and calculates its centroid, which is taken as the new target location. The total mass of the blob is used as a confidence factor. The error signal is computed using the Euclidean distance in the image between the centroid of the blob and the center of the image frame. Two error signals are used for pitch and yaw, and both these signals are sent to the motion controller. A yellow target was found to work well with this type of tracker.
  • The histogram tracker compares rectangular regions of the input frame with the target region by comparing their corresponding color histograms. A histogram of the target to be tracked is created and stored. Every image from the camera is divided into rectangular regions and their normalized histograms having a fixed number of bins over the hue space are calculated. Computationally, the color histogram is formed by discretizing the colors within an image and counting the number of pixels of each color. Depending on the target and the size of the image frame, different number of bins can be used, with preferred numbers being 32 or 64, and with the regions having either one-eighth or one-sixteenth the dimension of the image frame. The use of normalized histograms reduces the effect on color matching of brightness changes in the underwater environment. The histograms are one-dimensional vectors that combine the multi-hue channel data. Similarity between histograms are computed by known measures, for example the histogram intersection measure, the χ2 (Chi-squared) measure, the Bhattacharyya distance measure, or Jeffrey's Divergence. Since the histograms are normalized, the measures return values ranging from 0 to 1; higher values indicating higher degree of similarity. The minimum similarity measure is preferably taken as 0.5; any measure below this threshold is not accepted as a valid target region. The center of the chosen window is taken as the new target location. As in the case of the color blob tracker, two error signals are used for pitch and yaw, and both these signals are sent to the motion controller. This type of tracker is suitable for tracking objects that have a variety of color.
  • For the mean-shift tracker, color histograms are also used as the underlying distribution. The histograms are three-dimensional arrays in this case, one each for the three RGB channels, with preferably 16 bins per channel. The target histogram are computed in a square window of sides preferably equaling 100 pixels. The color model probability density function for the target is calculated by overlaying the sub window by a kernel having the Epanechnikov profile. The weights for the mean-shift vector are calculated using the Epanechnikov kernel. The tracker is initialized with the last known location of the target and the target PDF model. In each successive tracking step, the candidate window having the same size as the target is created at the location of the last known target position, the candidate PDF model is calculated and the weights for pixel are calculated, leading to a new candidate position. The mean-shift process preferably uses 10 iteration steps to choose a new target location. The Bhattacharyya distance between the candidate PDF model and the target PDF model is calculated to quantify the similarity between the target and the new candidate location. The location with the minimum Bhattacharyya distance is chosen as the new target location. As for the other trackers, two error signals (pitch and yaw) are sent to the motion controller. The mean shift tracker is resistant to changes in lighting and appearance of duplicate objects in the frame, but necessitates substantially more computation than the preceding trackers.
  • In all cases, the visual tracking module 126 is able to track the target at almost 15 frames/second, and therefore without filtering, the commands sent to the leg actuators 20 by the leg controller 112 would be changing at such a high rate that it would yield a highly unstable swimming behavior. As such the motion calculator 106 which receives the error signals from the visual tracking module 126 includes a pitch controller 130 and a yaw controller 132 which are both PID controllers, used to take these target locations and produce pitch and yaw commands at a rate to ensure stable behavior of the device 10. The roll axis is not affected by this particular type of visual control. Given the input from the visual tracking module 126 at any instant, and the previous tracker inputs, each of the pitch and yaw controllers 130, 132 generates commands based on the following control law:
  • Δ = K P ɛ i _ + K I ɛ i _ t + K D t ɛ t _
  • where εt is the time-averaged error signal at time t and is defined recursively as:

  • εt tεt-1
  • εt is the error signal at time t, KP, KI and KD are respectively the proportional, integral and differential gains and γ is the error propagation constant.
  • The pitch and yaw controllers 130, 132 work identically as follows. Each controller 130, 132 includes a low-pass first-order infinite-impulse response or IIR filter (i.e. a digital filter blocking high frequency signals), smoothing out fast changing pitch and yaw commands by averaging them over a period of time. A time constant is defined for the low-pass filter for each controller 130, 132. The gains KP, KI and KD for each controller 130, 132 are input manually, with limits to truncate the gains. Each controller 130, 132 has a dead band limit applied to the error signal, i.e. a range of change in output for which the controller 130, 132 will not respond at all. This prevents the controller output from changing too frequently, by ignoring small changes in the error signal. A sleep time between each iteration in servoing is also introduced to reduce command overhead of the controllers 130, 132.
  • In a particular embodiment, the parameters for the controllers 130, 132 are as follows: a KP of 1.0 with corresponding limit of 1.0, a KI of 0.0 with corresponding limit of 0.3, and a KD of 0.0 with corresponding limit of 1.0 for both controllers 130, 132, a dead band of 0.2 for both controllers 130, 132, a time constant of 0.35 for the pitch controller 130 and 0.05 for the yaw controller 132, and a command limit of 1.0 for both controllers 130, 132.
  • The pitch and yaw controllers 130, 132 thus send required pitch and yaw of the device 10 to the leg controller 112, which as mentioned above computes a required thrust at each leg 16 a,b,c,d, determines a corresponding leg motion following the appropriate gaits, and controls the actuator 20 of each leg 16 a,b,c,d accordingly to obtain the required pitch and yaw.
  • In an alternate embodiment not shown, the visual tracking module 126 also compares the size of the target with a reference size, and sends a size error signal to the motion calculator 106, which computes a desired speed change for the device 10, sending corresponding motion data to the leg controller 112. As such the device 10 can remain within a given distance of the target by modifying its speed.
  • The visual control system 120 a thus allows the device to follow a moving target or, through use of the hovering gait, hold position relative to a stationary target.
  • Marker-Based Visual Control System
  • FIG. 11 illustrates another embodiment of the visual control system 120 b. The image analyzing module 122 includes a marker detection module 134, which is configured to detect a target of a particular type, for example an ARTag marker. ARTag markers include both symbolic and geometric content, and are constructed using an error-correcting code to enhance robustness. The marker detection module 134 sends the description of the marker, which in the case of an ARTag marker is a binary number corresponding to the black and white regions of the marker, to a marker identification module 136 which is part of the motion calculator 106. The motion calculator 106 also includes a marker library 138, which contains a list of the possible markers, each being associated with a particular command, for example turn right, change speed to 1 m/s, go to location X, film during a given period, switch to the hovering, swimming or walking gaits, etc. The marker identification module 136 thus accesses the marker library 138 to identify the marker and retrieve the associated command. If the command is one of motion of the device 10, the marker identification module 136 sends the required motion signal (pitch, yaw, roll, heave and/or surge) to the leg controller 112, which as mentioned above computes a required thrust at each leg 16 a,b,c,d, determines the motion of each leg 16 a,b,c,d based on the appropriate gaits and controls the actuator 20 of each leg 16 a,b,c,d accordingly to produce the required motion. In this embodiment a diver can thus communicate directly with the device 10 and give it a series of instructions simply by showing it different cards having the adequate markers illustrated thereon. The motion calculator 106 can also memorize a series of commands such that the diver can in fact program in advance a series of motions or tasks for the device 10.
  • The visual control systems 120 a,b of FIG. 10 and FIG. 11 can be used together, such that for example the device 10 follows a given target unless a marker is detected, at which point the device 10 stops following the target and obeys the commands dictated by the marker. As mentioned above, the visual control system 120, 120 a, 120 b can be used with a robotic device moved by a propulsion system other than legs, the leg controller 112 being replaced by an equivalent controller receiving the desired device motion signal, determining the corresponding required thrust of the propulsion system and actuating the propulsion system accordingly.
  • The use of visual sensing to control the device 10 makes use of a passive sensing medium which is thus both non-intrusive as well as energy efficient. These are both important considerations (in contrast to sonar for example) in a range of applications ranging from environmental assays to security surveillance. Alternative sensing media such as sonar also suffer from several deficiencies which make them difficult to use for tracking moving targets at close range in potentially turbulent water.
  • The visual control abilities of the device 10 allows it to follow a moving object, for instance a diver, and/or accept commands from the diver on presentation of cards carrying predetermined markers corresponding with tasks to be performed. A complete sequence of actions can be programmed into the device 10 using the predetermined markers. As such a diver can communicate directly with the device 10 without the assistance of an operator located on the surface and as such with or without a tether.
  • The device 10 can thus be operated in a semi-autonomous manner, with or without input from an operator on the surface through the operator control unit 104.
  • The device 10 of the present invention can be used in a wide range of applications. These include underwater search and rescue, coral health monitoring, monitoring of underwater establishments (e.g. oil pipelines, communication cables) and many more. Specifically, environmental assessment tasks in which visual measurements of a marine ecosystem must be taken on a regular basis can be performed by the device 10.
  • The device 10 can also be used in a variety of diver-assisting tasks, such as monitoring divers from a surface, providing lighting (for example while following a diver), providing communication between divers and the surface, carrying cargo and/or tools, carrying audio equipment or air reserves, etc.
  • In a particular embodiment, the device 10 includes an acoustic transducer and as such allows the diver to hear sounds transmitted from the surface, stored on the device 10 and/or synthesized by the device 10, as well as to send acoustic signals back to the surface by having them relayed by the device 10, while following the diver or another target and/or responding to commands given by the diver through the use of visual markers. The sounds could be, for example, music, instructional narrative and/or cautionary information. The sounds emitted by the device 10 can depend on various factors that can be sensed by the device 10, for example, on the depth or location of the device 10, the length of time the diver has been underwater, the water temperature, or other environmental parameters.
  • The visual tracking module 126 can be used to recognize given landmarks and as such allow the device 10 to return autonomously to its starting point once a given task is performed. The amphibious legs 16 b,c,d allow the device to start from and return to a location on dry land while performing a task (such as video surveillance) underwater.
  • The embodiments of the invention described above are intended to be exemplary. Those skilled in the art will therefore appreciate that the foregoing description is illustrative only, and that various alternate configurations and modifications can be devised without departing from the spirit of the present invention. Accordingly, the present invention is intended to embrace all such alternate configurations, modifications and variances which fall within the scope of the appended claims.

Claims (25)

1. A robotic device for navigating in at least a liquid medium, the robotic device comprising:
a legged propulsion system having a series of legs external of a body of the robotic device, each of the legs being independently driven and mounted to the body for pivotal movement about a respective transverse axis, each of the legs being operable to at least oscillate relative to the body about the respective transverse axis such that interaction between the legs and the liquid medium produces propulsive forces that displace the robotic device within the liquid medium; and
a control system operatively connected to the legged propulsion system for autonomous control and operation of the robotic device based on information received from at least one sensor providing data about an environment of the device, the control system using data from the at least one sensor to determine a desired motion of the robotic device and a corresponding required leg motion of each of the legs to produce the desired motion, and the control system autonomously actuating each of the legs of the legged propulsion system in accordance with the corresponding required leg motion.
2. The robotic device according to claim 1, wherein the desired motion includes a series of at least two consecutive steps, each step including one of movement in at least one of six degrees of freedom and station keeping.
3. The robotic device according to claim 1, wherein the at least one sensor includes a visual sensor retrieving an image of an environment of the device, the control system determining a presence of an object of a given type in the image, determining an identity of the object from a given list of possible objects of the given type, and determining the desired motion associated with the identity of the object in the list.
4. The robotic device according to claim 1, wherein the at least one sensor includes a visual sensor retrieving an image of an environment of the device, the control system determining a presence of an object of a given type in the image, determining a position of the object on the image and comparing the position to a desired position, and determining the desired motion of the device such as to change to position to correspond to the desired position.
5. The robotic device according to claim 1, wherein the control system includes a motion calculator having at least one angular controller which calculates a required angular displacement necessary to achieve said desired motion.
6. The robotic device according to claim 5, wherein the angular control includes a yaw controller calculating a required yaw of the device, a pitch controller calculating a required pitch of the device and a roll controller calculating a required roll of the device, the desired motion including at least one of the required yaw, the required pitch and the required roll.
7. The robotic device according to claim 1, wherein each of the legs is also operable to rotate about the respective transverse axis such that interaction between the legs and a solid medium allows the robotic device to move on the solid medium, thereby making the robotic device amphibious.
8. The robotic device according to claim 1, wherein each of the legs defines at least two members pivotally interconnected to relatively pivot about a pivot axis parallel to the respective transverse axis.
9. An amphibious robotic device comprising:
a legged propulsion system having a series of legs, each of said legs being driven by an actuator and mounted for pivotal movement about a respective transverse axis in one of at least a swimming mode and a walking mode, said legs being configured to pivotally oscillate relative to the transverse axis in said swimming mode when the device is in a liquid medium such that interaction between said legs and the liquid medium provides propulsive forces that displace the vehicle body within the liquid medium, said legs being configured to rotate relative to the transverse axis in said walking mode when the device is on a solid medium such that interaction between said legs and the solid medium provides propulsive forces that displace the vehicle body in a desired direction on the solid medium; and
a control system having at least one sensor operable to autonomously detect with which of the liquid medium and the solid medium the robotic device is interacting and a leg controller synchronously operating said legs in either one of the swimming mode and the walking mode based on the detected medium.
10. The amphibious robotic device according to claim 9, wherein each leg defines at least two members pivotally interconnected to relatively pivot about a pivot axis parallel to the respective transverse axis.
11. The amphibious robotic device according to claim 10, wherein each leg includes an elastic material extending through the members and providing resistance to a relative pivoting motion of the members about the pivot axis.
12. The amphibious robotic device according to claim 11, wherein the resistance to the relative pivoting motion of the members increases as the members pivot away from an aligned position.
13. The amphibious robotic device according to claim 9, wherein the control system is operatively connected to the legged propulsion system for autonomous control and operation of the robotic device based on information received from the at least one sensor, the control system using data from the at least one sensor to determine a desired motion of the robotic device and a corresponding required leg motion of each of the legs to produce the desired motion, and the control system autonomously actuating each of the legs of the legged propulsion system in accordance with the corresponding required leg motion.
14. The amphibious robotic device according to claim 13, wherein the desired motion includes a series of at least two consecutive steps, each step including one of movement in at least one of six degrees of freedom and station keeping.
15. The amphibious robotic device according to claim 13, wherein the control system includes a motion calculator having at least one angular controller which calculates a required angular displacement necessary to achieve said desired motion.
16. The amphibious robotic device according to claim 15, wherein the angular control includes a yaw controller calculating a required yaw of the device, a pitch controller calculating a required pitch of the device and a roll controller calculating a required roll of the device, the desired motion including at least one of the required yaw, the required pitch and the required roll.
17. A control system for autonomously maneuvering a robotic device in at least one of a liquid medium and a solid medium, the robotic device including a propulsion system having a series of individually controlled legs, the control system comprising:
at least one visual sensor retrieving an image of an environment of the device in the medium;
an image analyzing module receiving the image, determining a presence of an object of a given type therein and analyzing at least one property of the object;
a motion calculator determining a desired motion of the device based on the at least one property of the object; and
a controller operating the propulsion system of the device, the controller calculating a respective required leg motion of each of the legs to obtain the desired motion of the device and operating each of the legs based on the respective required leg motion calculated, such that the robotic device autonomously maneuvers in said medium.
18. The control system according to claim 17, wherein the desired motion of the device includes station keeping.
19. The control system according to claim 17, wherein the motion calculator is programmed upon reception of the at least one property to memorize a series of at least two consecutive steps, each step including one of movement in at least one of six degrees of freedom and station keeping, the desired motion successively corresponding to each of the consecutive steps.
20. The control system according to claim 17, wherein the image analyzing module determines an identity of the object from a given list of possible objects of the given type, the motion calculator determining the desired motion of the device from the list where a different desired motion is associated with each of at least some of the possible objects of the given type.
21. The control system according to claim 17, wherein the image analyzing module determines a position of the object on the image and compares the position to a desired position, and the motion calculator determining the desired motion of the device such as to change the position to correspond to the desired position.
22. The control system according to claim 21, wherein the object is moving, and the desired motion of the device allows the device to follow the object.
23. The control system according to claim 17, wherein the motion calculator includes at least one angular controller which calculates a required angular displacement necessary to achieve said desired motion.
24. The control system according to claim 23, wherein the angular control includes a yaw controller calculating a required yaw of the device, a pitch controller calculating a required pitch of the device and a roll controller calculating a required roll of the device, the desired motion including at least one of the required yaw, the required pitch and the required roll.
25. The control system according to claim 24, wherein the yaw, pitch and roll controllers are PID controllers.
US12/192,579 2006-08-02 2008-08-15 Amphibious robotic device Abandoned US20080300722A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/192,579 US20080300722A1 (en) 2006-08-02 2008-08-15 Amphibious robotic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/497,302 US7427220B2 (en) 2006-08-02 2006-08-02 Amphibious robotic device
US12/192,579 US20080300722A1 (en) 2006-08-02 2008-08-15 Amphibious robotic device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/497,302 Continuation US7427220B2 (en) 2006-08-02 2006-08-02 Amphibious robotic device

Publications (1)

Publication Number Publication Date
US20080300722A1 true US20080300722A1 (en) 2008-12-04

Family

ID=39029765

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/497,302 Active US7427220B2 (en) 2006-08-02 2006-08-02 Amphibious robotic device
US12/192,579 Abandoned US20080300722A1 (en) 2006-08-02 2008-08-15 Amphibious robotic device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/497,302 Active US7427220B2 (en) 2006-08-02 2006-08-02 Amphibious robotic device

Country Status (1)

Country Link
US (2) US7427220B2 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100152898A1 (en) * 2008-12-15 2010-06-17 Gm Global Technology Operations, Inc. Joint-space impedance control for tendon-driven manipulators
US20110154934A1 (en) * 2008-08-27 2011-06-30 ABB Research Ltd........ Robot for harsh outdoor environment
US20130269585A1 (en) * 2010-12-22 2013-10-17 Samsung Heavy Ind. Co., Ltd. Underwater moving apparatus and moving method thereof
CN103587605A (en) * 2012-08-14 2014-02-19 中国科学院合肥物质科学研究院 Double-foot amphibious robot
CN104503231A (en) * 2014-11-25 2015-04-08 北京理工大学 Swinging arm driving-type motion control method for amphibious frog board robot
CN104675468A (en) * 2013-11-28 2015-06-03 中国科学院沈阳自动化研究所 Air intake and exhaust device and method for ocean robot
CN105059071A (en) * 2015-07-28 2015-11-18 中国科学技术大学 Amphibious composite propelling leg based on arc-shaped foot-web switching mechanism
US9511639B2 (en) 2014-02-20 2016-12-06 Ontario Drive and Gear, Ltd. Vehicle drive unit and remotely controllable vehicle therewith
CN108482509A (en) * 2018-01-23 2018-09-04 夏荧 Wheel leg travelling robot

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8065046B2 (en) * 2007-09-17 2011-11-22 The United States Of America As Represented By The Secretary Of The Navy Olivo-cerebellar controller
CN102050162B (en) * 2009-11-09 2012-11-07 中国科学院沈阳自动化研究所 Amphibious robot with integrally-driven wheel paddle legs
CN102049983B (en) * 2009-11-09 2012-10-24 中国科学院沈阳自动化研究所 Footplate driving-type amphibious robot
CN102303491A (en) * 2011-06-16 2012-01-04 哈尔滨工程大学 Rotary foot type amphibious mine disaster search and rescue robot
WO2013089442A1 (en) * 2011-12-15 2013-06-20 한국해양연구원 Multi-joint underwater robot having complex movement functions of walking and swimming and underwater exploration system using same
CN102825988B (en) * 2012-06-15 2014-12-24 北京理工大学 Amphibious mobile robot platform
CN102785542B (en) * 2012-08-02 2014-12-10 中国科学技术大学 Amphibious robot with deformable foot-web compounded propulsion mechanism
US9580172B2 (en) * 2013-09-13 2017-02-28 Sandia Corporation Multiple environment unmanned vehicle
JP2017171137A (en) * 2016-03-24 2017-09-28 ソニー株式会社 Imaging part support device
CN105946483B (en) * 2016-05-06 2018-01-30 重庆大学 With the amphibious multi-foot robot for becoming cell type pedipulator
CN105882339B (en) * 2016-05-06 2018-06-29 重庆大学 Become cell type pedipulator
CN106628072B (en) * 2016-09-30 2019-06-11 哈尔滨工程大学 A kind of bionical deep-sea unmanned submersibles of state that navigate more
USD855549S1 (en) * 2017-01-04 2019-08-06 Powervision Tech Inc. Unmanned underwater vehicle
CN107571694B (en) * 2017-09-20 2019-07-12 重庆大学 Take turns web formula integration walking mechanism
CN108016522B (en) * 2017-12-14 2021-03-19 安庆师范大学 Expanded chain type bilateral rolling ditch crossing investigation robot
US11242162B2 (en) * 2018-03-27 2022-02-08 Massachusetts Institute Of Technology Methods and apparatus for in-situ measurements of atmospheric density
CN108820064B (en) * 2018-06-01 2019-10-25 重庆大学 Deformation leg-type mobile amphibious robot with fast junction apparatus
CN108859637B (en) * 2018-07-27 2023-11-14 北京理工大学 Spherical amphibious robot
CN109670435B (en) * 2018-12-13 2022-12-13 深圳市信义科技有限公司 Method for identifying identity based on human gait characteristics
CN109733136A (en) * 2019-01-14 2019-05-10 浙江理工大学 A kind of imitative die Schwimmhaut crawl stroke formula propulsion robot
CN110027692B (en) * 2019-05-14 2023-06-13 西南石油大学 Amphibious robot propelled by fluctuation fin
CN112327860B (en) * 2020-11-16 2023-12-12 西安应用光学研究所 Amphibious bionic robot self-adaptive motion control system
CN113296524B (en) * 2021-04-25 2022-11-29 哈尔滨工程大学 Thrust vector distribution optimization method for underwater bionic spherical/hemispherical robot
CN114802660B (en) * 2022-04-08 2024-01-12 中国科学院深圳先进技术研究院 Underwater robot
CN115071934A (en) * 2022-04-26 2022-09-20 哈尔滨工程大学 Novel underwater robot based on flapping wing propulsion
CN114889749B (en) * 2022-06-07 2023-07-07 浙江理工大学 Water wave adaptation method for actively adapting water wave to water operation robot
CN116165958A (en) * 2023-04-25 2023-05-26 舜泰汽车有限公司 Automatic driving system of amphibious special unmanned platform

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5220261A (en) * 1991-03-19 1993-06-15 Bodenseewerk Geratetechnik Gmbh Method of calibrating highly precise robots
US5377106A (en) * 1987-03-24 1994-12-27 Fraunhofer Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Process for navigating an unmanned vehicle and a vehicle for the same
US5400244A (en) * 1991-06-25 1995-03-21 Kabushiki Kaisha Toshiba Running control system for mobile robot provided with multiple sensor information integration system
US5521843A (en) * 1992-01-30 1996-05-28 Fujitsu Limited System for and method of recognizing and tracking target mark
US5831408A (en) * 1992-12-02 1998-11-03 Cybernet Systems Corporation Force feedback system
US5911767A (en) * 1994-10-04 1999-06-15 Garibotto; Giovanni Navigation system for an autonomous mobile robot
US6058847A (en) * 1995-09-21 2000-05-09 Gec-Marconi Limited Submersible mine neutralisation vehicle
US6089178A (en) * 1997-09-18 2000-07-18 Mitsubishi Heavy Industries, Ltd. Submersible vehicle having swinging wings
US6179683B1 (en) * 1993-02-10 2001-01-30 Nekton Technologies, Inc. Swimming aquatic creature simulator
US6250585B1 (en) * 1997-09-05 2001-06-26 Nekton Technologies, Inc. Impellers with bladelike elements and compliant tuned transmission shafts and vehicles including same
US6481513B2 (en) * 2000-03-16 2002-11-19 Mcgill University Single actuator per leg robotic hexapod
US6647853B2 (en) * 2000-11-02 2003-11-18 Christopher Daniel Dowling Hickey Seabed mine clearance
US6974356B2 (en) * 2003-05-19 2005-12-13 Nekton Research Llc Amphibious robot devices and related methods

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5337434A (en) * 1993-04-12 1994-08-16 Aqua Products, Inc. Directional control means for robotic swimming pool cleaners
US6377013B2 (en) * 1999-12-24 2002-04-23 Honda Giken Kogyo Kabushiki Kaisha Control apparatus for legged mobile robot
JP2001277166A (en) * 2000-03-31 2001-10-09 Sony Corp Robot and behaivoir determining method therefor
JP3870257B2 (en) * 2002-05-02 2007-01-17 独立行政法人 宇宙航空研究開発機構 Robot with offset rotary joint

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5377106A (en) * 1987-03-24 1994-12-27 Fraunhofer Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Process for navigating an unmanned vehicle and a vehicle for the same
US5220261A (en) * 1991-03-19 1993-06-15 Bodenseewerk Geratetechnik Gmbh Method of calibrating highly precise robots
US5400244A (en) * 1991-06-25 1995-03-21 Kabushiki Kaisha Toshiba Running control system for mobile robot provided with multiple sensor information integration system
US5521843A (en) * 1992-01-30 1996-05-28 Fujitsu Limited System for and method of recognizing and tracking target mark
US5831408A (en) * 1992-12-02 1998-11-03 Cybernet Systems Corporation Force feedback system
US6179683B1 (en) * 1993-02-10 2001-01-30 Nekton Technologies, Inc. Swimming aquatic creature simulator
US5911767A (en) * 1994-10-04 1999-06-15 Garibotto; Giovanni Navigation system for an autonomous mobile robot
US6058847A (en) * 1995-09-21 2000-05-09 Gec-Marconi Limited Submersible mine neutralisation vehicle
US6250585B1 (en) * 1997-09-05 2001-06-26 Nekton Technologies, Inc. Impellers with bladelike elements and compliant tuned transmission shafts and vehicles including same
US6089178A (en) * 1997-09-18 2000-07-18 Mitsubishi Heavy Industries, Ltd. Submersible vehicle having swinging wings
US6481513B2 (en) * 2000-03-16 2002-11-19 Mcgill University Single actuator per leg robotic hexapod
US6647853B2 (en) * 2000-11-02 2003-11-18 Christopher Daniel Dowling Hickey Seabed mine clearance
US6974356B2 (en) * 2003-05-19 2005-12-13 Nekton Research Llc Amphibious robot devices and related methods
US7007626B2 (en) * 2003-05-19 2006-03-07 Nekton Research Llc Amphibious robot devices

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110154934A1 (en) * 2008-08-27 2011-06-30 ABB Research Ltd........ Robot for harsh outdoor environment
US8627740B2 (en) * 2008-08-27 2014-01-14 Abb Research Ltd. Robot for harsh outdoor environment
US8060250B2 (en) * 2008-12-15 2011-11-15 GM Global Technology Operations LLC Joint-space impedance control for tendon-driven manipulators
US20100152898A1 (en) * 2008-12-15 2010-06-17 Gm Global Technology Operations, Inc. Joint-space impedance control for tendon-driven manipulators
US20130269585A1 (en) * 2010-12-22 2013-10-17 Samsung Heavy Ind. Co., Ltd. Underwater moving apparatus and moving method thereof
US9051036B2 (en) * 2010-12-22 2015-06-09 Samsung Heavy Ind. Co., Ltd. Underwater moving apparatus and moving method thereof
CN103587605A (en) * 2012-08-14 2014-02-19 中国科学院合肥物质科学研究院 Double-foot amphibious robot
CN104675468A (en) * 2013-11-28 2015-06-03 中国科学院沈阳自动化研究所 Air intake and exhaust device and method for ocean robot
US9731571B2 (en) 2014-02-20 2017-08-15 Ontario Drive And Gear Limited Vehicle drive unit and remotely controllable vehicle therewith
US10300753B2 (en) 2014-02-20 2019-05-28 Ontario Drive and Gear, Ltd. Vehicle drive unit and remotely controllable vehicle therewith
US9511639B2 (en) 2014-02-20 2016-12-06 Ontario Drive and Gear, Ltd. Vehicle drive unit and remotely controllable vehicle therewith
CN104503231A (en) * 2014-11-25 2015-04-08 北京理工大学 Swinging arm driving-type motion control method for amphibious frog board robot
CN105059071A (en) * 2015-07-28 2015-11-18 中国科学技术大学 Amphibious composite propelling leg based on arc-shaped foot-web switching mechanism
CN108482509A (en) * 2018-01-23 2018-09-04 夏荧 Wheel leg travelling robot

Also Published As

Publication number Publication date
US20080032571A1 (en) 2008-02-07
US7427220B2 (en) 2008-09-23

Similar Documents

Publication Publication Date Title
US7427220B2 (en) Amphibious robotic device
Sinisterra et al. Stereovision-based target tracking system for USV operations
US9816812B2 (en) Systems and methods for automated vessel navigation using sea state prediction
Dudek et al. Aqua: An amphibious autonomous robot
Yu et al. Design and control of an embedded vision guided robotic fish with multiple control surfaces
Manderson et al. Vision-based autonomous underwater swimming in dense coral for combined collision avoidance and target selection
Negre et al. Robust vision‐based underwater homing using self‐similar landmarks
Kim et al. Artificial landmark-based underwater localization for AUVs using weighted template matching
CA2555148A1 (en) Amphibious robotic device
Yu Development of real-time acoustic image recognition system using by autonomous marine vehicle
Sattar et al. A visual servoing system for an aquatic swimming robot
Shi et al. Underwater formation system design and implement for small spherical robots
Chen et al. A novel unmanned surface vehicle with 2d-3d fused perception and obstacle avoidance module
Rodríguez-Teiles et al. Vision-based reactive autonomous navigation with obstacle avoidance: Towards a non-invasive and cautious exploration of marine habitat
Cortés-Pérez et al. A mirror-based active vision system for underwater robots: From the design to active object tracking application
Meng et al. Vision-based underwater target following control of an agile robotic manta with flexible pectoral fins
Maldonado-Ramírez et al. Ethologically inspired reactive exploration of coral reefs with collision avoidance: Bridging the gap between human and robot spatial understanding of unstructured environments
Salazar et al. Multi-robot visual control of autonomous soft robotic fish
Hu et al. Underwater target following with a vision-based autonomous robotic fish
Dudek et al. Sensor-based behavior control for an autonomous underwater vehicle
Sattar et al. Sensor-based behavior control for an autonomous underwater vehicle
Sun et al. Active visual tracking of free-swimming robotic fish based on automatic recognition
Kim et al. Development of Bioinspired Multimodal Underwater Robot “HERO-BLUE” for Walking, Swimming, and Crawling
Zhao et al. Development of vision-based autonomous robotic fish and its application in water-polo-attacking task
Garcia A proposal to estimate the motion of an underwater vehicle through visual mosaicking

Legal Events

Date Code Title Description
AS Assignment

Owner name: MCGILL UNIVERSITY, CANADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DUDEK, GREGORY;PRAHACS, CHRIS;SAUNDERSON, SHANE;AND OTHERS;REEL/FRAME:021653/0224

Effective date: 20060726

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION