US20170274529A1 - Detection of Movable Ground Areas of a Robot's Environment Using a Transducer Array - Google Patents
Detection of Movable Ground Areas of a Robot's Environment Using a Transducer Array Download PDFInfo
- Publication number
- US20170274529A1 US20170274529A1 US15/618,822 US201715618822A US2017274529A1 US 20170274529 A1 US20170274529 A1 US 20170274529A1 US 201715618822 A US201715618822 A US 201715618822A US 2017274529 A1 US2017274529 A1 US 2017274529A1
- Authority
- US
- United States
- Prior art keywords
- robot
- ground surface
- transducer
- pressure wave
- depth
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000001514 detection method Methods 0.000 title description 7
- 238000000034 method Methods 0.000 claims abstract description 59
- 238000005259 measurement Methods 0.000 abstract description 81
- 244000025254 Cannabis sativa Species 0.000 description 35
- 230000005021 gait Effects 0.000 description 17
- 230000006399 behavior Effects 0.000 description 14
- 238000003860 storage Methods 0.000 description 9
- 238000013500 data storage Methods 0.000 description 8
- 238000004590 computer program Methods 0.000 description 7
- 230000033001 locomotion Effects 0.000 description 7
- 230000008569 process Effects 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 239000000463 material Substances 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 238000012512 characterization method Methods 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- 239000012530 fluid Substances 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 238000002604 ultrasonography Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000003466 welding Methods 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 239000010426 asphalt Substances 0.000 description 1
- 239000004568 cement Substances 0.000 description 1
- QBWCMBCROVPCKQ-UHFFFAOYSA-N chlorous acid Chemical compound OCl=O QBWCMBCROVPCKQ-UHFFFAOYSA-N 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 230000009849 deactivation Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000005305 interferometry Methods 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 150000002739 metals Chemical class 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000005019 pattern of movement Effects 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 230000003094 perturbing effect Effects 0.000 description 1
- 239000004033 plastic Substances 0.000 description 1
- 229920003023 plastic Polymers 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/163—Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/088—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
- B25J13/089—Determining the position of the robot with reference to its environment
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/026—Acoustical sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/027—Electromagnetic sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/50—Systems of measurement based on relative movement of target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0255—Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/02—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
- G01S15/50—Systems of measurement, based on relative movement of the target
- G01S15/58—Velocity or trajectory determination systems; Sense-of-movement determination systems
- G01S15/60—Velocity or trajectory determination systems; Sense-of-movement determination systems wherein the transmitter and receiver are mounted on the moving object, e.g. for determining ground speed, drift angle, ground track
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S901/00—Robots
- Y10S901/01—Mobile robot
Definitions
- Robotic devices may be used for applications involving material handling, transportation, welding, assembly, and dispensing, among others.
- the manner in which these robotic systems operate is becoming more intelligent, efficient, and intuitive.
- robotic systems become increasingly prevalent in numerous aspects of modern life, it is desirable for robotic systems to be efficient. Therefore, a demand for efficient robotic systems has helped open up a field of innovation in actuators, movement, sensing techniques, as well as component design and assembly.
- the present application discloses implementations that relate to detection of unstable surfaces within an environment.
- the present application describes a method.
- the method involves receiving, from a depth sensor coupled to a mobile robot, a first depth measurement between the depth sensor and a ground surface.
- the method also involves causing at least one transducer coupled to the mobile robot to emit a directional pressure wave toward the ground surface.
- the method further involves receiving, from the depth sensor coupled to the mobile robot, a second depth measurement between the depth sensor and the ground surface after emitting the directional pressure wave.
- the method involves identifying one or more differences between the first depth measurement and the second depth measurement indicating that the ground surface includes a movable element. Further, the method involves providing navigation instructions to the mobile robot based on the identified one or more differences between the first depth measurement and the second depth measurement.
- the present application describes a robot.
- the robot includes a depth sensor, at least one transducer, and a computing device.
- the depth sensor is configured to measure distances between the depth sensor and one or more surfaces in an environment.
- the environment includes a ground surface.
- the at least one transducer is configured to emit pressure waves.
- the computing device is configured to execute instructions that cause performance of a set of operations.
- the operations include obtaining, from the depth sensor, a first depth measurement between the depth sensor and the ground surface.
- the operations also include causing the at least one transducer to emit a directional pressure wave toward the ground surface.
- the operations further include obtaining, from the depth sensor, a second depth measurement between the depth sensor and the ground surface after causing the at least one transducer to emit the directional pressure wave.
- the operations include identifying one or more differences between the first depth measurement and the second depth measurement indicating that the ground surface includes a movable element. Further, the operations include viding navigation instructions to the robot based on the identified one or more differences between the first depth measurement and the second depth measurement.
- the present application describes a non-transitory computer-readable medium having instructions stored thereon that, upon execution by at least one processor, causes a mobile robot to perform a set of operations.
- the operations include obtaining a first depth map of a ground surface proximate the mobile robot.
- the operations also include causing an array of transducers to emit a plurality of pressure waves in accordance with a timing sequence.
- the plurality of pressure waves collectively form a directional pressure wave directed toward the ground surface.
- the operations further include obtaining a second depth map of the ground surface after the directional pressure wave reaches the ground surface.
- the operations include identifying one or more differences between a first area of the first depth map and a corresponding second area of the second depth map.
- the operations include providing navigation instructions to the mobile robot based on the identified one or more differences between the first depth map and the second depth map.
- the present application describes a system.
- the system includes a means for receiving, from a depth sensor coupled to a mobile robot, a first depth measurement between the depth sensor and a ground surface.
- the system also includes a means for causing at least one transducer coupled to the mobile robot to emit a directional pressure wave toward the ground surface.
- the system further includes a means for receiving, from the depth sensor coupled to the mobile robot, a second depth measurement between the depth sensor and the ground surface after emitting the directional pressure wave.
- the system includes a means for identifying one or more differences between the first depth measurement and the second depth measurement indicating that the ground surface includes a movable element.
- the system includes a means for providing navigation instructions to the mobile robot based on the identified one or more differences between the first depth measurement and the second depth measurement.
- FIG. 1 illustrates a configuration of a robotic system, according to an example embodiment.
- FIG. 2 illustrates a perspective view of a quadruped robot, according to an example embodiment.
- FIG. 3 illustrates a perspective view of a biped robot, according to an example embodiment.
- FIG. 4 illustrates a flowchart, according to an example embodiment.
- FIG. 5 is a conceptual illustration of an operation of a phased transducer array, according to an example embodiment.
- FIG. 6A illustrates an example operation of a robot at a first time, according to an example embodiment.
- FIG. 6B illustrates an example operation of a robot at a second time, according to an example embodiment.
- FIG. 7 illustrates comparison between a pair of depth maps, according to an example embodiment.
- FIG. 8 illustrates an example computer-readable medium, according to an example embodiment.
- the present application discloses implementations that relate to detection of unstable surfaces within an environment.
- a mobile robot may encounter a variety of terrains. Some terrains—such as pavement, asphalt, cement, and wood—are stable and solid and may be confidently stepped on. However, other terrains may be unstable or include movable elements—such as carpet, grass, or other vegetation—which may cause the robot to slip or misstep. Thus, it may be desirable for the robot to either avoid stepping onto such unstable surfaces, or to modify its stepping behavior in order to maintain stability.
- An example embodiment involves a robot with a depth sensor and a transducer.
- the depth sensor first measures a distance between the sensor and a particular location on the ground proximate the robot. Then, the transducer is operated to emit a pressure wave directed toward that particular location on the ground. The depth sensor then performs another distance measurement on that particular location on the ground.
- a control system of the robot may then determine whether there are any differences between the two depth measurements. If there is at least one identified difference between the two depth measurements, a movable feature may have been detected at that particular location. Upon detecting the movable feature, the control system may instruct the robot to avoid that particular location on the ground and/or otherwise modify the robot's stepping behavior.
- the robot may include a plurality of transducers arranged as an array, each of which can be operated independently. Once the particular location on the ground has been identified, selected, or determined, a control system of the robot may determine a phase timing with which to operate the transducer array. By operating the transducer array in this manner, relatively weak individual pressure waves emitted by each transducer may constructively interfere at a particular focal point to form a stronger and directed pressure wave. A known relationship between the arrangement of the transducers and the speed of the pressure wave in the air (or other medium) may enable a control system to calculate the phase timing necessary to direct the pressure wave at a desired focal point.
- the operating principle behind a phased transducer array is described in more detail with respect to FIG. 5 .
- the depth sensor may be configured to capture a depth map that includes a set of spatially-mapped distance measurements for an area near the robot.
- the depth sensor may capture two depth maps—one before the pressure wave is emitted, and another after the pressure wave is emitted.
- a control system of the robot may then compare the two depth maps and identify any differences between the two depth maps. If the depth maps are different, the control system may determine that a movable feature has been detected and accordingly instruct the robot and/or modify the robot's stepping behavior.
- a variety of navigation instructions may be provided to the robot.
- the robot's behavior and trajectory may remain unchanged.
- the robot's trajectory and/or planned stepping route may be modified in order to avoid stepping onto the ground surface containing the movable feature.
- the robot's stepping behavior may be modified to proceed cautiously onto the ground surface containing the movable feature.
- the robot may be instructed to more slowly lower its foot onto the ground surface until it is firmly planted onto the stable portion of the ground surface beneath the movable feature.
- the robot's stepping pattern may become shorter, and/or the robot's velocity may be reduced.
- the information indicating that a movable feature has been detected may be provided to other planning or control systems of the robot and be considered with other information about the environment.
- FIG. 1 illustrates an example configuration of a robotic system that may be used in connection with the implementations described herein.
- the robotic system 100 may be configured to operate autonomously, semi-autonomously, and/or using directions provided by user(s).
- the robotic system 100 may be implemented in various forms, such as a biped robot, quadruped robot, or some other arrangement.
- the robotic system 100 may also be referred to as a robot, robotic device, or mobile robot, among other designations.
- the robotic system 100 may include processor(s) 102 , data storage 104 , and controller(s) 108 , which together may be part of a control system 118 .
- the robotic system 100 may also include sensor(s) 112 , power source(s) 114 , mechanical components 110 , and electrical components 116 . Nonetheless, the robotic system 100 is shown for illustrative purposes, and may include more or fewer components.
- the various components of robotic system 100 may be connected in any manner, including wired or wireless connections. Further, in some examples, components of the robotic system 100 may be distributed among multiple physical entities rather than a single physical entity. Other example illustrations of robotic system 100 may exist as well.
- Processor(s) 102 may operate as one or more general-purpose hardware processors or special purpose hardware processors (e.g., digital signal processors, application specific integrated circuits, etc.).
- the processor(s) 102 may be configured to execute computer-readable program instructions 106 , and manipulate data 107 , both of which are stored in the data storage 104 .
- the processor(s) 102 may also directly or indirectly interact with other components of the robotic system 100 , such as sensor(s) 112 , power source(s) 114 , mechanical components 110 , and/or electrical components 116 .
- the data storage 104 may be one or more types of hardware memory.
- the data storage 104 may include or take the form of one or more computer-readable storage media that can be read or accessed by processor(s) 102 .
- the one or more computer-readable storage media can include volatile and/or non-volatile storage components, such as optical, magnetic, organic, or another type of memory or storage, which can be integrated in whole or in part with processor(s) 102 .
- the data storage 104 can be a single physical device.
- the data storage 104 can be implemented using two or more physical devices, which may communicate with one another via wired or wireless communication.
- the data storage 104 may include the computer-readable program instructions 106 and the data 107 .
- the data 107 may be any type of data, such as configuration data, sensor data, and/or diagnostic data, among other possibilities.
- the controller 108 may include one or more electrical circuits, units of digital logic, computer chips, and/or microprocessors that are configured to (perhaps among other tasks), interface between any combination of the mechanical components 110 , the sensor(s) 112 , the power source(s) 114 , the electrical components 116 , the control system 118 , and/or a user of the robotic system 100 .
- the controller 108 may be a purpose-built embedded device for performing specific operations with one or more subsystems of the robotic device 100 .
- the control system 118 may monitor and physically change the operating conditions of the robotic system 100 . In doing so, the control system 118 may serve as a link between portions of the robotic system 100 , such as between mechanical components 110 and/or electrical components 116 . In some instances, the control system 118 may serve as an interface between the robotic system 100 and another computing device. Further, the control system 118 may serve as an interface between the robotic system 100 and a user. The instance, the control system 118 may include various components for communicating with the robotic system 100 , including a joystick, buttons, and/or ports, etc. The example interfaces and communications noted above may be implemented via a wired or wireless connection, or both. The control system 118 may perform other operations for the robotic system 100 as well.
- control system 118 may communicate with other systems of the robotic system 100 via wired or wireless connections, and may further be configured to communicate with one or more users of the robot.
- the control system 118 may receive an input (e.g., from a user or from another robot) indicating an instruction to perform a particular gait in a particular direction, and at a particular speed.
- a gait is a pattern of movement of the limbs of an animal, robot, or other mechanical structure.
- control system 118 may perform operations to cause the robotic device 100 to move according to the requested gait.
- a control system may receive an input indicating an instruction to move to a particular geographical location.
- the control system 118 (perhaps with the assistance of other components or systems) may determine a direction, speed, and/or gait based on the environment through which the robotic system 100 is moving en route to the geographical location.
- control system 118 Operations of the control system 118 may be carried out by the processor(s) 102 . Alternatively, these operations may be carried out by the controller 108 , or a combination of the processor(s) 102 and the controller 108 . In some implementations, the control system 118 may partially or wholly reside on a device other than the robotic system 100 , and therefore may at least in part control the robotic system 100 remotely.
- Mechanical components 110 represent hardware of the robotic system 100 that may enable the robotic system 100 to perform physical operations.
- the robotic system 100 may include physical members such as leg(s), arm(s), and/or wheel(s).
- the physical members or other parts of robotic system 100 may further include actuators arranged to move the physical members in relation to one another.
- the robotic system 100 may also include one or more structured bodies for housing the control system 118 and/or other components, and may further include other types of mechanical components.
- the particular mechanical components 110 used in a given robot may vary based on the design of the robot, and may also be based on the operations and/or tasks the robot may be configured to perform.
- the mechanical components 110 may include one or more removable components.
- the robotic system 100 may be configured to add and/or remove such removable components, which may involve assistance from a user and/or another robot.
- the robotic system 100 may be configured with removable arms, hands, feet, and/or legs, so that these appendages can be replaced or changed as needed or desired.
- the robotic system 100 may include one or more removable and/or replaceable battery units or sensors. Other types of removable components may be included within some implementations.
- the robotic system 100 may include sensor(s) 112 arranged to sense aspects of the robotic system 100 .
- the sensor(s) 112 may include one or more force sensors, torque sensors, velocity sensors, acceleration sensors, position sensors, proximity sensors, motion sensors, location sensors, load sensors, temperature sensors, touch sensors, depth sensors, ultrasonic range sensors, infrared sensors, object sensors, and/or cameras, among other possibilities.
- the robotic system 100 may be configured to receive sensor data from sensors that are physically separated from the robot (e.g., sensors that are positioned on other robots or located within the environment in which the robot is operating).
- the sensor(s) 112 may provide sensor data to the processor(s) 102 (perhaps by way of data 107 ) to allow for interaction of the robotic system 100 with its environment, as well as monitoring of the operation of the robotic system 100 .
- the sensor data may be used in evaluation of various factors for activation, movement, and deactivation of mechanical components 110 and electrical components 116 by control system 118 .
- the sensor(s) 112 may capture data corresponding to the terrain of the environment or location of nearby objects, which may assist with environment recognition and navigation.
- sensor(s) 112 may include RADAR (e.g., for long-range object detection, distance determination, and/or speed determination), LIDAR (e.g., for short-range object detection, distance determination, and/or speed determination), SONAR (e.g., for underwater object detection, distance determination, and/or speed determination), VICON® (e.g., for motion capture), one or more cameras (e.g., stereoscopic cameras for 3D vision), a global positioning system (GPS) transceiver, and/or other sensors for capturing information of the environment in which the robotic system 100 is operating.
- the sensor(s) 112 may monitor the environment in real time, and detect obstacles, elements of the terrain, weather conditions, temperature, and/or other aspects of the environment.
- the robotic system 100 may include sensor(s) 112 configured to receive information indicative of the state of the robotic system 100 , including sensor(s) 112 that may monitor the state of the various components of the robotic system 100 .
- the sensor(s) 112 may measure activity of systems of the robotic system 100 and receive information based on the operation of the various features of the robotic system 100 , such the operation of extendable legs, arms, or other mechanical and/or electrical features of the robotic system 100 .
- the data provided by the sensor(s) 112 may enable the control system 118 to determine errors in operation as well as monitor overall operation of components of the robotic system 100 .
- the robotic system 100 may use force sensors to measure load on various components of the robotic system 100 .
- the robotic system 100 may include one or more force sensors on an arm or a leg to measure the load on the actuators that move one or more members of the arm or leg.
- the robotic system 100 may use one or more position sensors to sense the position of the actuators of the robotic system. For instance, such position sensors may sense states of extension, retraction, or rotation of the actuators on arms or legs.
- the sensor(s) 112 may include one or more velocity and/or acceleration sensors.
- the sensor(s) 112 may include an inertial measurement unit (IMU).
- IMU inertial measurement unit
- the IMU may sense velocity and acceleration in the world frame, with respect to the gravity vector. The velocity and acceleration sensed by the IMU may then be translated to that of the robotic system 100 based on the location of the IMU in the robotic system 100 and the kinematics of the robotic system 100 .
- the robotic system 100 may include other types of sensors not explicated discussed herein. Additionally or alternatively, the robotic system may use particular sensors for purposes not enumerated herein.
- the robotic system 100 may also include one or more power source(s) 114 configured to supply power to various components of the robotic system 100 .
- the robotic system 100 may include a hydraulic system, electrical system, batteries, and/or other types of power systems.
- the robotic system 100 may include one or more batteries configured to provide charge to components of the robotic system 100 .
- Some of the mechanical components 110 and/or electrical components 116 may each connect to a different power source, may be powered by the same power source, or be powered by multiple power sources.
- the robotic system 100 may include a hydraulic system configured to provide power to the mechanical components 110 using fluid power. Components of the robotic system 100 may operate based on hydraulic fluid being transmitted throughout the hydraulic system to various hydraulic motors and hydraulic cylinders, for example.
- the hydraulic system may transfer hydraulic power by way of pressurized hydraulic fluid through tubes, flexible hoses, or other links between components of the robotic system 100 .
- the power source(s) 114 may charge using various types of charging, such as wired connections to an outside power source, wireless charging, combustion, or other examples.
- the electrical components 116 may include various mechanisms capable of processing, transferring, and/or providing electrical charge or electric signals.
- the electrical components 116 may include electrical wires, circuitry, and/or wireless communication transmitters and receivers to enable operations of the robotic system 100 .
- the electrical components 116 may interwork with the mechanical components 110 to enable the robotic system 100 to perform various operations.
- the electrical components 116 may be configured to provide power from the power source(s) 114 to the various mechanical components 110 , for example.
- the robotic system 100 may include electric motors. Other examples of electrical components 116 may exist as well.
- the robotic system 100 may include a body, which may connect to or house appendages and components of the robotic system.
- the structure of the body may vary within examples and may further depend on particular operations that a given robot may have been designed to perform. For example, a robot developed to carry heavy loads may have a wide body that enables placement of the load. Similarly, a robot designed to reach high speeds may have a narrow, small body that does not have substantial weight.
- the body and/or the other components may be developed using various types of materials, such as metals or plastics.
- a robot may have a body with a different structure or made of various types of materials.
- the body and/or the other components may include or carry the sensor(s) 112 . These sensors may be positioned in various locations on the robotic device 100 , such as on the body and/or on one or more of the appendages, among other examples.
- the robotic device 100 may carry a load, such as a type of cargo that is to be transported.
- the load may also represent external batteries or other types of power sources (e.g., solar panels) that the robotic device 100 may utilize. Carrying the load represents one example use for which the robotic device 100 may be configured, but the robotic device 100 may be configured to perform other operations as well.
- the robotic system 100 may include various types of legs, arms, wheels, and so on.
- the robotic system 100 may be configured with zero or more legs.
- An implementation of the robotic system with zero legs may include wheels, treads, or some other form of locomotion.
- An implementation of the robotic system with two legs may be referred to as a biped, and an implementation with four legs may be referred as a quadruped. Implementations with six or eight legs are also possible.
- biped and quadruped implementations of the robotic system 100 are described below.
- FIG. 2 illustrates a quadruped robot 200 , according to an example implementation.
- the robot 200 may be configured to perform some of the operations described herein.
- the robot 200 includes a control system, and legs 204 A, 204 B, 204 C, 204 D connected to a body 208 .
- Each leg may include a respective foot 206 A, 206 B, 206 C, 206 D that may contact a surface (e.g., a ground surface).
- the robot 200 is illustrated with sensor(s) 210 , and may be capable of carrying a load on the body 208 .
- the robot 200 may include more or fewer components, and thus may include components not shown in FIG. 2 .
- the robot 200 may be a physical representation of the robotic system 100 shown in FIG. 1 , or may be based on other configurations.
- the robot 200 may include one or more of mechanical components 110 , sensor(s) 112 , power source(s) 114 , electrical components 116 , and/or control system 118 , among other possible components or systems.
- the configuration, position, and/or structure of the legs 204 A- 204 D may vary in example implementations.
- the legs 204 A- 204 D enable the robot 200 to move relative to its environment, and may be configured to operate in multiple degrees of freedom to enable different techniques of travel.
- the legs 204 A- 204 D may enable the robot 200 to travel at various speeds according to the mechanics set forth within different gaits.
- the robot 200 may use one or more gaits to travel within an environment, which may involve selecting a gait based on speed, terrain, the need to maneuver, and/or energy efficiency.
- gaits may use different gaits due to variations in design. Although some gaits may have specific names (e.g., walk, trot, run, bound, gallop, etc.), the distinctions between gaits may overlap. The gaits may be classified based on footfall patterns—the locations on a surface for the placement the feet 206 A- 206 D. Similarly, gaits may also be classified based on ambulatory mechanics.
- the body 208 of the robot 200 connects to the legs 204 A- 204 D and may house various components of the robot 200 .
- the body 208 may include or carry sensor(s) 210 .
- These sensors may be any of the sensors discussed in the context of sensor(s) 112 , such as a camera, LIDAR, or an infrared sensor.
- the locations of sensor(s) 210 are not limited to those illustrated in FIG. 2 .
- sensor(s) 210 may be positioned in various locations on the robot 200 , such as on the body 208 and/or on one or more of the legs 204 A- 204 D, among other examples.
- FIG. 3 illustrates a biped robot 300 according to another example implementation. Similar to robot 200 , the robot 300 may correspond to the robotic system 100 shown in FIG. 1 , and may be configured to perform some of the implementations described herein. Thus, like the robot 200 , the robot 300 may include one or more of mechanical components 110 , sensor(s) 112 , power source(s) 114 , electrical components 116 , and/or control system 118 .
- the robot 300 may include legs 304 and 306 connected to a body 308 .
- Each leg may consist of one or more members connected by joints and configured to operate with various degrees of freedom with respect to one another.
- Each leg may also include a respective foot 310 and 312 , which may contact a surface (e.g., the ground surface).
- the legs 304 and 306 may enable the robot 300 to travel at various speeds according to the mechanics set forth within gaits.
- the robot 300 may utilize different gaits from that of the robot 200 , due at least in part to the differences between biped and quadruped capabilities.
- the robot 300 may also include arms 318 and 320 . These arms may facilitate object manipulation, load carrying, and/or balancing for the robot 300 . Like legs 304 and 306 , each arm may consist of one or more members connected by joints and configured to operate with various degrees of freedom with respect to one another. Each arm may also include a respective hand 322 and 324 . The robot 300 may use hands 322 and 324 for gripping, turning, pulling, and/or pushing objects. The hands 322 and 324 may include various types of appendages or attachments, such as fingers, grippers, welding tools, cutting tools, and so on.
- the robot 300 may also include sensor(s) 314 , corresponding to sensor(s) 112 , and configured to provide sensor data to its control system. In some cases, the locations of these sensors may be chosen in order to suggest an anthropomorphic structure of the robot 300 . Thus, as illustrated in FIG. 3 , the robot 300 may contain vision sensors (e.g., cameras, infrared sensors, object sensors, range sensors, etc.) within its head 316 .
- vision sensors e.g., cameras, infrared sensors, object sensors, range sensors, etc.
- FIG. 4 illustrates a flowchart of an example method 400 for detecting a movable element on a ground surface, according to an example embodiment.
- Method 400 shown in FIG. 4 presents an embodiment of a method that could be used or implemented by the robot 200 of FIG. 2 and/or the robot 300 of FIG. 3 , for example, or more generally by one or more components of any computing device.
- Method 400 may include one or more operations, functions, or actions as illustrated by one or more blocks of 402 - 410 . Although the blocks are illustrated in a sequential order, these blocks may also be performed in parallel, and/or in a different order than those described herein. Also, the various blocks may be combined into fewer blocks, divided into additional blocks, and/or removed based upon the directed implementation.
- each block may represent a module, a segment, or a portion of program code, which includes one or more instructions executable by a processor or computing device for implementing specific logical operations or steps in the process.
- the program code may be stored on any type of computer-readable medium, for example, such as a storage device included a disk or hard drive.
- the computer-readable medium may include a non-transitory computer-readable medium, for example, such as computer-readable media that stores data for short periods of time like register memory, processor cache and/or random access memory (RAM).
- the computer-readable medium may also include non-transitory media, such as secondary or persistent long-term storage, like read-only memory (ROM), optical or magnetic disks, and compact-disc read-only memory (CD-ROM), for example.
- non-transitory media such as secondary or persistent long-term storage, like read-only memory (ROM), optical or magnetic disks, and compact-disc read-only memory (CD-ROM), for example.
- the computer-readable media may be considered a computer-readable storage medium, for example, or a tangible storage device.
- each block in FIG. 4 may represent circuitry that is wired to perform the specific logical operations in the process.
- operations of the method 400 may be performed by a control system, such as control system 118 of FIG. 1 .
- the operations of method 400 may be distributed across multiple control systems that are interfaced with the depth sensor and transducers in order to perform the specified actions.
- the method 400 involves receiving a first depth measurement between a depth sensor and a ground surface.
- the specific location of the ground surface may or may not be known before the first depth measurement is received.
- the robot may be configured to assess the ground surfaces proximate the robot, in which case the ground surface's location may be selected as a part of a sweep across a larger area of ground surfaces, for example.
- the ground surface's specific location may be selected because it is a step location that lies on the robot's planned trajectory.
- the first depth measurement of the ground surface may be recorded by the depth sensor and provided to the robot's control system.
- the first depth measurement is a single distance value indicative of a distance between the depth sensor and the ground surface (or a movable element present thereon).
- the first depth measurement includes a set of distance values that collectively form a depth map of the ground surface (and/or any movable elements present thereon).
- depth measurement refers to any kind of depth information that may be collected by the depth sensor, including single measurements and/or depth maps.
- the method 400 involves causing at least one transducer to emit a directional pressure wave toward the ground surface.
- a “pressure wave” may be any vibration within a medium (e.g., air, among other possible mediums).
- a pressure wave may possess one or more frequencies within the infrasound, acoustic, or ultrasonic range. Although the description herein may generally refer to pressure waves as ultrasound waves, a variety of frequencies may be utilized depending upon the particular implementation.
- the robot may utilize a monolithic transducer that is designed to emit a focused pressure wave.
- the monolithic transducer may be aimed toward a desired location on the ground in a number of ways.
- the transducer may be mounted on a body of the robot, may be turned, rotated, or otherwise moved in order to aim the transducer toward a desired location on the ground.
- the transducer may be coupled to an actuated mount, which can be controlled to physically orient the transducer to a desired position.
- the robot may utilize an array of transducers designed to produced a focused pressure wave by operating the transducers according to a specific phase timing.
- a control system of the robot may, based on a known three-dimensional position of a ground surface, generate a phase timing.
- the transducer array is operated in accordance with the phase timing, the pressures waves emitted by each transducer in the array constructively interfere at or near the known three-dimensional position of the ground surface. In this manner, the transducer array may precisely “aim” the pressure wave at a desired location, without the need for any additional robotic movements or actuated mounts.
- the pressure wave or waves produced by the at least one transducer are ultrasound waves. Unlike acoustic waves, some ultrasound waves may not be audible to humans, thereby allowing the robot to perform the techniques of the present application without producing a human-audible tone.
- the pressure wave may be directed or otherwise form at or near the ground surface.
- the pressure wave may then interact with any movable features present on the ground surface. This interaction may perturb the movable feature, which may change its position.
- many movable features may be present on the ground surface (e.g., many blades of grass), some of which are moved and while others are not moved.
- the method 400 involves receiving a second depth measurement between the depth sensor and the ground surface after emitting the directional pressure wave. If a movable feature is present on the ground surface and is perturbed through its interaction with the pressure wave, the second depth measurement may capture the depth of this feature in its new position.
- the point in time at which the second depth measurement is captured may be determined based on the first distance measurement.
- the first distance measurement may be indicative of a distance between the the depth sensor and the ground surface.
- the control system may determine the length of time between the emission of the pressure wave and the point in time when the pressure wave reaches the ground surface.
- the depth sensor may capture the second depth measurement. Note that, in some cases, the distance between the at least one transducer and the ground surface may first be determined trigonometrically based on a known relative position of the at least one transducer and the depth sensor.
- two or more second depth measurements may be recorded at different points in time after the pressure wave has been emitted. Each of the two or more second depth measurements may be compared to the first depth measurement to determine whether any differences can be identified between them at block 408 .
- the method 400 involves identifying one or more differences between the first depth measurement and the second depth measurement indicating that the ground surfaces includes a movable element.
- identifying one or more differences between them may involve determining whether or not the two measurements are the same. If the distance recorded by the first depth measurement differs from the distance recorded by the second depth measurement, the control system may consider the ground surface to include a movable element.
- the two distances may be compared to determine whether they differ by at least a threshold amount.
- the depth sensor's measurements may be noisy, which might produce slight variations in the distance measurements.
- a threshold distance difference may be set, which must be exceeded before the ground surface is determined to include a movable element.
- the threshold may be predetermined based on a number of factors, such as an expected environment, a tolerance of the controller, and/or a desired behavior of the robot, among other possible factors.
- a difference in depth between the two distance measurements may not lead to a binary decision as to whether or not the ground surface contains a movable element. Rather, in some embodiments, the extent of difference between the two distance measurements may serve as a basis for informing the robot of the degree to which the ground surface is unstable. For example, a short-fiber carpet may only produce a small difference (if any) between the two distance measurements, whereas a wild grass may produce a large difference between the two distance measurements. In this example, the small difference may indicate to a control system that the ground surface is only slightly unstable, while the large difference may indicate to the control system that the ground surface is very unstable.
- identifying one or more differences between them may involve determining a number of differences between the two depth maps.
- control system may determine that the ground surface contains a movable feature if a single difference is identified between the depth maps. In other instances, the control system may determine that the ground surface contains a movable feature if a threshold number of differences are identified between the depth maps. More complex comparisons also may be performed to quantitatively determine the extent of difference between two depth maps.
- a given comparison between corresponding depth measurements of the two depth maps may only count as an “identified difference” if that difference exceeds a threshold amount. The number of “identified differences” may then be counted, the sum of which may serve as a basis for determining whether or not the ground surface includes thereon a movable feature.
- the results of the comparison between the depth measurements or depth maps may serve as a basis for characterizing the ground surface.
- the characterization may be whether or not the ground surface has thereon a movable element or feature. Other characterizations may indicate a degree to which the ground surface is unstable.
- the method 400 involves providing navigation instructions to a mobile robot based on the identified one or more differences between the first depth measurement and the second depth measurement.
- the navigation instructions may cause the robot to alter its trajectory or planned step path. For example, the navigation instructions may cause the robot to avoid stepping onto a ground surface determined to include a movable element. As another example, the navigation instructions may cause the robot to step over the ground surface. In instances where the robot has a trajectory planning system or other computing system, the navigation instructions may be supplied to that system of the robot in order to inform it of the characterization determined at block 408 .
- the navigation instructions may modify the robot's stepping behavior, gait, speed, or other aspects of its movement. For example, the navigation instructions may cause the robot to step cautiously onto a ground surface determined to include a movable element. As another example, the navigation instructions may cause the robot to reduce its speed. As yet another example, the robot may alter its gait (e.g., from running to walking).
- the navigation instructions may not alter the robot's trajectory or behavior in any way.
- the robot's control system may determine that the ground surface only has a small degree of instability which does not necessitate a change to the robot's gait, speed, stepping behavior, or trajectory.
- the robot's control system may determine that the ground surface is stable.
- FIG. 5 is a conceptual illustration 500 of an operation of a phased transducer array, according to an example embodiment.
- a robot may have coupled thereto a phased transducer array operated by a control system of the robot.
- the phased transducer array may be operated according to a phase timing in order to produce directional waves aimed toward a focal point.
- FIG. 5 illustrates a conceptual, two-dimensional illustration 500 of the operating principle behind a phased transducer array.
- a set of signals 510 are provided to a corresponding set of transducers 520 .
- the signals 510 have a specific phase timing as shown in illustration 500 .
- the top and bottom signals of signals 510 arrive at the respective top and bottom transducers of transducers 520 first. Then, after a delay, the signal that is second from the top and the signal that is second from the bottom arrives at their respective transducers. After yet another delay, the signal that is third from the top and the signal that is third from the bottom arrives at their respective transducers. Finally, after another delay, the middle signal arrives at the middle transducer.
- a pressure wave is emitted from that respective transducer.
- a set of pressure waves 530 are emitted that collectively form a wavefront 540 .
- the wavefront 540 focuses over time to focal point 550 .
- the focal point 550 represents a position at which the entire set of pressure waves 530 constructively interfere.
- the resulting pressure wave at the focal point 550 may have a greater amplitude than each individual pressure wave of the set of pressure waves 530 .
- the conceptual illustration 500 depicts one example of how phase timing can be used to generate a collective pressure wave at a particular location using a set of individual pressure waves.
- the phase timing can be varied in order to select a variety of focal points at various locations in space.
- the conceptual illustration 500 shows a two-dimensional example of a phased transducer array
- any two- or three-dimensional arrangement of transducers may be utilized to implement a phased transducer array.
- employing a phase timing for a three-dimensional arrangement of transducers may generate a set of pressure waves that constructively interfere at a three-dimensional point in space. It should be understood that, although some examples and figures described herein may refer to or illustrate two-dimensional examples, the techniques of the present application may be applied in three dimensions.
- FIG. 5 is provided to facilitate understanding of implementations of the present application that utilize a phased transducer array.
- FIG. 6A and FIG. 6B depict a scenario in which techniques of the present application may be used at two different points in time.
- the robot 610 may be similar to robot 200 of FIG. 2 , robot 300 of FIG. 3 , and/or may include any combination of components of robot 200 , robot 300 , a transducer or transducer array as described herein, and/or a depth sensor as described herein.
- a robot 610 encounters an environment containing grass 622 on ground surface 620 .
- the robot 610 utilizes techniques of the present application in order to detect the presence of movable elements—in this example, the grass 622 —on the ground surface 620 .
- a control system and/or computing device of the robot 610 may also carry out some or all of these operations.
- FIG. 6A illustrates an example operation of a robot at a first time 600 , according to an example embodiment.
- the robot 610 uses the depth sensor 616 measures the depth measurement 630 of the grass 622 .
- the robot 610 also determines the height 640 of the grass 622 with respect to the depth sensor 616 based on a trigonometric relationship between the depth sensor 616 and the known position of the grass 622 .
- the robot 610 controls the transducer array 612 to emit a pressure wave 614 directed toward the grass 622 at a second time 650 .
- the pressure wave is directed toward the grass 622 using phase timing techniques as described herein, or using other kinds of beamforming techniques.
- the pressure wave 614 interacts with the grass 622 , perturbing it and causing the grass 622 to separate at the focal point of the pressure wave 614 .
- the robot 610 uses the depth sensor 616 to measure the depth measurement 632 of the grass 622 while or after it has interacted with the pressure wave 622 .
- the depth measurement 632 is larger than the depth measurement 630 .
- the robot 610 also determines the height 642 of the grass 622 after it has been perturbed, which may be similar to or the same as the height of the ground surface 620 .
- the robot 610 may then determine whether or not the location at which the depth sensor 616 was capturing depth measurements (in this example, the grass 622 ) includes a movable element. In one embodiment, the robot 610 compares the depth measurement 630 to the depth measurement 632 to determine if they are different. As previously described, the depth measurement comparison may simply determine whether the two depth measurements are the same, or determine whether the two depth measurements differ by a threshold amount. In another embodiment, the robot 610 may calculate a difference between the depth measurement 630 and the depth measurement 632 to determine the degree to which grass 622 is unstable.
- the depth sensor 616 gathers depth maps of the grass 622 and compares a depth map captured at the first time 600 to another depth map captured at the second time 650 .
- An example depth map comparison is illustrated in FIG. 7 and described in more detail below.
- the robot 610 may also use that determined height of the grass 622 as an input into a stepping controller or other control system. For example, upon detecting the movable feature of grass 622 , the robot 610 may modify its stepping behavior to step more slowly onto the grass 622 . The control system that implements this more cautious stepping behavior may move a foot of the robot quickly to some height at or above the determined height 640 and/or height 642 , and then proceed to slowly lower the foot until it touches down onto the ground surface 620 . In other examples, the robot 610 may implement other stepping controllers that utilize an estimated ground surface height (e.g., height 642 ) as a basis for controlling the robot's stepping behavior.
- an estimated ground surface height e.g., height 642
- the depth sensor 616 may be any device capable measuring distances between the depth sensor 616 and surfaces within an environment.
- the depth sensor 616 may be a stereoscopic camera, an imaging device coupled with a structured light projector, a RADAR device, and/or a LIDAR device, among other possible depth sensors.
- the depth sensor 616 may be any other device utilizing stereo triangulation, structured or patterned light, time-of-flight, and/or interferometry techniques.
- a “movable element” may be any object, item, material, artifact, or other feature that may interact with, be perturbed by, or otherwise react to a pressure wave.
- the movable element may be partially fixed or secured onto the ground surface, in some instances. In other instances, the movable element may be on the ground surface, but is not affixed to it in any way, such as a loose object that landed onto the ground surface.
- first time 600 and the second time 650 depict particular instances in time of the scenario shown in FIG. 6A and FIG. 6B
- the first time 600 and/or the second time 650 may represent multiple operations of the robot 610 that occur over some period of time. It should be understood that the “first time” and “second time” distinguish between two states of the environment and may not necessarily represent a single point in time.
- FIG. 7 illustrates comparison 700 between a pair of depth maps, according to an example embodiment.
- the depth sensor 616 of the robot 610 may capture depth map 710 at the first time 600 , and depth map 720 at the second time 650 in this example.
- Depth map 710 depicts blades of grass 622 as seen from the perspective of the robot 610 .
- the depth map 710 includes some lightly shaded blades of grass 622 which represent the blades of grass closest to the depth sensor 616 , some medium shaded blades of grass 622 which represent blades of grass behind the lightly shaded blades of grass 622 , and some darkly shaded blades of grass 622 behind the medium shaded blades of grass 622 .
- the comparison between the depth map 710 and the depth map 720 may be performed in a variety of ways.
- Each depth map may have a spatial resolution indicative of a discrete number of depth measurements included therein.
- Each individual depth measurement within depth map 710 may be compared against its corresponding depth measurement within depth map 720 for each spatial position.
- a control system may count the number of corresponding depth measurements between the depth map 710 and the depth map 720 that differ (by any amount, or by at least a threshold amount).
- a region of the depth map 710 and the depth map 720 may be compared. For example, as shown in FIG. 7 , the region 712 of the depth map 710 may be compared against the corresponding region 722 of the depth map 720 . In this example, the blades of grass 622 have moved about such that the region 712 and the region 722 no longer appear to be the same or similar, while other regions of the entire depth map 710 and depth map 720 appear unchanged between the two depth maps.
- a control system or computing device may be able to determine a number of regions (if any) between the two depth maps that differ.
- one different region may be indicative of the presence of a movable element.
- a threshold number of differing regions may indicate the presence of a movable element.
- the present application discloses a robot, computing device, or control system identifying differences between the depth map 710 and the depth map 720 , and using those identified differences as a basis for maintaining the robot's current control behavior, for modifying the robot's behavior, speed, step path, or trajectory, and/or for informing a control system of either the presence of a movable element and/or of a degree of instability of a ground surface.
- each blade of grass 622 may be determined with higher granularity. Additionally, a given blade of grass 622 may have multiple depths associated with it, such as if a given blade of grass bends away from the perspective of robot 610 .
- the distinct shading levels and simplified depth representations of each blade of grass 622 is provided for the purposes of explanation and may or may not necessarily correspond to an actual depth map or scenario.
- FIG. 8 illustrates an example computer-readable medium configured according to at least some implementation described herein.
- the example system can include one or more processors, one or more forms of memory, one or more input devices/interfaces, one or more output devices/interfaces, and machine readable instructions that when executed by the one or more processors cause a robotic device to carry out the various operations, tasks, capabilities, etc., described above.
- FIG. 8 is a schematic illustrating a conceptual partial view of a computer program product that includes a computer program for executing a computer process on a computing device, arranged according to at least some implementations disclosed herein.
- the example computer program product 800 may include one or more program instructions 802 that, when executed by one or more processors may provide functionality or portions of the functionality described above with respect to FIGS. 1-7 .
- the computer program product 800 may include a computer-readable medium 804 , such as, but not limited to, a hard disk drive, a Compact Disc (CD), a Digital Video Disk (DVD), a digital tape, memory, etc.
- the computer program product 800 may include a computer recordable medium 806 , such as, but not limited to, memory, read/write (R/W) CDs, R/W DVDs, etc.
- the one or more program instructions 802 can be, for example, computer executable and/or logic implemented instructions.
- a computing device is configured to provide various operations, or actions in response to the program instructions 802 conveyed to the computing device by the computer readable medium 804 and/or the computer recordable medium 806 .
- the computing device can be an external device in communication with a device coupled to the robotic device.
- the computer readable medium 804 can also be distributed among multiple data storage elements, which could be remotely located from each other.
- the computing device that executes some or all of the stored instructions could be an external computer, or a mobile computing platform, such as a smartphone, tablet device, personal computer, or a wearable device, among others.
- the computing device that executes some or all of the stored instructions could be a remotely located computer system, such as a server.
- the computer program product 800 can implement operations discussed in reference to FIGS. 4-7 .
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- Acoustics & Sound (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
Abstract
Description
- The present disclosure is a continuation of U.S. patent application Ser. No. 14/822,009, filed on Aug. 10, 2015, and entitled “Detection of Movable Ground Areas of a Robot's Environment Using a Transducer Array,” the entire contents of which are herein incorporated by reference as if fully set forth in this description.
- As technology advances, various types of robotic devices are being created for performing a variety of functions that may assist users. Robotic devices may be used for applications involving material handling, transportation, welding, assembly, and dispensing, among others. Over time, the manner in which these robotic systems operate is becoming more intelligent, efficient, and intuitive. As robotic systems become increasingly prevalent in numerous aspects of modern life, it is desirable for robotic systems to be efficient. Therefore, a demand for efficient robotic systems has helped open up a field of innovation in actuators, movement, sensing techniques, as well as component design and assembly.
- The present application discloses implementations that relate to detection of unstable surfaces within an environment. In one example, the present application describes a method. The method involves receiving, from a depth sensor coupled to a mobile robot, a first depth measurement between the depth sensor and a ground surface. The method also involves causing at least one transducer coupled to the mobile robot to emit a directional pressure wave toward the ground surface. The method further involves receiving, from the depth sensor coupled to the mobile robot, a second depth measurement between the depth sensor and the ground surface after emitting the directional pressure wave. Additionally, the method involves identifying one or more differences between the first depth measurement and the second depth measurement indicating that the ground surface includes a movable element. Further, the method involves providing navigation instructions to the mobile robot based on the identified one or more differences between the first depth measurement and the second depth measurement.
- In another example, the present application describes a robot. The robot includes a depth sensor, at least one transducer, and a computing device. The depth sensor is configured to measure distances between the depth sensor and one or more surfaces in an environment. The environment includes a ground surface. The at least one transducer is configured to emit pressure waves. The computing device is configured to execute instructions that cause performance of a set of operations. The operations include obtaining, from the depth sensor, a first depth measurement between the depth sensor and the ground surface. The operations also include causing the at least one transducer to emit a directional pressure wave toward the ground surface. The operations further include obtaining, from the depth sensor, a second depth measurement between the depth sensor and the ground surface after causing the at least one transducer to emit the directional pressure wave. Additionally, the operations include identifying one or more differences between the first depth measurement and the second depth measurement indicating that the ground surface includes a movable element. Further, the operations include viding navigation instructions to the robot based on the identified one or more differences between the first depth measurement and the second depth measurement.
- In still another example, the present application describes a non-transitory computer-readable medium having instructions stored thereon that, upon execution by at least one processor, causes a mobile robot to perform a set of operations. The operations include obtaining a first depth map of a ground surface proximate the mobile robot. The operations also include causing an array of transducers to emit a plurality of pressure waves in accordance with a timing sequence. The plurality of pressure waves collectively form a directional pressure wave directed toward the ground surface. The operations further include obtaining a second depth map of the ground surface after the directional pressure wave reaches the ground surface. Additionally, the operations include identifying one or more differences between a first area of the first depth map and a corresponding second area of the second depth map. Further, the operations include providing navigation instructions to the mobile robot based on the identified one or more differences between the first depth map and the second depth map.
- In yet another example, the present application describes a system. The system includes a means for receiving, from a depth sensor coupled to a mobile robot, a first depth measurement between the depth sensor and a ground surface. The system also includes a means for causing at least one transducer coupled to the mobile robot to emit a directional pressure wave toward the ground surface. The system further includes a means for receiving, from the depth sensor coupled to the mobile robot, a second depth measurement between the depth sensor and the ground surface after emitting the directional pressure wave. Additionally, the system includes a means for identifying one or more differences between the first depth measurement and the second depth measurement indicating that the ground surface includes a movable element. Further, the system includes a means for providing navigation instructions to the mobile robot based on the identified one or more differences between the first depth measurement and the second depth measurement.
- The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the figures and the following detailed description and the accompanying drawings.
-
FIG. 1 illustrates a configuration of a robotic system, according to an example embodiment. -
FIG. 2 illustrates a perspective view of a quadruped robot, according to an example embodiment. -
FIG. 3 illustrates a perspective view of a biped robot, according to an example embodiment. -
FIG. 4 illustrates a flowchart, according to an example embodiment. -
FIG. 5 is a conceptual illustration of an operation of a phased transducer array, according to an example embodiment. -
FIG. 6A illustrates an example operation of a robot at a first time, according to an example embodiment. -
FIG. 6B illustrates an example operation of a robot at a second time, according to an example embodiment. -
FIG. 7 illustrates comparison between a pair of depth maps, according to an example embodiment. -
FIG. 8 illustrates an example computer-readable medium, according to an example embodiment. - The following detailed description describes various features and functions of the disclosed systems and methods with reference to the accompanying figures. The illustrative system and method embodiments described herein are not meant to be limiting. It may be readily understood that certain aspects of the disclosed systems and methods can be arranged and combined in a wide variety of different configurations, all of which are contemplated herein.
- The present application discloses implementations that relate to detection of unstable surfaces within an environment. A mobile robot may encounter a variety of terrains. Some terrains—such as pavement, asphalt, cement, and wood—are stable and solid and may be confidently stepped on. However, other terrains may be unstable or include movable elements—such as carpet, grass, or other vegetation—which may cause the robot to slip or misstep. Thus, it may be desirable for the robot to either avoid stepping onto such unstable surfaces, or to modify its stepping behavior in order to maintain stability.
- An example embodiment involves a robot with a depth sensor and a transducer. The depth sensor first measures a distance between the sensor and a particular location on the ground proximate the robot. Then, the transducer is operated to emit a pressure wave directed toward that particular location on the ground. The depth sensor then performs another distance measurement on that particular location on the ground. A control system of the robot may then determine whether there are any differences between the two depth measurements. If there is at least one identified difference between the two depth measurements, a movable feature may have been detected at that particular location. Upon detecting the movable feature, the control system may instruct the robot to avoid that particular location on the ground and/or otherwise modify the robot's stepping behavior.
- In some embodiments, the robot may include a plurality of transducers arranged as an array, each of which can be operated independently. Once the particular location on the ground has been identified, selected, or determined, a control system of the robot may determine a phase timing with which to operate the transducer array. By operating the transducer array in this manner, relatively weak individual pressure waves emitted by each transducer may constructively interfere at a particular focal point to form a stronger and directed pressure wave. A known relationship between the arrangement of the transducers and the speed of the pressure wave in the air (or other medium) may enable a control system to calculate the phase timing necessary to direct the pressure wave at a desired focal point. The operating principle behind a phased transducer array is described in more detail with respect to
FIG. 5 . - In some embodiments, the depth sensor may be configured to capture a depth map that includes a set of spatially-mapped distance measurements for an area near the robot. When detecting movable elements on ground surfaces, the depth sensor may capture two depth maps—one before the pressure wave is emitted, and another after the pressure wave is emitted. A control system of the robot may then compare the two depth maps and identify any differences between the two depth maps. If the depth maps are different, the control system may determine that a movable feature has been detected and accordingly instruct the robot and/or modify the robot's stepping behavior.
- If a movable feature is detected on a ground surface on which the robot has been instructed to step on, a variety of navigation instructions may be provided to the robot. In some cases, the robot's behavior and trajectory may remain unchanged. In other cases, the robot's trajectory and/or planned stepping route may be modified in order to avoid stepping onto the ground surface containing the movable feature. Alternatively, the robot's stepping behavior may be modified to proceed cautiously onto the ground surface containing the movable feature. As one example, the robot may be instructed to more slowly lower its foot onto the ground surface until it is firmly planted onto the stable portion of the ground surface beneath the movable feature. The robot's stepping pattern may become shorter, and/or the robot's velocity may be reduced. In some instances, the information indicating that a movable feature has been detected may be provided to other planning or control systems of the robot and be considered with other information about the environment.
-
FIG. 1 illustrates an example configuration of a robotic system that may be used in connection with the implementations described herein. Therobotic system 100 may be configured to operate autonomously, semi-autonomously, and/or using directions provided by user(s). Therobotic system 100 may be implemented in various forms, such as a biped robot, quadruped robot, or some other arrangement. Furthermore, therobotic system 100 may also be referred to as a robot, robotic device, or mobile robot, among other designations. - As shown in
FIG. 1 , therobotic system 100 may include processor(s) 102,data storage 104, and controller(s) 108, which together may be part of acontrol system 118. Therobotic system 100 may also include sensor(s) 112, power source(s) 114,mechanical components 110, andelectrical components 116. Nonetheless, therobotic system 100 is shown for illustrative purposes, and may include more or fewer components. The various components ofrobotic system 100 may be connected in any manner, including wired or wireless connections. Further, in some examples, components of therobotic system 100 may be distributed among multiple physical entities rather than a single physical entity. Other example illustrations ofrobotic system 100 may exist as well. - Processor(s) 102 may operate as one or more general-purpose hardware processors or special purpose hardware processors (e.g., digital signal processors, application specific integrated circuits, etc.). The processor(s) 102 may be configured to execute computer-
readable program instructions 106, and manipulatedata 107, both of which are stored in thedata storage 104. The processor(s) 102 may also directly or indirectly interact with other components of therobotic system 100, such as sensor(s) 112, power source(s) 114,mechanical components 110, and/orelectrical components 116. - The
data storage 104 may be one or more types of hardware memory. For example, thedata storage 104 may include or take the form of one or more computer-readable storage media that can be read or accessed by processor(s) 102. The one or more computer-readable storage media can include volatile and/or non-volatile storage components, such as optical, magnetic, organic, or another type of memory or storage, which can be integrated in whole or in part with processor(s) 102. In some implementations, thedata storage 104 can be a single physical device. In other implementations, thedata storage 104 can be implemented using two or more physical devices, which may communicate with one another via wired or wireless communication. As noted previously, thedata storage 104 may include the computer-readable program instructions 106 and thedata 107. Thedata 107 may be any type of data, such as configuration data, sensor data, and/or diagnostic data, among other possibilities. - The
controller 108 may include one or more electrical circuits, units of digital logic, computer chips, and/or microprocessors that are configured to (perhaps among other tasks), interface between any combination of themechanical components 110, the sensor(s) 112, the power source(s) 114, theelectrical components 116, thecontrol system 118, and/or a user of therobotic system 100. In some implementations, thecontroller 108 may be a purpose-built embedded device for performing specific operations with one or more subsystems of therobotic device 100. - The
control system 118 may monitor and physically change the operating conditions of therobotic system 100. In doing so, thecontrol system 118 may serve as a link between portions of therobotic system 100, such as betweenmechanical components 110 and/orelectrical components 116. In some instances, thecontrol system 118 may serve as an interface between therobotic system 100 and another computing device. Further, thecontrol system 118 may serve as an interface between therobotic system 100 and a user. The instance, thecontrol system 118 may include various components for communicating with therobotic system 100, including a joystick, buttons, and/or ports, etc. The example interfaces and communications noted above may be implemented via a wired or wireless connection, or both. Thecontrol system 118 may perform other operations for therobotic system 100 as well. - During operation, the
control system 118 may communicate with other systems of therobotic system 100 via wired or wireless connections, and may further be configured to communicate with one or more users of the robot. As one possible illustration, thecontrol system 118 may receive an input (e.g., from a user or from another robot) indicating an instruction to perform a particular gait in a particular direction, and at a particular speed. A gait is a pattern of movement of the limbs of an animal, robot, or other mechanical structure. - Based on this input, the
control system 118 may perform operations to cause therobotic device 100 to move according to the requested gait. As another illustration, a control system may receive an input indicating an instruction to move to a particular geographical location. In response, the control system 118 (perhaps with the assistance of other components or systems) may determine a direction, speed, and/or gait based on the environment through which therobotic system 100 is moving en route to the geographical location. - Operations of the
control system 118 may be carried out by the processor(s) 102. Alternatively, these operations may be carried out by thecontroller 108, or a combination of the processor(s) 102 and thecontroller 108. In some implementations, thecontrol system 118 may partially or wholly reside on a device other than therobotic system 100, and therefore may at least in part control therobotic system 100 remotely. -
Mechanical components 110 represent hardware of therobotic system 100 that may enable therobotic system 100 to perform physical operations. As a few examples, therobotic system 100 may include physical members such as leg(s), arm(s), and/or wheel(s). The physical members or other parts ofrobotic system 100 may further include actuators arranged to move the physical members in relation to one another. Therobotic system 100 may also include one or more structured bodies for housing thecontrol system 118 and/or other components, and may further include other types of mechanical components. The particularmechanical components 110 used in a given robot may vary based on the design of the robot, and may also be based on the operations and/or tasks the robot may be configured to perform. - In some examples, the
mechanical components 110 may include one or more removable components. Therobotic system 100 may be configured to add and/or remove such removable components, which may involve assistance from a user and/or another robot. For example, therobotic system 100 may be configured with removable arms, hands, feet, and/or legs, so that these appendages can be replaced or changed as needed or desired. In some implementations, therobotic system 100 may include one or more removable and/or replaceable battery units or sensors. Other types of removable components may be included within some implementations. - The
robotic system 100 may include sensor(s) 112 arranged to sense aspects of therobotic system 100. The sensor(s) 112 may include one or more force sensors, torque sensors, velocity sensors, acceleration sensors, position sensors, proximity sensors, motion sensors, location sensors, load sensors, temperature sensors, touch sensors, depth sensors, ultrasonic range sensors, infrared sensors, object sensors, and/or cameras, among other possibilities. Within some examples, therobotic system 100 may be configured to receive sensor data from sensors that are physically separated from the robot (e.g., sensors that are positioned on other robots or located within the environment in which the robot is operating). - The sensor(s) 112 may provide sensor data to the processor(s) 102 (perhaps by way of data 107) to allow for interaction of the
robotic system 100 with its environment, as well as monitoring of the operation of therobotic system 100. The sensor data may be used in evaluation of various factors for activation, movement, and deactivation ofmechanical components 110 andelectrical components 116 bycontrol system 118. For example, the sensor(s) 112 may capture data corresponding to the terrain of the environment or location of nearby objects, which may assist with environment recognition and navigation. In an example configuration, sensor(s) 112 may include RADAR (e.g., for long-range object detection, distance determination, and/or speed determination), LIDAR (e.g., for short-range object detection, distance determination, and/or speed determination), SONAR (e.g., for underwater object detection, distance determination, and/or speed determination), VICON® (e.g., for motion capture), one or more cameras (e.g., stereoscopic cameras for 3D vision), a global positioning system (GPS) transceiver, and/or other sensors for capturing information of the environment in which therobotic system 100 is operating. The sensor(s) 112 may monitor the environment in real time, and detect obstacles, elements of the terrain, weather conditions, temperature, and/or other aspects of the environment. - Further, the
robotic system 100 may include sensor(s) 112 configured to receive information indicative of the state of therobotic system 100, including sensor(s) 112 that may monitor the state of the various components of therobotic system 100. The sensor(s) 112 may measure activity of systems of therobotic system 100 and receive information based on the operation of the various features of therobotic system 100, such the operation of extendable legs, arms, or other mechanical and/or electrical features of therobotic system 100. The data provided by the sensor(s) 112 may enable thecontrol system 118 to determine errors in operation as well as monitor overall operation of components of therobotic system 100. - As an example, the
robotic system 100 may use force sensors to measure load on various components of therobotic system 100. In some implementations, therobotic system 100 may include one or more force sensors on an arm or a leg to measure the load on the actuators that move one or more members of the arm or leg. As another example, therobotic system 100 may use one or more position sensors to sense the position of the actuators of the robotic system. For instance, such position sensors may sense states of extension, retraction, or rotation of the actuators on arms or legs. - As another example, the sensor(s) 112 may include one or more velocity and/or acceleration sensors. For instance, the sensor(s) 112 may include an inertial measurement unit (IMU). The IMU may sense velocity and acceleration in the world frame, with respect to the gravity vector. The velocity and acceleration sensed by the IMU may then be translated to that of the
robotic system 100 based on the location of the IMU in therobotic system 100 and the kinematics of therobotic system 100. - The
robotic system 100 may include other types of sensors not explicated discussed herein. Additionally or alternatively, the robotic system may use particular sensors for purposes not enumerated herein. - The
robotic system 100 may also include one or more power source(s) 114 configured to supply power to various components of therobotic system 100. Among other possible power systems, therobotic system 100 may include a hydraulic system, electrical system, batteries, and/or other types of power systems. As an example illustration, therobotic system 100 may include one or more batteries configured to provide charge to components of therobotic system 100. Some of themechanical components 110 and/orelectrical components 116 may each connect to a different power source, may be powered by the same power source, or be powered by multiple power sources. - Any type of power source may be used to power the
robotic system 100, such as electrical power or a gasoline engine. Additionally or alternatively, therobotic system 100 may include a hydraulic system configured to provide power to themechanical components 110 using fluid power. Components of therobotic system 100 may operate based on hydraulic fluid being transmitted throughout the hydraulic system to various hydraulic motors and hydraulic cylinders, for example. The hydraulic system may transfer hydraulic power by way of pressurized hydraulic fluid through tubes, flexible hoses, or other links between components of therobotic system 100. The power source(s) 114 may charge using various types of charging, such as wired connections to an outside power source, wireless charging, combustion, or other examples. - The
electrical components 116 may include various mechanisms capable of processing, transferring, and/or providing electrical charge or electric signals. Among possible examples, theelectrical components 116 may include electrical wires, circuitry, and/or wireless communication transmitters and receivers to enable operations of therobotic system 100. Theelectrical components 116 may interwork with themechanical components 110 to enable therobotic system 100 to perform various operations. Theelectrical components 116 may be configured to provide power from the power source(s) 114 to the variousmechanical components 110, for example. Further, therobotic system 100 may include electric motors. Other examples ofelectrical components 116 may exist as well. - Although not shown in
FIG. 1 , therobotic system 100 may include a body, which may connect to or house appendages and components of the robotic system. As such, the structure of the body may vary within examples and may further depend on particular operations that a given robot may have been designed to perform. For example, a robot developed to carry heavy loads may have a wide body that enables placement of the load. Similarly, a robot designed to reach high speeds may have a narrow, small body that does not have substantial weight. Further, the body and/or the other components may be developed using various types of materials, such as metals or plastics. Within other examples, a robot may have a body with a different structure or made of various types of materials. - The body and/or the other components may include or carry the sensor(s) 112. These sensors may be positioned in various locations on the
robotic device 100, such as on the body and/or on one or more of the appendages, among other examples. - On its body, the
robotic device 100 may carry a load, such as a type of cargo that is to be transported. The load may also represent external batteries or other types of power sources (e.g., solar panels) that therobotic device 100 may utilize. Carrying the load represents one example use for which therobotic device 100 may be configured, but therobotic device 100 may be configured to perform other operations as well. - As noted above, the
robotic system 100 may include various types of legs, arms, wheels, and so on. In general, therobotic system 100 may be configured with zero or more legs. An implementation of the robotic system with zero legs may include wheels, treads, or some other form of locomotion. An implementation of the robotic system with two legs may be referred to as a biped, and an implementation with four legs may be referred as a quadruped. Implementations with six or eight legs are also possible. For purposes of illustration, biped and quadruped implementations of therobotic system 100 are described below. -
FIG. 2 illustrates aquadruped robot 200, according to an example implementation. Among other possible features, therobot 200 may be configured to perform some of the operations described herein. Therobot 200 includes a control system, andlegs body 208. Each leg may include arespective foot robot 200 is illustrated with sensor(s) 210, and may be capable of carrying a load on thebody 208. Within other examples, therobot 200 may include more or fewer components, and thus may include components not shown inFIG. 2 . - The
robot 200 may be a physical representation of therobotic system 100 shown inFIG. 1 , or may be based on other configurations. Thus, therobot 200 may include one or more ofmechanical components 110, sensor(s) 112, power source(s) 114,electrical components 116, and/orcontrol system 118, among other possible components or systems. - The configuration, position, and/or structure of the
legs 204A-204D may vary in example implementations. Thelegs 204A-204D enable therobot 200 to move relative to its environment, and may be configured to operate in multiple degrees of freedom to enable different techniques of travel. In particular, thelegs 204A-204D may enable therobot 200 to travel at various speeds according to the mechanics set forth within different gaits. Therobot 200 may use one or more gaits to travel within an environment, which may involve selecting a gait based on speed, terrain, the need to maneuver, and/or energy efficiency. - Further, different types of robots may use different gaits due to variations in design. Although some gaits may have specific names (e.g., walk, trot, run, bound, gallop, etc.), the distinctions between gaits may overlap. The gaits may be classified based on footfall patterns—the locations on a surface for the placement the
feet 206A-206D. Similarly, gaits may also be classified based on ambulatory mechanics. - The
body 208 of therobot 200 connects to thelegs 204A-204D and may house various components of therobot 200. For example, thebody 208 may include or carry sensor(s) 210. These sensors may be any of the sensors discussed in the context of sensor(s) 112, such as a camera, LIDAR, or an infrared sensor. Further, the locations of sensor(s) 210 are not limited to those illustrated inFIG. 2 . Thus, sensor(s) 210 may be positioned in various locations on therobot 200, such as on thebody 208 and/or on one or more of thelegs 204A-204D, among other examples. -
FIG. 3 illustrates abiped robot 300 according to another example implementation. Similar torobot 200, therobot 300 may correspond to therobotic system 100 shown inFIG. 1 , and may be configured to perform some of the implementations described herein. Thus, like therobot 200, therobot 300 may include one or more ofmechanical components 110, sensor(s) 112, power source(s) 114,electrical components 116, and/orcontrol system 118. - For example, the
robot 300 may includelegs body 308. Each leg may consist of one or more members connected by joints and configured to operate with various degrees of freedom with respect to one another. Each leg may also include arespective foot robot 200, thelegs robot 300 to travel at various speeds according to the mechanics set forth within gaits. Therobot 300, however, may utilize different gaits from that of therobot 200, due at least in part to the differences between biped and quadruped capabilities. - The
robot 300 may also includearms robot 300. Likelegs respective hand robot 300 may usehands hands - The
robot 300 may also include sensor(s) 314, corresponding to sensor(s) 112, and configured to provide sensor data to its control system. In some cases, the locations of these sensors may be chosen in order to suggest an anthropomorphic structure of therobot 300. Thus, as illustrated inFIG. 3 , therobot 300 may contain vision sensors (e.g., cameras, infrared sensors, object sensors, range sensors, etc.) within itshead 316. -
FIG. 4 illustrates a flowchart of anexample method 400 for detecting a movable element on a ground surface, according to an example embodiment.Method 400 shown inFIG. 4 presents an embodiment of a method that could be used or implemented by therobot 200 ofFIG. 2 and/or therobot 300 ofFIG. 3 , for example, or more generally by one or more components of any computing device.Method 400 may include one or more operations, functions, or actions as illustrated by one or more blocks of 402-410. Although the blocks are illustrated in a sequential order, these blocks may also be performed in parallel, and/or in a different order than those described herein. Also, the various blocks may be combined into fewer blocks, divided into additional blocks, and/or removed based upon the directed implementation. - In addition, the
method 400 and other processes and methods disclosed herein, the block diagram shows functionality and operation of one possible implementation of present embodiments. In this regard, each block may represent a module, a segment, or a portion of program code, which includes one or more instructions executable by a processor or computing device for implementing specific logical operations or steps in the process. The program code may be stored on any type of computer-readable medium, for example, such as a storage device included a disk or hard drive. The computer-readable medium may include a non-transitory computer-readable medium, for example, such as computer-readable media that stores data for short periods of time like register memory, processor cache and/or random access memory (RAM). The computer-readable medium may also include non-transitory media, such as secondary or persistent long-term storage, like read-only memory (ROM), optical or magnetic disks, and compact-disc read-only memory (CD-ROM), for example. The computer-readable media may be considered a computer-readable storage medium, for example, or a tangible storage device. - In addition, for the
method 400 and other processes, methods, and operations described herein, each block inFIG. 4 may represent circuitry that is wired to perform the specific logical operations in the process. - In one embodiment, operations of the
method 400 may be performed by a control system, such ascontrol system 118 ofFIG. 1 . In other embodiments, the operations ofmethod 400 may be distributed across multiple control systems that are interfaced with the depth sensor and transducers in order to perform the specified actions. - At
block 402, themethod 400 involves receiving a first depth measurement between a depth sensor and a ground surface. The specific location of the ground surface may or may not be known before the first depth measurement is received. In some situations, the robot may be configured to assess the ground surfaces proximate the robot, in which case the ground surface's location may be selected as a part of a sweep across a larger area of ground surfaces, for example. In other situations, the ground surface's specific location may be selected because it is a step location that lies on the robot's planned trajectory. Regardless of the particular circumstance, the first depth measurement of the ground surface may be recorded by the depth sensor and provided to the robot's control system. - In some implementations, the first depth measurement is a single distance value indicative of a distance between the depth sensor and the ground surface (or a movable element present thereon). In other implementations, the first depth measurement includes a set of distance values that collectively form a depth map of the ground surface (and/or any movable elements present thereon). For the purposes of this application, “depth measurement” refers to any kind of depth information that may be collected by the depth sensor, including single measurements and/or depth maps.
- At
block 404, themethod 400 involves causing at least one transducer to emit a directional pressure wave toward the ground surface. For the purposes of this application, a “pressure wave” may be any vibration within a medium (e.g., air, among other possible mediums). A pressure wave may possess one or more frequencies within the infrasound, acoustic, or ultrasonic range. Although the description herein may generally refer to pressure waves as ultrasound waves, a variety of frequencies may be utilized depending upon the particular implementation. - In some implementations the robot may utilize a monolithic transducer that is designed to emit a focused pressure wave. The monolithic transducer may be aimed toward a desired location on the ground in a number of ways. For example, the transducer may be mounted on a body of the robot, may be turned, rotated, or otherwise moved in order to aim the transducer toward a desired location on the ground. In another example, the transducer may be coupled to an actuated mount, which can be controlled to physically orient the transducer to a desired position.
- In other implementations, the robot may utilize an array of transducers designed to produced a focused pressure wave by operating the transducers according to a specific phase timing. In one example, a control system of the robot may, based on a known three-dimensional position of a ground surface, generate a phase timing. When the transducer array is operated in accordance with the phase timing, the pressures waves emitted by each transducer in the array constructively interfere at or near the known three-dimensional position of the ground surface. In this manner, the transducer array may precisely “aim” the pressure wave at a desired location, without the need for any additional robotic movements or actuated mounts.
- In some implementations, the pressure wave or waves produced by the at least one transducer are ultrasound waves. Unlike acoustic waves, some ultrasound waves may not be audible to humans, thereby allowing the robot to perform the techniques of the present application without producing a human-audible tone.
- Regardless of the particular implementation, the pressure wave may be directed or otherwise form at or near the ground surface. The pressure wave may then interact with any movable features present on the ground surface. This interaction may perturb the movable feature, which may change its position. In some scenarios, many movable features may be present on the ground surface (e.g., many blades of grass), some of which are moved and while others are not moved.
- At
block 406, themethod 400 involves receiving a second depth measurement between the depth sensor and the ground surface after emitting the directional pressure wave. If a movable feature is present on the ground surface and is perturbed through its interaction with the pressure wave, the second depth measurement may capture the depth of this feature in its new position. - In some embodiments, the point in time at which the second depth measurement is captured may be determined based on the first distance measurement. The first distance measurement may be indicative of a distance between the the depth sensor and the ground surface. Based on a known speed of the pressure wave and the distance to the ground surface, the control system may determine the length of time between the emission of the pressure wave and the point in time when the pressure wave reaches the ground surface. At or near the time when the pressure wave is expected to reach the ground surface, the depth sensor may capture the second depth measurement. Note that, in some cases, the distance between the at least one transducer and the ground surface may first be determined trigonometrically based on a known relative position of the at least one transducer and the depth sensor.
- In some embodiments, two or more second depth measurements may be recorded at different points in time after the pressure wave has been emitted. Each of the two or more second depth measurements may be compared to the first depth measurement to determine whether any differences can be identified between them at
block 408. - At
block 408, themethod 400 involves identifying one or more differences between the first depth measurement and the second depth measurement indicating that the ground surfaces includes a movable element. - In embodiments where the depth measurements are individual distance measurements, identifying one or more differences between them may involve determining whether or not the two measurements are the same. If the distance recorded by the first depth measurement differs from the distance recorded by the second depth measurement, the control system may consider the ground surface to include a movable element.
- Alternatively, the two distances may be compared to determine whether they differ by at least a threshold amount. The depth sensor's measurements may be noisy, which might produce slight variations in the distance measurements. To account for this, a threshold distance difference may be set, which must be exceeded before the ground surface is determined to include a movable element. The threshold may be predetermined based on a number of factors, such as an expected environment, a tolerance of the controller, and/or a desired behavior of the robot, among other possible factors.
- In other embodiments, a difference in depth between the two distance measurements may not lead to a binary decision as to whether or not the ground surface contains a movable element. Rather, in some embodiments, the extent of difference between the two distance measurements may serve as a basis for informing the robot of the degree to which the ground surface is unstable. For example, a short-fiber carpet may only produce a small difference (if any) between the two distance measurements, whereas a wild grass may produce a large difference between the two distance measurements. In this example, the small difference may indicate to a control system that the ground surface is only slightly unstable, while the large difference may indicate to the control system that the ground surface is very unstable.
- In embodiments where the depth measurements are depth maps, identifying one or more differences between them may involve determining a number of differences between the two depth maps. A given depth map may contain a set of distance measurements mapped to a respective set of spatial positions. Identifying a difference between the two depth maps may involve comparing a distance measurement in the first depth map corresponding to a particular spatial position to a distance map in the second depth map corresponding to that same spatial position. Such a comparison between the two depth maps may be repeated for each spatial position.
- In some instances, the control system may determine that the ground surface contains a movable feature if a single difference is identified between the depth maps. In other instances, the control system may determine that the ground surface contains a movable feature if a threshold number of differences are identified between the depth maps. More complex comparisons also may be performed to quantitatively determine the extent of difference between two depth maps.
- In some implementations, a given comparison between corresponding depth measurements of the two depth maps may only count as an “identified difference” if that difference exceeds a threshold amount. The number of “identified differences” may then be counted, the sum of which may serve as a basis for determining whether or not the ground surface includes thereon a movable feature.
- Regardless of any threshold values and/or how a particular implementation defines an “identified difference,” the results of the comparison between the depth measurements or depth maps may serve as a basis for characterizing the ground surface. The characterization may be whether or not the ground surface has thereon a movable element or feature. Other characterizations may indicate a degree to which the ground surface is unstable.
- At
block 410, themethod 400 involves providing navigation instructions to a mobile robot based on the identified one or more differences between the first depth measurement and the second depth measurement. - In some embodiments, the navigation instructions may cause the robot to alter its trajectory or planned step path. For example, the navigation instructions may cause the robot to avoid stepping onto a ground surface determined to include a movable element. As another example, the navigation instructions may cause the robot to step over the ground surface. In instances where the robot has a trajectory planning system or other computing system, the navigation instructions may be supplied to that system of the robot in order to inform it of the characterization determined at
block 408. - In some embodiments, the navigation instructions may modify the robot's stepping behavior, gait, speed, or other aspects of its movement. For example, the navigation instructions may cause the robot to step cautiously onto a ground surface determined to include a movable element. As another example, the navigation instructions may cause the robot to reduce its speed. As yet another example, the robot may alter its gait (e.g., from running to walking).
- In some scenarios, the navigation instructions may not alter the robot's trajectory or behavior in any way. For example, the robot's control system may determine that the ground surface only has a small degree of instability which does not necessitate a change to the robot's gait, speed, stepping behavior, or trajectory. As another example, the robot's control system may determine that the ground surface is stable.
-
FIG. 5 is aconceptual illustration 500 of an operation of a phased transducer array, according to an example embodiment. In some embodiments, a robot may have coupled thereto a phased transducer array operated by a control system of the robot. The phased transducer array may be operated according to a phase timing in order to produce directional waves aimed toward a focal point.FIG. 5 illustrates a conceptual, two-dimensional illustration 500 of the operating principle behind a phased transducer array. - As shown in
FIG. 5 , a set ofsignals 510 are provided to a corresponding set oftransducers 520. Thesignals 510 have a specific phase timing as shown inillustration 500. The top and bottom signals ofsignals 510 arrive at the respective top and bottom transducers oftransducers 520 first. Then, after a delay, the signal that is second from the top and the signal that is second from the bottom arrives at their respective transducers. After yet another delay, the signal that is third from the top and the signal that is third from the bottom arrives at their respective transducers. Finally, after another delay, the middle signal arrives at the middle transducer. - When a signal arrives at its respective transducer, a pressure wave is emitted from that respective transducer. After all of the signals have reached their respective transducers, a set of pressure waves 530 are emitted that collectively form a
wavefront 540. Because the pressure waves were emitting according to the phase timing, thewavefront 540 focuses over time tofocal point 550. Thefocal point 550 represents a position at which the entire set of pressure waves 530 constructively interfere. The resulting pressure wave at thefocal point 550 may have a greater amplitude than each individual pressure wave of the set of pressure waves 530. - Note that the
conceptual illustration 500 depicts one example of how phase timing can be used to generate a collective pressure wave at a particular location using a set of individual pressure waves. The phase timing can be varied in order to select a variety of focal points at various locations in space. Additionally, while theconceptual illustration 500 shows a two-dimensional example of a phased transducer array, any two- or three-dimensional arrangement of transducers may be utilized to implement a phased transducer array. As an example, employing a phase timing for a three-dimensional arrangement of transducers may generate a set of pressure waves that constructively interfere at a three-dimensional point in space. It should be understood that, although some examples and figures described herein may refer to or illustrate two-dimensional examples, the techniques of the present application may be applied in three dimensions. - Note that the
conceptual illustration 500 is merely illustrative and may not necessarily be drawn to an accurate scale. Theillustration 500 ofFIG. 5 is provided to facilitate understanding of implementations of the present application that utilize a phased transducer array. -
FIG. 6A andFIG. 6B depict a scenario in which techniques of the present application may be used at two different points in time. Therobot 610 may be similar torobot 200 ofFIG. 2 ,robot 300 ofFIG. 3 , and/or may include any combination of components ofrobot 200,robot 300, a transducer or transducer array as described herein, and/or a depth sensor as described herein. In the depicted scenario, arobot 610 encounters anenvironment containing grass 622 onground surface 620. Therobot 610 utilizes techniques of the present application in order to detect the presence of movable elements—in this example, thegrass 622—on theground surface 620. Note that, while the following describes therobot 610 performing a variety of actions and controlling thetransducer array 612 anddepth sensor 616, a control system and/or computing device of therobot 610 may also carry out some or all of these operations. -
FIG. 6A illustrates an example operation of a robot at afirst time 600, according to an example embodiment. First, therobot 610 uses thedepth sensor 616 measures thedepth measurement 630 of thegrass 622. In some instances, therobot 610 also determines theheight 640 of thegrass 622 with respect to thedepth sensor 616 based on a trigonometric relationship between thedepth sensor 616 and the known position of thegrass 622. - Then, as shown in
FIG. 6B , therobot 610 controls thetransducer array 612 to emit apressure wave 614 directed toward thegrass 622 at asecond time 650. In some embodiments, the pressure wave is directed toward thegrass 622 using phase timing techniques as described herein, or using other kinds of beamforming techniques. In this example, thepressure wave 614 interacts with thegrass 622, perturbing it and causing thegrass 622 to separate at the focal point of thepressure wave 614. - Then, the
robot 610 uses thedepth sensor 616 to measure thedepth measurement 632 of thegrass 622 while or after it has interacted with thepressure wave 622. In this example, because thegrass 622 separated and revealed theground surface 620 beneath it, thedepth measurement 632 is larger than thedepth measurement 630. In some instances, therobot 610 also determines theheight 642 of thegrass 622 after it has been perturbed, which may be similar to or the same as the height of theground surface 620. - The
robot 610 may then determine whether or not the location at which thedepth sensor 616 was capturing depth measurements (in this example, the grass 622) includes a movable element. In one embodiment, therobot 610 compares thedepth measurement 630 to thedepth measurement 632 to determine if they are different. As previously described, the depth measurement comparison may simply determine whether the two depth measurements are the same, or determine whether the two depth measurements differ by a threshold amount. In another embodiment, therobot 610 may calculate a difference between thedepth measurement 630 and thedepth measurement 632 to determine the degree to whichgrass 622 is unstable. - In other embodiments, the
depth sensor 616 gathers depth maps of thegrass 622 and compares a depth map captured at thefirst time 600 to another depth map captured at thesecond time 650. An example depth map comparison is illustrated inFIG. 7 and described in more detail below. - In instances where the
robot 610 determines theheight 640 and/or theheight 642, therobot 610 may also use that determined height of thegrass 622 as an input into a stepping controller or other control system. For example, upon detecting the movable feature ofgrass 622, therobot 610 may modify its stepping behavior to step more slowly onto thegrass 622. The control system that implements this more cautious stepping behavior may move a foot of the robot quickly to some height at or above thedetermined height 640 and/orheight 642, and then proceed to slowly lower the foot until it touches down onto theground surface 620. In other examples, therobot 610 may implement other stepping controllers that utilize an estimated ground surface height (e.g., height 642) as a basis for controlling the robot's stepping behavior. - The
depth sensor 616 may be any device capable measuring distances between thedepth sensor 616 and surfaces within an environment. Thedepth sensor 616 may be a stereoscopic camera, an imaging device coupled with a structured light projector, a RADAR device, and/or a LIDAR device, among other possible depth sensors. Thedepth sensor 616 may be any other device utilizing stereo triangulation, structured or patterned light, time-of-flight, and/or interferometry techniques. - It should be understood that, although the scenario of
FIG. 6A andFIG. 6B describes detectinggrass 622 as a movable element, other movable elements or unstable features may be detecting using the techniques disclosed herein. A “movable element” may be any object, item, material, artifact, or other feature that may interact with, be perturbed by, or otherwise react to a pressure wave. The movable element may be partially fixed or secured onto the ground surface, in some instances. In other instances, the movable element may be on the ground surface, but is not affixed to it in any way, such as a loose object that landed onto the ground surface. - Note that, although the
first time 600 and thesecond time 650 depict particular instances in time of the scenario shown inFIG. 6A andFIG. 6B , thefirst time 600 and/or thesecond time 650 may represent multiple operations of therobot 610 that occur over some period of time. It should be understood that the “first time” and “second time” distinguish between two states of the environment and may not necessarily represent a single point in time. -
FIG. 7 illustratescomparison 700 between a pair of depth maps, according to an example embodiment. Referring toFIG. 6 , thedepth sensor 616 of therobot 610 may capturedepth map 710 at thefirst time 600, anddepth map 720 at thesecond time 650 in this example.Depth map 710 depicts blades ofgrass 622 as seen from the perspective of therobot 610. Thedepth map 710 includes some lightly shaded blades ofgrass 622 which represent the blades of grass closest to thedepth sensor 616, some medium shaded blades ofgrass 622 which represent blades of grass behind the lightly shaded blades ofgrass 622, and some darkly shaded blades ofgrass 622 behind the medium shaded blades ofgrass 622. - The comparison between the
depth map 710 and thedepth map 720 may be performed in a variety of ways. Each depth map may have a spatial resolution indicative of a discrete number of depth measurements included therein. Each individual depth measurement withindepth map 710 may be compared against its corresponding depth measurement withindepth map 720 for each spatial position. A control system may count the number of corresponding depth measurements between thedepth map 710 and thedepth map 720 that differ (by any amount, or by at least a threshold amount). - In some embodiments, a region of the
depth map 710 and thedepth map 720 may be compared. For example, as shown inFIG. 7 , theregion 712 of thedepth map 710 may be compared against thecorresponding region 722 of thedepth map 720. In this example, the blades ofgrass 622 have moved about such that theregion 712 and theregion 722 no longer appear to be the same or similar, while other regions of theentire depth map 710 anddepth map 720 appear unchanged between the two depth maps. - By comparing one or more regions of the
depth map 710 to a corresponding set of regions in thedepth map 720, a control system or computing device may be able to determine a number of regions (if any) between the two depth maps that differ. In some cases, one different region may be indicative of the presence of a movable element. In other cases, a threshold number of differing regions may indicate the presence of a movable element. - Regardless of the exact manner in which the two depth maps are compared, the present application discloses a robot, computing device, or control system identifying differences between the
depth map 710 and thedepth map 720, and using those identified differences as a basis for maintaining the robot's current control behavior, for modifying the robot's behavior, speed, step path, or trajectory, and/or for informing a control system of either the presence of a movable element and/or of a degree of instability of a ground surface. - Note that, although three distinct shading levels are depicted in
FIG. 7 , the depths of the blades ofgrass 622 may be determined with higher granularity. Additionally, a given blade ofgrass 622 may have multiple depths associated with it, such as if a given blade of grass bends away from the perspective ofrobot 610. The distinct shading levels and simplified depth representations of each blade ofgrass 622 is provided for the purposes of explanation and may or may not necessarily correspond to an actual depth map or scenario. -
FIG. 8 illustrates an example computer-readable medium configured according to at least some implementation described herein. In example implementations, the example system can include one or more processors, one or more forms of memory, one or more input devices/interfaces, one or more output devices/interfaces, and machine readable instructions that when executed by the one or more processors cause a robotic device to carry out the various operations, tasks, capabilities, etc., described above. - As noted above, the disclosed procedures can be implemented by computer program instructions encoded on a computer-readable storage medium in a machine-readable format, or on other media or articles of manufacture.
FIG. 8 is a schematic illustrating a conceptual partial view of a computer program product that includes a computer program for executing a computer process on a computing device, arranged according to at least some implementations disclosed herein. - In some implementations, the example
computer program product 800 may include one ormore program instructions 802 that, when executed by one or more processors may provide functionality or portions of the functionality described above with respect toFIGS. 1-7 . In some examples, thecomputer program product 800 may include a computer-readable medium 804, such as, but not limited to, a hard disk drive, a Compact Disc (CD), a Digital Video Disk (DVD), a digital tape, memory, etc. In some implementations, thecomputer program product 800 may include acomputer recordable medium 806, such as, but not limited to, memory, read/write (R/W) CDs, R/W DVDs, etc. - The one or
more program instructions 802 can be, for example, computer executable and/or logic implemented instructions. In some examples, a computing device is configured to provide various operations, or actions in response to theprogram instructions 802 conveyed to the computing device by the computer readable medium 804 and/or thecomputer recordable medium 806. In other examples, the computing device can be an external device in communication with a device coupled to the robotic device. - The computer readable medium 804 can also be distributed among multiple data storage elements, which could be remotely located from each other. The computing device that executes some or all of the stored instructions could be an external computer, or a mobile computing platform, such as a smartphone, tablet device, personal computer, or a wearable device, among others. Alternatively, the computing device that executes some or all of the stored instructions could be a remotely located computer system, such as a server. For example, the
computer program product 800 can implement operations discussed in reference toFIGS. 4-7 . - It should be understood that arrangements described herein are for purposes of example only. As such, those skilled in the art will appreciate that other arrangements and other elements (e.g. machines, interfaces, operations, orders, and groupings of operations, etc.) can be used instead, and some elements may be omitted altogether according to the desired results. Further, many of the elements that are described are functional entities that may be implemented as discrete or distributed components or in conjunction with other components, in any suitable combination and location, or other structural elements described as independent structures may be combined.
- While various aspects and implementations have been disclosed herein, other aspects and implementations will be apparent to those skilled in the art. The various aspects and implementations disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope being indicated by the following claims, along with the full scope of equivalents to which such claims are entitled. It is also to be understood that the terminology used herein is for the purpose of describing particular implementations only, and is not intended to be limiting.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/618,822 US20170274529A1 (en) | 2015-08-10 | 2017-06-09 | Detection of Movable Ground Areas of a Robot's Environment Using a Transducer Array |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/822,009 US9701016B1 (en) | 2015-08-10 | 2015-08-10 | Detection of movable ground areas of a robot's environment using a transducer array |
US15/618,822 US20170274529A1 (en) | 2015-08-10 | 2017-06-09 | Detection of Movable Ground Areas of a Robot's Environment Using a Transducer Array |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/822,009 Continuation US9701016B1 (en) | 2015-08-10 | 2015-08-10 | Detection of movable ground areas of a robot's environment using a transducer array |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170274529A1 true US20170274529A1 (en) | 2017-09-28 |
Family
ID=59257498
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/822,009 Expired - Fee Related US9701016B1 (en) | 2015-08-10 | 2015-08-10 | Detection of movable ground areas of a robot's environment using a transducer array |
US15/618,822 Abandoned US20170274529A1 (en) | 2015-08-10 | 2017-06-09 | Detection of Movable Ground Areas of a Robot's Environment Using a Transducer Array |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/822,009 Expired - Fee Related US9701016B1 (en) | 2015-08-10 | 2015-08-10 | Detection of movable ground areas of a robot's environment using a transducer array |
Country Status (1)
Country | Link |
---|---|
US (2) | US9701016B1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110389586A (en) * | 2018-04-19 | 2019-10-29 | 法拉第未来公司 | The system and method detected for ground and free space |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2538779B (en) * | 2015-05-28 | 2017-08-30 | Dyson Technology Ltd | A method of controlling a mobile robot |
US9701016B1 (en) * | 2015-08-10 | 2017-07-11 | X Development Llc | Detection of movable ground areas of a robot's environment using a transducer array |
TW201727418A (en) * | 2016-01-26 | 2017-08-01 | 鴻海精密工業股份有限公司 | Analysis of the ground texture combined data recording system and method for analysing |
CN107943059B (en) * | 2017-12-29 | 2024-03-15 | 南京工程学院 | Heavy-load multi-foot robot based on depth visual navigation and motion planning method thereof |
KR102629036B1 (en) * | 2018-08-30 | 2024-01-25 | 삼성전자주식회사 | Robot and the controlling method thereof |
CN109822597B (en) * | 2019-04-14 | 2021-01-19 | 北京中大科慧科技发展有限公司 | Full-automatic intelligent inspection robot of data center |
US11194044B2 (en) * | 2020-02-13 | 2021-12-07 | Tymphany Acoustic Technology (Huizhou) Co., Ltd. | Object movement detection based on ultrasonic sensor data analysis |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9701016B1 (en) * | 2015-08-10 | 2017-07-11 | X Development Llc | Detection of movable ground areas of a robot's environment using a transducer array |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4821206A (en) * | 1984-11-27 | 1989-04-11 | Photo Acoustic Technology, Inc. | Ultrasonic apparatus for positioning a robot hand |
US5131392A (en) * | 1990-02-13 | 1992-07-21 | Brigham & Women's Hospital | Use of magnetic field of magnetic resonance imaging devices as the source of the magnetic field of electromagnetic transducers |
US6154134A (en) * | 1999-07-16 | 2000-11-28 | Carmen; Norman | Head activated vehicle horn controller |
US9053222B2 (en) * | 2002-05-17 | 2015-06-09 | Lawrence A. Lynn | Patient safety processor |
US20030014199A1 (en) * | 2001-07-12 | 2003-01-16 | Patrick Toomey | System and methods for detecting fault in structure |
US20080109115A1 (en) | 2006-11-03 | 2008-05-08 | Michael Zin Min Lim | Dynamic force controller for multilegged robot |
JP4998506B2 (en) | 2009-04-22 | 2012-08-15 | トヨタ自動車株式会社 | Robot control device, robot control method, and legged robot |
KR101247761B1 (en) | 2011-07-15 | 2013-04-01 | 삼성중공업 주식회사 | Method for finding the movement area of a mobile robot on hull surface, a mobile robot, and recording medium |
US8842495B2 (en) * | 2011-09-23 | 2014-09-23 | Rethink Robotics, Inc. | Ultrasonic motion detection |
US20130081479A1 (en) * | 2011-09-30 | 2013-04-04 | Craig Miller | Sensor system to detect persence of a person on an object and monitoring system comprising a sensor system to detect persence of a person |
CA2854829C (en) | 2011-11-15 | 2019-07-02 | Manickam UMASUTHAN | Method of real-time tracking of moving/flexible surfaces |
US9283677B2 (en) * | 2012-04-05 | 2016-03-15 | Rethink Robotics, Inc. | Visual indication of target tracking |
-
2015
- 2015-08-10 US US14/822,009 patent/US9701016B1/en not_active Expired - Fee Related
-
2017
- 2017-06-09 US US15/618,822 patent/US20170274529A1/en not_active Abandoned
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9701016B1 (en) * | 2015-08-10 | 2017-07-11 | X Development Llc | Detection of movable ground areas of a robot's environment using a transducer array |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110389586A (en) * | 2018-04-19 | 2019-10-29 | 法拉第未来公司 | The system and method detected for ground and free space |
Also Published As
Publication number | Publication date |
---|---|
US9701016B1 (en) | 2017-07-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9701016B1 (en) | Detection of movable ground areas of a robot's environment using a transducer array | |
US11426875B2 (en) | Natural pitch and roll | |
US11188081B2 (en) | Auto-swing height adjustment | |
US10456916B2 (en) | Determination of robotic step path | |
US9908240B1 (en) | Ground plane compensation for legged robots | |
US20210309310A1 (en) | Control of Robotic Devices with Non-Constant Body Pitch | |
US9821461B1 (en) | Determining a trajectory for a walking robot to prevent motor overheating | |
US10196104B1 (en) | Terrain Evaluation for robot locomotion | |
US10000248B1 (en) | Rotatable robot foot with perpendicular soles | |
US11247344B2 (en) | Continuous slip recovery | |
US10179619B1 (en) | Robotic foot sensor | |
US9931753B1 (en) | Methods and devices for automatic gait transition | |
US10215852B1 (en) | Robotic radar assistance | |
US9994269B1 (en) | Rotatable extension for robot foot | |
Barak et al. | Two Dimensional Mapping by using Single Ultrasonic Sensor. | |
US10160505B1 (en) | Variable-compliance, slip-resistant foot for legged mechanisms |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GOOGLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:URATA, JUNICHI;ITO, YOSHITO;SIGNING DATES FROM 20150902 TO 20150903;REEL/FRAME:042747/0723 Owner name: X DEVELOPMENT LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GOOGLE INC.;REEL/FRAME:042887/0114 Effective date: 20160901 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044567/0001 Effective date: 20170929 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |