US20190384314A1 - Detecting objects near an autonomous device - Google Patents

Detecting objects near an autonomous device Download PDF

Info

Publication number
US20190384314A1
US20190384314A1 US16/009,414 US201816009414A US2019384314A1 US 20190384314 A1 US20190384314 A1 US 20190384314A1 US 201816009414 A US201816009414 A US 201816009414A US 2019384314 A1 US2019384314 A1 US 2019384314A1
Authority
US
United States
Prior art keywords
short
autonomous device
sensors
range
field
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/009,414
Inventor
Niels Jul JACOBSEN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mobile Industrial Robots AS
Original Assignee
Mobile Industrial Robots AS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mobile Industrial Robots AS filed Critical Mobile Industrial Robots AS
Priority to US16/009,414 priority Critical patent/US20190384314A1/en
Assigned to MOBILE INDUSTRIAL ROBOTS APS reassignment MOBILE INDUSTRIAL ROBOTS APS ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JACOBSEN, NIELS JUL
Priority to PCT/EP2019/065766 priority patent/WO2019238958A1/en
Assigned to MOBILE INDUSTRIAL ROBOTS A/S reassignment MOBILE INDUSTRIAL ROBOTS A/S CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: MOBILE INDUSTRIAL ROBOTS APS
Publication of US20190384314A1 publication Critical patent/US20190384314A1/en
Priority to US17/161,977 priority patent/US20210223786A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0088Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0259Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/22Command input arrangements
    • G05D1/221Remote-control arrangements
    • G05D1/227Handing over between remote control and on-board control; Handing over between remote control arrangements
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/247Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/247Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons
    • G05D1/249Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons from positioning sensors located off-board the vehicle, e.g. from cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/617Safety or protection, e.g. defining protection zones around obstacles or avoiding hazards
    • G05D1/622Obstacle avoidance
    • G05D1/628Obstacle avoidance following the obstacle profile, e.g. a wall or undulated terrain

Definitions

  • This specification relates generally to an autonomous device configured to detect objects within a vicinity of the autonomous device.
  • Autonomous devices such as mobile robots, include sensors, such as scanners or three-dimensional (3D) cameras, to detect objects in their path or in their vicinity. These sensors have a limited field of view. As a result, autonomous devices may be unable to detect objects in their immediate vicinity. For example, sensors on an autonomous device may be unable to detect objects close to the ground and near to the autonomous device, particularly at its corners. This can be problematic, especially in a manufacturing environment where ground-level objects, such as forklifts, can move into the path of the autonomous device.
  • sensors such as scanners or three-dimensional (3D) cameras
  • An example autonomous device is configured to detect objects within a vicinity of the autonomous device.
  • the autonomous device is configured to move along a surface.
  • the autonomous device includes a body, at least one long-range sensor on the body configured for detection in a first field, and at least one short-range sensor on the body.
  • Each short-range sensor is configured for detection in a second field directed towards the surface. The second field is smaller than the first field.
  • Each short-range sensor is configured to output signals based on detection of an object within the second field.
  • a control system is configured to control movement of the autonomous device based, at least in part, on the signals.
  • the autonomous device may include one or more of the following features, either alone or in combination.
  • the at least one short-range sensor may comprise proximity sensors.
  • the at least one short-range sensor may comprise near-field sensors.
  • the autonomous device may be, or include, a mobile robot.
  • the body may comprise one or more corners.
  • a group of short-range sensors may be arranged at each corner so that second fields of at least some of the short-range sensors in each group overlap at least in part.
  • Each corner may comprise an intersection of two edges.
  • Each edge of each corner may comprise three short-range sensors. Adjacent ones of the three short-range sensors may have second fields that overlap at least in part.
  • the body may have a circular perimeter. Short-range sensors may be arranged along the circular perimeter so that second fields of at least some of the short-range sensors overlap at least in part.
  • the body may have a curved perimeter. Short-range sensors may be arranged along the curved perimeter so that second fields of at least some the short-range sensors overlap at least in part.
  • the body may comprise a top part and a bottom part.
  • the bottom part may be closer to the surface during movement of the autonomous device than the top part.
  • Short-range sensors may be located on the body closer to the top part than to the bottom part. At least one short-range sensor may be located adjacent to the top part.
  • the at least one short-range sensor on the body may be angled towards the surface such that a second field of the at least one short-range sensor is directed towards the surface.
  • a horizontal plane extends from the body at 0°, and the surface is at ⁇ 90° relative to the horizontal plane.
  • Short-range sensors may be directed towards the surface such that the second field of at least some of the short-range sensors is between ⁇ 1° and ⁇ 90° relative to the horizontal plane.
  • Short-range sensors may be directed towards the surface such that the second field of all of the short-range sensors is between ⁇ 1° and ⁇ 90° relative to the horizontal plane.
  • the at least one short-range sensor may be configured to output signals in response to detecting the object.
  • the at least one short-range sensor may be configured to use non-visible light to detect the object.
  • the at least one short-range sensor may be configured to use infrared light to detect the object.
  • the at least one short-range sensor may be configured to use electromagnetic signals to detect the object.
  • the at least one short-range sensor may comprise photoelectric sensors.
  • Each second field may be 30 centimeters (cm) in diameter at most. Each second field may be 20 centimeters (cm) in diameter at most.
  • the body may include corners.
  • a group of short-range sensor may be arranged at each of the corners. Adjacent ones of the short-range sensors may have second fields that overlap at least in part such that, for a corner among the corners, there are no blind spots for the autonomous device in a partial circumference of a circle centered at the corner.
  • the autonomous device may include a bumper that is comprised of an elastic material. The bumper may be around at least part of a perimeter of the autonomous device. The at least one short-range sensor may be located underneath the bumper. Sensors may be arranged around at least part of a perimeter of the body.
  • Short-range sensors may be directed towards the surface on which the device travels such that a second field of each short-range sensor extends at least from 15 centimeters (cm) above the surface to the surface.
  • An example autonomous device is configured to detect objects within a vicinity of the autonomous device.
  • the autonomous device includes a body for supporting weight of an object, wheels on the body to enable the body to travel across a surface, and a camera on the body to obtain images in front of the autonomous device.
  • the camera has first field that extends from the body.
  • Sensors may be disposed along at least part of a perimeter of the body.
  • the sensors may have a second field that extends from the surface to at least a location below the first field.
  • the autonomous device may be, or include, a mobile robot.
  • the autonomous device may include one or more of the following features, either alone or in combination.
  • the second field may intersect the first field in part.
  • At least two of the sensors that are adjacent to each other may have fields that overlap at least partly.
  • the sensors may be configured to use non-visible light to detect an object.
  • the sensors may be configured to use infrared light to detect an object.
  • the sensors may be configured to use electromagnetic signals to detect an object.
  • the sensors may comprise photoelectric sensors.
  • the sensors may comprise proximity sensors configured to sense an object within at most 20 centimeters.
  • the sensors may comprise proximity sensors configured to sense an object within at most 30 centimeters.
  • the body may comprise one or more corners. At least one of the corners may be defined by edges that support a group of the sensors.
  • the group of sensors may have fields that overlap at least in part such that, for the at least one corner, there are no blind spots for the autonomous device in a partial circumference of a circle centered at the at least one corner.
  • the autonomous device may include a rubber bumper along at least part of the perimeter. The sensors may be underneath the rubber bumper.
  • the systems and processes described herein, or portions thereof, can be controlled by a computer program product that includes instructions that are stored on one or more non-transitory machine-readable storage media, and that are executable on one or more processing devices to control (e.g., coordinate) the operations described herein.
  • the systems and processes described herein, or portions thereof, can be implemented as an apparatus or method.
  • the systems and processes described herein can include one or more processing devices and memory to store executable instructions to implement various operations.
  • FIG. 1 is a side view of an example autonomous robot.
  • FIG. 2 is a side view of the example autonomous robot, which shows ranges of long-range sensors included on the robot.
  • FIG. 3 is a top view of the example autonomous robot, which shows ranges of the long-range sensors included on the robot.
  • FIG. 4 is a top view of the example autonomous robot, which shows short-range sensors arranged around parts of the robot.
  • FIG. 5 is a side view of the example autonomous robot, which the shows short-range sensors arranged around parts of the robot and their fields of view.
  • FIG. 6 is a top view of the example autonomous robot, which shows the short-range sensors arranged around parts of the robot and their fields of view.
  • FIG. 7 is a side view of the example autonomous robot, which shows a bumper and the short-range sensors underneath or behind the bumper.
  • An example autonomous device (or simply “device”) is configured to move along a surface, such as the floor of factory.
  • the example device includes a body for supporting the weight of an object and wheels on the body to enable the body to travel across the surface.
  • the example device includes long-range sensors on the body configured for detection in a first field of view (FOV) or simply “field”.
  • FOV field of view
  • the device may include a three-dimensional (3D) camera that is capable of detecting an object within its FOV.
  • the example device also includes short-range sensors on the body. Each short-range sensor may be configured for detection in a second FOV that is smaller than, or different from, the FOV of each long-range sensor.
  • the short-range sensors may include near-field sensors or proximity sensors for detecting within the second FOV.
  • the second FOV may be directed toward the surface to enable detection of objects in the immediate vicinity of the device.
  • the short-range sensors may be configured to detect objects close to the ground and near to the device, particularly at its corners.
  • Each short-range sensor may be configured to output signals based on—for example, in response to—detection of an object within its FOV.
  • a control system may be configured to control movement of the device based, at least in part, on those signals.
  • the control system may be, or include, one or more processing devices, such as a microprocessor.
  • the control system can also include computing resources distributed to a remote—for example, a cloud—service and, therefore, the control system need not be on-board the robot.
  • the control system may take appropriate action, such as changing the device's path or stopping movement or other operation of the robot.
  • sensor parameters such as FOV. These values are examples only. Different sensors may have different values, and different devices may use different types, numbers, or configurations of sensors.
  • autonomous robot 10 is a mobile robot, and is referred to simply as “robot”.
  • Robot 10 includes a body 12 having wheels 13 to enable robot 10 to travel across a surface 14 , such as the floor of a factory or other terrain.
  • Robot 10 also includes a support area 15 configured to support the weight of an object.
  • robot 10 may be controlled to transport the object from one location to another location.
  • Robot 10 includes a sensor configuration of the type described herein. However, the sensor configuration is not limited to robots of this type. Rather, the sensor configuration may be used with any appropriate type of autonomous device, robot, or vehicle.
  • robot 10 includes 3D camera 16 at a front 17 of the robot.
  • the front of the robot faces the direction of travel of the robot.
  • the back of the robot faces terrain that the robot has already traversed.
  • 3D camera 16 has a FOV 18 of 16° off of horizontal plane 20 .
  • the placement of 3D camera 16 is such that there is about a 350 millimeter (mm) range 21 before the 3D camera can detect an object proximate to the robot, and about a 410 mm range 22 before the object can detect the surface 14 on which it is traveling.
  • the 3D camera has a sensing range 31 of about 1900 mm and can see about 750 mm above surface 14 .
  • Robot 10 also includes a LIDAR scanner 24 at its back 25 .
  • the LIDAR scanner is positioned at a back corner of the robot.
  • the LIDAR scanner is configured to detect objects within a sensing plane 26 .
  • the sensing plane is about 200 mm above surface 14 .
  • the LIDAR scanner is not capable of detecting objects less than 200 mm above surface 14 .
  • a similar LIDAR scanner is included at the diagonally opposite front corner of the robot, which has the same scanning range and limitations.
  • FIG. 3 is a top view of robot 10 .
  • LIDAR scanners 24 and 23 are located at back corner 28 and at front corner 27 , respectively.
  • each LIDAR scanner has a scanning range 29 of about 1000 mm over an arc of about 270°.
  • each LIDAR scanner may have a scanning range of about 12,000 mm over an arc of about 270°.
  • the range 31 of 3D camera 16 is about 1900 mm over an arc 33 of about 56°.
  • the field of view of 3D camera 16 decreases from about 1400 mm to about 1000 mm at the maximum range of the 3D camera.
  • robot 10 includes several blinds spots, including at corners 27 and 28 .
  • a blind spot includes an area that is not visible to the long-range sensors.
  • the long-range sensors cannot accurately detect objects within that area.
  • robot 10 cannot accurately detect objects that are less than 200 mm above surface 14 .
  • robot 10 cannot accurately detect objects that are less than 350 mm from its front 17 .
  • short-range sensors are incorporated into the robot to sense in the areas that cannot be sensed by the long-range sensors.
  • the short-range sensors are able to detect objects that would otherwise go undetected.
  • each short-range sensor is a member of a group of short-range sensors that is arranged around, or adjacent to, each corner of the robot.
  • the FOVs of at least some of the short-range sensors in each group overlap in whole or in part to provide substantially consistent sensor coverage in areas near the robot that are not visible by the long-range sensors. In some cases, complete overlap of the FOVs of some short range sensors may provide sensing redundancy.
  • robot 10 includes four corners 27 , 28 , 35 , and 36 .
  • FIG. 1 there may be two short-range sensors arranged at each corner; there may be three short-range sensors arranged at each corner; there may be four short-range sensors arranged at each corner; there may be five short-range sensors arranged at each corner; there may be six short-range sensors arranged at each corner; there may be seven short-range sensors arranged at each corner; there may be eight short-range sensors arranged at
  • each corner comprises an intersection of two edges.
  • Each edge of each corner includes three short-range sensors arranged in series.
  • Adjacent short-range sensors have FOVs that overlap in part. At least some of the overlap may be at the corners so that there are no blind spots for the mobile device in a partial circumference of a circle centered at each corner.
  • FIGS. 1, 4, 5, and 6 show different views of the example sensor configuration of robot 10 .
  • FOVs 40 and 41 of adjacent short-range sensors 42 and 43 overlap in part to cover all, some, or portions of blind spots on the robot that are outside—for example, below—the FOVs of the long-range sensors.
  • the short-range sensors 38 are arranged so that their FOVs are directed at least partly towards surface 14 on which the robot travels.
  • horizontal plane 44 extending from body 12 is at 0° and that the direction towards surface 14 is at ⁇ 90° relative to horizontal plane 44 .
  • the short-range sensors 38 may be directed (e.g., pointed) toward surface 14 such that the FOVs of all, or of at least some, of the short-range sensors are in a range between ⁇ 1° and ⁇ 90° relative to horizontal plane 44 .
  • the short-range sensors may be angled downward between ⁇ 1° and ⁇ 90° relative to horizontal plane 44 so that their FOVs extend across the surface in areas near to the robot, as shown in FIGS.
  • FIGS. 5 and 6 show adjacent short-range sensor FOVs overlapping in areas, such as area 45 of FIG. 6 , to create combined FOVs that cover the entirety of the front 17 of the robot, the entirety of the back 25 of the robot, and parts of sides 46 and 477 of the robot.
  • the short-ranges sensors may be arranged to combine FOVs that cover the entirety of sides 46 and 47 .
  • the FOVs of individual short-range sensors cover areas on surface 14 having, at most, a diameter of 10 centimeters (cm), a diameter of 20 cm, a diameter of 30 cm, a diameter of 40 cm, or a diameter of 50 cm, for example.
  • each short-range sensor may have a sensing range of at least 200 mm; however, other examples may have different sensing ranges.
  • the short-range sensors are, or include, time-of-flight (ToF) laser-ranging modules, an example of which is the VL53LOX manufactured by STMicroelectronics®. This particular sensor is based on a 940 nanometer (nm) “class 1” laser and receiver.
  • TEZ time-of-flight
  • the short-range sensors may be of the same type or of different types.
  • each group of short-range sensors for example, at each corner of the robot—may have the same composition of sensors or different compositions of sensors.
  • One or more short-range sensors may be configured to use non-visible light, such as laser light, to detect an object.
  • One or more short-range sensors may be configured to use infrared light to detect the object.
  • One or more short-range sensors may be configured to use electromagnetic signals to detect the object.
  • One or more short-range sensors may be, or include, photoelectric sensors to detect the object.
  • One or more short-range sensors may be, or include, appropriately-configured 3D cameras to detect the object. In some implementations, combinations of two or more of the preceding types of sensors may be used on the same robot.
  • the short-range sensors on the robot may be configured to output one or more signals in response to detecting an object.
  • Signals from the short-range sensors and from the long-range sensors may be processed by a control system, such as a computing system, to identify an object near to, or in the path of, the robot. If necessary, navigational corrections to the path of the robot may be made, and the robot's movement system may be controlled based on those corrections
  • the control system may be local.
  • the control system may include an on-board computing system located on the robot itself.
  • the control system may be remote.
  • the control system may be a computing system external to the robot.
  • signals and commands may be exchanged wirelessly to control operation of the robot.
  • control systems may include one or more processing devices, such as a microprocessor, and memory storing instructions that are executable by the microprocessor to interpret data based on signals from sensors, to determine navigational paths of the robot based on those signals, and to control the movement of the robot based on the determined navigational paths.
  • processing devices such as a microprocessor
  • memory storing instructions that are executable by the microprocessor to interpret data based on signals from sensors, to determine navigational paths of the robot based on those signals, and to control the movement of the robot based on the determined navigational paths.
  • the short-range sensors are not limited to placement at the corners of the robot.
  • the sensors may be distributed around the entire perimeter of the robot.
  • the short-range sensors may be distributed around the circular or non-rectangular perimeter and spaced at regular or irregular distances from each other in order to achieve overlapping FOV coverage of the type described herein.
  • the short-range sensors may be at any appropriate locations—for example, elevations—relative to the surface on which the robot travels.
  • body 12 includes a top part 50 and a bottom part 51 .
  • the bottom part is closer to surface 14 during movement of the robot than is the top part.
  • the short-range sensors may be located on the body closer to the top part than to the bottom part.
  • the short-range sensors may be located on the body closer to the bottom part than to the top part.
  • the short-range sensors may be located on the body such that a second field of each short-range sensor extends at least from 15 centimeters (cm) above the surface down to the surface.
  • the location of the short-range sensors may be based, at least in part, on the FOVs of the sensors.
  • all of the sensors may be located at the same elevation relative to the surface on which the robot travels.
  • some of the sensors may be located at different elevations relative to the surface on which the robot travels. For example, sensors having different FOVs may be appropriately located relative to the surface to enable coverage of blinds spots near to the surface.
  • robot 10 may include a bumper 52 .
  • the bumper may be a shock absorber and may be elastic, at least partially.
  • the short-range sensors 38 may be located behind or underneath the bumper. In some implementations, the short-range sensors may be located underneath structures on the robot that are hard and, therefore, protective.
  • the direction that the short-range sensors point may be changed via the control system.
  • the short-range sensors may be mounted on body 12 for pivotal motion, translational motion, rotational motion, or a combination thereof.
  • the control system may output signals to the robot to position or to reposition the short-range sensors, as desired. For example if one short-range sensor fails, the other short-range sensors may be reconfigured to cover the FOV previously covered by the failed short-range sensor.
  • the FOVs of the short-range sensors and of the long-range sensors may intersect in part to provide thorough coverage in the vicinity of the robot.
  • the dimensions and sensor ranges presented herein are for illustration only. Other types of autonomous devices may have different numbers, types, or both numbers and types of sensors than those presented herein. Other types of autonomous devices may have different sensor ranges that cause blind spots that are located at different positions relative to the robot or that have different dimension than those presented.
  • the short-range sensors described herein may be arranged to accommodate these blind spots.
  • the example robot described herein may include, and/or be controlled using, a control system comprised of one or more computer systems comprising hardware or a combination of hardware and software.
  • a robot may include various controllers and/or processing devices located at various points in the system to control operation of its elements.
  • a central computer may coordinate operation among the various controllers or processing devices.
  • the central computer, controllers, and processing devices may execute various software routines to effect control and coordination of the various automated elements.
  • a computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
  • a computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a network.
  • Actions associated with implementing at least part of the robot can be performed by one or more programmable processors executing one or more computer programs to perform the functions described herein.
  • At least part of the robot can be implemented using special purpose logic circuitry, e.g., an FPGA (field programmable gate array) and/or an ASIC (application-specific integrated circuit).
  • processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
  • a processor will receive instructions and data from a read-only storage area or a random access storage area or both.
  • Elements of a computer include one or more processors for executing instructions and one or more storage area devices for storing instructions and data.
  • a computer will also include, or be operatively coupled to receive data from, or transfer data to, or both, one or more machine-readable storage media, such as mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks.
  • Machine-readable storage media suitable for embodying computer program instructions and data include all forms of non-volatile storage area, including by way of example, semiconductor storage area devices, e.g., EPROM, EEPROM, and flash storage area devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
  • semiconductor storage area devices e.g., EPROM, EEPROM, and flash storage area devices
  • magnetic disks e.g., internal hard disks or removable disks
  • magneto-optical disks e.g., CD-ROM and DVD-ROM disks.
  • connection involving electrical circuitry that allows signals to flow is an electrical connection and not necessarily a direct physical connection regardless of whether the word “electrical” is used to modify “connection”.
  • connection Any connection involving electrical circuitry that allows signals to flow, unless stated otherwise, is an electrical connection and not necessarily a direct physical connection regardless of whether the word “electrical” is used to modify “connection”.
  • Elements of different implementations described herein may be combined to form other embodiments not specifically set forth above. Elements may be left out of the structures described herein without adversely affecting their operation. Furthermore, various separate elements may be combined into one or more individual elements to perform the functions described herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Game Theory and Decision Science (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

An example autonomous device is configured to detect objects within a vicinity of the autonomous device. The autonomous device is configured to move along a surface. The autonomous device includes a body, at least one long-range sensor on the body configured for detection in a first field, and at least one short-range sensor on the body. Each short-range sensor is configured for detection in a second field directed towards the surface. The second field is smaller than the first field. Each short-range sensor is configured to output signals based on detection of an object within the second field. A control system is configured to control movement of the autonomous device based, at least in part, on the signals.

Description

    TECHNICAL FIELD
  • This specification relates generally to an autonomous device configured to detect objects within a vicinity of the autonomous device.
  • BACKGROUND
  • Autonomous devices, such as mobile robots, include sensors, such as scanners or three-dimensional (3D) cameras, to detect objects in their path or in their vicinity. These sensors have a limited field of view. As a result, autonomous devices may be unable to detect objects in their immediate vicinity. For example, sensors on an autonomous device may be unable to detect objects close to the ground and near to the autonomous device, particularly at its corners. This can be problematic, especially in a manufacturing environment where ground-level objects, such as forklifts, can move into the path of the autonomous device.
  • SUMMARY
  • An example autonomous device is configured to detect objects within a vicinity of the autonomous device. The autonomous device is configured to move along a surface. The autonomous device includes a body, at least one long-range sensor on the body configured for detection in a first field, and at least one short-range sensor on the body. Each short-range sensor is configured for detection in a second field directed towards the surface. The second field is smaller than the first field. Each short-range sensor is configured to output signals based on detection of an object within the second field. A control system is configured to control movement of the autonomous device based, at least in part, on the signals. The autonomous device may include one or more of the following features, either alone or in combination.
  • The at least one short-range sensor may comprise proximity sensors. The at least one short-range sensor may comprise near-field sensors. The autonomous device may be, or include, a mobile robot.
  • The body may comprise one or more corners. A group of short-range sensors may be arranged at each corner so that second fields of at least some of the short-range sensors in each group overlap at least in part. There may be four or more short-range sensors arranged at each of the four corners so that second fields of adjacent short-range sensors among the four or more short-range sensors overlap at least in part. Each corner may comprise an intersection of two edges. Each edge of each corner may comprise three short-range sensors. Adjacent ones of the three short-range sensors may have second fields that overlap at least in part.
  • The body may have a circular perimeter. Short-range sensors may be arranged along the circular perimeter so that second fields of at least some of the short-range sensors overlap at least in part. The body may have a curved perimeter. Short-range sensors may be arranged along the curved perimeter so that second fields of at least some the short-range sensors overlap at least in part.
  • The body may comprise a top part and a bottom part. The bottom part may be closer to the surface during movement of the autonomous device than the top part. Short-range sensors may be located on the body closer to the top part than to the bottom part. At least one short-range sensor may be located adjacent to the top part.
  • The at least one short-range sensor on the body may be angled towards the surface such that a second field of the at least one short-range sensor is directed towards the surface. A horizontal plane extends from the body at 0°, and the surface is at −90° relative to the horizontal plane. Short-range sensors may be directed towards the surface such that the second field of at least some of the short-range sensors is between −1° and −90° relative to the horizontal plane. Short-range sensors may be directed towards the surface such that the second field of all of the short-range sensors is between −1° and −90° relative to the horizontal plane.
  • The at least one short-range sensor may be configured to output signals in response to detecting the object. The at least one short-range sensor may be configured to use non-visible light to detect the object. The at least one short-range sensor may be configured to use infrared light to detect the object. The at least one short-range sensor may be configured to use electromagnetic signals to detect the object. The at least one short-range sensor may comprise photoelectric sensors.
  • Each second field may be 30 centimeters (cm) in diameter at most. Each second field may be 20 centimeters (cm) in diameter at most.
  • The body may include corners. A group of short-range sensor may be arranged at each of the corners. Adjacent ones of the short-range sensors may have second fields that overlap at least in part such that, for a corner among the corners, there are no blind spots for the autonomous device in a partial circumference of a circle centered at the corner. The autonomous device may include a bumper that is comprised of an elastic material. The bumper may be around at least part of a perimeter of the autonomous device. The at least one short-range sensor may be located underneath the bumper. Sensors may be arranged around at least part of a perimeter of the body.
  • Short-range sensors may be directed towards the surface on which the device travels such that a second field of each short-range sensor extends at least from 15 centimeters (cm) above the surface to the surface.
  • An example autonomous device is configured to detect objects within a vicinity of the autonomous device. The autonomous device includes a body for supporting weight of an object, wheels on the body to enable the body to travel across a surface, and a camera on the body to obtain images in front of the autonomous device. The camera has first field that extends from the body. Sensors may be disposed along at least part of a perimeter of the body. The sensors may have a second field that extends from the surface to at least a location below the first field. The autonomous device may be, or include, a mobile robot. The autonomous device may include one or more of the following features, either alone or in combination.
  • The second field may intersect the first field in part. At least two of the sensors that are adjacent to each other may have fields that overlap at least partly. The sensors may be configured to use non-visible light to detect an object. The sensors may be configured to use infrared light to detect an object. The sensors may be configured to use electromagnetic signals to detect an object. The sensors may comprise photoelectric sensors. The sensors may comprise proximity sensors configured to sense an object within at most 20 centimeters. The sensors may comprise proximity sensors configured to sense an object within at most 30 centimeters.
  • The body may comprise one or more corners. At least one of the corners may be defined by edges that support a group of the sensors. The group of sensors may have fields that overlap at least in part such that, for the at least one corner, there are no blind spots for the autonomous device in a partial circumference of a circle centered at the at least one corner. The autonomous device may include a rubber bumper along at least part of the perimeter. The sensors may be underneath the rubber bumper.
  • Any two or more of the features described in this specification, including in this summary section, can be combined to form implementations not specifically described herein.
  • The systems and processes described herein, or portions thereof, can be controlled by a computer program product that includes instructions that are stored on one or more non-transitory machine-readable storage media, and that are executable on one or more processing devices to control (e.g., coordinate) the operations described herein. The systems and processes described herein, or portions thereof, can be implemented as an apparatus or method. The systems and processes described herein can include one or more processing devices and memory to store executable instructions to implement various operations.
  • The details of one or more implementations are set forth in the accompanying drawings and the description below. Other features and advantages will be apparent from the description and drawings, and from the claims.
  • DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a side view of an example autonomous robot.
  • FIG. 2 is a side view of the example autonomous robot, which shows ranges of long-range sensors included on the robot.
  • FIG. 3 is a top view of the example autonomous robot, which shows ranges of the long-range sensors included on the robot.
  • FIG. 4 is a top view of the example autonomous robot, which shows short-range sensors arranged around parts of the robot.
  • FIG. 5 is a side view of the example autonomous robot, which the shows short-range sensors arranged around parts of the robot and their fields of view.
  • FIG. 6 is a top view of the example autonomous robot, which shows the short-range sensors arranged around parts of the robot and their fields of view.
  • FIG. 7 is a side view of the example autonomous robot, which shows a bumper and the short-range sensors underneath or behind the bumper.
  • Like reference numerals in different figures indicate like elements.
  • DETAILED DESCRIPTION
  • Described herein are examples of autonomous devices or vehicles, such as a mobile robot. An example autonomous device (or simply “device”) is configured to move along a surface, such as the floor of factory. The example device includes a body for supporting the weight of an object and wheels on the body to enable the body to travel across the surface. The example device includes long-range sensors on the body configured for detection in a first field of view (FOV) or simply “field”. For example, the device may include a three-dimensional (3D) camera that is capable of detecting an object within its FOV. The example device also includes short-range sensors on the body. Each short-range sensor may be configured for detection in a second FOV that is smaller than, or different from, the FOV of each long-range sensor. The short-range sensors may include near-field sensors or proximity sensors for detecting within the second FOV. The second FOV may be directed toward the surface to enable detection of objects in the immediate vicinity of the device. For example, the short-range sensors may be configured to detect objects close to the ground and near to the device, particularly at its corners. Each short-range sensor may be configured to output signals based on—for example, in response to—detection of an object within its FOV. A control system may be configured to control movement of the device based, at least in part, on those signals. The control system may be, or include, one or more processing devices, such as a microprocessor. The control system can also include computing resources distributed to a remote—for example, a cloud—service and, therefore, the control system need not be on-board the robot. In response to detection of the object, the control system may take appropriate action, such as changing the device's path or stopping movement or other operation of the robot.
  • The following description includes values relating to sensor parameters, such as FOV. These values are examples only. Different sensors may have different values, and different devices may use different types, numbers, or configurations of sensors.
  • An example of an autonomous device is autonomous robot 10 of FIG. 1. In this example, autonomous robot 10 is a mobile robot, and is referred to simply as “robot”. Robot 10 includes a body 12 having wheels 13 to enable robot 10 to travel across a surface 14, such as the floor of a factory or other terrain. Robot 10 also includes a support area 15 configured to support the weight of an object. In this example, robot 10 may be controlled to transport the object from one location to another location. Robot 10 includes a sensor configuration of the type described herein. However, the sensor configuration is not limited to robots of this type. Rather, the sensor configuration may be used with any appropriate type of autonomous device, robot, or vehicle.
  • In this example, robot 10 includes two types of long-range sensors: a three-dimensional (3D) camera and a light detection and ranging (LIDAR) scanner. However, the robot is not limited to this configuration. For example, the robot may include a single long-range sensor or a single type of long-range sensor. For example, the robot may include more than two types of long-range sensors.
  • Referring to FIG. 2, robot 10 includes 3D camera 16 at a front 17 of the robot. In this example, the front of the robot faces the direction of travel of the robot. The back of the robot faces terrain that the robot has already traversed. In this example, 3D camera 16 has a FOV 18 of 16° off of horizontal plane 20. The placement of 3D camera 16 is such that there is about a 350 millimeter (mm) range 21 before the 3D camera can detect an object proximate to the robot, and about a 410 mm range 22 before the object can detect the surface 14 on which it is traveling. In this example, the 3D camera has a sensing range 31 of about 1900 mm and can see about 750 mm above surface 14. Robot 10 also includes a LIDAR scanner 24 at its back 25. In this example, the LIDAR scanner is positioned at a back corner of the robot. The LIDAR scanner is configured to detect objects within a sensing plane 26. In this example, the sensing plane is about 200 mm above surface 14. The LIDAR scanner is not capable of detecting objects less than 200 mm above surface 14. A similar LIDAR scanner is included at the diagonally opposite front corner of the robot, which has the same scanning range and limitations.
  • FIG. 3 is a top view of robot 10. LIDAR scanners 24 and 23 are located at back corner 28 and at front corner 27, respectively. In this example, each LIDAR scanner has a scanning range 29 of about 1000 mm over an arc of about 270°. In some implementations, each LIDAR scanner may have a scanning range of about 12,000 mm over an arc of about 270°. As shown in FIG. 3, the range 31 of 3D camera 16 is about 1900 mm over an arc 33 of about 56°. However, after a plane 34, the field of view of 3D camera 16 decreases from about 1400 mm to about 1000 mm at the maximum range of the 3D camera.
  • As is evident from FIGS. 2 and 3, in this example configuration, robot 10 includes several blinds spots, including at corners 27 and 28. In this example, a blind spot includes an area that is not visible to the long-range sensors. As a result, the long-range sensors cannot accurately detect objects within that area. For example, robot 10 cannot accurately detect objects that are less than 200 mm above surface 14. For example, robot 10 cannot accurately detect objects that are less than 350 mm from its front 17. Accordingly, short-range sensors are incorporated into the robot to sense in the areas that cannot be sensed by the long-range sensors. Thus, the short-range sensors are able to detect objects that would otherwise go undetected.
  • In some implementations, each short-range sensor is a member of a group of short-range sensors that is arranged around, or adjacent to, each corner of the robot. The FOVs of at least some of the short-range sensors in each group overlap in whole or in part to provide substantially consistent sensor coverage in areas near the robot that are not visible by the long-range sensors. In some cases, complete overlap of the FOVs of some short range sensors may provide sensing redundancy.
  • In the example of FIG. 4, robot 10 includes four corners 27, 28, 35, and 36. In some implementations, there are two or more short-range sensors arranged at each of the four corners so that FOVs of adjacent short-range sensors overlap in part. For example, there may be two short-range sensors arranged at each corner; there may be three short-range sensors arranged at each corner; there may be four short-range sensors arranged at each corner; there may be five short-range sensors arranged at each corner; there may be six short-range sensors arranged at each corner; there may be seven short-range sensors arranged at each corner; there may be eight short-range sensors arranged at each corner, and so forth. In the example of FIG. 4, there are six short-range sensors 38 arranged at each of the four corners so that FOVs of adjacent short-range sensors overlap in part. In this example, each corner comprises an intersection of two edges. Each edge of each corner includes three short-range sensors arranged in series. Adjacent short-range sensors have FOVs that overlap in part. At least some of the overlap may be at the corners so that there are no blind spots for the mobile device in a partial circumference of a circle centered at each corner.
  • FIGS. 1, 4, 5, and 6 show different views of the example sensor configuration of robot 10. As explained above, in this example, there are six short-range sensors 38 arranged around each of the four corners 27, 28, 35, and 36 of robot 10. As shown in FIGS. 5 and 6, FOVs 40 and 41 of adjacent short- range sensors 42 and 43 overlap in part to cover all, some, or portions of blind spots on the robot that are outside—for example, below—the FOVs of the long-range sensors.
  • Referring to FIG. 5, the short-range sensors 38 are arranged so that their FOVs are directed at least partly towards surface 14 on which the robot travels. In an example, assume that horizontal plane 44 extending from body 12 is at 0° and that the direction towards surface 14 is at −90° relative to horizontal plane 44. The short-range sensors 38 may be directed (e.g., pointed) toward surface 14 such that the FOVs of all, or of at least some, of the short-range sensors are in a range between −1° and −90° relative to horizontal plane 44. For example, the short-range sensors may be angled downward between −1° and −90° relative to horizontal plane 44 so that their FOVs extend across the surface in areas near to the robot, as shown in FIGS. 5 and 6. The FOVs of the adjacent short-range sensors overlap partly. This is depicted in FIGS. 5 and 6, which show adjacent short-range sensor FOVs overlapping in areas, such as area 45 of FIG. 6, to create combined FOVs that cover the entirety of the front 17 of the robot, the entirety of the back 25 of the robot, and parts of sides 46 and 477 of the robot. In some implementations, the short-ranges sensors may be arranged to combine FOVs that cover the entirety of sides 46 and 47.
  • In some implementations, the FOVs of individual short-range sensors cover areas on surface 14 having, at most, a diameter of 10 centimeters (cm), a diameter of 20 cm, a diameter of 30 cm, a diameter of 40 cm, or a diameter of 50 cm, for example. In some examples, each short-range sensor may have a sensing range of at least 200 mm; however, other examples may have different sensing ranges.
  • In some implementations, the short-range sensors are, or include, time-of-flight (ToF) laser-ranging modules, an example of which is the VL53LOX manufactured by STMicroelectronics®. This particular sensor is based on a 940 nanometer (nm) “class 1” laser and receiver. However, other types of short-range sensors may be used in place of, or in addition to, this type of sensor. In some implementations, the short-range sensors may be of the same type or of different types. Likewise, each group of short-range sensors—for example, at each corner of the robot—may have the same composition of sensors or different compositions of sensors. One or more short-range sensors may be configured to use non-visible light, such as laser light, to detect an object. One or more short-range sensors may be configured to use infrared light to detect the object. One or more short-range sensors may be configured to use electromagnetic signals to detect the object. One or more short-range sensors may be, or include, photoelectric sensors to detect the object. One or more short-range sensors may be, or include, appropriately-configured 3D cameras to detect the object. In some implementations, combinations of two or more of the preceding types of sensors may be used on the same robot. The short-range sensors on the robot may be configured to output one or more signals in response to detecting an object.
  • Signals from the short-range sensors and from the long-range sensors may be processed by a control system, such as a computing system, to identify an object near to, or in the path of, the robot. If necessary, navigational corrections to the path of the robot may be made, and the robot's movement system may be controlled based on those corrections The control system may be local. For example, the control system may include an on-board computing system located on the robot itself. The control system may be remote. For example, the control system may be a computing system external to the robot. In this example, signals and commands may be exchanged wirelessly to control operation of the robot. Examples of control systems that may be used are described herein and may include one or more processing devices, such as a microprocessor, and memory storing instructions that are executable by the microprocessor to interpret data based on signals from sensors, to determine navigational paths of the robot based on those signals, and to control the movement of the robot based on the determined navigational paths.
  • The short-range sensors are not limited to placement at the corners of the robot. For example, the sensors may be distributed around the entire perimeter of the robot. For example, in example robots that have circular or other non-rectangular bodies, the short-range sensors may be distributed around the circular or non-rectangular perimeter and spaced at regular or irregular distances from each other in order to achieve overlapping FOV coverage of the type described herein. Likewise, the short-range sensors may be at any appropriate locations—for example, elevations—relative to the surface on which the robot travels.
  • Referring to FIG. 1 for example, body 12 includes a top part 50 and a bottom part 51. The bottom part is closer to surface 14 during movement of the robot than is the top part. The short-range sensors may be located on the body closer to the top part than to the bottom part. The short-range sensors may be located on the body closer to the bottom part than to the top part. The short-range sensors may be located on the body such that a second field of each short-range sensor extends at least from 15 centimeters (cm) above the surface down to the surface. The location of the short-range sensors may be based, at least in part, on the FOVs of the sensors. In some implementations, all of the sensors may be located at the same elevation relative to the surface on which the robot travels. In some implementations, some of the sensors may be located at different elevations relative to the surface on which the robot travels. For example, sensors having different FOVs may be appropriately located relative to the surface to enable coverage of blinds spots near to the surface.
  • In some implementations, such as that shown in FIG. 7, robot 10 may include a bumper 52. The bumper may be a shock absorber and may be elastic, at least partially. The short-range sensors 38 may be located behind or underneath the bumper. In some implementations, the short-range sensors may be located underneath structures on the robot that are hard and, therefore, protective.
  • In some implementations, the direction that the short-range sensors point may be changed via the control system. For example, in some implementations, the short-range sensors may be mounted on body 12 for pivotal motion, translational motion, rotational motion, or a combination thereof. The control system may output signals to the robot to position or to reposition the short-range sensors, as desired. For example if one short-range sensor fails, the other short-range sensors may be reconfigured to cover the FOV previously covered by the failed short-range sensor. In some implementations, the FOVs of the short-range sensors and of the long-range sensors may intersect in part to provide thorough coverage in the vicinity of the robot.
  • The dimensions and sensor ranges presented herein are for illustration only. Other types of autonomous devices may have different numbers, types, or both numbers and types of sensors than those presented herein. Other types of autonomous devices may have different sensor ranges that cause blind spots that are located at different positions relative to the robot or that have different dimension than those presented. The short-range sensors described herein may be arranged to accommodate these blind spots.
  • The example robot described herein may include, and/or be controlled using, a control system comprised of one or more computer systems comprising hardware or a combination of hardware and software. For example, a robot may include various controllers and/or processing devices located at various points in the system to control operation of its elements. A central computer may coordinate operation among the various controllers or processing devices. The central computer, controllers, and processing devices may execute various software routines to effect control and coordination of the various automated elements.
  • The example robot described herein can be controlled, at least in part, using one or more computer program products, e.g., one or more computer program tangibly embodied in one or more information carriers, such as one or more non-transitory machine-readable media, for execution by, or to control the operation of, one or more data processing apparatus, e.g., a programmable processor, a computer, multiple computers, and/or programmable logic components.
  • A computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a network.
  • Actions associated with implementing at least part of the robot can be performed by one or more programmable processors executing one or more computer programs to perform the functions described herein. At least part of the robot can be implemented using special purpose logic circuitry, e.g., an FPGA (field programmable gate array) and/or an ASIC (application-specific integrated circuit).
  • Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only storage area or a random access storage area or both. Elements of a computer (including a server) include one or more processors for executing instructions and one or more storage area devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from, or transfer data to, or both, one or more machine-readable storage media, such as mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. Machine-readable storage media suitable for embodying computer program instructions and data include all forms of non-volatile storage area, including by way of example, semiconductor storage area devices, e.g., EPROM, EEPROM, and flash storage area devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
  • Any connection involving electrical circuitry that allows signals to flow, unless stated otherwise, is an electrical connection and not necessarily a direct physical connection regardless of whether the word “electrical” is used to modify “connection”. Elements of different implementations described herein may be combined to form other embodiments not specifically set forth above. Elements may be left out of the structures described herein without adversely affecting their operation. Furthermore, various separate elements may be combined into one or more individual elements to perform the functions described herein.

Claims (36)

What is claimed is:
1. An autonomous device configured to move along a surface, the autonomous device comprising:
a body;
at least one long-range sensor on the body configured for detection in a first field;
at least one short-range sensor on the body, each short-range sensor being configured for detection in a second field directed towards the surface, the second field being smaller than the first field, each short-range sensor being configured to output signals based on detection of an object within the second field; and
a control system to control movement of the autonomous device based, at least in part, on the signals.
2. The autonomous device of claim 1, wherein the at least one short-range sensor comprises proximity sensors.
3. The autonomous device of claim 1, wherein the at least one short-range sensor comprises near-field sensors.
4. The autonomous device of claim 1, wherein the body comprises one or more corners; and
wherein a group of short-range sensors is arranged at each corner so that second fields of at least some of the short-range sensors in each group overlap at least in part.
5. The autonomous device of claim 1, wherein the body comprises four corners; and
wherein there are four or more short-range sensors arranged at each of the four corners so that second fields of adjacent short-range sensors among the four or more short-range sensors overlap at least in part.
6. The autonomous device of claim 5, wherein each corner comprises an intersection of two edges, each edge of each corner comprising three short-range sensors, adjacent ones of the three short-range sensors having second fields that overlap at least in part.
7. The autonomous device of claim 1, wherein the body has a circular perimeter; and
wherein short-range sensors are arranged along the circular perimeter so that second fields of at least some of the short-range sensors overlap at least in part.
8. The autonomous device of claim 1, wherein the body has a curved perimeter; and
wherein short-range sensors are arranged along the curved perimeter so that second fields of at least some the short-range sensors overlap at least in part.
9. The autonomous device of claim 1, wherein the body comprises a top part and a bottom part, the bottom part being closer to the surface during movement of the autonomous device than the top part; and
wherein short-range sensors are located on the body closer to the top part than to the bottom part.
10. The autonomous device of claim 9, wherein the at least one short-range sensor is located adjacent to the top part.
11. The autonomous device of claim 1, wherein the at least one short-range sensor on the body is angled towards the surface such that a second field of the at least one short-range sensor is directed towards the surface.
12. The autonomous device of claim 1, wherein a horizontal plane extending from the body is at 0° and the surface is at −90° relative to the horizontal plane; and
wherein short-range sensors are directed towards the surface such that the second field of at least some of the short-range sensors is between −1° and −90° relative to the horizontal plane.
13. The autonomous device of claim 1, wherein a horizontal plane extending from the body is at 0° and the surface is at −90° relative to the horizontal plane; and
wherein short-range sensors are directed towards the surface such that the second field of all of the short-range sensors is between −1° and −90° relative to the horizontal plane.
14. The autonomous device of claim 1, wherein the at least one short-range sensor is configured to output signals in response to detecting the object, the at least one short-range sensor being configured to use non-visible light to detect the object.
15. The autonomous device of claim 1, wherein the at least one short-range sensor is configured to output signals in response to detecting the object, the at least one short-range sensor being configured to use infrared light to detect the object.
16. The autonomous device of claim 1, wherein the at least one short-range sensor is configured to output signals in response to detecting the object, the at least one short-range sensor being configured to use electromagnetic signals to detect the object.
17. The autonomous device of claim 1, wherein the at least one short-range sensor is configured to output signals in response to detecting the object, the at least one short-range sensor comprising photoelectric sensors.
18. The autonomous device of claim 1, wherein each second field is 30 centimeters (cm) in diameter at most.
19. The autonomous device of claim 1, wherein each second field is 20 centimeters (cm) in diameter at most.
20. The autonomous device of claim 1, wherein the body comprises corners;
wherein a group of short-range sensor is arranged at each of the corners; and
wherein adjacent ones of the short-range sensors have second fields that overlap at least in part such that, for a corner among the corners, there are no blind spots for the autonomous device in a partial circumference of a circle centered at the corner.
21. The autonomous device of claim 1, further comprising:
a bumper that is comprised of an elastic material, the bumper being around at least part of a perimeter of the autonomous device;
wherein the at least one short-range sensor is located underneath the bumper.
22. The autonomous device of claim 1, wherein sensors are arranged around at least part of a perimeter of the body.
23. The autonomous device of claim 18, wherein short-range sensors are directed towards the surface such that a second field of each short-range sensor extends at least from 15 centimeters (cm) above the surface to the surface.
24. The autonomous device 1, wherein the autonomous device comprises a mobile robot.
25. An autonomous device comprising:
a body for supporting weight of an object;
wheels on the body to enable the body to travel across a surface;
a camera on the body to obtain images in front of the autonomous device, the camera having first field that extends from the body; and
sensors disposed along at least part of a perimeter of the body, the sensors having a second field that extends from the surface to at least a location below the first field.
26. The autonomous device of claim 25, wherein the second field intersects the first field.
27. The autonomous device of claim 25, wherein at least two of the sensors that are adjacent to each other have fields that overlap at least partly.
28. The autonomous device of claim 25, wherein the sensors are configured to use non-visible light to detect an object.
29. The autonomous device of claim 25, wherein the sensors are configured to use infrared light to detect an object.
30. The autonomous device of claim 25, wherein the sensors are configured to use electromagnetic signals to detect an object.
31. The autonomous device of claim 25, wherein the sensors comprise photoelectric sensors.
32. The autonomous device of claim 25, wherein the body comprises one or more corners, at least one of the corners being defined by edges that support a group of the sensors, the group of sensors having fields that overlap at least in part such that, for the at least one corner, there are no blind spots for the autonomous device in a partial circumference of a circle centered at the at least one corner.
33. The autonomous device of claim 32, further comprising a rubber bumper along at least part of the perimeter, the sensors being underneath the rubber bumper.
34. The autonomous device of claim 25, wherein the sensors comprise proximity sensors configured to sense an object within at most 20 centimeters.
35. The autonomous device of claim 25, wherein the sensors comprise proximity sensors configured to sense an object within at most 30 centimeters.
36. The autonomous device 25, wherein the autonomous device comprises a mobile robot.
US16/009,414 2018-06-15 2018-06-15 Detecting objects near an autonomous device Abandoned US20190384314A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US16/009,414 US20190384314A1 (en) 2018-06-15 2018-06-15 Detecting objects near an autonomous device
PCT/EP2019/065766 WO2019238958A1 (en) 2018-06-15 2019-06-14 Detecting objects near an autonomous device
US17/161,977 US20210223786A1 (en) 2018-06-15 2021-01-29 Detecting objects near an autonomous device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US16/009,414 US20190384314A1 (en) 2018-06-15 2018-06-15 Detecting objects near an autonomous device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/161,977 Division US20210223786A1 (en) 2018-06-15 2021-01-29 Detecting objects near an autonomous device

Publications (1)

Publication Number Publication Date
US20190384314A1 true US20190384314A1 (en) 2019-12-19

Family

ID=66998386

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/009,414 Abandoned US20190384314A1 (en) 2018-06-15 2018-06-15 Detecting objects near an autonomous device
US17/161,977 Pending US20210223786A1 (en) 2018-06-15 2021-01-29 Detecting objects near an autonomous device

Family Applications After (1)

Application Number Title Priority Date Filing Date
US17/161,977 Pending US20210223786A1 (en) 2018-06-15 2021-01-29 Detecting objects near an autonomous device

Country Status (2)

Country Link
US (2) US20190384314A1 (en)
WO (1) WO2019238958A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11231501B2 (en) 2019-09-26 2022-01-25 Baidu Usa Llc Front and side three-LIDAR design for autonomous driving vehicles
US11305786B2 (en) 2020-05-01 2022-04-19 Autoguide, LLC. Maintaining consistent sensor output
US11459221B2 (en) 2020-04-24 2022-10-04 Autoguide, LLC Robot for stacking elements
US11592299B2 (en) 2020-03-19 2023-02-28 Mobile Industrial Robots A/S Using static scores to control vehicle operations
US11656625B2 (en) 2018-05-18 2023-05-23 Mobile Industrial Robots A/S System for evacuating one or more mobile robots
US11835949B2 (en) 2020-11-24 2023-12-05 Mobile Industrial Robots A/S Autonomous device safety system
US11845415B2 (en) 2018-09-13 2023-12-19 Mobile Industrial Robots A/S AGV having dynamic safety zone
US11977392B2 (en) 2020-05-11 2024-05-07 Mobile Industrial Robots Inc. Identifying elements in an environment

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102020209843B3 (en) * 2020-08-05 2021-09-30 BSH Hausgeräte GmbH Method for determining a distance between a cleaning robot and an obstacle

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104245244B (en) * 2012-09-21 2016-01-20 艾罗伯特公司 Degree of approach sensing on mobile robot
US9483055B2 (en) * 2012-12-28 2016-11-01 Irobot Corporation Autonomous coverage robot
JP6132659B2 (en) * 2013-02-27 2017-05-24 シャープ株式会社 Ambient environment recognition device, autonomous mobile system using the same, and ambient environment recognition method
US9963155B2 (en) * 2015-05-29 2018-05-08 Clearpath Robotics, Inc. Method, system and apparatus for path control in unmanned vehicles
US9919425B2 (en) * 2015-07-01 2018-03-20 Irobot Corporation Robot navigational sensor system
US9746852B1 (en) * 2015-08-17 2017-08-29 X Development Llc Using laser sensors to augment stereo sensor readings for robotic devices
US10168711B2 (en) * 2015-09-16 2019-01-01 Omron Adept Technologies, Inc. Method and apparatus for autonomous conveyance of transport carts
US10108193B2 (en) * 2016-05-27 2018-10-23 Glen C Wernersbach Mover system
US10585440B1 (en) * 2017-01-23 2020-03-10 Clearpath Robotics Inc. Systems and methods for using human-operated material-transport vehicles with fleet-management systems

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11656625B2 (en) 2018-05-18 2023-05-23 Mobile Industrial Robots A/S System for evacuating one or more mobile robots
US11845415B2 (en) 2018-09-13 2023-12-19 Mobile Industrial Robots A/S AGV having dynamic safety zone
US11231501B2 (en) 2019-09-26 2022-01-25 Baidu Usa Llc Front and side three-LIDAR design for autonomous driving vehicles
US11592299B2 (en) 2020-03-19 2023-02-28 Mobile Industrial Robots A/S Using static scores to control vehicle operations
US11459221B2 (en) 2020-04-24 2022-10-04 Autoguide, LLC Robot for stacking elements
US11305786B2 (en) 2020-05-01 2022-04-19 Autoguide, LLC. Maintaining consistent sensor output
US11977392B2 (en) 2020-05-11 2024-05-07 Mobile Industrial Robots Inc. Identifying elements in an environment
US11835949B2 (en) 2020-11-24 2023-12-05 Mobile Industrial Robots A/S Autonomous device safety system

Also Published As

Publication number Publication date
US20210223786A1 (en) 2021-07-22
WO2019238958A1 (en) 2019-12-19

Similar Documents

Publication Publication Date Title
US20210223786A1 (en) Detecting objects near an autonomous device
US10674885B2 (en) Robot navigational sensor system
US11693422B2 (en) Sensor array for an autonomously operated utility vehicle and method for surround-view image acquisition
EP2791842B1 (en) Positive and negative obstacle avoidance system for a mobile robot
JP2020126691A (en) Movable robot movement restriction
ES2717794T3 (en) System and method of inspection to perform inspections in a storage facility
US20200004247A1 (en) Controlling movement of autonomous device
US10481270B2 (en) Device for detecting an obstacle by means of intersecting planes and detection method using such a device
JP6828579B2 (en) Environmental maintenance robot and its control program
US20170082751A1 (en) Device for detection of obstacles in a horizontal plane and detection method implementing such a device
CN110815202A (en) Obstacle detection method and device
JP5461494B2 (en) Automated traveling vehicle and control method for automated traveling vehicle
US11613321B1 (en) Method and system for a vehicle decking process associated with manufacturing a vehicle
US20220100195A1 (en) Vehicle object-engagement scanning system and method
Nakamura et al. Validation of SLAM without odometry in outdoor environment
JP7300413B2 (en) Control device, moving body, movement control system, control method and program
JP2019519013A (en) Powered autonomous robots for cargo transport
CN116339299A (en) Obstacle avoidance equipment, obstacle avoidance method, obstacle avoidance device, electronic equipment and medium
CN113353173A (en) Automatic guided vehicle
US20210341930A1 (en) Obstacle avoidance method and apparatus, and warehousing robot
CN218255156U (en) Transfer robot
US20230128651A1 (en) Determining scanner error
JP7334708B2 (en) Autonomous mobile
US20230400858A1 (en) Identifying transport structures
US11305786B2 (en) Maintaining consistent sensor output

Legal Events

Date Code Title Description
AS Assignment

Owner name: MOBILE INDUSTRIAL ROBOTS APS, DENMARK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:JACOBSEN, NIELS JUL;REEL/FRAME:046648/0477

Effective date: 20180615

AS Assignment

Owner name: MOBILE INDUSTRIAL ROBOTS A/S, DENMARK

Free format text: CHANGE OF NAME;ASSIGNOR:MOBILE INDUSTRIAL ROBOTS APS;REEL/FRAME:049477/0589

Effective date: 20190424

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION