WO2021252425A1 - Systèmes et procédés de détection de fil et évitement de tels fils par des robots - Google Patents

Systèmes et procédés de détection de fil et évitement de tels fils par des robots Download PDF

Info

Publication number
WO2021252425A1
WO2021252425A1 PCT/US2021/036302 US2021036302W WO2021252425A1 WO 2021252425 A1 WO2021252425 A1 WO 2021252425A1 US 2021036302 W US2021036302 W US 2021036302W WO 2021252425 A1 WO2021252425 A1 WO 2021252425A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
wire
controller
image
computer readable
Prior art date
Application number
PCT/US2021/036302
Other languages
English (en)
Inventor
Cristian TRONCOSO
Ryan LUSTIG
Original Assignee
Brain Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Brain Corporation filed Critical Brain Corporation
Publication of WO2021252425A1 publication Critical patent/WO2021252425A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/163Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • B25J9/1676Avoiding collision or forbidden zones

Definitions

  • the present application relates generally to robotics, and more specifically to systems and methods for wire detection and avoidance for robots.
  • the present disclosure provides, inter alia, systems and methods for wire detection and avoidance for robots.
  • the present disclosure is directed towards a practical implementation of neural network and computer vision technology to enable robots to detect wires and navigate away from the wires.
  • robot may generally refer to autonomous vehicle or object that travels a route, executes a task, or otherwise moves automatically upon executing or processing computer-readable instructions.
  • a method for detecting and avoiding wires along a path traveled by a robot comprises: receiving, via at least one sensor coupled to the robot, at least one image; transmitting, via at least one controller coupled to the robot, the at least one image to a trained model, the trained model configured to detecting, by the trained model, at least one wire within the at least one image; and localizing, by the at least one controller, the wire based on the detection of the at least one wire within the at least one image.
  • the method may further comprise: performing an avoidance maneuver by the robot subsequent to the detection of the at least one wire by the at least one controller.
  • the avoidance maneuver comprises at least one of stopping the robot or changing of the path of the robot to a different path.
  • the method may further comprise: localizing, via the at least one controller, the at least one wire onto a computer-readable map of an environment of the robot.
  • the trained model is derived from a neural network
  • the neural network is trained to identify a plurality of wires within a plurality of images of a training set
  • the training is executed by either at least one controller coupled to the robot or at least one processor of a server communicatively coupled the robot.
  • the at least one sensor comprises a field of view encompassing at least in part an area in front of the robot.
  • a non-transitory computer-readable storage medium comprising a plurality of computer-readable instructions embodied thereon, that when executed by at least one controller configure the at least one controller to: receive, via a sensor coupled to a robot, at least one image; transmitting the at least one image to a trained model, the trained model configured to detect a wire within a path of the robot using the image, the detection being based on a trained model configured to detect wires within the image; and localize the wire onto a computer-readable map based on the location of the wire within the image.
  • the non-transitory computer-readable memory further comprises instructions which configure the at least one controller to: cause the robot to perform an avoidance maneuver subsequent to the detection of the wire, the avoidance maneuver being configured via the at least one controller producing commands to one or more actuators of the robot.
  • the avoidance maneuver comprises at least one of stopping the robot or changing of the path of the robot to a different second path.
  • the non-transitory computer-readable memory further comprises instructions which configure the at least one controller to localize the wire onto a computer-readable map of an environment of the robot.
  • the model is derived from a trained neural network, the neural network being trained to identify a plurality of wires within a plurality of images of a training set, the training is executed by either the at least one controller coupled to the robot or by one or more processors of an external server communicatively coupled to the robot.
  • a robotic system comprises: a non-transitory memory comprising computer-readable instructions stored thereon; and at least one controller configured to execute the computer-readable instructions to: receive at least one image from a camera sensor coupled to the robotic system; detect at least one wire within a path of the robotic system using the image, the detection being based on a model, the model being configured to identify wires within the image; and localize the wire based on detection of the at least one wire within the at least one image.
  • the at least one controller is further configured to execute the computer-readable instructions to: cause the robotic system to perform an avoidance maneuver subsequent to the detection of the wire.
  • the avoidance maneuver comprises at least one of stopping the robot or changing of the path of the robot to a different second path.
  • the at least one controller is further configured to execute the computer-readable instructions to: localize the at least one wire onto a computer-readable map of an environment of the robot.
  • the model is derived from a trained neural network, the neural network being trained to identify wires within images of a training set, the training being executed by either the at least one controller of the robotic system or at least one processor of a server communicatively coupled to the robotic system.
  • a robotic system comprises: a non-transitory memory comprising computer-readable instructions stored thereon; and a controller configured to execute the computer-readable instructions to: receive an image from a camera sensor coupled to the robotic system; detect a wire within a path of the robotic system using the image, the detection being based on a model; localize the wire onto a computer-readable map of an environment of the robot; perform an avoidance maneuver subsequent to the detection of the wire; wherein, the avoidance maneuver comprises at least one of stopping the robot or changing of the path of the robot to a different second path; and the model is derived from a trained neural network, the neural network being trained to identify wires within images of a training set.
  • FIG. 1A is a functional block diagram of a main robot in accordance with some embodiments of this disclosure.
  • FIG. IB is a functional block diagram of a controller or processor in accordance with some embodiments of this disclosure.
  • FIG. 2 illustrates a neural network, according to an exemplary embodiment.
  • FIG. 3A illustrates a robot capturing an image using a camera sensor, according to an exemplary embodiment.
  • FIG. 3B illustrates detection of a wire within an image captured in FIG. 3A, according to an exemplary embodiment.
  • FIG. 3C illustrates localization of the wire detected in FIG. 3A-B and a robot navigating around the localized wire, according to an exemplary embodiment.
  • FIG. 4 is a process flow diagram illustrating a method for a controller or processor of a robot to detect a wire using a camera sensor, according to an exemplary embodiment.
  • FIG. 5 illustrates a computer-readable map comprising a localized wire and an avoidance maneuver for a robot to avoid the wire, according to an exemplary embodiment.
  • FIG. 6 illustrates a robot approaching a hanging or suspended wire according to an exemplary embodiment.
  • FIGS. 7A-B illustrate two sequential images captured by a camera sensor a robot 102 illustrated in FIG. 6 according to an exemplary embodiment.
  • FIG. 8 illustrates an alternative method for determining a distance to a wire using a depth camera, according to an exemplary embodiment.
  • Wires are typically thin (e.g., approximately 0.5 inches across or less), making them difficult to detect using sensors such as scanning/sweeping light detection and ranging (“LiDAR”) sensors, depth cameras, and other time-of-flight sensors, especially at long distances.
  • LiDAR scanning/sweeping light detection and ranging
  • a robot navigating over a wire may cause the wire to become stuck or entangled within mechanical parts of the robot, such as its wheel axles or drive shafts.
  • robots navigating over wires may damage the wires themselves. Accordingly, there is a need in the art for systems and methods for detection of wires for robots.
  • a robot may include mechanical and/or virtual entities configured to carry out a complex series of tasks or actions autonomously.
  • robots may be machines that are guided and/or instructed by computer programs and/or electronic circuitry.
  • robots may include electro-mechanical components that are configured for navigation, where the robot may move from one location to another.
  • Such robots may include autonomous and/or semi-autonomous cars, floor cleaners, rovers, drones, planes, boats, carts, trams, wheelchairs, industrial equipment, stocking machines, mobile platforms, personal transportation devices (e.g., hover boards, scooters, self-balancing vehicles such as those manufactured by Segway, etc.), trailer movers, vehicles, and the like.
  • Robots may also include any autonomous and/or semi-autonomous machine for transporting items, people, animals, cargo, freight, objects, luggage, and/or anything desirable from one location to another.
  • a wire may comprise electrical wires (e.g., copper wires), strings, ropes, cords, cables, threads, or other thin material drawn out into a thin flexible thread.
  • electrical wires e.g., copper wires
  • strings e.g., strings, ropes, cords, cables, threads, or other thin material drawn out into a thin flexible thread.
  • a feature may comprise one or more numeric values (e.g., floating point, decimal, a tensor of values, etc.) characterizing an input or output from a sensor unit including, but not limited to, detection of an object (e.g., humans, couches, cars, cats etc. represented in point clouds, RGB images, etc.), parameters of the object (e.g., size, shape, color, orientation, edges, etc.), color values of pixels of an image, depth values of pixels of a depth image, brightness of an image, the image as a whole, changes of features over time (e.g., velocity, trajectory, etc.
  • numeric values e.g., floating point, decimal, a tensor of values, etc.
  • a feature may be abstracted to any level; for example, an item on a shelf may be a feature of the shelf, the shelf may be a feature of a store, the store may be a feature of a city, and so forth, wherein each of these features may be characterized by data collected by a sensor.
  • a model may comprise a mathematical transformation, function(s), and/or operations which produce an output based on an input.
  • models may refer to mathematical operations derived from a trained neural network, as described in more detail below.
  • Models may be configured to, inter alia, detect features within images, predict movements of objects or of a robot, predict future values of time-dependent parameters, and/or produce a desired output corresponding to an input based on training.
  • network interfaces may include any signal, data, or software interface with a component, network, or process including, without limitation, those of the FireWire (e.g., FW400, FW800, FWS800T, FWS1600, FWS3200, etc.), universal serial bus (“USB”) (e.g., USB l.X, USB 2.0, USB 3.0, USB Type-C, etc.), Ethernet (e.g., 10/100, 10/100/1000 (Gigabit Ethernet), 10-Gig- E, etc.), multimedia over coax alliance technology (“MoCA”), Coaxsys (e.g., TVNETTM), radio frequency tuner (e.g., in-band or OOB, cable modem, etc.), Wi-Fi (802.11), WiMAX (e.g., WiMAX (802.16)), PAN (e.g., PAN/802.15), cellular (e.g., 3G, LTE/LTE-A/TD-LTE/TD-
  • FireWire e.
  • Wi-Fi may include one or more of IEEE-Std. 802.11, variants of IEEE-Std. 802.11, standards related to IEEE-Std. 802.11 (e.g., 802.11 a/b/g/n/ac/ad/af/ah/ai/aj/aq/ax/ay), and/or other wireless standards.
  • IEEE-Std. 802.11 variants of IEEE-Std. 802.11
  • standards related to IEEE-Std. 802.11 e.g., 802.11 a/b/g/n/ac/ad/af/ah/ai/aj/aq/ax/ay
  • other wireless standards e.g., 802.11 a/b/g/n/ac/ad/af/ah/ai/aj/aq/ax/ay
  • processor, microprocessor, and/or digital processor may include any type of digital processing device such as, without limitation, digital signal processors (“DSPs”), reduced instruction set computers (“RISC”), complex instruction set computers (“CISC”), microprocessors, gate arrays (e.g., field programmable gate arrays (“FPGAs”)), programmable logic device (“PLDs”), reconfigurable computer fabrics (“RCFs”), array processors, secure microprocessors, specialized processors (e.g., neuromorphic processors), and application-specific integrated circuits (“ASICs”).
  • DSPs digital signal processors
  • RISC reduced instruction set computers
  • CISC complex instruction set computers
  • microprocessors gate arrays (e.g., field programmable gate arrays (“FPGAs”)), programmable logic device (“PLDs”), reconfigurable computer fabrics (“RCFs”), array processors, secure microprocessors, specialized processors (e.g., neuromorphic processors), and application-specific integrated circuits (“ASICs
  • computer program and/or software may include any sequence of machine-cognizable steps which perform a function.
  • Such computer program and/or software may be rendered in any programming language or environment including, for example, C/C++, C#, Fortran, COBOL, MATLABTM, PASCAL, GO, RUST, SCALA, Python, assembly language, markup languages (e.g., HTML, SGML, XML, VoXML), and the like, as well as object-oriented environments such as the Common Object Request Broker Architecture (“CORBA”), JAVATM (including J2ME, Java Beans, etc.), Binary Runtime Environment (e.g., “BREW”), and the like.
  • CORBA Common Object Request Broker Architecture
  • JAVATM including J2ME, Java Beans, etc.
  • BFW Binary Runtime Environment
  • connection, link, and/or wireless link may include a causal link between any two or more entities (whether physical or logical/virtual), which enables information exchange between the entities.
  • computer and/or computing devices may include, but are not limited to, personal computers (“PCs”) and minicomputers, whether desktop, laptop, or otherwise, mainframe computers, workstations, servers, personal digital assistants (“PDAs”), handheld computers, embedded computers, programmable logic devices, personal communicators, tablet computers, mobile devices, portable navigation aids, J2ME equipped devices, cellular telephones, smart phones, personal integrated communication or entertainment devices, and/or any other device capable of executing a set of instructions and processing an incoming data signal.
  • PCs personal computers
  • PDAs personal digital assistants
  • handheld computers handheld computers
  • embedded computers embedded computers
  • programmable logic devices personal communicators
  • tablet computers tablet computers
  • mobile devices portable navigation aids
  • J2ME equipped devices portable navigation aids
  • cellular telephones smart phones
  • personal integrated communication or entertainment devices personal integrated communication or entertainment devices
  • the systems and methods of this disclosure at least: (i) enable robots to detect thin wires; (ii) enhance autonomy of robots by enabling them to avoid hazardous and difficult- to-detect wires; (iii) improve safety of operating robots within environments comprising wires; and (iv) reduce a risk of robots navigating over wires, potentially causing damage thereto or to electrical components coupled to the wires.
  • Other advantages are readily discernible by one having ordinary skill in the art given the contents of the present disclosure.
  • FIG. 1A is a functional block diagram of a robot 102 in accordance with some principles of this disclosure.
  • robot 102 may include controller 118, memory 120, user interface unit 112, sensor units 114, navigation units 106, actuator unit 108, and communications unit 116, as well as other components and subcomponents (e.g., some of which may not be illustrated).
  • controller 118 memory 120
  • user interface unit 112 user interface unit 112
  • sensor units 114 e.g., sensor units 114
  • navigation units 106 e.g., a specific embodiment
  • actuator unit 108 e.g., a specific embodiment
  • communications unit 116 e.g., a specific embodiment is illustrated in FIG. 1A, it is appreciated that the architecture may be varied in certain embodiments as would be readily apparent to one of ordinary skill given the contents of the present disclosure.
  • robot 102 may be representative at least in part of any robot described in this disclosure.
  • Controller 118 may control the various operations performed by robot 102. Controller
  • processors e.g., microprocessors
  • processors e.g., microprocessors
  • processors may include any type of digital processing device such as, without limitation, digital signal processors (“DSPs”), reduced instruction set computers (“RISC”), complex instruction set computers (“CISC”) processors, microprocessors, gate arrays (e.g., field programmable gate arrays (“FPGAs”)), programmable logic device (“PLDs”), reconfigurable computer fabrics (“RCFs”), array processors, secure microprocessors, and application-specific integrated circuits (“ASICs”).
  • DSPs digital signal processors
  • RISC reduced instruction set computers
  • CISC complex instruction set computers
  • microprocessors e.g., gate arrays (e.g., field programmable gate arrays (“FPGAs”)), programmable logic device (“PLDs”), reconfigurable computer fabrics (“RCFs”), array processors, secure microprocessors, and application-specific integrated circuits (“ASICs”).
  • Peripherals may include hardware accelerators configured to perform a specific function using hardware elements such as, without limitation, encryption/description hardware, algebraic processing devices (e.g., tensor processing units, quadratic problem solvers, multipliers, etc.), data compressors, encoders, arithmetic logic units (“ALU”), and the like.
  • Such digital processors may be contained on a single unitary integrated circuit die, or distributed across multiple components.
  • Controller 118 may be operatively and/or communicatively coupled to memory 120.
  • Memory 120 may include any type of integrated circuit or other storage device configurable to store digital data including, without limitation, read-only memory (“ROM”), random access memory (“RAM”), non-volatile random access memory (“NVRAM”), programmable read-only memory (“PROM”), electrically erasable programmable read-only memory (“EEPROM”), dynamic random- access memory (“DRAM”), Mobile DRAM, synchronous DRAM (“SDRAM”), double data rate SDRAM (“DDR/2 SDRAM”), extended data output (“EDO”) RAM, fast page mode RAM (“FPM”), reduced latency DRAM (“RLDRAM”), static RAM (“SRAM”), flash memory (e.g., NAND/NOR), memristor memory, pseudostatic RAM (“PSRAM”), etc.
  • ROM read-only memory
  • RAM random access memory
  • NVRAM non-volatile random access memory
  • PROM programmable read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • DRAM dynamic random- access memory
  • SDRAM synchronous
  • Memory 120 may provide instructions and data to controller 118.
  • memory 120 may be a non-transitory, computer-readable storage apparatus and/or medium having a plurality of instructions stored thereon, the instructions being executable by a processing apparatus (e.g., controller 118) to operate robot 102.
  • the instructions may be configurable to, when executed by the processing apparatus, cause the processing apparatus to perform the various methods, features, and/or functionality described in this disclosure.
  • controller 118 may perform logical and/or arithmetic operations based on program instructions stored within memory 120.
  • the instructions and/or data of memory 120 may be stored in a combination of hardware, some located locally within robot 102, and some located remote from robot 102 (e.g., in a cloud, server, network, etc.).
  • a processor may be on board and/or internal to robot 102 and/or external to robot 102 and be communicatively coupled to controller 118 of robot 102 utilizing communication units 116 wherein the external processor may receive data from robot 102, process the data, and transmit computer-readable instructions back to controller 118.
  • the processor may be on a remote server (not shown).
  • memory 120 may store a library of sensor data.
  • the sensor data may be associated at least in part with objects and/or people.
  • this library may include sensor data related to objects and/or people in different conditions, such as sensor data related to objects and/or people with different compositions (e.g., materials, reflective properties, molecular makeup, etc.), different lighting conditions, angles, sizes, distances, clarity (e.g., blurred, obstructed/occluded, partially off frame, etc.), colors, surroundings, and/or other conditions.
  • the sensor data in the library may be taken by a sensor (e.g., a sensor of sensor units 114 or any other sensor) and/or generated automatically, such as with a computer program that is configurable to generate/simulate (e.g., in a virtual world) library sensor data (e.g., which may generate/simulate these library data entirely digitally and/or beginning from actual sensor data) from different lighting conditions, angles, sizes, distances, clarity (e.g., blurred, obstructed/occluded, partially off frame, etc.), colors, surroundings, and/or other conditions.
  • a sensor e.g., a sensor of sensor units 114 or any other sensor
  • a computer program that is configurable to generate/simulate (e.g., in a virtual world) library sensor data (e.g., which may generate/simulate these library data entirely digitally and/or beginning from actual sensor data) from different lighting conditions, angles, sizes, distances, clarity (e.g., blurred, obstructed
  • the number of images in the library may depend at least in part on one or more of the amount of available data, the variability of the surrounding environment in which robot 102 operates, the complexity of objects and/or people, the variability in appearance of objects, physical properties of robots, the characteristics of the sensors, and/or the amount of available storage space (e.g., in the library, memory 120, and/or local or remote storage).
  • the library may be stored on a network (e.g., cloud, server, distributed network, etc.) and/or may not be stored completely within memory 120.
  • various robots may be networked so that data captured by individual robots are collectively shared with other robots.
  • these robots may be configurable to leam and/or share sensor data in order to facilitate the ability to readily detect and/or identify errors and/or assist events.
  • operative units 104 may be coupled to controller 118, or any other controller, to perform the various operations described in this disclosure.
  • controller 118 or any other controller, to perform the various operations described in this disclosure.
  • One, more, or none of the modules in operative units 104 may be included in some embodiments.
  • reference may be to various controllers and/or processors.
  • a single controller e.g., controller 118
  • controller 118 may serve as the various controllers and/or processors described.
  • different controllers and/or processors may be used, such as controllers and/or processors used particularly for one or more operative units 104.
  • Controller 118 may send and/or receive signals, such as power signals, status signals, data signals, electrical signals, and/or any other desirable signals, including discrete and analog signals to operative units 104. Controller 118 may coordinate and/or manage operative units 104, and/or set timings (e.g., synchronously or asynchronously), turn off/on control power budgets, receive/send network instructions and/or updates, update firmware, send interrogatory signals, receive and/or send statuses, and/or perform any operations for running features of robot 102.
  • timings e.g., synchronously or asynchronously
  • operative units 104 may include various units that perform functions for robot 102.
  • operative units 104 includes at least navigation units 106, actuator units 108, user interface units 112, sensor units 114, and communication units 116.
  • Operative units 104 may also comprise other units that provide the various functionality of robot 102.
  • operative units 104 may be instantiated in software, hardware, or both software and hardware.
  • units of operative units 104 may comprise computer- implemented instructions executed by a controller.
  • units of operative unit 104 may comprise hardcoded logic (e.g., ASICS).
  • units of operative units 104 may comprise both computer-implemented instructions executed by a controller and hardcoded logic. Where operative units 104 are implemented in part in software, operative units 104 may include units/modules of code configurable to provide one or more functionalities.
  • navigation units 106 may include systems and methods that may computationally construct and update a map of an environment, localize robot 102 (e.g., find its position) in a map, and navigate robot 102 to/from destinations. The mapping may be performed by imposing data obtained in part by sensor units 114 into a computer-readable map representative at least in part of the environment.
  • a map of an environment may be uploaded to robot 102 through user interface units 112, uploaded wirelessly or through wired connection, or taught to robot 102 by a user.
  • navigation units 106 may include components and/or software configurable to provide directional instructions for robot 102 to navigate. Navigation units 106 may process maps, routes, and localization information generated by mapping and localization units, data from sensor units 114, and/or other operative units 104.
  • actuator units 108 may include actuators such as electric motors, gas motors, driven magnet systems, solenoid/ratchet systems, piezoelectric systems (e.g., inchworm motors), magnetostrictive elements, gesticulation, and/or any way of driving an actuator known in the art.
  • actuator unit 108 may include systems that allow movement of robot 102, such as motorized propulsion.
  • motorized propulsion may move robot 102 in a forward or backward direction, and/or be used at least in part in turning robot 102 (e.g., left, right, and/or any other direction).
  • actuator unit 108 may control if robot 102 is moving or is stopped and/or allow robot 102 to navigate from one location to another location.
  • such actuators may actuate the wheels for robot 102 to navigate a route, navigate around obstacles or move the robot as it conducts a task.
  • Other actuators may repose cameras and sensors.
  • actuator unit 108 may include systems that allow in part for task execution by the robot 102 such as, for example, actuating features of robot 102 (e.g., moving a robotic arm feature to manipulate objects within an environment).
  • sensor units 114 may comprise systems and/or methods that may detect characteristics within and/or around robot 102.
  • Sensor units 114 may comprise a plurality and/or a combination of sensors.
  • Sensor units 114 may include sensors that are internal to robot 102 or external, and/or have components that are partially internal and/or partially external.
  • sensor units 114 may include one or more exteroceptive sensors, such as sonars, light detection and ranging (“LiDAR”) sensors, radars, lasers, cameras (including video cameras (e.g., red- blue-green (“RBG”) cameras, infrared cameras, three-dimensional (“3D”) cameras, thermal cameras, etc.), time of flight (“TOF”) cameras, structured light cameras, antennas, motion detectors, microphones, and/or any other sensor known in the art.
  • sensor units 114 may collect raw measurements (e.g., currents, voltages, resistances, gate logic, etc.) and/or transformed measurements (e.g., distances, angles, detected points in obstacles, etc.).
  • measurements may be aggregated and/or summarized.
  • Sensor units 114 may generate data based at least in part on distance or height measurements. Such data may be stored in data structures, such as matrices, arrays, queues, lists, stacks, bags, etc.
  • sensor units 114 may include sensors that may measure internal characteristics of robot 102.
  • sensor units 114 may measure temperature, power levels, statuses, and/or any characteristic of robot 102.
  • sensor units 114 may be configurable to determine the odometry of robot 102.
  • sensor units 114 may include proprioceptive sensors, which may comprise sensors such as accelerometers, inertial measurement units (“IMU”), odometers, gyroscopes, speedometers, cameras (e.g. using visual odometry), clock/timer, and the like. Odometry may facilitate autonomous navigation and/or autonomous actions of robot 102.
  • IMU inertial measurement units
  • odometers e.g. using visual odometry
  • clock/timer e.g. using visual odometry
  • This odometry may include robot 102’s position (e.g., where position may include robot’s location, displacement and/or orientation, and may sometimes be interchangeable with the term pose as used herein) relative to the initial location.
  • Such data may be stored in data structures, such as matrices, arrays, queues, lists, stacks, bags, etc.
  • the data structure of the sensor data may be called an image.
  • user interface units 112 may be configurable to enable a user to interact with robot 102.
  • user interface units 112 may include touch panels, buttons, keypads/keyboards, ports (e.g., universal serial bus (“USB”), digital visual interface (“DVI”), Display Port, E-Sata, Firewire, PS/2, Serial, VGA, SCSI, audioport, high-definition multimedia interface (“HDMI”), personal computer memory card international association (“PCMCIA”) ports, memory card ports (e.g., secure digital (“SD”) and miniSD), and/or ports for computer-readable medium), mice, rollerballs, consoles, vibrators, audio transducers, and/or any interface for a user to input and/or receive data and/or commands, whether coupled wirelessly or through wires.
  • USB universal serial bus
  • DVI digital visual interface
  • Display Port Display Port
  • E-Sata Firewire
  • PS/2 Serial, VGA, SCSI
  • HDMI high-definition multimedia interface
  • PCMCIA personal computer memory card international association
  • User interface units 218 may include a display, such as, without limitation, liquid crystal display (“UCDs”), light-emitting diode (“UED”) displays, LED LCD displays, in-plane-switching (“IPS”) displays, cathode ray tubes, plasma displays, high definition (“HD”) panels, 4K displays, retina displays, organic LED displays, touchscreens, surfaces, canvases, and/or any displays, televisions, monitors, panels, and/or devices known in the art for visual presentation.
  • UCDs liquid crystal display
  • UMD light-emitting diode
  • IPS in-plane-switching
  • cathode ray tubes plasma displays
  • HD high definition
  • 4K displays retina displays
  • organic LED displays organic LED displays
  • touchscreens touchscreens
  • canvases canvases
  • any displays televisions, monitors, panels, and/or devices known in the art for visual presentation.
  • user interface units 112 may be positioned on the body of robot 102.
  • user interface units 112 may be positioned away from the body of robot 102 but may be communicatively coupled to robot 102 (e.g., via communication units including transmitters, receivers, and/or transceivers) directly or indirectly (e.g., through a network, server, and/or a cloud).
  • user interface units 112 may include one or more projections of images on a surface (e.g., the floor) proximally located to the robot, e.g., to provide information to the occupant or to people around the robot.
  • the information could be the direction of future movement of the robot, such as an indication of moving forward, left, right, back, at an angle, and/or any other direction. In some cases, such information may utilize arrows, colors, symbols, etc.
  • communications unit 116 may include one or more receivers, transmitters, and/or transceivers. Communications unit 116 may be configurable to send/receive a transmission protocol, such as BLUETOOTH ® , ZIGBEE ® , Wi-Fi, induction wireless data transmission, radio frequencies, radio transmission, radio-frequency identification (“RFID”), near- field communication (“NFC”), infrared, network interfaces, cellular technologies such as 3G (3GPP/3GPP2), high-speed downlink packet access (“HSDPA”), high-speed uplink packet access (“HSUPA”), time division multiple access (“TDMA”), code division multiple access (“CDMA”) (e.g., IS-95A, wideband code division multiple access (“WCDMA”), etc.), frequency hopping spread spectrum (“FHSS”), direct sequence spread spectrum (“DSSS”), global system for mobile communication (“GSM”), Personal Area Network (“PAN”) (e.g., PAN/802.15), worldwide interoperability for microwave
  • a transmission protocol such as BLU
  • Communications unit 116 may also be configurable to send/receive signals utilizing a transmission protocol over wired connections, such as any cable that has a signal line and ground.
  • a transmission protocol such as any cable that has a signal line and ground.
  • cables may include Ethernet cables, coaxial cables, Universal Serial Bus (“USB”), FireWire, and/or any connection known in the art.
  • USB Universal Serial Bus
  • Such protocols may be used by communications unit 116 to communicate to external systems, such as computers, smart phones, tablets, data capture systems, mobile telecommunications networks, clouds, servers, or the like.
  • Communications unit 116 may be configurable to send and receive signals comprising numbers, letters, alphanumeric characters, and/or symbols.
  • signals may be encrypted, using algorithms such as 128-bit or 256-bit keys and/or other encryption algorithms complying with standards such as the Advanced Encryption Standard (“AES”), RSA, Data Encryption Standard (“DES”), Triple DES, and the like.
  • Communications unit 116 may be configurable to send and receive statuses, commands, and other data/information.
  • communications unit 116 may communicate with a user operator to allow the user to control robot 102
  • Communications unit 116 may communicate with a server/network (e.g., a network) in order to allow robot 102 to send data, statuses, commands, and other communications to the server.
  • the server may also be communicatively coupled to computer(s) and/or device(s) that may be used to monitor and/or control robot 102 remotely.
  • Communications unit 116 may also receive updates (e.g., firmware or data updates), data, statuses, commands, and other communications from a server for robot 102.
  • operating system 110 may be configurable to manage memory 120, controller 118, power supply 122, modules in operative units 104, and/or any software, hardware, and/or features of robot 102.
  • operating system 110 may include device drivers to manage hardware recourses for robot 102.
  • power supply 122 may include one or more batteries, including, without limitation, lithium, lithium ion, nickel-cadmium, nickel-metal hydride, nickel- hydrogen, carbon-zinc, silver-oxide, zinc-carbon, zinc-air, mercury oxide, alkaline, or any other type of battery known in the art. Certain batteries may be rechargeable, such as wirelessly (e.g., by resonant circuit and/or a resonant tank circuit) and/or plugging into an external power source. Power supply 122 may also be any supplier of energy, including wall sockets and electronic devices that convert solar, wind, water, nuclear, hydrogen, gasoline, natural gas, fossil fuels, mechanical energy, steam, and/or any power source into electricity.
  • One or more of the units described with respect to FIG. 1A may be integrated onto robot 102, such as in an integrated system.
  • one or more of these units may be part of an attachable module.
  • This module may be attached to an existing apparatus to automate so that it behaves as a robot. Accordingly, the features described in this disclosure with reference to robot 102 may be instantiated in a module that may be attached to an existing apparatus and/or integrated onto robot 102 in an integrated system.
  • a robot 102, a controller 118, or any other controller, processor, or robot performing a task illustrated in the figures below comprises a controller executing computer- readable instructions stored on a non-transitory computer-readable storage apparatus, such as memory 120, as would be appreciated by one skilled in the art.
  • the architecture of a processor or processing device 138 is illustrated according to an exemplary embodiment.
  • the processing device 138 includes a data bus 128, a receiver 126, a transmitter 134, at least one processor 130, and a memory 132.
  • the receiver 126, the processor 130 and the transmitter 134 all communicate with each other via the data bus 128.
  • the processor 130 is configurable to access the memory 132 which stores computer code or computer-readable instructions in order for the processor 130 to execute the specialized algorithms.
  • memory 132 may comprise some, none, different, or all of the features of memory 120 previously illustrated in FIG. 1A. The algorithms executed by the processor 130 are discussed in further detail below.
  • the receiver 126 as shown in FIG. IB is configurable to receive input signals 124.
  • the input signals 124 may comprise signals from a plurality of operative units 104 illustrated in FIG. 1A including, but not limited to, sensor data from sensor units 114, user inputs, motor feedback, external communication signals (e.g., from a remote server), and/or any other signal from an operative unit 104 requiring further processing.
  • the receiver 126 communicates these received signals to the processor 130 via the data bus 128.
  • the data bus 128 is the means of communication between the different components — receiver, processor, and transmitter — in the processing device.
  • the processor 130 executes the algorithms, as discussed below, by accessing specialized computer-readable instructions from the memory 132.
  • the memory 132 is a storage medium for storing computer code or instructions.
  • the storage medium may include optical memory (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory (e.g., RAM, EPROM, EEPROM, etc.), and/or magnetic memory (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others.
  • Storage medium may include volatile, nonvolatile, dynamic, static, read/write, read-only, random-access, sequential-access, location-addressable, fde -addressable, and/or content-addressable devices.
  • the processor 130 may communicate output signals to transmitter 134 via data bus 128 as illustrated.
  • the transmitter 134 may be configurable to further communicate the output signals to a plurality of operative units 104 illustrated by signal output 136.
  • FIG. IB may also illustrate an external server architecture configurable to effectuate the control of a robotic apparatus from a remote location, such as a server. That is, the server may also include a data bus, a receiver, a transmitter, a processor, and a memory that stores specialized computer-readable instructions thereon.
  • the server may also include a data bus, a receiver, a transmitter, a processor, and a memory that stores specialized computer-readable instructions thereon.
  • a controller 118 of a robot 102 may include one or more processing devices 138 and may further include other peripheral devices used for processing information, such as ASICS, DPS, proportional-integral-derivative (“PID”) controllers, hardware accelerators (e.g., encryption/decryption hardware), and/or other peripherals (e.g., analog to digital converters) described above in FIG. 1A.
  • PID proportional-integral-derivative
  • hardware accelerators e.g., encryption/decryption hardware
  • other peripherals e.g., analog to digital converters
  • peripheral devices are used as a means for intercommunication between the controller 118 and operative units 104 (e.g., digital to analog converters and/or amplifiers for producing actuator signals).
  • the controller 118 executing computer-readable instructions to perform a function may include one or more processing devices 138 thereof executing computer-readable instructions and, in some instances, the use of any hardware peripherals known within the art.
  • Controller 118 may be illustrative of various processing devices 138 and peripherals integrated into a single circuit die or distributed to various locations of the robot 102 which receive, process, and output information to/from operative units 104 of the robot 102 to effectuate control of the robot 102 in accordance with instructions stored in a memory 120, 132.
  • controller 118 may include a plurality of processing devices 138 for performing high-level tasks (e.g., planning a route to avoid obstacles) and processing devices 138 for performing low-level tasks (e.g., producing actuator signals in accordance with the route).
  • high-level tasks e.g., planning a route to avoid obstacles
  • low-level tasks e.g., producing actuator signals in accordance with the route
  • FIG. 2 illustrates a neural network 200, according to an exemplary embodiment.
  • the neural network 200 may comprise a plurality of input nodes 202, intermediate nodes 206, and output nodes 210.
  • the input nodes 202 are connected via links 204 to one or more intermediate nodes 206.
  • Some intermediate nodes 206 may be respectively connected via links 208 to one or more adjacent intermediate nodes 206.
  • Some intermediate nodes 206 may be connected via links 212 to output nodes 210.
  • Links 204, 208, 212 illustrate inputs/outputs to/from the nodes 202, 206, and 210 in accordance with Equation 1 below.
  • the intermediate nodes 206 may form an intermediate layer 214 of the neural network 200.
  • a neural network 200 may comprise a plurality of intermediate layers 214, intermediate nodes 206 of each intermediate layer 214 being linked to one or more intermediate nodes 206 of adjacent layers, unless an adjacent layer is an input layer (i.e., input nodes 202) or an output layer (i.e., output nodes 210).
  • the two intermediate layers 214 illustrated may correspond to a hidden layer of neural network 200.
  • a hidden layer may comprise more or fewer intermediate layers 214 or intermediate nodes 206.
  • Each node 202, 206, and 210 may be linked to any number of nodes, wherein linking all nodes together as illustrated is not intended to be limiting.
  • the input nodes 202 may be directly linked to one or more output nodes 210.
  • the input nodes 206 may receive a numeric value Xi of a sensory input of a feature, / being an integer index.
  • Xi may represent color values of an i th pixel of a color image.
  • the input nodes 206 may output the numeric value Xi to one or more intermediate nodes 206 via links 204.
  • index i corresponds to a node number within a layer (e.g., xi denotes the first input node 202 of the input layer, indexing from zero).
  • Index j corresponds to a layer, wherein j would be equal to one for the one intermediate layer 214-1 of the neural network 200 illustrated, however, j may be any number corresponding to a neural network 200 comprising any number of intermediate layers 214.
  • Constants a, b, c, and d represent weights to be learned in accordance with a training process. The number of constants of Equation 1 may depend on a number of input links 204 to a respective intermediate node 206.
  • intermediate nodes 206 are linked to all input nodes 202, but this is not intended to be limiting.
  • Intermediate nodes 206 of the second (rightmost) intermediate layer 214-2 may output values k L2 to respective links 212 following Equation 1 above. It is appreciated that constants a, b, c, d may be of different values for each intermediate node 206.
  • Equation 1 utilizes addition of inputs multiplied by respective learned coefficients, other operations are applicable, such as convolution operations, thresholds for input values for producing an output, and/or biases, wherein the above equation is intended to be illustrative and non limiting.
  • Output nodes 210 may be configured to receive at least one numeric value ki j from at least an i th intermediate node 206 of a final (i.e., rightmost) intermediate layer 214. As illustrated, for example, each output node 210 receives numeric values ko-7,2 from the eight intermediate nodes 206 of the second intermediate layer 214-2.
  • the output of the output nodes 210 may comprise a classification of a feature of the input nodes 202.
  • the output Ci of the output nodes 210 may be calculated following a substantially similar equation as Equation 1 above (i.e., based on learned weights and inputs from connections 212).
  • the output nodes 210 may output a classification Ci of each input pixel (e.g., pixel / is a car, train, dog, person, background, soap, or any other classification).
  • Other outputs of the output nodes 210 are considered, such as, for example, output nodes 210 predicting a temperature within an environment at a future time based on temperature measurements provided to input nodes 202 at prior times and/or at different locations.
  • the training process comprises providing the neural network 200 with both input and output pairs of values to the input nodes 202 and output nodes 210, respectively, such that weights of the intermediate nodes 206 may be determined.
  • An input and output pair comprise a ground truth data input comprising values for the input nodes 202 and corresponding correct values for the output nodes 210 (e.g., an image and corresponding annotations or labels).
  • the determined weights configure the neural network 200 to receive input to input nodes 202 and determine a correct output at the output nodes 210.
  • annotated (i.e., labeled) images may be utilized to train a neural network 200 to identify objects or features within the image based on the annotations and the image itself, the annotations may comprise, e.g., pixels encoded with “cat” or “not cat” information if the training is intended to configure the neural network 200 to identify cats within an image.
  • the unannotated images of the training pairs may be provided to input nodes 202 and the annotations of the image (i.e., classifications for each pixel) may be provided to the output nodes 210, wherein weights of the intermediate nodes 206 may be adjusted such that the neural network 200 generates the annotations of the image based on the provided pixel color values to the input nodes 202.
  • This process may be repeated using a substantial number of labeled images (e.g., hundreds or more) such that ideal weights of each intermediate node 206 may be determined.
  • the training process is complete when predictions made by the neural network 200 fall below a threshold error rate, which may be defined using a cost function.
  • a training pair may comprise any set of information provided to input and output of the neural network 200 for use in training the neural network 200.
  • a training pair may comprise an image and one or more labels of the image (e.g., an image depicting a cat and a bounding box associated with a region occupied by the cat within the image).
  • Neural network 200 may be configured to receive any set of numeric values representative of any feature and provide an output set of numeric values representative of the feature.
  • the inputs may comprise color values of a color image and outputs may comprise classifications for each pixel of the image.
  • inputs may comprise numeric values for a time-dependent trend of a parameter (e.g., temperature fluctuations within a building measured by a sensor) and output nodes 210 may provide a predicted value for the parameter at a future time based on the observed trends, wherein the trends may be utilized to train the neural network 200.
  • Training of the neural network 200 may comprise providing the neural network 200 with a sufficiently large number of training input/output pairs comprising ground truth (i.e., highly accurate) training data.
  • audio information may be provided to input nodes 202 and a meaning of the audio information may be provided to output nodes 210 to train the neural network 200 to identify words and speech patterns.
  • a neural network 200 may be configured to perform a certain task (e.g., classify a certain type of object within an image) based on training pairs provided, wherein the neural networks 200 may fail at other tasks due to a lack of sufficient training data and other computational factors (e.g., processing power). For example, a neural network 200 may be trained to identify cereal boxes within images, however the same neural network 200 may fail to identify soap bars within the images.
  • a certain task e.g., classify a certain type of object within an image
  • the neural networks 200 may fail at other tasks due to a lack of sufficient training data and other computational factors (e.g., processing power).
  • a neural network 200 may be trained to identify cereal boxes within images, however the same neural network 200 may fail to identify soap bars within the images.
  • one or more outputs ky from intermediate nodes 206 of a j th intermediate layer 212 may be utilized as inputs to one or more intermediate nodes 206 and m th intermediate layer 212, wherein index m may be greater than or less than j (e.g., a recurrent or feed forward neural network).
  • a neural network 200 may comprise N dimensions for an N-dimensional feature (e.g., a 2-dimensional input image or point cloud), wherein only one dimension has been illustrated for simplicity.
  • a neural network 200 may appreciate a plurality of other embodiments of a neural network 200, wherein the neural network 200 illustrated represents a simplified embodiment of a neural network to illustrate the structure, utility, and training of neural networks and is not intended to be limiting.
  • the exact configuration of the neural network used may depend on (i) processing resources available, (ii) training data available, (iii) quality of the training data, and/or (iv) difficulty or complexity of the classification/problem.
  • programs such as AutoKeras utilize automatic machine learning (“AutoML”) to enable one of ordinary skill in the art to optimize a neural network 200 design to a specified task or data set.
  • AutoML automatic machine learning
  • the neural network 200 may comprise of a convolutional neural network, wherein one or more nodes 206 and/or layers 214 may perform convolution operations on their respective inputs.
  • the inputs may be convolved with filters, wherein the filters are configured based on the training pairs provided to train the neural network 200. That is, the filter parameters are adjusted in a similar manner as the coefficients or weights of equation 2 as described above.
  • convolutional neural networks may further utilize a sliding window to analyze certain sections of a multi -dimensional input, such as a 2D image, to detect patterns, the analysis may comprise of filtering, convolving, or other operations described herein.
  • a model or trained model may comprise of a fixed input-output system configured to receive an input and generate a corresponding output.
  • a model may be derived from, for example, the weights of intermediate nodes 206 and output nodes 210 after a sufficient training process of a neural network 200.
  • the model may be analogous to a neural network 200 with fixed weights (e.g., constants a, b, c, d of Equation 1), wherein the values of the fixed weights are learned during the training process.
  • a model may be referred to herein as being “trained”, wherein the model was trained to at a prior instance in time and is being utilized without further training.
  • Models may further comprise of other image pattern recognition algorithms known within the art and described herein.
  • a model used to identify features or objects within images may comprise of an image comparison system, wherein a given input image is compared to a library of images depicting the feature/object to be detected.
  • a processing device 138 utilizing such model may, for any input image, compare the input image to a library of images.
  • the comparison may compare the color values, luminance, distance/shape (e.g., for depth imagery), circularity, edges, contours, other salient portions of the image, and/or the image in its entirety to the library of images.
  • a similarity measure may be determined corresponding to the similarity between one or more parameters of the input image and one or more parameters of the images of the library.
  • FIG. 3A illustrates a robot 102 utilizing a camera sensor 302 to capture an image, according to an exemplary embodiment.
  • the camera sensor 302 may comprise a sensor unit 114 of the robot 102 configured to capture images (e.g., red green blue (“RGB”), greyscale images, chrominance- luminance images, etc.) within a field of view illustrated by vision lines 304.
  • RGB red green blue
  • chrominance- luminance images etc.
  • the camera sensor 302 may be orientated such that the field of view of the camera sensor 302 encompasses at least in part a region in front of the robot 102 along a forward direction of the robot 102, the forward direction in this embodiment being toward the right side of the page along route 300. Other orientations of the camera sensor 302 may be utilized without limitation, but it may be advantageous to detect wires ahead of the robot 102 to avoid navigating into the wires.
  • the camera sensor 302 may capture an image 308 illustrated next in FIG. 3B comprising, at least in part, a wire 306 represented therein.
  • the robot 102 may be currently navigating along the route 300 which, if continued, would cause the robot 102 to navigate over the wire 306. Navigation over wire 306 may risk damage to either the wire 306 or robot 102.
  • a wire 306 may correspond to a thin piece of metal or cord with a certain length.
  • wire 306 may be hanging from an object (e.g., a ceiling) or suspended between two objects.
  • Wire 306 being on the ground or floor in front of the robot 102 as shown is not intended to be limiting.
  • hanging or dangling wires may also be detected in a similar manner as described herein.
  • FIG. 3B illustrates an image 308 captured by a camera sensor 302 of a robot 102, according to an exemplary embodiment.
  • the image 308 may comprise a plurality of pixels, each comprising at least one color value associated thereto (e.g., greyscale value, RGB values, HSV values, etc.).
  • a model may be utilized to determine if a wire 306 is present within the image 308, a model may be utilized.
  • the model may be derived from a trained neural network 200, as discussed above with respect to Fig. 2, based on learned weights of nodes 202, 206, and/or 210 during a training process, wherein the training process comprises training the neural network 200 using annotated (i.e., labeled) training images depicting wires.
  • the training images may be annotated using bounding boxes surrounding wires depicted in the training images, or each pixel of the training images may be encoded using labels such as, for example, “wire” or “not wire.”
  • Controller 118 of the robot 102 may utilize the trained model to identify wire 306 within the image 308.
  • controller 118 may provide color value(s) of a pixel of the image 308 to a respective input node 202, wherein output nodes 210 may output a classification of the respective pixel, the classification comprising one of “background” and “wire” classes, or similar classes (e.g., “not wire” and “wire”).
  • a bounding box 310 may be determined by output nodes 210 of neural network 200.
  • the bounding box 310 comprises an area of pixels, which encompasses the wire 306. That is, the model may output either a bounding box 310 comprising an area encompassed by wire 306 or may perform semantic segmentation to label each pixel of the image 308 as either “wire” or “background/not wire” classes.
  • a bounding box 310 may comprise two or more boxes, squares, rectangles, discretized functions, convex hulls, or a kemelized function which enclose pixels depicting a wire 306.
  • bounding boxes instead of semantic segmentation may reduce computational resources utilized to detect the wire 306 (i.e., reduce time to process the image 308 to detect the wire 306) and provide additional area around the wire 306 for the robot 102 to avoid as an additional safety measure when navigating around the wire 306.
  • FIG. 3C illustrates a robot 102 utilizing a camera sensor 302 and a trained model to detect a wire 306, according to an exemplary embodiment.
  • wire 306 may be localized onto a computer-readable map, which, in turn, translates to locating the wire 306 within the environment.
  • the camera matrix i.e., camera projection matrix
  • the pin-hole camera model is a well-known model for spatial mapping of imaging sensors within the art and is not to be confused with models derived from neural networks 200.
  • differential motion between two or more sequential images captured by the sensor 302 in conjunction with the robot 102 motion may be utilized to localize the wire 306 via a binocular disparity, presuming the wire is stationary.
  • controller 118 may modify route 300 to route 312, or an alternate route, to avoid the wire 306. Accordingly, controller 118 may activate one or more actuator units 108 to navigate the robot 102 along the route 312 to avoid the detected wire 306.
  • the route 312 is not limited to turning the robot in one or more forward directions, such as shown for illustration in Fig. 3C, but may also comprise stopping, rotating and/or reversing the robot as one or more avoidance maneuvers in a plurality of maneuvers to navigate the robot 102 around the detected wire.
  • camera sensor 302 may be illustrative of a pair of spatially separated imaging cameras configured to measure depth of a visual scene. Depth or distance information may be determined based on a binocular disparity between two contemporaneously captured images (i.e., parallax). Accordingly, the wire 306 may be localized based on a distance from the sensors to the wire 306 depicted in one or both contemporaneously captured images and the pose of the camera sensors 302 on the robot 102.
  • differential motion may be utilized to localize the wire based on image 308. Similar to the use of two camera sensors to measure distance to the wire 306 using parallax or binocular disparity, two sequential images from the single camera sensors 302 may be utilized to determine distance to the wire 306 (assuming the wire 306 is stationary) as the robot 102 moves along the route 300. Data from navigation units 106 may be utilized by the controller 118 to estimate the change in position of the camera sensors 302 during acquisition of the sequential images.
  • the change in apparent size, shape, and position of the wire 306 within the sequential images captured by the camera sensor 302 as the robot 102 moves and views the wire 306 at different angles may yield information relating to the distance of the wire 306 from the camera sensors 302, and thereby yield the location of the wire 306.
  • the camera sensors 302 intrinsic parameters e.g., lens distortion, focal length, etc.
  • its position on the robot 102 may be utilized to localize a two-dimensional (2D) point of an image 308 (i.e., a pixel) to a three-dimensional (3D) location in space.
  • This may require the controller 118 of the robot 102 to localize itself and the camera 302 during acquisition of every image from the camera 302.
  • the controller 118 may utilize a pinhole camera model and a camera projection matrix to map the points of the images to points in 3D space.
  • the camera projection matrix may be typically provided by a manufacturer of the camera 302 or measured and is based in part on intrinsic parameters such as lenses’ distortion and focal length.
  • the camera 302 may comprise a depth camera, wherein distance to the wire may be calculated based on (i) distance measurements within the bounding box 310, and (ii) the position of the camera 302 on the robot 102.
  • distance to the wire may be calculated based on (i) distance measurements within the bounding box 310, and (ii) the position of the camera 302 on the robot 102.
  • the location of the bounding box 310 in 3D space may be determined based on data from other sensor units 114 of the robot 102 such as, for example, LiDAR sensors, depth cameras, structured light sensors, etc., which includes a field of view which overlaps, at least in part, with the field of view of the camera 302.
  • a wire 306 is not required to be resting upon a floor for a trained model to identify the wire 306 within an image captured by a camera sensor 302.
  • the wire may be hanging from a ceiling, strung between two objects, or otherwise suspended above the floor.
  • camera sensor 302 may comprise a greyscale depth camera configurable or configured to produce greyscale images encoded with depth information.
  • a model may be trained to identify wires using only greyscale images such that depth images may be utilized to both identify wires and further localize their distance from the robot 102 using the encoded depth information.
  • Other imagery methods may additionally be used, such as chrominance-luminance imagery, YUV color encoded images, and/or any other form of imagery for which the neural network 200 is capable of being trained to utilize as an input and which can represent thin objects such as wires 306 with sufficient detail for the wires 306 to be discernible.
  • FIG. 4 is a process flow diagram illustrating a method 400 for a controller 118, or processor 130, of a robot 102 to detect a wire 306 and plan its route accordingly, according to an exemplary embodiment. It may be appreciated by a skilled artisan that any steps of method 400 performed by the controller 118 may be effectuated by the controller 118 executing computer-readable instructions from a memory 120.
  • Block 402 comprises the controller 118 training a model to detect wires within images.
  • the model may be trained or developed using a neural network 200, described above in regard to FIG. 2.
  • training of the model may comprise providing the neural network 200 with a plurality of training images.
  • the training images may each comprise pictures depicting, in part, wires.
  • the wires depicted may comprise various colors (e.g., black, white, yellow, etc.), shapes (e.g., coiled wires, hanging wires, wires along a floor, etc.), thicknesses, lengths, and so forth.
  • the pixels comprising the wires are annotated or encoded with a “wire” classification, or similar classification (e.g., “object”) and the remaining pixels are classified as “background,” “not-wire,” or similar classification.
  • the training images may include bounding boxes 310, which encompass pixels depicting wires.
  • the training images (e.g., color values of pixels of the training images) may be provided to input nodes 202 of the neural network 200 and the annotations of the training images for respective pixels may be provided to output nodes 210 such that weights of intermediate nodes 206 may be configured to correlate the input training images to produce the output annotations.
  • the training of the model may be performed separately from the robot 102.
  • a neural network 200 may be trained, using annotated training images, to identify wires within an image, wherein the neural network 200 is embodied by a processor separate from the robot 102 executing computer-readable instructions.
  • the model may be derived based on weights of the intermediate nodes 206 learned during the training process.
  • the number of images required to sufficiently train the neural network 200 to reliably identify wires within images may depend on a plurality of factors including, but not limited to, the number of nodes of the network 200, processing resources (e.g., processor cycles, runtime available, memory available, etc.) of the processor, image resolution/clarity, image modality (e.g., RGB, YUV, HSV, greyscale, depth images, etc.), size of receptive fields of input nodes 202 (e.g., if neural network 200 is a convolutional neural network), and the like.
  • processing resources e.g., processor cycles, runtime available, memory available, etc.
  • image resolution/clarity e.g., image modality (e.g., RGB, YUV, HSV, greyscale, depth images, etc.)
  • size of receptive fields of input nodes 202 e.g., if neural network 200 is a convolutional neural network
  • at least 100 images may be used, however use of additional images may improve
  • the model may then be communicated to the robot 102 via wired or wireless communications and stored in a memory 120.
  • execution of a trained model may use substantially fewer computing resources as compared to training of a neural network 200 to derive the model. Accordingly, it may be beneficial to train the model separate from the robot 102 if the robot 102 comprises limited computing resources.
  • training of the model to identify wires within images separately from a robot 102 may enable communicating the trained model to a plurality of robots 102 for each to use in identifying and navigating around wires.
  • the trained model may be included in the initial programming of a plurality of robots 102
  • model may be trained further after instantiation within a robot 102 using additional training images generated by a robot 102 operating within a specific environment.
  • Block 404 comprises the controller 118 receiving an image from a sensor coupled to the robot 102 as the robot 102 navigates a route.
  • the sensor may comprise a camera sensor configured to capture images of an environment surrounding the robot 102, wherein the images may be encoded with color values using, for example, RGB, greyscale, HSV, YUV, or similar encoding methods.
  • the sensor preferably does not comprise a time of flight (“ToF”) sensor because ToF sensors often struggle to detect wires due to their small width. Depth images from depth cameras may be utilized, however the depth information (e.g., distance measurements) should not be used for detecting the presence of thin objects such as wires.
  • ToF time of flight
  • the camera sensor may be configured in a forward-facing direction such that wires ahead of the robot 102 (i.e., along the path of the robot 102) are detected prior to the robot 102 navigating over the wire, but the camera sensor may be positioned anywhere without limitation.
  • Block 406 comprises the controller 118 determining if a wire is detected within the image received in block 404.
  • the model may be trained to perform semantic segmentation of the image, wherein the detection of the wire may comprise at least one pixel of the image comprising a “wire” classification based on the trained model.
  • the detection of the wire is based on a presence of a bounding box encompassing the wire (e.g., bounding box 310 encompassing wire 306 of FIG. 3B).
  • the detection of the wire is a binary determination, wherein the trained model may output either a positive indication that the wire is detected or a negative indication that no wire is detected.
  • 118 may move to block 408 to continue navigating along the route.
  • Block 410 comprises the controller 118 localizing the wire.
  • the controller 118 may consider, for example, (i) a position of the robot 102 during acquisition of the image, and (ii) intrinsic parameters (i.e., focal length, lens parameters, projection matrix, etc.) and (iii) extrinsic parameters (i.e., (x, y, z, yaw, pitch, roll) pose on the robot 102) ofthe camera to determine the distance from the detected wire.
  • intrinsic parameters i.e., focal length, lens parameters, projection matrix, etc.
  • extrinsic parameters i.e., (x, y, z, yaw, pitch, roll pose on the robot 102
  • the distance from the detected wire may be determined using, for example, differential motion of the wire within two or more sequential images captured by the sensor, the differential motion being caused by motions of the robot 102 and the position of the sensor is calculated for each of the two or more images to provide stereo imagery of the static wires.
  • the robot 102 may comprise two spatially separated image sensors such that a binocular disparity may be utilized to determine distance to the wire.
  • the distance, in conjunction with the pose of the sensor and position of the robot 102, may be utilized by the controller 118 to localize the wire in the environment, wherein the wire may then be localized onto a computer-readable map.
  • Controller 118 may localize the wire onto the map using, for example, a wire object or mapping an object encompassing a region corresponding to a bounding box of the wire.
  • the mapped wire object may additionally be classified as an obstacle in the map to enable the controller to determine one or more avoidance maneuvers to avoid the wire obstacle.
  • Block 412 comprises the controller 118 performing an avoidance maneuver.
  • the avoidance maneuver may configure the robot 102 to move away from or around the detected wire.
  • the avoidance maneuver may configure the robot 102 to stop and hail or wait for human assistance (e.g., using an auditory noise, a visual signal, or transmitting a signal via communication units 116 to a user interface unit 112) prior to navigating over the wire if no route around the wire is available without collisions with nearby objects.
  • the avoidance maneuver may comprise the controller 118 navigating the robot 102 around the wire if a route around the wire is available, as shown in FIGS. 3A-C.
  • the avoidance maneuver configures the robot 102 to change its route from a first route to a different second route to avoid the localized wire.
  • the specific maneuver may depend on a state of the robot 102, such as its speed, distance to the wire, distance to nearby objects, and/or other similar environmental parameters.
  • the avoidance maneuver desirably includes rejoining the second route with the first route after the wire has been avoided.
  • the controller 118 may return to block 404 and continue navigating along the route navigated prior to execution of the avoidance maneuver (i.e. the first route).
  • the avoidance maneuver may include stopping of the robot 102 if no collision-free path around the wire 306 is possible.
  • the determination may be based on a computer-readable map comprising the localized wire 306 and objects surrounding the wire 306, which restrict movements of the robot 102.
  • the robot 102 may stop and call for assistance using, for example, communications units 116 coupled to a user device (e.g., a cell phone) of an operator of the robot. Stopping of the robot 102 upon detection of any wire may be further advantageous, from a safety perspective, since wires may be too long to be depicted entirely within images captured by the camera sensor.
  • the robot 102 may turn around, for example, if the robot 102 comprises a differential drive as means for locomotion (i.e., is able to turn in place) or sufficient space to perform a U-tum.
  • the region occupied by the wire 306 may correspond with a region wherein a task is to be performed (e.g., cleaning a floor under the wire 306), wherein the robot 102 may call for assistance for an operator to remove the wire 306.
  • the controller 118 may determine that the task cannot be performed and send notification to an operator that the task has been cancelled, wherein the robot 102 may move to a separate task.
  • the notification may comprise an image of the wire to allow the human receiving the notification to determine if the robot 102 may safely navigate over the wire, wherein the human may view the notification from a remote location using, e.g., a smartphone.
  • the avoidance maneuver may include the robot 102 deactivating one or more actuator units 108 prior to navigating over the wire, provided the wire is detected as being on the floor.
  • the actuators 108 deactivated correspond to actuatable features of the robot 102, such as a vacuum, a scrubbing deck, or other actuatable features. If these features are actuated while the robot 102 passes over the wire, the wire may become entangled within these features. Accordingly, the actuators 108 may be deactivated to allow the robot 102 to pass safely over the wire as an avoidance maneuver.
  • FIG. 5 illustrates various avoidance maneuvers on a computer-readable map, according to an exemplary embodiment.
  • the computer-readable map depicted may include a footprint 504 corresponding to a position and area occupied by the robot 102 (e.g., as depicted in FIG. 3A or 3C above). That is, footprint 504 is a virtual representation of an area occupied and location of the robot 102 on the computer-readable map.
  • a wire 306 may be detected and localized as shown by region 502 corresponding to a region occupied by a bounding box 310, which surrounds the detected wire 306.
  • Robot 102, represented by its footprint 504 in FIG. 5, is initially following route 300, which, as shown in FIG.
  • a controller 118 of a robot 102 may map a region 502 corresponding to a region occupied by the wire 306.
  • region 502 may comprise a region occupied by a bounding box 310 projected onto a computer-readable map, the computer-readable map depicted in FIG. 5 comprising a top-down (i.e., bird’s eye view) of an environment of the robot 102.
  • Distance 506 may comprise a distance between the robot 102 and region 502 comprising a detected wire 306.
  • Distance 506 may indicate which avoidance maneuver the robot 102 may take.
  • distance 506 may be longer than a maximum stopping distance of the robot 102 (i.e., a stopping distance when the robot 102 is moving at a maximum velocity along route 300). Accordingly, there may be ample distance for the robot 102 to perform an avoidance maneuver of stopping.
  • robot 102 may comprise a differential drive for locomotion; the differential drive enables the robot 102 to turn in place. If distance 506 is equal to or greater than the maximum stopping distance for the robot 102, the avoidance maneuver may comprise stopping the robot 102, turning in place, and subsequently navigating around the region 502. Alternatively, if the robot 102 does not utilize a differential drive, the controller 118 may first determine if there is sufficient room to perform a U-tum without colliding with the wire or nearby objects.
  • robot 102 may comprise a tricycle, four-wheeled, or other non-holonomic locomotion means.
  • the avoidance maneuver may include the controller 118 navigating route 508.
  • Route 508 may comprise a modification to a portion 510 (dotted lines) of route 300; the modification includes extending the route 300 to include turns around the region 502 and returning to route 300 beyond region 502.
  • Route 508 may be the shortest path around the region 502 occupied by the wire.
  • region 502 may be surrounded by additional objects.
  • region 502 may include a wire 306 strung between two walls or shelves of a store.
  • the avoidance maneuver may include the robot 102 stopping.
  • FIG. 6 illustrates a robot 102 approaching a hanging or suspended wire 306 to illustrate how the suspended wire 306 may be mapped onto a two-dimensional (“2D”) computer-readable map, according to an exemplary embodiment.
  • 2D two-dimensional
  • the robot 102 operates on a substantially planar surface, such as a floor, wherein computer-readable maps produced by this robot 102 are 2D bird’s eye view of the environment.
  • a 2D map over a three dimensional (“3D”) one may be advantageous as processing of the map (e.g., producing one from sensor data) and path planning using a 2D map utilizes substantially fewer computing resources of the controller 118 than a 3D map.
  • Robots 102 that operate in three dimensions and/or utilize three dimensional maps may not need to consider the projections shown and described in FIG. 6, as appreciated by one skilled in the art.
  • the wire 306 is shown as being suspended between two objects 602.
  • the wire may represent, for example, an extension cord, a thin light fixture (e.g., Christmas lights), a piece of string, and/or other suspended wires.
  • Wire 306 may further represent a wire that is suspended on only one side, rather than between two objects 602.
  • the wire 306 may connect a laptop resting upon a table (e.g., 602) to a wall socket (e.g., 616), wherein the wire 306 may be suspended in the air at least in part. If the robot 102 navigates into the wire 306, there is a risk that anything attached to the wire may be pulled off objects 602 and damaged in addition to potential damage to the robot 102.
  • camera 302 may comprise a binocular camera configured to capture images and measure distances to objects within the image using parallax motion or binocular disparity.
  • two sequential images captured by a singular camera may be utilized to determine distance within the images using parallax or binocular disparity, wherein the parallax motion is caused by the movement of the camera 302 as the robot 102 moves along a route 608 towards the wire 306 (presuming the wire 306 is stationary).
  • the horizontal width of the wire 610 may be determined by the size of the bounding box 310 used to identify the wire 306 in the image(s) and the distance 614.
  • the bounding box 310 is not shown in FIG. 6 for clarity, a similar bounding box 310 is shown in FIG. 3B-C above.
  • the length 612 of the wire 306 may be a predetermined value.
  • the predetermined value may range from a width of a common wire (e.g., 5 mm) up to a value determined to provide a safe buffer region around the wire.
  • the range of length 612 may encompass this uncertainty range on either side of the wire 306.
  • the width 612 may be at least 20 cm.
  • the width of region 612 should be larger than the uncertainty of the range 614.
  • the wire 306 may be mapped onto a 2D computer-readable map by projecting the region occupied by the wire 306 onto the plane of the map.
  • the plane of the map comprises the plane of the floor upon which the robot 102 is navigating. That is, the region formed by opposing width/length measures 610, 612 is projected onto the floor. This projected region 606 may then be placed onto the 2D map to allow the robot 102 to plan an avoidance maneuver to avoid colliding with the wire 306.
  • the controller 118 may further check if the height 616 of the wire 306 is above a predetermined threshold value corresponding to the height of the robot 102. That is, if the robot 102 can pass under the wire 306, the controller 118 will not map the region 606 on its computer-readable map.
  • FIG. 7A-B illustrate two sequential images 702-1, 702-2 captured by a camera sensor 302 of a robot 102 previously shown in FIG. 6 as encountering a hanging wire 306, according to an exemplary embodiment.
  • FIG. 7A illustrates an image 702-1 captured by the camera 302 where the wire 306 is first detected. Accordingly, upon detecting the wire 306, a bounding box 310-1 is placed around the wire 306.
  • key points 704 may be identified in the first image 702-1.
  • Key points also commonly referred to as trackers, track points, or salient points, comprise of points within the image 704 determined to be salient.
  • points 704 may represent an edge, a contour, a color value, or any other notable feature within the image 702-1.
  • six key points 704 are shown: two for the comers of each object 602, and two located on a feature 616 of the object 602.
  • Feature 616 may represent, for example, a wall outlet or potion of the object 602 comprising a salient texture or color.
  • the robot 102 moves forward towards the wire 306 and captures another image 702-2 using the camera sensor 302, according to an exemplary embodiment.
  • the same key points 704 are tracked in image 702-2 and due to the motion of the robot 102, the image-space location of the key points 702 have moved.
  • Key points 704 farther from the camera 302 may move less than key points 704 closer to the camera 302, as shown by the varying lengths of rays 706, which show the motion of each respective key point 704 between image 702-1 to image 702-2.
  • the apparent size of the bounding box 302-2 is larger than the size of the bounding box 310-1, wherein the comers of the bounding box 310-1 have grown along vectors 706 into bounding box 310-2 due to the motion of the camera 302 closer towards the wire 306.
  • This inter-frame motion of points 702 yields depth information within the visual scene, thereby allowing the controller 118 to calculate the approximate depth 614 from the camera 302 of the wire 306, more specifically the depth 614 of the bounding box 310.
  • the controller 118 may track the apparent motion of the comers of the bounding box 310-1, 310-2 to determine the distance 614 to the bounding box 310-2, and thereby the distance 614 to the wire 306.
  • the distance to each comer of the bounding box 310 may further indicate an orientation of the wire 306 relative to the camera 302 orientation. For example, if the left comers move less than the right comers in between frames 702-1 and 702-2, the wire may be at a skewed angle. As another example, if both comers move approximately the same amount, the wire 306 may be perpendicular to the camera 302 (i.e., span across the field of view from left to right).
  • FIG. 8 illustrates a second method for determining a distance 614 to a wire 306 using a depth camera, according to an exemplary embodiment.
  • depth cameras and other time of flight sensors may have difficulty detecting wires 306 due to the thinness of the wire 306.
  • a controller 118 of the robot 102 may produce a histogram 800 of distance measurements, which he within a bounding box 310 of a detected wire.
  • the horizontal axis of the histogram represents distances
  • the vertical axis represents an integer number of beams that comprise a certain distance.
  • the histogram 800 may represent distance measurements within the bounding box 310 shown in either images 702-1 or 702-2 shown in FIG. 7A-B above.
  • Each bar of the histogram may represent a range of distances, e.g., 1 cm ranges and comprise a height equal to the number of beams, which fall within the range.
  • distance 614 may correspond to the average distance of measurements 802. [00128]
  • the histogram 800 may not include outlier measurements 802, which directly sense the wire 306. Accordingly, if no outlier measurements 802 are detected within the bounding box 310, the controller 118 may determine that the wire 306 is on a floor, rather than suspended.
  • the term “including” should be read to mean “including, without limitation,” “including but not limited to,” or the like; the term “comprising” as used herein is synonymous with “including,” “containing,” or “characterized by,” and is inclusive or open-ended and does not exclude additional, unrecited elements or method steps; the term “having” should be interpreted as “having at least;” the term “such as” should be interpreted as “such as, without limitation;” the term ‘includes” should be interpreted as “includes but is not limited to;” the term “example” or the abbreviation “e.g.” is used to provide exemplary instances of the item in discussion, not an exhaustive or limiting list thereof, and should be interpreted as “example, but without limitation;” the term “illustration” is used to provide illustrative instances of the item in discussion, not an exhaustive or limiting list thereof, and should be interpreted as “illustration, but without limitation.” Adjectives such as “known,” “normal
  • a group of items linked with the conjunction “and” should not be read as requiring that each and every one of those items be present in the grouping, but rather should be read as “and/or” unless expressly stated otherwise.
  • a group of items linked with the conjunction “or” should not be read as requiring mutual exclusivity among that group, but rather should be read as “and/or” unless expressly stated otherwise.
  • the terms “about” or “approximate” and the like are synonymous and are used to indicate that the value modified by the term has an understood range associated with it, where the range may be ⁇ 20%, ⁇ 15%, ⁇ 10%, ⁇ 5%, or ⁇ 1%.
  • a result e.g., measurement value
  • close may mean, for example, the result is within 80% of the value, within 90% of the value, within 95% of the value, or within 99% of the value.
  • defined or “determined” may include “predefined” or “predetermined” and/or otherwise determined values, conditions, thresholds, measurements, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

Sont ici divulgués des systèmes et des procédés de détection de fils destinés à des robots. Selon au moins un mode de réalisation illustratif non limitatif, un modèle est formé pour identifier des fils dans des images capturées par des capteurs couplés à un robot, le modèle étant issu de l'apprentissage d'un réseau neuronal. La détection d'un fil peut amener le robot à réaliser une manœuvre d'évitement pour éviter une circulation par-dessus ou droit dans le fil détecté.
PCT/US2021/036302 2020-06-08 2021-06-08 Systèmes et procédés de détection de fil et évitement de tels fils par des robots WO2021252425A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063035939P 2020-06-08 2020-06-08
US63/035,939 2020-06-08

Publications (1)

Publication Number Publication Date
WO2021252425A1 true WO2021252425A1 (fr) 2021-12-16

Family

ID=78846505

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/036302 WO2021252425A1 (fr) 2020-06-08 2021-06-08 Systèmes et procédés de détection de fil et évitement de tels fils par des robots

Country Status (1)

Country Link
WO (1) WO2021252425A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220061941A1 (en) * 2020-09-02 2022-03-03 Auris Health, Inc. Robotic collision boundary determination

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060184274A1 (en) * 2003-03-14 2006-08-17 Matsushita Electric Works, Ltd. Autonomously moving robot
US8793069B2 (en) * 2008-04-03 2014-07-29 Honda Motor Co., Ltd. Object recognition system for autonomous mobile body
US20150073646A1 (en) * 2010-05-20 2015-03-12 Irobot Corporation Mobile Human Interface Robot
US20180259971A1 (en) * 2017-03-08 2018-09-13 Nec Corporation Autonomous mobile robot, and method and program for controlling the same
US10452071B1 (en) * 2016-02-29 2019-10-22 AI Incorporated Obstacle recognition method for autonomous robots

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060184274A1 (en) * 2003-03-14 2006-08-17 Matsushita Electric Works, Ltd. Autonomously moving robot
US8793069B2 (en) * 2008-04-03 2014-07-29 Honda Motor Co., Ltd. Object recognition system for autonomous mobile body
US20150073646A1 (en) * 2010-05-20 2015-03-12 Irobot Corporation Mobile Human Interface Robot
US10452071B1 (en) * 2016-02-29 2019-10-22 AI Incorporated Obstacle recognition method for autonomous robots
US20180259971A1 (en) * 2017-03-08 2018-09-13 Nec Corporation Autonomous mobile robot, and method and program for controlling the same

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220061941A1 (en) * 2020-09-02 2022-03-03 Auris Health, Inc. Robotic collision boundary determination

Similar Documents

Publication Publication Date Title
US11613016B2 (en) Systems, apparatuses, and methods for rapid machine learning for floor segmentation for robotic devices
US20220269943A1 (en) Systems and methods for training neural networks on a cloud server using sensory data collected by robots
JP7462891B2 (ja) エスカレータを検出するためのシステム、装置、及び方法
US20210354302A1 (en) Systems and methods for laser and imaging odometry for autonomous robots
US11886198B2 (en) Systems and methods for detecting blind spots for robots
US11865731B2 (en) Systems, apparatuses, and methods for dynamic filtering of high intensity broadband electromagnetic waves from image data from a sensor coupled to a robot
US20210232136A1 (en) Systems and methods for cloud edge task performance and computing using robots
US11951629B2 (en) Systems, apparatuses, and methods for cost evaluation and motion planning for robotic devices
US20230168689A1 (en) Systems and methods for preserving data and human confidentiality during feature identification by robotic devices
US20210232149A1 (en) Systems and methods for persistent mapping of environmental parameters using a centralized cloud server and a robotic network
US20230083293A1 (en) Systems and methods for detecting glass and specular surfaces for robots
US20240077882A1 (en) Systems and methods for configuring a robot to scan for features within an environment
US20240085916A1 (en) Systems and methods for robotic detection of escalators and moving walkways
WO2021252425A1 (fr) Systèmes et procédés de détection de fil et évitement de tels fils par des robots
WO2020123612A1 (fr) Systèmes et procédés de commande améliorée de systèmes robotiques non holonomes
US20240168487A1 (en) Systems and methods for detecting and correcting diverged computer readable maps for robotic devices
US20240096103A1 (en) Systems and methods for constructing high resolution panoramic imagery for feature identification on robotic devices
US20230350420A1 (en) Systems and methods for precisely estimating a robotic footprint for execution of near-collision motions
WO2023076576A1 (fr) Systèmes et procédés de génération automatique d'itinéraire pour dispositifs robotiques
US20210220996A1 (en) Systems, apparatuses and methods for removing false positives from sensor detection
US20220163644A1 (en) Systems and methods for filtering underestimated distance measurements from periodic pulse-modulated time-of-flight sensors
WO2022132880A1 (fr) Systèmes et procédés de détection de sol à partir de mesures de profondeur bruitées pour des robots
WO2022087014A1 (fr) Systèmes et procédés de production de cartes d'occupation destinés à des dispositifs robotiques

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21821744

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21821744

Country of ref document: EP

Kind code of ref document: A1