US20170329347A1 - Systems and methods for training a robot to autonomously travel a route - Google Patents
Systems and methods for training a robot to autonomously travel a route Download PDFInfo
- Publication number
- US20170329347A1 US20170329347A1 US15/152,425 US201615152425A US2017329347A1 US 20170329347 A1 US20170329347 A1 US 20170329347A1 US 201615152425 A US201615152425 A US 201615152425A US 2017329347 A1 US2017329347 A1 US 2017329347A1
- Authority
- US
- United States
- Prior art keywords
- robot
- map
- route
- implementations
- navigable route
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/40—Control within particular dimensions
- G05D1/43—Control of position or course in two dimensions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
- A47L11/4011—Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
- A47L11/4061—Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L9/00—Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
- A47L9/28—Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
- A47L9/2805—Parameters or conditions being sensed
- A47L9/2826—Parameters or conditions being sensed the condition of the floor
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L9/00—Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
- A47L9/28—Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
- A47L9/2836—Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means characterised by the parts which are controlled
- A47L9/2852—Elements for displacement of the vacuum cleaner or the accessories therefor, e.g. wheels, casters or nozzles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
- B25J11/0085—Cleaning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/161—Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/163—Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1692—Calibration of manipulator
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0088—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0221—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/243—Means capturing signals occurring naturally from the environment, e.g. ambient optical, acoustic, gravitational or magnetic signals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/246—Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/60—Intended control result
- G05D1/646—Following a predefined trajectory, e.g. a line marked on the floor or a flight path
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L2201/00—Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
- A47L2201/04—Automatic control of the travelling movement; Automatic obstacle detection
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2101/00—Details of software or hardware architectures used for the control of position
- G05D2101/10—Details of software or hardware architectures used for the control of position using artificial intelligence [AI] techniques
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2111/00—Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
- G05D2111/10—Optical signals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2111/00—Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
- G05D2111/50—Internal signals, i.e. from sensors located in the vehicle, e.g. from compasses or angular sensors
-
- G05D2201/0203—
Definitions
- the robot further has a processor configured to associate a position on the map with actuation of the first actuator unit. In another variant, the robot includes a processor configured to associate a position on the map with actuation of the second actuator unit.
- the robot further includes a communication unit configured to communicate with a server, wherein the robot sends the map to the server and receives a verification of the quality of the map.
- non-transitory computer-readable storage medium includes instructions that when executed, further cause the processing apparatus to associate the map of the navigable route and surrounding environment with the initialization location.
- the navigation unit of the robot is also configured to determine not to autonomously navigate at least a portion of the navigable route. This determination includes a determination to avoid an obstacle of the environment.
- some implementations include a non-transitory computer-readable storage medium having a plurality of instructions stored thereon, the instructions being executable by a processing apparatus to operate a robot, the instructions configured to, when executed by the processing apparatus, cause the processing apparatus to create a map of a navigable route and surrounding environment during a demonstration of the navigable route to the robot beginning from an initialization location.
- the errors include at least overlapping objects. In some implementations, the errors include failure to form a closed loop. In some implementations, the errors include predetermined patterns in the map.
- FIG. 1C is an overhead view of an alternative example route autonomously navigated by the robot shown in FIGS. 1A and 1B , where the robot avoids objects in accordance with the principles of the present disclosure.
- FIG. 2 is a process flow diagram of an exemplary method for training a robot to autonomously navigate an example route in accordance with the principles of the present disclosure.
- FIG. 3 is a functional block diagram of one exemplary robot in accordance with some implementations of the present disclosure.
- FIG. 4 is a process flow diagram of an exemplary method in which an exemplary robot learns and then travels an example route in accordance with the principles of the present disclosure.
- FIG. 5A is one exemplary user interface for receiving an input from a user in order to begin teaching or choosing an example route in accordance with the principles of the present disclosure.
- FIGS. 5B-5D are overhead views of an exemplary robot detecting an initialization location and initializing an example orientation and example position in accordance with the principles of the present disclosure.
- FIG. 6C illustrates various side elevation views of exemplary body forms for a robot in accordance with the principles of the present disclosure.
- FIG. 6D is an overhead view of a user controlling a robot while the robot senses its surroundings in accordance with the principles of the present disclosure.
- FIGS. 8A-8B illustrate various example mapped objects as they may appear in a map, where FIG. 8A demonstrates one set of example objects that are substantially parallel with one another, while FIG. 8B demonstrates another set of example objects that are not substantially parallel with one another in accordance with the principles of the present disclosure.
- FIG. 9A is an overhead view of an exemplary route discontinuity between route portions of a map in accordance with the principles of the present disclosure.
- FIG. 10 is an overhead view of a mapped portion having exemplary overlapping objects in accordance with the principles of the present disclosure.
- FIG. 11A is an overhead view of a robot travelling in an exemplary closed loop route, where the example initialization location is substantially similar to the example end location in accordance with the principles of the present disclosure.
- FIG. 11B is an exemplary mapping error where a robot associates the mapping error with a corrected route in accordance with the principles of the present disclosure.
- FIG. 12 is an example user interface that can be used for route selection in accordance with the principles of the present disclosure.
- FIG. 13 is a process flow diagram of an exemplary method for operating a robot in accordance with the principles of the present disclosure.
- a robot can include mechanical or virtual entities configured to carry out complex series of actions automatically.
- robots can be electro-mechanical machines that are guided by computer programs or electronic circuitry.
- robots can include electro-mechanical machines that are configured for autonomous navigation, where the robot can move from one location to another with little to no user control.
- Such autonomously navigating robots can include autonomous cars, floor cleaners (e.g., floor scrubbers, vacuums, etc.), rovers, drones, and the like.
- some of the systems and methods described in this disclosure can be implemented to a virtual environment, where a virtual robot can learn demonstrated routes in a simulated environment (e.g., in a computer simulation) with characteristics of the physical world. After learning those routes, the robot can then autonomously navigate the learned routes in the simulated environment and/or in the real world using systems and methods disclosed in this disclosure.
- a virtual robot can learn demonstrated routes in a simulated environment (e.g., in a computer simulation) with characteristics of the physical world. After learning those routes, the robot can then autonomously navigate the learned routes in the simulated environment and/or in the real world using systems and methods disclosed in this disclosure.
- the systems and methods of this disclosure at least: (i) reduce or eliminate the need for environment-specific programming; (ii) reduce or eliminate the need for highly skilled technicians to program a robot; (iii) provide application-specific performance from a generally programmed robot; (iv) obviate or reduce the need for task-specific programming (e.g., such as how close to navigate to obstacles for cleaning; and (v) enable effective autonomous navigation of robots.
- a user does not have to program every route beforehand.
- this can allow a user to train a robot to navigate environments that the user had not anticipated beforehand.
- a user may not utilize any particular expertise to train the robot.
- a user may not have to know computer science and/or be educated on how to program the robot.
- a user may just know how to perform the task that he/she desires the robot to do. For example, where the robot is a floor cleaner, the user may just know how to clean the floor, which he/she can demonstrate to the robot.
- training robots to travel routes can allow robots to perform specific tasks to specification without having to identify and program in each of those specifications.
- a robot is a floor scrubbing unit
- a user can demonstrate those distances as it trains the robot and the robot, in some cases, can repeat those distances.
- training a robot that can learn a navigable route can allow a robot to be specifically programmed to efficiently navigate a particular environment while also being generally programmed to perform in many environments.
- this allows such robots to have the benefit of both being optimized in particular applications, yet having the ability, and flexibility, to perform in a variety of applications.
- map and routes can be verified and/or validated before navigation. This verification and/or validation can prevent accidents and/or situations where a robot may crash into walls and/or obstacles because of a poor quality map and/or route.
- FIG. 1A illustrates an overhead view of an example route 106 autonomously navigated by robot 102 through implementations of this disclosure.
- Robot 102 can autonomously navigate through environment 100 , which can comprise various objects 108 , 110 , 112 , 118 .
- Robot 102 can start at an initialization location 104 and end at an end location 114 .
- robot 102 can be a robotic floor cleaner, such as a robotic floor scrubber, vacuum cleaner, steamer, mop, sweeper, and the like.
- Environment 100 can be a space having floors that are desired to be cleaned.
- Environment 100 can be a store, warehouse, office building, home, storage facility, etc.
- objects 108 , 110 , 112 , 118 can be shelves, displays, objects, items, people, animals, or any other entity or thing that may be on the floor or otherwise impede the robot's ability to navigate through the environment.
- Route 106 can be the cleaning path traveled by robot 102 .
- Route 106 can follow a path that weaves between objects 108 , 110 , 112 , 118 as illustrated in example route 106 .
- objects 108 , 110 , 112 , 118 are shelves in a store
- robot 102 can go along the aisles of the store and clean the floors of the aisles.
- routes 106 , 116 , 126 illustrated in FIGS. 1A, 1B and 1C can appear differently as illustrated and are meant merely as illustrative examples.
- environment 100 is shown, however, it should be appreciated that environment 100 can take on any number of forms and arrangements (e.g., of any size, configuration, and layout of a room or building) and is not limited by this disclosure.
- robot 102 uses its sensors (e.g., sensors 560 A-D and/or sensors 568 A-B as will be described with reference to FIGS. 5B-E ) to sense where it is in relationship to its surrounding. Such sensing may be imprecise in some instances, which may cause robot 102 to not navigate the precise route that had been demonstrated and robot 102 had been trained to follow. In some cases, small changes to environment 100 , such as the moving of shelves and/or changes in the items on the shelves, can cause robot 102 to deviate from route 116 when it autonomously navigates route 106 . As another example, as illustrated in FIG.
- portion 206 includes positioning robot 102 in initialization location 104 once again. This second placement of robot 102 into initialization location 104 can occur at a later point in time after portion 204 , such as substantially right after the demonstration of portion 204 , or at some later time, such as hours later, days later, weeks later, or whenever the user 604 desires to clean the floor.
- FIG. 3 illustrates a functional block diagram of example robot 102 in some implementations.
- robot 102 includes controller 304 , memory 302 , power supply 306 , and operative units 308 , each of which can be operatively and/or communicatively coupled to each other and each other's components and/or subcomponents.
- Controller 304 controls the various operations performed by robot 102 .
- FIG. 3 it is appreciated that the architecture may be varied in certain implementations as would be readily apparent to one of ordinary skill given the contents of the present disclosure.
- Controller 304 can include one or more processors (e.g., microprocessors) and other peripherals.
- processors e.g., microprocessors
- the terms processor, microprocessor, and digital processor can include any type of digital processing devices such as, without limitation, digital signal processors (“DSPs”), reduced instruction set computers (“RISC”), general-purpose (“CISC”) processors, microprocessors, gate arrays (e.g., field programmable gate arrays (“FPGAs”)), programmable logic device (“PLDs”), reconfigurable computer fabrics (“RCFs”), array processors, secure microprocessors, and application-specific integrated circuits (“ASICs”).
- DSPs digital signal processors
- RISC reduced instruction set computers
- CISC general-purpose
- microprocessors e.g., gate arrays (“FPGAs”)), programmable logic device (“PLDs”), reconfigurable computer fabrics (“RCFs”), array processors, secure microprocessors, and application-specific integrated circuits (“
- Memory 302 can provide instructions and data to controller 304 .
- memory 302 can be a non-transitory, computer-readable storage medium having a plurality of instructions stored thereon, the instructions being executable by a processing apparatus (e.g., controller 304 ) to operate robot 102 .
- the instructions can be configured to, when executed by the processing apparatus, cause the processing apparatus to perform the various methods, features, and/or functionality described in this disclosure.
- controller 304 can perform logical and arithmetic operations based on program instructions stored within memory 302 .
- Operative units 308 can be coupled to controller 304 , or any other controller, to perform the various operations described in this disclosure.
- One or more, or none, of the modules in operative units 308 can be included in some implementations.
- controllers and/or processors e.g., controller 304
- controller 304 can serve as the various controllers and/or processors described.
- different controllers and/or processors can be used, such as controllers and/or processors used particularly for one or more of operative units 308 .
- Controller 304 can send and/or receive signals, such as power signals, control signals, sensor signals, interrogatory signals, status signals, data signals, electrical signals and/or any other desirable signals, including discrete and analog signals to operative units 308 .
- Controller 304 can coordinate and/or manage operative units 308 , and/or set timings (e.g., synchronously or asynchronously), turn on/off, control power budgets, receive/send network instructions and/or updates, update firmware, send interrogatory signals, receive and/or send statuses, and/or perform any operations for running features of robot 102 .
- Operative units 308 can include various units that perform functions for robot 102 .
- units of operative units 308 can include mapping and localization units 312 , sensor units 314 , map evaluation units 324 , actuator units 318 , communication units 316 , navigation units 326 , and user interface units 322 .
- Operative units 308 can also comprise other units that provide the various functionality of robot 102 .
- the units of operative units 308 can be instantiated in software or hardware or both software and hardware.
- units of operative unit 308 can comprise computer-implemented instructions executed by a controller.
- units of operative unit 308 can comprise hardcoded logic.
- units of operative unit 308 can comprise both computer-implemented instructions executed by a controller and hardcoded logic. Where operative units 308 are implemented at least in part in software, operative units 308 can include units/modules of code configured to provide one or more functionalities.
- sensor units 314 can comprise systems that can detect characteristics within and/or around robot 102 .
- Sensor units 314 can include sensors that are internal to robot 102 or external, and/or have components that are partially internal and/or partially external.
- Sensors unit 314 can include exteroceptive sensors such as sonar, lidar, radar, lasers, video cameras, infrared cameras, 3D sensors, 3D cameras, and/or any other sensor known in the art.
- Sensor units 314 can also include proprioceptive sensors, such as accelerometers, inertial measurement units, odometers, gyroscopes, speedometers, and the like.
- sensor units 314 can collect raw measurements (e.g., currents, voltages, resistances gate logic, etc.) and/or transformed measurements (e.g., distances, angles, detected points in obstacles, etc.).
- mapping and localization units 312 can include systems and methods that can computationally construct and update map 700 (as will be described with reference to FIGS. 7A-7B ) of environment 100 (or any other generated map of any environment) as robot 102 navigates environment 100 (or any other environment). Mapping and localization units 312 can both map environment 100 and localize the robot 102 (e.g., find the position) robot 102 in map 700 . At the same time, mapping and localization units 312 can record a demonstrated route (e.g., route 116 ) in map 700 (e.g., mapped route 716 ).
- a demonstrated route e.g., route 116
- the mapping can be performed by imposing data obtained at least in part by sensor units 314 into a two-dimensional (“2D”), three-dimensional (“3D”), and/or four-dimensional (“4D”) map representative at least in part of the environment 100 .
- map 700 can include depictions representative at least in part of obstacles and/or objects detected by robot 102 .
- Map 700 can also record demonstrated routes, such as mapped route 716 as will be described with reference to FIGS. 7A-7B .
- mapped route 716 can include coordinates (e.g., x and y in a 2D map and x, y, and z in a 3D map) based at least in part on the relative position of robot 102 (e.g., including one or more of location, displacement, and orientation) to a reference, such as initialization location 104 .
- the coordinates can include an orientation (e.g., a displacement angle) of robot 102 at any given point relative to a reference, such as initialization location 104 .
- the term position has its ordinary and customary meaning. For example, in some cases, position can include a location in terms of displacement, coordinates, etc. of an object, robot 102 , etc.
- position can also include an orientation of an object, robot 102 , etc. Accordingly, in some cases, the terms position and pose may be used interchangeably to include one or more of location, displacement, and orientation.
- Map 700 created through the demonstration process, can record substantially the whole environment that robot 102 sensed in one or more demonstrations/trainings. For this reason, some may call map 700 a global map. In some cases, map 700 can be static in that after the demonstration, map 700 is substantially not updated. In some implementations, map 700 and mapped route 716 can also be generated separately (e.g., by a user using a computer) and uploaded onto robot 102 .
- Mapping and localization units 312 can also receive sensor data from sensor units 314 to localize (e.g., position) robot 102 in map 700 .
- mapping and localization units 312 can include localization systems and methods that allow robot 102 to localize itself in the coordinates of map 700 . Based at least in part on data from sensors 314 , mapping and localization unit 312 can infer the position of robot 102 in the coordinates of map 700 of environment 100 . The ability to localize robot 102 with coordinates of map 700 can allow robot 102 to navigate environment 100 using map 700 and approximate where robot 102 is on mapped route 716 .
- communication units 316 can include one or more receivers, transmitters, and/or transceivers. Communication units 316 can be configured to send/receive a transmission protocol, such as BLUETOOTH®, ZIGBEE®, Wi-Fi, induction wireless data transmission, radio frequencies, radio transmission, radio-frequency identification (“RFID”), near-field communication (“NFC”), global system for mobile communications (“GSM”), infrared, network interfaces, cellular technologies such as 3G (3GPP/3GPP2), high-speed downlink packet access (“HSDPA”), high-speed uplink packet access (“HSUPA”), time division multiple access (“TDMA”), code division multiple access (“CDMA”) (e.g., IS-95A, wideband code division multiple access (“WCDMA”), etc.), frequency hopping spread spectrum (“FHSS”), direct sequence spread spectrum (“DSSS”), global system for mobile communication (“GSM”), Personal Area Network (“PAN”) (e.g., PAN/802.15), worldwide interoperability for microwave access
- network interfaces can include any signal, data, or software interface with a component, network, or process including, without limitation, those of the FireWire (e.g., FW400, FW800, FWS800T, FWS1600, FWS3200, etc.), universal serial bus (“USB”) (e.g., USB 1.X, USB 2.0, USB 3.0, USB Type-C, etc.), Ethernet (e.g., 10/100, 10/100/1000 (Gigabit Ethernet), 10-Gig-E, etc.), multimedia over coax alliance technology (“MoCA”), Coaxsys (e.g., TVNETTM), radio frequency tuner (e.g., in-band or OOB, cable modem, etc.), Wi-Fi (802.11), WiMAX (e.g., WiMAX (802.16)), PAN (e.g., PAN/802.15), cellular (e.g., 3G, LTE/LTE-A/TD-LTE/TD-LTE, GSM,
- Wi-Fi can include one or more of IEEE-Std. 802.11, variants of IEEE-Std. 802.11, standards related to IEEE-Std. 802.11 (e.g., 802.11 a/b/g/n/ac/ad/af/ah/ai/aj/aq/ax/ay), and/or other wireless standards.
- IEEE-Std. 802.11 variants of IEEE-Std. 802.11
- standards related to IEEE-Std. 802.11 e.g., 802.11 a/b/g/n/ac/ad/af/ah/ai/aj/aq/ax/ay
- other wireless standards e.g., 802.11 a/b/g/n/ac/ad/af/ah/ai/aj/aq/ax/ay
- Communication units 316 can also be configured to send/receive a transmission protocol over wired connections, such as any cable that has a signal line and ground.
- wired connections such as any cable that has a signal line and ground.
- cables can include Ethernet cables, coaxial cables, Universal Serial Bus (“USB”), FireWire, and/or any connection known in the art.
- USB Universal Serial Bus
- FireWire FireWire
- Such protocols can be used by communication units 316 to communicate to external systems, such as computers, smart phones, tablets, data capture systems, mobile telecommunications networks, clouds, servers, or the like.
- Communication units 316 can be configured to send and receive signals comprising of numbers, letters, alphanumeric characters, and/or symbols.
- signals can be encrypted, using algorithms such as 128-bit or 256-bit keys and/or other encryption algorithms complying with standards such as the Advanced Encryption Standard (“AES”), RSA, Data Encryption Standard (“DES”), Triple DES, and the like.
- Communication 316 can be configured to send and receive statuses, commands, and other data/information.
- communication units 316 can communicate with a user controller to allow the user to control robot 102 .
- Communication units 316 can communicate with a server/network in order to allow robot 102 to send data, statuses, commands, and other communications to the server.
- the server can also be communicatively coupled to computer(s) and/or device(s) that can be used to monitor and/or control robot 102 remotely.
- Communication units 316 can also receive updates (e.g., firmware or data updates), data, statuses, commands, and other communications from a server for robot 102 and/or its operative units 308 .
- actuator units 318 can include actuators such as electric motors, gas motors, driven magnet systems, solenoid/ratchet systems, piezoelectric systems (e.g., inchworm motors), magnetostrictive elements, gesticulation, and/or any way of driving an actuator known in the art.
- actuators can actuate wheels or other displacement enabling drivers (e.g., mechanical legs, jet engines, propellers, hydraulics, etc.) for robot 102 to navigate through environment 100 or any other environment.
- actuators units 318 can include actuators configured for actions and/or action-specific tasks, such as mobilizing brushes for floor cleaning, moving (e.g., moving up, down, left, right, forward, back) squeegees, turning on/off water, spraying water, turning on/off vacuums, moving vacuum hose positions, gesticulating an arm, raising/lowering a lift, turning a camera and/or any sensor of sensor units 314 , and/or any movement desired for robot 102 to perform an action.
- actions and/or action-specific tasks such as mobilizing brushes for floor cleaning, moving (e.g., moving up, down, left, right, forward, back) squeegees, turning on/off water, spraying water, turning on/off vacuums, moving vacuum hose positions, gesticulating an arm, raising/lowering a lift, turning a camera and/or any sensor of sensor units 314 , and/or any movement desired for robot 102 to perform an action.
- user interface units 322 can be configured to enable a user (e.g., user 604 or any other user) to interact with robot 102 .
- user interface units 322 can include touch panels, buttons, keypads/keyboards, ports (e.g., USB, DVI, Display Port, E-Sata, Firewire, PS/2, Serial, VGA, SCSI, audioport, HDMI, PCMCIA ports, memory card ports (e.g., SD and miniSD), and/or ports for computer-readable media), mice, rollerballs, consoles, vibrators, audio transducers, and/or any interface for a user to input and/or receive data and/or commands, whether coupled wirelessly or through wires (including, without limitation, any of the wireless or wired connections described in this disclosure, such as with reference to communication units 316 ).
- ports e.g., USB, DVI, Display Port, E-Sata, Firewire, PS/2, Serial, VGA, SCSI, audioport, HDMI, PCMCIA ports, memory card ports (e
- User interface units 322 can include a display, such as, without limitation, LCDs, LED displays, LED LCD displays, IPSs, cathode ray tubes, plasma displays, HD panels, 4K displays, retina displays, organic LED displays, touchscreens, surfaces, canvases, and/or any displays, televisions, monitors, panels, and/or devices known in the art for visual presentation.
- user interface units 322 can be positioned on the body of robot 102 .
- user interface units 322 can be positioned away from the body of robot 102 , but can be communicatively coupled to robot 102 (e.g., via communication units 316 ) directly or indirectly (e.g., through a network or a cloud).
- map evaluation units 324 can include comparators, signal processors, image processors, and other software or hardware components. As will be described with reference to FIGS. 7A-7B, 8A-8C, 9A-9C, 10, 11 map evaluation units 324 can analyze and evaluate map 700 (or any other map) to detect mapping errors, determine the quality of map 700 (e.g., high, good, acceptable, poor, and/or any other designation), and/or the usability of map 700 for autonomous navigation. In some cases, in analyzing the quality of map 700 or any other map, map evaluation units 324 can determine that there has been a mapping error and/or that the map is of poor quality.
- robot 102 can prompt a user (e.g., user 604 ) using user interface units 322 or through communication units 316 to re-demonstrate a route (e.g., route 116 ), or otherwise re-map environment 100 .
- a user e.g., user 604
- user interface units 322 or through communication units 316 to re-demonstrate a route (e.g., route 116 ), or otherwise re-map environment 100 .
- route e.g., route 116
- navigation units 326 can include components and/or software configured to provide directional instructions for robot 102 to navigate. Navigation units 326 can process maps and localization information generated by mapping and localization units 312 , sensor data from sensor units 314 , and/or other operative units 308 . For example, navigation units 326 can receive map 700 from mapping and localization units 312 . Navigation units 326 can also receive localization information from mapping and localization units 312 , which can be indicative at least in part of the location of robot 102 within map 700 , including route 716 . Navigation units 326 can also receive sensor data from sensor units 314 which can be indicative at least in part of objects around robot 102 . Using one or more of the map, location, and sensor data, navigation units 326 can instruct robot 102 where to navigate (e.g., go forward, left, right, back, etc.).
- navigate e.g., go forward, left, right, back, etc.
- power supply 306 can include one or more batteries, including, without limitation, lithium, lithium ion, nickel-cadmium, nickel-metal hydride, nickel-hydrogen, carbon-zinc, silver-oxide, zinc-carbon, zinc-air, mercury oxide, alkaline, or any other type of battery known in the art. Certain batteries can be rechargeable, such as wirelessly (e.g., by a resonant circuit and/or a resonant tank circuit) and/or by plugging into an external power source. Power supply 306 can also be any supplier of energy, including wall sockets and electronic devices that convert solar, wind, water, nuclear, hydrogen, gasoline, natural gas, fossil fuels, mechanical energy, steam, and/or any power source into electricity.
- operating system 310 can be configured to manage memory 302 , controller 304 , power supply 306 , modules in operative units 308 , and/or any software, hardware and/or features of robot 102 .
- operating system 310 can include device drivers to manage hardware resources for robot 102 .
- any of the aforementioned components of robot 102 can be instantiated in software and/or hardware.
- a unit/module can be a piece of hardware and/or a piece of code run on a computer.
- FIG. 4 illustrates a process flow diagram of an exemplary method 400 where robot 102 learns a route and then travels that route.
- robot 102 can learn route 116 demonstrated by user 604 .
- robot 102 can autonomously navigate along route 106 or route 126 .
- robot 102 can begin teaching phase 414 by receiving an input from input 574 in user interface 500 illustrated in FIG. 5A .
- User interface 500 can appear on display 576 , which can be a mobile device, specialized device, or any other device with a screen and configured to accept a user input.
- display 576 can be part of user interface units 322 of robot 102 .
- display 576 can be a separate display communicatively coupled to robot 102 , such as, without limitation, communicatively coupled through communication units 316 of robot 102 .
- Input 574 can include buttons, radio buttons, pull-down menus, text input, and/or any way for a user to put in information and/or commands known in the art.
- User interface 500 can also include input 572 , which can be used to initiate autonomous phase 416 , which will be described later in this disclosure.
- Input 572 can include buttons, radio buttons, pull-down menus, text input, or any way for a user to input information and/or commands known in the art.
- robot 102 can detect initialization location 104 and initialize position and/or orientation of robot 102 .
- initialization location 104 is a position relative to the floor and/or floor plan.
- initialization location 104 can be demarcated by a user (e.g., drawn and/or marked physically or digitally) so that robot 102 can use the initialization position of the route training for later route initialization (e.g., in recalling learned routes).
- robot 102 can detect that robot 102 is in initialization location 104 based at least in part on where the user stopped robot 102 .
- a transmitter e.g., a transmitter that transmits communications using RFID, NFC, BLUETOOTH®, radio transmission, radio frequency field, and/or any other communication protocol described in this disclosure
- a transmitter can have an operable range such that robot 102 can detect a communication from the transmitter only when it is in the starting location.
- the transmission range of NFC can be ten centimeters or less. Accordingly, when robot 102 receives a transmission via NFC, robot 102 can detect that it is positioned in initialization location 104 . In some implementations, robot 102 can receive the transmission from the transmitter and calculate the distance to the transmitter based at least in part on the attenuation of the signal strength. In this way, robot 102 can detect how close it is to the transmitter, and consequently, the position of robot 102 relative to the transmitter and/or initialization location 104 . In some implementations, robot 102 can determine its location by triangulating the signal strength of a plurality of transmitters.
- initialization location 104 can be demarcated by a sign (e.g., markings, symbols, lines, etc.) on the floor.
- a sign e.g., markings, symbols, lines, etc.
- robot 102 can detect that robot 102 is positioned in initialization location 104 .
- a camera is positioned on the ceiling, wherein the camera can be communicatively coupled (e.g., through communication units 316 ) to robot 102 .
- the camera can be part of sensor units 314 .
- the camera can determine the position/pose of robot 102 through image processing and/or machine learning and communicate the position/pose to robot 102 .
- the camera will recognize when robot 102 is in initialization location 104 through image processing and/or machine learning and communicate to robot 102 that robot 102 is in initialization location 104 .
- robot 102 can detect the presence of robot 102 in initialization position 104 and/or determine robot's 102 relative positioning and/or orientation to one or more surrounding objects.
- robot 102 in order to detect robot's 102 presence in initialization position 104 and initialize its orientation and/or position, robot 102 can use, at least in part, its sensors (e.g., sensor unit 314 ) to sense its surrounding. These sensors can sense characteristics of the surrounding environment, such as objects (e.g., items, walls, etc.), floors, ceilings, persons and things, signs, surfaces, etc. The relative position and/or orientation of sensed objects in its surrounding can allow the robot to get its bearings relative to its initialization location.
- sensors e.g., sensor unit 314
- robot 102 can have other sides as well, corresponding to the surfaces of robot 102 , which can vary by shape (e.g., rectangular, pyramidal, humanoid, or any other designed shape).
- front side 502 can be positioned on the forward-facing side of robot 102 , where the forward-facing side is forward in the direction of forward movement of robot 102 .
- Back side 504 can be positioned on the backward-facing side of robot 102 , where the backward-facing side is the side facing in substantially the opposite direction of the forward facing side.
- Right side 508 can be the right-hand side relative to front side 502
- left side 506 can be the left-hand side relative to front side 502 .
- Sensors 560 A- 560 D can be positioned orthogonal to a side (e.g., front side 502 , right side 508 , left side 506 , and back side 504 , top side 564 , bottom side (not pictured), and/or any other side) or be placed at an angle.
- the angle can be determined by the desired objects to be sensed and the range, focal plane, region-of-interest, and/or other characteristics of each of sensors 560 A- 560 D.
- a sonar sensor can emit acoustic signals that fan out in a spread (e.g., a multi-lobed pattern, fan, or other characteristic shape of the sensor) from the sonar sensor. For example, FIG.
- sensor 560 B can also use filters to see reflected ambient infrared light.
- sensor 560 B can be a 3D sensor configured to emit and receive energy to sense the environment in three dimensions.
- energy pattern 580 B can represent at least in part the characteristic energy emitted, reflected, and/or received by sensor 560 B.
- the one or more sensors 560 A- 560 D can measure or approximate distance 516 to point 590 of object 512 .
- sensor 560 A can be a sonar sensor that can measure distance by measuring the time difference of an original emitted sound wave and the reflection of that sound wave back to sensor 560 A, where the temporal difference between the emitted and reflected sound waves can be scaled to distance using the speed of sound.
- the one or more sensors 560 A- 560 D can create a map 700 , as will later be described, where map 700 includes object 512 as well as, in some implementations, a learned route.
- Distance 516 can be approximated based at least in part on approximate measurements taken on map 700 , such as by using relative units on map 700 or scaling the relative units of the map 700 to absolute distance measurements.
- robot 102 can record its position and/or orientation (e.g., distance 516 and/or angle 514 ) relative to object 512 , and/or point 590 therein in memory 302 and associate its position with respect to object 512 and/or point 590 with initialization position 104 . In this way, robot 102 can later both detect initialization position 104 and initialize position with respect to object 512 and/or point 590 when subsequently returning to initialization position 104 .
- the detection of initialization position 104 and the initialization of position can be performed by mapping and localization units 312 .
- FIG. 5C illustrates an overhead view of robot 102 positioned at an angle in initialization location 104 .
- sensor 560 A of robot 102 can measure distance 524 at angle 518 to point 591 of object 512 using systems and methods substantially similar to how sensor 560 A measured distance 516 and angle 514 described with reference to FIG. 5B .
- FIG. 5C illustrates that a plurality of sensors 560 A- 560 D can independently measure distances and angles to object 512 .
- sensor 560 B can measure distance 522 and angle 520 to point 592 of object 512 using systems and methods substantially similar to how sensor 560 A of robot 102 measured distance 516 and angle 514 described with reference to FIG. 5B .
- robot 102 can detect initialization position 104 and/or initialize the position and/or orientation of robot 102 with respect to object 512 .
- robot 102 can record robot's 102 position and/or orientation (e.g., one or more of distances 516 , 522 and angle 514 , 520 ) relative to object 512 , and/or points 591 , 592 therein, in memory 302 and associate robot's 102 position and/or orientation with respect to object 512 and/or points 591 , 592 with initialization position 104 .
- robot 102 can later both detect initialization position 104 and initialize robot's 102 position and/or orientation with respect to object 512 and/or points 591 , 592 when subsequently returning to initialization position 104 .
- FIG. 5D illustrates an overhead view of example robot 102 where a plurality of example objects 512 , 546 , 548 , 550 are used to detect initialization location 104 and/or initialize the orientation and/or position of robot 102 .
- robot 102 can also measure distance 558 and angle 540 relative to point 594 of object 546 , distance 554 and angle 542 relative to point 596 of object 548 , and distance 556 and angle 544 relative to point 598 of object 550 .
- robot 102 can detect initialization position 104 and initialize robot 102 's position and/or orientation with respect to one or more of objects 512 , 546 , 548 , 550 , and/or points 590 , 594 , 596 , 598 therein.
- robot 102 can record its position and/or orientation (e.g., distances 516 , 558 , 554 , 556 and/or angles 514 , 540 , 542 , 544 ) relative to one or more points 590 , 594 , 596 , 598 of objects 512 , 546 , 548 , 550 in memory 302 and associate robot's 102 position and/or orientation with respect to one or more of objects 512 , 546 , 548 , 550 , and/or points 590 , 594 , 596 , 598 therein, with initialization position 104 . Accordingly, robot 102 can later both detect initialization position 104 and initialize robot's 102 position and/or orientation when subsequently returning to initialization position 104 .
- position and/or orientation e.g., distances 516 , 558 , 554 , 556 and/or angles 514 , 540 , 542 , 544
- Using a plurality of objects 512 , 546 , 548 , 550 to detect initialization location 104 can be advantageous in allowing robot 102 to more precisely locate initialization location 104 .
- Using a plurality of objects 512 , 546 , 548 can also provide additional uniqueness to initialization location 104 , which can aid robot 102 in detecting initialization location 104 and/or reduce the chances that robot 102 mistakes a different location for initialization location 104 .
- robot 102 can initialize exteroceptive sensors 568 A- 568 B.
- Initialization of sensors 568 A- 568 B can comprise zeroing sensors 568 A- 568 B, setting sensors 568 A- 568 B to an initial value, or storing in memory 302 the current value of sensors 568 A- 568 B.
- exteroceptive sensors 568 A- 568 B can initialize relative to a reference point.
- robot 102 can initialize exteroceptive sensors 568 A- 568 B relative to point 590 such that point 590 is treated as the origin (e.g., (0, 0) in a 2D map or (0, 0, 0) in a 3D map). Accordingly, robot 102 can measure distance 516 and angle 514 to point 590 and determine the initial position and/or orientation of robot 102 relative to the origin. This determination can be performed by mapping and localization units 312 .
- robot 102 can then determine its coordinates (e.g., (x, y) in a 2D map or (x, y, z) in a 3D map) using trigonometry on the vector (e.g., distance 516 and angle 514 ).
- the x-coordinate can be the cosine of angle 514 multiplied by distance 516 in some cases.
- the y-coordinate can be the sine of angle 514 multiplied by distance 516 in some cases.
- Another point such as, without limitation, one of points 591 , 592 , 594 , 596 , 598 , can similarly be used as the origin, and trigonometry used with the corresponding vector (e.g., distances 516 , 518 , 522 , 558 , 554 , 556 and/or angles 514 , 518 , 520 , 540 , 542 , 544 ) as illustrated and/or described with respect to FIGS. 5B-5D .
- there can be multiple origins so that a plurality of points (e.g., two or more of points 590 , 591 , 592 , 594 , 596 , 598 ) can initialize robot 102 .
- Using multiple origins may be desirable to create multiple maps, provide multiple origins from which to choose for computational simplicity, provide a check of sensors in case one or more have an incorrect reading, and other benefits.
- IMUs can include accelerometers, magnetometers, angular rate sensors, and the like.
- sensors 568 A includes a lidar
- the displacement (and corresponding position) can be determined based on position differences of different images at different times.
- scan matching can be used to determine position.
- Sensors 568 A- 568 B can also include one or more odometers to measure the distance traveled by robot 102 .
- Robot 102 can be trained to associate (e.g., and later perform) an action and/or actuation with a position and/or trajectory on map 700 .
- brush 608 can be actuated by actuator units 318 , wherein brush 608 can turn on/off and/or be raised/lowered by actuator units 318 .
- Robot 102 can learn actuations of brush 608 as the user controls brush 608 while recording route 716 and map 700 .
- map 700 can comprise actuator instructions for actuation of brush 608 at one or more positions and/or trajectories on map 700 and/or route 716 therein.
- robot 102 can also have one or more squeegee 616 .
- Squeegee 616 can be a rubber piece, such as a rubber-edged blade, to clean or scrape the floor. Actuator units 318 can also be used to raise/lower squeegee 616 . Accordingly, robot 102 can learn actuations of squeegee 616 as the user controls it while recording route 116 and map 700 .
- map 700 can comprise actuator instructions for actuation of squeegee 616 at one or more locations and/or trajectories on map 700 .
- actuation of other instruments of a scrubber, or any other robot form can also be similarly learned, such as turning on/off water, spraying water, turning on/off vacuums, moving vacuum hose positions, gesticulating an arm, raising/lowering a lift, turning a camera and/or any sensor of sensor units 314 , and/or any movement desired for robot 102 to perform an action.
- robot 102 would not perform those actions and/or actuator instructions each time it passes a position (e.g., where it loops around and passes the same physical location multiple times), but only perform such actions and/or such actuator instructions when it passes by the position (e.g., location) either in a particular direction or at particular instance(s) in the route.
- Body form 654 can be motorized enabling it to move with little to no user exertion upon body form 654 besides steering. The user may steer body form 654 as it moves.
- Body form 656 can include a seat, pedals, and a steering wheel, where a user can drive body form 656 like a vehicle as body form 656 cleans.
- Body form 658 can have a shape that is larger than body form 656 and can have a plurality of brushes.
- Body form 660 can have a partial or fully encased area where a user sits as he/she drives body form 660 .
- Body form 662 can have a platform where a user stands while he/she drives body form 662 .
- FIG. 6C illustrates some additional examples of body forms of robot 102 .
- body form 664 illustrates an example where robot 102 is a stand-up shop vacuum.
- Body form 666 illustrates an example where robot 102 is a humanoid robot having an appearance substantially similar to a human body.
- Body form 668 illustrates an example where robot 102 is a drone having propellers.
- Body form 670 illustrates an example where robot 102 has a vehicle shape having wheels and a passenger cabin.
- Body form 672 illustrates an example where robot 102 is a rover.
- robot 102 can be configured in any number of ways for control by user 604 . As illustrated, user 604 can walk behind robot 102 and steer robot 102 using steering wheel 610 . In other implementations, robot 102 can be a ride-on floor cleaner (not pictured) where user 604 can ride on a seat or standing platform of robot 102 and control robot 102 . In some implementations, user 604 can control robot 102 remotely with a remote control, such as a radio remote, mobile device, joystick, or any other apparatus for navigation known in the art.
- a remote control such as a radio remote, mobile device, joystick, or any other apparatus for navigation known in the art.
- This control can include turning left, turning right, moving forward (e.g., using a gas pedal or telling robot 102 to go in a forward direction), moving backwards (e.g., using a reverse pedal or telling robot 102 to go in a backward direction), turn on/off, raise/lower brush, turn on/off water, etc.
- user 604 may control actuator units 318 , which drives movement of robot 102 , raises/lowers brushes, turns on/off water, etc.
- robot 102 may not be a floor cleaner, but may be any of the other robots described in this disclosure.
- FIG. 6D illustrates a top down view as user 604 controls example robot 102 , and robot 102 senses its surroundings.
- Robot 102 can use one or more of sensors 560 A- 560 D and other sensors to detect objects and map the surroundings of robot 102 as robot navigates route 116 .
- robot 102 can emit energy waves 580 A- 580 C.
- Energy 580 B was described earlier in this disclosure with reference to FIG. 5E as well as elsewhere throughout the disclosure.
- Energy waves 580 A, 580 C can be substantially similar to energy wave 580 B, where energy wave 580 A corresponds to sensor 560 A and energy wave 580 C corresponds to sensor 560 C.
- pixels of map 700 can have one or more states, where the pixel state is indicative at least in part of a characteristic of the position/location in environment 100 represented by that pixel.
- pixels of map 700 can be binary, where a first pixel state (e.g., pixel value) is indicative at least in part of a clear (e.g., navigable) location, and a second pixel state is indicative at least in part of a blocked (e.g., not navigable) location.
- a pixel value of zero (0) can be indicative at least in part of a clear location and a pixel value of one (1) can be indicative at least in part of a blocked location.
- Pixels of map 700 can also store more than a single value, or pixel state.
- each pixel of map 700 can store a plurality of values such as values stored in a vector or matrix. These values can include values indicative at least in part of the position/pose (e.g., including location and/or orientation) of robot 102 when the position is measured at a point (e.g., pixel) along route 716 . These values can also include whether robot 102 should clean or not clean a position/location, or other actions that should be taken by robot 102 .
- Robot 102 can travel along route 116 (pictured in FIG. 1B ), which can be reflected in map 700 as route 716 .
- Robot 102 can be represented by robot indicator 702 on map 700 , where the position of robot indicator 702 in map 700 can reflect at least in part the relative location of robot 102 in environment 100 .
- robot 102 can determine its position and/or orientation relative to initialization location 104 , or another reference point (e.g., objects 512 , 546 , 548 , 550 , points 590 , 591 , 592 , 594 , 596 , 598 , and/or any other reference point robot 102 used during initialization at initialization location 104 ).
- another reference point e.g., objects 512 , 546 , 548 , 550 , points 590 , 591 , 592 , 594 , 596 , 598 , and/or any other reference point robot 102 used during initialization at initialization location 104 ).
- Initialization location 104 can be represented on map 700 as mapped position 724 .
- End location 114 can be represented on map 700 as mapped position 726 .
- robot 102 can measure or approximate its distance from initialization location 104 (or another reference point) using odometry, where it uses proprioceptive sensors 568 A- 568 B (e.g., wheel encoders (e.g., rotary encoders), visual odometry, IMUs (including accelerometers, magnetometers, angular rate sensors, and the like), etc.) to track its movements since its initialization at initialization location 104 .
- proprioceptive sensors 568 A- 568 B e.g., wheel encoders (e.g., rotary encoders), visual odometry, IMUs (including accelerometers, magnetometers, angular rate sensors, and the like), etc.
- proprioceptive sensors 568 A- 568 B can be wheel encoders that measure or estimate distance based on the revolution of the wheels of robot 102 .
- visual odometers can be used to measure or estimate the distance traveled and/or orientation of robot 102 through sequential images taken by a camera.
- the visual odometers can construct an optical flow field (e.g., using Lucas-Kanade methods or other methods) and estimate camera motion, such as by using Kalman filters or projection.
- IMUs can be used to measure or estimate the position and/or orientation of robot 102 .
- Robot 102 can record route 716 in map 700 , as robot indicator 702 progresses along map 700 in a substantially similar way as robot 102 navigates through environment 100 .
- map 700 and route 716 are created together, wherein robot 102 maps the environment 100 and records route 716 at substantially similar times.
- map 700 and route 716 can be paired together wherein each recorded route is stored only with a particular map.
- robot 102 can change a corresponding pixel on route 716 in map 700 to a pixel state indicating the pixel is part of a navigable route.
- robot 102 can also measure robot's 102 position and/or orientation relative to objects using one or more sensors 560 A- 560 D using systems and method substantially similar to those described with reference to sensors 560 A- 560 D with respect to FIGS. 5A-5E .
- robot 102 can detect and/or measure robot's 102 position and/or orientation relative to objects, such as shelves or walls, in order to populate map 700 , where robot 102 can change pixel states based at least in part on these measurements and detections by robot 102 .
- robot 102 can use sensors 560 A- 560 D to detect and/or measure the position and/or orientation of those objects in a plurality of directions relative to robot 102 .
- robot 102 can use sensors 568 A- 568 B to estimate robot's 102 position (e.g., distance traveled) and/or orientation.
- sensor 560 B which can be positioned on front side 502 of robot 102 , can have range 704 .
- robot 102 can detect objects at front side 502 up to range 704 .
- sensors 560 A, 560 C, 560 D can each have ranges and detect objects within those ranges.
- robot 102 can indicate on map 700 the location of pixels that correspond to detected objects.
- Such pixels can be turned to a state that is indicative at least in part that those pixels correspond to objects (e.g., a pixel state indicative of a blocked location or an object).
- map 700 can have certain artifacts. For example, walls that appear smooth can appear jagged based at least in part on the signals received by the sensors. For example, where sensors 560 A- 560 D include sonars, lidars, or other sensors that depend on the reflectance of sound, light, or other elements from surfaces, there can be variability within the surface. There can also be motion artifacts and others artifacts and/or distortions.
- sensors 560 A- 560 D may not sense certain areas.
- an object can impede the availability of robot 102 to sense an area, or the area may appear in a blind spot (e.g., place not covered by the measuring range of the sensors).
- box 706 highlights on the map 700 measurements taken by robot 102 as it made turn 708 on map 700 .
- sensors 560 A- 560 D measured the area marked white (e.g., as navigable locations) by box 706 , however, certain objects impeded the range of the sensors, creating the elongated, fractured appearance illustrated in box 706 .
- robot 102 can generate map 700 comprising a representation of route 116 and the surrounding environment 100 of route 116 within the range of the sensors of robot 102 .
- FIG. 7B illustrates example map 700 once completed.
- robot 102 can record mapped route 716 and map the surrounding environment of mapped route 716 in map 700 in one demonstration. Accordingly, map 700 can allow robot 102 to navigate route 116 (or a route substantially similar to route 116 ) again autonomously in as few as one demonstration.
- robot 102 can determine mapping errors in map 700 . This determination can be performed by map evaluation units 324 .
- robot 102 desirably travels route 106 autonomously (e.g., in autonomous phase 416 ) after a single demonstration generating map 700
- determining if there have been mapping errors in map 700 can allow robot 102 to avoid, e.g., collisions, errors, and/or any negative consequences of inaccurate or incorrect mapping.
- robot 102 can send (e.g., via user interface units 322 ) an alert, alarm, prompt and/or other indication to a user (e.g., user 604 or another user) indicating that the map is poor quality.
- robot 102 can send an alert, alarm, prompt or other indication to the user to re-demonstrate a route (e.g., by performing portions 402 , 404 again).
- determining errors and/or evaluating the quality of map 700 prior to autonomous navigation can save time and prevent damage by ensuring that robot 102 does not crash into an obstacle or become stuck due to robot 102 's mapping.
- mapping errors there are a number of ways in which robot 102 can detect mapping errors and/or evaluate the quality of map 700 (including route 716 ), each way implemented alone or in combination.
- not every mapping error or the presence of mapping errors means that map 700 is of poor quality and/or cannot be used to navigate autonomously.
- map 700 can have many errors and still be fit for use for autonomous navigation. Rather, portion 406 can be used to determine if map 700 is sufficiently flawed such that robot 102 cannot or should not navigate autonomously based at least in part on map 700 .
- portion 406 can be used to determine if map 700 is sufficiently flawed such that robot 102 cannot or should not navigate autonomously based at least in part on map 700 .
- the foregoing gives some illustrative examples of ways robot 102 can make such an evaluation.
- robot 102 in detecting mapping errors and/or evaluating the quality of map 700 , can take into account at least in part characteristics of errors in map 700 .
- robot 102 can detect mapping errors and/or evaluate the quality of map 700 with little or no input and/or effort by user 604 . This can create a seamless experience that further emphasizes and reinforces the autonomy of robot 102 to user 604 .
- robot 102 can transmit map 700 to a server, control center, mobile device, and/or any interface for a user/viewer to verify map 700 and/or route 716 .
- the viewer can view map 700 on a display, such as a screen, computer monitor, television, and the like, and/or any display in user interface units 322 .
- the viewer can also communicate back to robot 102 , where such communication can be indicative at least in part of whether map 700 and/or route 716 are acceptable for autonomous navigation.
- robot 102 can transmit map 700 using communication units 316 , which can send map 700 and receive communications indicative at least in part of whether map 700 and/or route 716 are acceptable to use for autonomous navigation.
- an interface for the user (e.g., user interface units 322 ) can be on robot 102 , wherein the user can view map 700 and/or route 716 and provide an input indicative at least in part of whether map 700 and/or route 716 are acceptable for autonomous navigation.
- objects 108 , 110 , 112 may appear parallel as mapped objects 808 , 810 , 812 , as illustrated in FIG. 8A . Accordingly, where robot 102 instead maps mapped objects 858 , 860 , 862 , as illustrated in FIG. 8B , robot 102 may find that there has been an error in map 700 .
- Robot 102 can detect such particular patterns on a pixel-by-pixel or region-by-region basis. In some cases, robot 102 can use image processing, such as segmentation, edge detection, shape recognition, and/or other techniques to identify one or more objects 858 , 860 , 862 in map 700 . Once objects 858 , 860 , 862 are identified, robot 102 can use various methods to determine whether objects 858 , 860 , 862 are approximately parallel to others of objects 858 , 860 , 862 . Robot 102 can then measure the orientations and/or positions of objects 858 , 860 , 862 , such as the distances and/or relative angles between objects 858 , 860 , 852 . Based at least in part on the measured orientations and/or positions, robot 102 can determine if objects 858 , 860 , 862 are approximately parallel or not.
- image processing such as segmentation, edge detection, shape recognition, and/or other techniques to identify one or more objects 858 , 860 , 8
- Robot 102 can measure the distance between each of points 864 , 866 , 868 of object 862 and points 890 , 892 , 894 of object 860 , and compare those distances to determine, at least in part, if objects 860 , 862 are approximately parallel. For example, if the difference of the distances between point 866 and point 892 , and point 868 and point 894 are above a predetermined threshold (e.g., a threshold indicative of possible deviations in measurements or in the actual location of approximately parallel shelves, such as, without limitation a 5%, 10%, 15% difference), robot 102 can find that objects 860 , 862 are not approximately parallel.
- the predetermined threshold can be stored in memory 302 .
- FIG. 8C illustrates an example mask 870 that can be used to search map 700 for parallel objects, such as objects 808 , 810 , 812 .
- Mask 870 can be a structural template that can be visualized as a matrix, wherein each cell of the matrix represents pixels or groups of pixels of map 700 , and their corresponding pixel states. As used in certain applications in the art, mask 870 can also be referred to as a filter.
- Mask 870 can be stored in memory 302 and/or part of software configured to process map 700 .
- mask 870 can be sized (e.g., as an m ⁇ n matrix with m pixels in the x direction and n pixels in the y direction) based at least in part on map 700 and the size of objects 808 , 810 , 812 .
- the size of mask 870 can be predetermined based at least in part on a percentage of the total pixel dimensions (e.g., 5%, 10%, 15%, 20%, 25%, or more) of map 700 , or based at least in part on known approximate measurements of objects 808 , 810 , 812 .
- mask 870 can change in size through iterations of search methods, where mask 870 begins searching map 700 as a first size, and then searches map 700 again as a second size, and searches map 700 again as a third size, and so on and so forth for a predetermined number of times. For example, mask 870 can begin as a larger mask and in subsequent iterations become a smaller mask. Note, the size of mask 870 illustrated in FIG. 8C is for illustration purposes and may not be to scale.
- Cell 872 of the matrix can align sequentially with one or more or all of the pixels of map 700 .
- the other cells of mask 870 can also align with the surrounding pixels in map 700 .
- Each pixel aligned from map 700 can be compared to the corresponding pixel of mask 870 to detect the similarities between mask 870 and the region of map 700 to which it is aligned.
- mask 870 defines structures 876 , 878 , which can be indicative at least in part of parallel objects (e.g., two of objects 808 , 810 , 812 ).
- the cells of structures 876 , 878 e.g., cell 876
- each of the cells of structures 876 , 878 can have a value indicative at least in part of an object of map 700 (e.g., indicative at least in part of the pixel state for an object in map 700 ).
- Between structures 876 , 878 can be structure 880 , whose pixels can have values indicative of a clear location.
- robot 102 can generate an indication (e.g., message, value, or command) that robot 102 has found matches between mask 870 and map 700 and/or the location of such matches. In some cases, where too few matches are found (e.g., based on a predetermined number of expected items to be found), robot 102 can detect mapping errors in map 700 and/or determine that map 700 is not good quality.
- an indication e.g., message, value, or command
- Route discontinuity 904 can be indicative at least in part of an error because robot 102 likely did not go from route portion 902 A to route portion 902 B, or vice versa, without going into any space in-between. In some cases, route discontinuity 904 may not be an issue for robot 102 to navigate mapped route 716 because robot 102 can travel across clear space 908 from route portion 902 A to route portion 902 B without issue. However, route discontinuity 904 , by itself or in combination with other route discontinuities and/or errors, can be indicative at least in part of mapping errors and/or the quality of map 700 (e.g., that map 700 is of poor quality).
- robot 102 can consider the size of route discontinuity 904 (e.g., the number of pixels, the distance, etc. of route discontinuity 904 ) and also if there are other route discontinuities elsewhere in map 700 . In some cases, where route discontinuity 904 is of a size above a predetermined size threshold (e.g., stored in memory 302 ), robot 102 can detect mapping errors and/or determine that map 700 is of poor quality.
- a predetermined size threshold e.g., stored in memory 302
- the predetermined size threshold can be measured in absolute distance measurements using standard units, such as inches, feet, meters, or any other unit of measurement (e.g., measurements in the metric, US, or other system of measurement) or measured in relative (or non-absolute) units, such as ticks, pixels, percentage of range of a sensor, and the like.
- This predetermined size threshold can be determined at least in part on one or more factors including: the signal resolution and/or fidelity of sensors (e.g., of sensor units 314 ) of robot 102 ; the complexity of environment 100 ; empirical correlations between route discontinuities with robot 102 and mapping errors/poor map quality; the ability of robot 102 to navigate with route discontinuity 904 ; and/or other factors.
- a highly complex environment 100 may strain the mapping and localizing capabilities (e.g., of mapping and localization units 312 ) of robot 102 , and discontinuity 904 may be expected, thus the predetermined size threshold may be relatively high.
- a relatively simple environment 100 may not strain the mapping and localizing capabilities of robot 102 , and route discontinuity 904 may not be expected, thus the predetermined size threshold may be relatively low.
- the predetermined size threshold may be relatively low.
- robot 102 may have prior maps (or maps aggregated on a server) whose map quality (and/or lack of mapping errors) have been independently evaluated (e.g., by a user or other person).
- Robot 102 can then consider the correlation between the size of route discontinuities in determining the predetermined size threshold in detecting mapping errors and/or evaluating the quality of map 700 based at least in part on discontinuity 904 and/or other route discontinuities.
- the predetermined size threshold may be based at least in part on the ability of robot 102 to navigate map 700 . After route discontinuity 904 becomes larger than a predetermined size threshold, robot 102 may no longer be able to navigate map 700 , thus robot 102 can detect mapping errors and/or determine map 700 is of poor quality. In any case of detected error and/or determination of poor quality, robot 102 can then prompt user 604 to demonstrate the route again (e.g., via user interface units 322 ).
- route discontinuity 904 may be one of a plurality of route discontinuities of map 700 .
- Robot 102 can consider these other route discontinuities. If the number of route discontinuities is above a predetermined number threshold (e.g., stored in memory 302 ), robot 102 can detect mapping errors and/or determine that map 700 is of poor quality. For example, this predetermined number threshold can be determined at least in part on one or more factors including: the signal resolution and/or fidelity of sensors (e.g., of sensor units 314 ) of robot 102 ; the complexity of environment 100 ; empirical correlations between route discontinuities with robot 102 and mapping errors/map quality; the ability of robot 102 to navigate with route discontinuity 904 ; and/or other factors.
- this predetermined number threshold can be determined at least in part on one or more factors including: the signal resolution and/or fidelity of sensors (e.g., of sensor units 314 ) of robot 102 ; the complexity of environment 100 ; empirical correlations between route discontinuities with robot 102 and mapping errors/map quality
- route discontinuity 904 For example, if the signal resolution and/or fidelity of sensors of robot 102 are low, robot 102 can expect that there will be some route discontinuity in mapping (e.g., route discontinuity 904 ). The presence of these route discontinuities might not be indicative at least in part of mapping errors and/or poor map quality, thus the predetermined number threshold could be relatively high. In contrast, where the signal resolution and/or fidelity of sensors of robot 102 are high, discontinuity 904 may be unexpected, and the presence of route discontinuities might be indicative at least in part of mapping errors and/or poor map quality, thus the predetermined number threshold could be relatively low.
- route discontinuity 904 may be unexpected, and the presence of route discontinuities might be indicative at least in part of mapping errors and/or poor map quality, thus the predetermined number threshold could be relatively low.
- a highly complex environment 100 may strain the mapping and localizing capabilities (e.g., of mapping and localization units 312 ) of robot 102 , and route discontinuity 904 may be expected, thus the predetermined number threshold may be relatively high.
- a relatively simple environment 100 may not strain the mapping and localizing capabilities of robot 102 , and route discontinuity 904 may not be expected, thus the predetermined number threshold may be relatively low.
- the predetermined number threshold may be relatively low.
- robot 102 may have prior maps (or maps aggregated on a server) whose map quality (and/or lack of mapping errors) have been independently evaluated (e.g., by a user or other person).
- Robot 102 can then consider the correlation between the number of route discontinuities in determining the predetermined number threshold in detecting mapping errors and/or evaluating the quality of map 700 based at least in part on route discontinuity 904 and/or other route discontinuities.
- the predetermined number threshold may be based at least in part on the ability of robot 102 to navigate map 700 . After the predetermined number threshold of route discontinuities substantially like route discontinuity 904 , robot 102 may no longer be able to navigate map 700 , thus robot 102 can detect mapping errors and/or determine map 700 is of poor quality. In any case of detected error and/or determination of poor quality, robot 102 can then prompt the user 604 to demonstrate the route again (e.g., via user interface units 322 ).
- hybrid thresholds can be used where the above described predetermined size threshold and predetermined number threshold are used in combination.
- the predetermined number threshold, above which map 700 is determined to contain mapping errors and/or be poor quality may be based at least in part on the number of route discontinuities above the predetermined size threshold.
- robot 102 can then prompt user 604 to demonstrate the route again (e.g., via user interface units 322 ).
- FIG. 9B illustrates example object discontinuity 924 between example object portion 926 A and example object portion 926 B of example mapped portion 920 .
- Mapped portion 920 can be a portion of map 700 .
- route portion 922 may not have any route discontinuities.
- object discontinuity 924 can be indicative of an error because object discontinuity 924 is likely an unmapped portion of map portion 924 in a position where it should have been mapped.
- object discontinuity 924 may not be an issue for robot 102 to navigate because robot 102 could detect the presence of the object with its sensors as it navigates through route portion 922 .
- object discontinuity 924 by itself or in combination with other discontinuities and/or other characteristics of mapping errors, can be indicative of mapping errors and/or a poor quality map.
- robot 102 can consider the size of object discontinuity 924 (e.g., the number of pixels, the distance, etc. of object discontinuity 924 ) and also if there are other object discontinuities elsewhere in map 700 . In some cases, where object discontinuity 924 is of a size above a predetermined size threshold (e.g., stored in memory 302 ), robot 102 can determine that map 700 has mapping errors and/or is of poor quality.
- a predetermined size threshold e.g., stored in memory 302
- this predetermined size threshold can be determined at least in part on one or more factors including: the signal resolution and/or fidelity of sensors (e.g., of sensor units 314 ) of robot 102 ; the complexity of environment 100 ; empirical correlations between object discontinuities with robot 102 and mapping errors/map quality; the ability of robot 102 to navigate with object discontinuity 924 ; and/or other factors. For example, if the signal resolution and/or fidelity of sensors of robot 102 are low, robot 102 can expect that there will be some object discontinuity in mapping (e.g., discontinuity 904 ) and such object discontinuities could be of a larger size.
- the signal resolution and/or fidelity of sensors of robot 102 are low, robot 102 can expect that there will be some object discontinuity in mapping (e.g., discontinuity 904 ) and such object discontinuities could be of a larger size.
- the predetermined size threshold could be relatively high.
- the signal resolution and/or fidelity of sensors of robot 102 are high, object discontinuity 924 may be unexpected, and even a discontinuity of a small size might be indicative at least in part of mapping errors and/or poor map quality, thus the predetermined size threshold could be relatively low.
- a highly complex environment 100 may strain the mapping and localizing capabilities (e.g., of mapping and localization units 312 ) of robot 102 , and object discontinuity 924 may be expected, thus the predetermined size threshold may be relatively high.
- a relatively simple environment 100 may not strain the mapping and localizing capabilities of robot 102 , and object discontinuity 924 may not be expected, thus the predetermined size threshold may be relatively low.
- the predetermined size threshold may be relatively low.
- robot 102 may have prior maps (or maps aggregated on a server) whose map quality (and/or lack of mapping errors) have been independently evaluated (e.g., by a user or other person). Robot 102 can then consider the correlation between the size of object discontinuities in determining the predetermined size threshold in detecting mapping errors and/or evaluating the quality of map 700 based at least in part on object discontinuity 924 and other object discontinuities.
- the predetermined size threshold may be based at least in part on the ability of robot 102 to navigate map 700 . After object discontinuity 924 becomes larger than a predetermined size, robot 102 may no longer be able to navigate map 700 , thus robot 102 can detect mapping errors and/or determine map 700 is of poor quality. In any case of detected error and/or determination of poor quality, robot 102 can then prompt the user to demonstrate the route again (e.g., via user interface units 322 ).
- object discontinuity 924 may be one of a plurality of object discontinuities of map 700 .
- Robot 102 can consider these other object discontinuities. If the number of object discontinuities is above a predetermined number threshold (e.g., stored in memory 302 ), robot 102 can detect mapping errors and/or determine that map 700 is of poor quality. For example, this predetermined number threshold can be determined at least in part on one or more factors including: the signal resolution and/or fidelity of sensors (e.g., of sensor units 314 ) of robot 102 ; the complexity of environment 100 ; empirical correlations between object discontinuities with robot 102 and mapping errors/map quality; the ability of robot 102 to navigate with object discontinuity 924 ; and/or other factors.
- this predetermined number threshold can be determined at least in part on one or more factors including: the signal resolution and/or fidelity of sensors (e.g., of sensor units 314 ) of robot 102 ; the complexity of environment 100 ; empirical correlations between object discontinuities with robot 102 and mapping errors/map quality
- the predetermined number threshold could be relatively high.
- object discontinuity 924 may be unexpected, and the presence of object discontinuities might be indicative at least in part of mapping errors and/or poor map quality, thus the predetermined number threshold could be relatively low.
- a highly complex environment 100 may strain the mapping and localizing capabilities (e.g., of mapping and localization units 312 ) of robot 102 , and object discontinuity 924 may be expected, thus the predetermined number threshold may be relatively high.
- a relatively simple environment 100 may not strain the mapping and localizing capabilities of robot 102 , and object discontinuity 924 may not be expected, thus the predetermined number threshold may be relatively low.
- the predetermined number threshold may be relatively low.
- robot 102 may have prior maps (or maps aggregated on a server) whose map quality (and/or lack of mapping errors) have been independently evaluated (e.g., by a user or other person).
- Robot 102 can then consider the correlation between the number of object discontinuities in determining the predetermined number threshold in detect mapping errors and/or evaluating the quality of map 700 based at least in part on object discontinuity 924 and other discontinuities.
- the predetermined number threshold may be based at least in part on the ability of robot 102 to navigate map 700 . After a predetermined number of object discontinuities substantially like object discontinuity 924 , robot 102 may no longer be able to navigate map 700 , thus robot 102 can detect mapping errors and/or determine map 700 is of poor quality. In any case of detected error and/or determination of poor quality, robot 102 can then prompt the user to demonstrate the route again (e.g., via user interface units 322 ).
- hybrid thresholds can be used where the above described predetermined size threshold and predetermined number threshold are used in combination.
- the predetermined number threshold, above which map 700 is determined to have mapping errors and/or be poor quality may be based at least in part on the number of object discontinuities above the predetermined size threshold.
- robot 102 can then prompt user 704 to demonstrate the route again (e.g., via user interface units 322 ).
- FIG. 9C illustrates an example mapped portion 920 that has discontinuity 934 , which includes both a route discontinuity and an object discontinuity.
- Mapped portion 920 can be a portion of map 700 .
- Discontinuity 934 can be a discontinuity between route portion 930 and route portion 932 .
- Discontinuity 934 can also be a discontinuity in object 936 .
- both route discontinuities and object discontinuities can be indicative at least in part of mapping errors and/or poor map quality.
- robot 102 evaluates map 700 , robot 102 can consider either route discontinuities or object discontinuities, or both together, in detect mapping errors and/or determining the quality of map 700 .
- robot 102 can evaluate the amount of overlap between items (e.g., routes, obstacles, or other objects) in map 700 in detecting mapping errors and/or determining the quality of map 700 .
- FIG. 10 illustrates example mapped portion 1000 having overlapping objects 1002 , 1004 , 1006 .
- Mapped portion 1000 can be a portion of map 700 .
- objects 1002 , 1004 , 1006 can be walls, objects, shelves, etc. that robot 102 detected while creating map 700 .
- robot 102 Based at least in part on the measured positioning and orientation of robot 102 , robot 102 mapped objects 1002 , 1004 , 1006 .
- robot 102 can determine that there has been an error in mapping. In identifying such areas of overlap, robot 102 can examine map 700 pixel-by-pixel or region-by-region. In some cases, robot 102 can use a mask and/or filter to find predetermined shapes within map 700 (e.g., substantially similar to mask 870 modified to look for the predetermined shape). The predetermined shapes can be based at least in part on known errors of robot 102 in mapping, such as previously observed transformations of object locations and/or sensor errors.
- Overlap can also be identified at least in part by a heavy density of detected objects 1002 , 1004 , 1006 in and/or around a pixel or region of pixels.
- robot 102 can detect shapes in map 700 , namely irregularity in shapes.
- robot 102 can detect entrapped spaces, such as space 1008 .
- space 1008 may be a clear, traveled to, and/or navigable space. Space 1008 would not normally occur between objects 1002 , 1004 , 1006 because robot 102 would not have access to space 1008 as mapped. Accordingly, robot 102 can determine that map 700 has mapping errors and/or is of poor quality if it detects space 1008 .
- robot 102 can detect jagged overhangs 1010 , 1012 .
- the irregularity of the shape can allow robot 102 to determine that there has been an error mapping in one or more of objects 1002 , 1004 , 1008 because such overhangs would not normally occur in environment 100 . Accordingly, based at least in part on the irregularity of overhangs 1010 , 1012 , robot 102 can detect mapping errors and/or determine that map 700 is of poor quality.
- robot 102 (and/or the route robot 102 travels) can be represented in map 700 as passing through objects. Because it is unlikely that robot 102 would pass through objects, such an occurrence can be indicative at least in part of a mapping error.
- robot 102 can identify mapping errors and/or the quality of map 700 by comparing map 700 with data from at least one of robot 102 's sensors.
- map 700 was generated using at least in part one or more of sensors 560 A- 560 D and one or more of sensors 568 A- 568 B.
- a check on the accuracy of map 700 can compare map 700 to data recorded by fewer than all of sensors 560 A- 560 D and sensors 568 A- 568 B.
- one or more of sensors 568 A-B can determine the odometry of robot 102 .
- a representation of a route of robot 102 based only on the odometry can be considered a map in the odometry frame.
- This map in the odometry frame can be compared to map 700 , such as using a comparator, subtraction, and/or any other method of comparing maps in this disclosure. If the deviation between the map in the odometry frame and map 700 exceeds a predetermined threshold (e.g., more than 40%, 50%, 60%, or any percentage determined based at least in part on empirical determinations of a correlation to poor map quality), robot 102 can determine that there were mapping errors and/or map 700 was of poor quality.
- a predetermined threshold e.g., more than 40%, 50%, 60%, or any percentage determined based at least in part on empirical determinations of a correlation to poor map quality
- robot 102 can be configured to travel in a closed loop (e.g., the end location is substantially similar to the initialization location). It should be noted that robot 102 may not always travel in a closed loop.
- FIG. 1A illustrated a route that did not form a closed loop because initialization location 104 was illustrated as not in substantially the same location as end location 114 .
- FIG. 11A illustrates robot 102 travelling in example closed loop route 1104 , where location 1102 is both the initialization location and the end location. In this case, if the map of route 1104 did not have the initialization location and end location approximately at location 1102 , robot 102 can detect mapping errors and/or determine that the map was of poor quality.
- a predetermined distance threshold (e.g., stored in memory 302 ). If the mapped initialization location and end location are not within the predetermined distance threshold (e.g., if the distance between the initialization location and end location does not exceed the predetermined distance threshold), robot 102 can detect mapping errors and/or determine the map is of poor quality.
- This predetermined distance threshold can be determined based at least in part on the size of the map (e.g., the predetermined distance threshold can be a percentage of map size), sensor resolution and/or fidelity, and/or other factors.
- robot 102 can have an uploaded map of the environment stored in memory 302 .
- Robot 102 can then compare map 700 to the uploaded map.
- robot 102 can utilize one or more comparators of map evaluation units 324 that compares map 700 with an uploaded map on a pixel-by-pixel or region-by-region basis.
- uploaded map and/or map 700 may be resized to facilitate that comparison.
- robot 102 can determine that there has been a mapping errors and/or that map 700 is of poor quality. Consequently, robot 102 can prompt the user 604 to demonstrate the route again (e.g., robot 102 can perform portion 404 again).
- a percentage similarity can be computed between the uploaded map and map 700 , where the percentage similarity reflects, at least in part, how similar the uploaded map is to map 700 .
- a predetermined threshold e.g., 70%, 80%, 90%, or any percentage indicative at least in part of substantial similarity between the uploaded map and map 700
- robot 102 can determine that there has been a mapping error and/or that map 700 is of poor quality. Consequently, robot 102 can prompt (e.g., via user interface units 322 ) user 604 to demonstrate the route again (e.g., robot 102 can perform portion 404 again).
- the uploaded map can be analyzed for shapes (e.g., shapes of objects or clear spaces).
- Map 700 can be analyzed for those same shapes to determine, at least in part, if those same shapes are present in map 700 .
- a mask and/or filter can be used for the search in some implementations (e.g., substantially similar to mask 870 modified to look for the shapes). If the shapes from the uploaded map are not found in map 700 , then robot 102 can determine that there has been a mapping error and/or that map 700 is of poor quality. Consequently, robot 102 can prompt (e.g., via user interface units 322 ) the user 604 to demonstrate the route again (e.g., robot 102 can perform portion 404 again).
- map 700 can be analyzed for shapes (e.g., shapes of objects or clear spaces), and the uploaded map analyzed to see if those same shapes are present.
- robot 102 can determine that there has been a mapping error and/or that map 700 is of poor quality and prompt (e.g., via user interface units 322 ) the user 604 to demonstrate the route again (e.g., robot 102 can perform portion 404 again).
- robot 102 can analyze map 700 for certain expected characteristics/features of an environment 100 .
- robot 102 might expect aisles and/or rows of shelves. Where robot 102 does not detect objects indicative of aisles and/or rows of shelves, or detects too few or too many, robot 102 can determine map 700 may be of poor quality and/or contains mapping errors. As another example, there may be a certain level of expectation on the complexity of an environment. Where map 700 has too many turns or too few turns, robot 102 can determine that map 700 may be of poor quality and/or contains mapping errors. As another example, environment 100 can have an expected size.
- robot 102 can determine that map 700 may be of poor quality and/or contains mapping errors.
- robot 102 can prompt a user (e.g., user 604 or a user with access to the map on a server) to verify map 700 . Accordingly, robot can send the map to the server and receive a verification of the quality of the map.
- machine learning algorithms can be used, wherein robot 102 (e.g., controller 304 of robot 102 ) learns to identify good maps and bad maps.
- robot 102 can have a library of maps that have been identified (e.g., hand labeled or machine labeled) as good maps and bad maps.
- robot 102 can then learn to associate characteristics robot 102 determines across its library as being indicative of a good map or a bad map.
- robot 102 can determine that there has been a mapping error and/or that map 700 is of poor quality and prompt (e.g., via user interface units 322 ) the user 604 to demonstrate the route again (e.g., robot 102 can perform portion 404 again).
- robot 102 can also correct errors in a map 700 of poor quality. For example, in some cases, where robot 102 did not travel exactly in a closed loop (e.g., closed loop route 1104 ), the difference between the initialization location and end location can be used to correct the odometry of robot 102 . For example, robot 102 can take the difference between the initialization location and end location and determine that the difference is indicative of how much the odometry drifted from the actual. Accordingly, robot 102 can adjust a recorded route to take into account that determined drift.
- a closed loop e.g., closed loop route 1104
- mapping errors can result in patterns that robot 102 can associate with at least a portion of a corrected map, which can be version of map 700 correcting one or more errors.
- FIG. 11B illustrates an example where example robot 102 associates an example mapping error with an example corrected route 1108 .
- map 700 can contain a series of drifted routes of substantially similar shapes, such as mapped routes 1106 A- 1106 N, where N is indicative that any number of mapped routes 1106 A- 1106 N can be mapped.
- Robot 102 can determine that such drifted mapped routes 1106 A- 1106 N can be indicative at least in part of a user 604 navigating the same route over and over again.
- robot 102 can then correct mapped routes 1106 A- 1106 N to mapped route 1108 , which is indicative of user 604 navigating the same route repeatedly.
- map 700 contained mapped routes 1106 A- 1106 N
- robot 102 can correct mapped routes 1106 A- 1106 N to mapped route 1108 in map 700 .
- error patterns e.g., drifts and/or other errors
- robot 102 can correct errors of map 700 .
- Robot 102 can also use machine learning to learn to associate errors with corrections of those errors.
- robot 102 can store in memory 302 and/or on a server maps with errors.
- user 604 can first demonstrate a route.
- the map created of the route and the surrounding environment can contain mapping errors.
- user 604 may remap the environment and/or route.
- robot 102 can have a version of a poor quality map (e.g., with mapping errors that would prevent successful navigation) and a version that is not of poor quality (e.g., without mapping errors that would prevent successful navigation).
- Robot 102 can then associate at least a portion of the poor quality map with a corresponding portion of the remapped version that is not of poor quality. Based on one or more substantially similar associations, robot 102 can learn to identify a mapping error that has occurred and then produce at least a portion of the corrected map once it has recognized the mapping error.
- robot 102 can then enter autonomous phase 416 .
- robot 102 can detect initialization location 104 and initialize the position and/or orientation of robot 102 .
- a user can bring robot 102 to initialization location 104 by driving robot 102 , remote controlling robot 102 , steering robot 102 , pushing robot 102 , and/or any other control, such as any control that drives actuator units 318 .
- robot 102 can return to initialization location 104 autonomously.
- robot 102 can store in memory 302 the location of initialization location 104 (e.g., as previously described with reference to FIGS. 5B-5E ) and return to that location.
- robot 102 can detect initialization location 104 in a way substantially similar to the systems and methods it used to detect initialization location 104 in portion 402 described with reference to FIGS. 5B-5E as well as elsewhere throughout this disclosure.
- robot's 102 position relative to, for example, one or more of objects 512 , 546 , 548 , 550 will have been stored in memory 302 (e.g., from portion 402 ).
- robot 102 can determine that robot 102 is in initialization location 104 .
- robot 102 can detect it is in initialization location 104 based at least in part on where the user stopped robot 102 . As such, it can assume where the user stopped, and subsequently selected a route as will be described with reference to portion 410 , is initialization location 104 .
- there can be a transmitter e.g., a transmitter that transmits communications using RFID, NFC, BLUETOOTH®, radio transmission, radio frequency field, and/or any other communication protocol described in this disclosure
- robot 102 can detect that it is on top of, or substantially close to the transmitter.
- the transmitter can have an operable range such that robot 102 can detect a communication from the transmitter only when it is in the starting location.
- the transmission range of NFC can be ten centimeters or less. Accordingly, when robot 102 receives a transmission via NFC, robot 102 can detect that it is positioned in initialization location 104 .
- robot 102 can receive the transmission from the transmitter and calculate the distance to the transmitter based at least in part on the attenuation of the signal strength. In this way, robot 102 can detect how close it is to the transmitter, and consequently, the position of robot 102 relative to the transmitter and/or initialization location 104 .
- robot 102 can determine its location by triangulating the signal strength of a plurality of transmitters.
- initialization location 104 can be demarcated by a sign (e.g., markings, symbols, lines, etc.) on the floor.
- a sign e.g., markings, symbols, lines, etc.
- robot 102 can detect that robot 102 is positioned in initialization location 104 .
- robot 102 can then select a recorded route to navigate autonomously.
- the selection of the recorded route (e.g., route 116 ) by robot 102 can be based at least in part on user input.
- a user can select input 572 on user interface 500 (illustrated in FIG. 5A ) on display 576 , where input 572 can allow a user to select a recorded route of robot 102 .
- interface 1200 illustrated in FIG. 12
- FIG. 12 illustrates example interface 1200 , which can be used for route selection.
- Interface 1200 can present a plurality of routes for selection displayed as selectable inputs 1202 A- 1202 F.
- a user may select one of selectable inputs 1202 A- 1202 F via touch (e.g., in the case display 576 includes a touch screen) and/or any other input mechanism of user interface units 322 .
- input 1202 F can correspond with mapped route 716 learned by robot 102 .
- robot 102 can then select map 700 and mapped route 716 (which is based upon the user's demonstration of route 116 ) based at least in part on the user's selection.
- robot 102 can automatically select a recorded route based on the initialization location it detected in portion 408 .
- initialization location 104 can be associated with only demonstrated route 116 (or as mapped as mapped route 716 ).
- robot 102 can have other initialization locations associated with other demonstrated routes.
- having a plurality of initialization locations can allow a user to demonstrate, and allow robot 102 to move autonomously through, a variety of routes.
- robot 102 can more quickly begin autonomous navigation with minimal additional user input.
- robot 102 can then travel autonomously along the selected recorded route in portion 410 .
- robot 102 can travel autonomously using map 700 and mapped route 716 .
- robot 102 can actuate various instruments on robot 102 , such as brush 908 and/or squeegee 616 as learned during portion 404 and/or recorded in map 700 .
- the actuation of learned actions of instruments of a scrubber, or any other robot form, can also be similarly be performed, such as turning on/off water, spraying water, turning on/off vacuums, moving vacuum hose positions, gesticulating an arm, raising/lowering a lift, turning a camera and/or any sensor of sensor units 314 , and/or any movement desired for robot 102 to perform an action.
- computer and/or computing device can include, but are not limited to, personal computers (“PCs”) and minicomputers, whether desktop, laptop, or otherwise, mainframe computers, workstations, servers, personal digital assistants (“PDAs”), handheld computers, embedded computers, programmable logic devices, personal communicators, tablet computers, mobile devices, portable navigation aids, J2ME equipped devices, cellular telephones, smart phones, personal integrated communication or entertainment devices, and/or any other device capable of executing a set of instructions and processing an incoming data signal.
- PCs personal computers
- PDAs personal digital assistants
- handheld computers handheld computers
- embedded computers embedded computers
- programmable logic devices personal communicators
- tablet computers tablet computers
- mobile devices portable navigation aids
- J2ME equipped devices portable navigation aids
- cellular telephones smart phones
- personal integrated communication or entertainment devices personal integrated communication or entertainment devices
- computer program and/or software can include any sequence or human or machine cognizable steps which perform a function.
- Such computer program and/or software may be rendered in any programming language or environment including, for example, C/C++, C#, Fortran, COBOL, MATLABTM, PASCAL, Python, assembly language, markup languages (e.g., HTML, SGML, XML, VoXML), and the like, as well as object-oriented environments such as the Common Object Request Broker Architecture (“CORBA”), JAVATM (including J2ME, Java Beans, etc.), Binary Runtime Environment (e.g., BREW), and the like.
- CORBA Common Object Request Broker Architecture
- JAVATM including J2ME, Java Beans, etc.
- BREW Binary Runtime Environment
- the term “including” should be read to mean “including, without limitation,” “including but not limited to,” or the like; the term “comprising” as used herein is synonymous with “including,” “containing,” or “characterized by,” and is inclusive or open-ended and does not exclude additional, unrecited elements or method steps; the term “having” should be interpreted as “having at least;” the term “such as” should be interpreted as “such as, without limitation;” the term ‘includes” should be interpreted as “includes but is not limited to;” the term “example” is used to provide exemplary instances of the item in discussion, not an exhaustive or limiting list thereof, and should be interpreted as “example, but without limitation;” adjectives such as “known,” “normal,” “standard,” and terms of similar meaning should not be construed as limiting the item described to a given time period or to an item available as of a given time, but instead should be read to encompass known, normal, or standard technologies that may be available or
- a group of items linked with the conjunction “and” should not be read as requiring that each and every one of those items be present in the grouping, but rather should be read as “and/or” unless expressly stated otherwise.
- a group of items linked with the conjunction “or” should not be read as requiring mutual exclusivity among that group, but rather should be read as “and/or” unless expressly stated otherwise.
- the terms “about” or “approximate” and the like are synonymous and are used to indicate that the value modified by the term has an understood range associated with it, where the range can be ⁇ 20%, ⁇ 15%, ⁇ 10%, ⁇ 5%, or ⁇ 1%.
- a result e.g., measurement value
- close can mean, for example, the result is within 80% of the value, within 90% of the value, within 95% of the value, or within 99% of the value.
- defined or “determined” can include “predefined” or “predetermined” and/or otherwise determined values, conditions, thresholds, measurements, and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Automation & Control Theory (AREA)
- Aviation & Aerospace Engineering (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Electromagnetism (AREA)
- Business, Economics & Management (AREA)
- Health & Medical Sciences (AREA)
- Game Theory and Decision Science (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Mathematical Physics (AREA)
- Fuzzy Systems (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Electric Vacuum Cleaner (AREA)
Priority Applications (9)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US15/152,425 US20170329347A1 (en) | 2016-05-11 | 2016-05-11 | Systems and methods for training a robot to autonomously travel a route |
| EP17796888.0A EP3454705A4 (en) | 2016-05-11 | 2017-05-11 | SYSTEMS AND METHOD FOR TRAINING A ROBOT TO AUTONOMOUSLY DRIVE A ROUTE |
| KR1020187035942A KR102355750B1 (ko) | 2016-05-11 | 2017-05-11 | 경로를 자율주행하도록 로봇을 훈련시키기 위한 시스템 및 방법 |
| JP2019511831A JP6949107B2 (ja) | 2016-05-11 | 2017-05-11 | 経路を自律走行するようにロボットを訓練するためのシステムおよび方法 |
| CN201780041655.5A CN109414142B (zh) | 2016-05-11 | 2017-05-11 | 用于训练机器人沿着路线自主行进的系统和方法 |
| PCT/US2017/032273 WO2017197190A1 (en) | 2016-05-11 | 2017-05-11 | Systems and methods for training a robot to autonomously travel a route |
| CA3023552A CA3023552A1 (en) | 2016-05-11 | 2017-05-11 | Systems and methods for training a robot to autonomously travel a route |
| US16/168,368 US11467602B2 (en) | 2016-05-11 | 2018-10-23 | Systems and methods for training a robot to autonomously travel a route |
| US17/961,926 US12393196B2 (en) | 2016-05-11 | 2022-10-07 | Systems and methods for training a robot to autonomously travel a route |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US15/152,425 US20170329347A1 (en) | 2016-05-11 | 2016-05-11 | Systems and methods for training a robot to autonomously travel a route |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/168,368 Continuation US11467602B2 (en) | 2016-05-11 | 2018-10-23 | Systems and methods for training a robot to autonomously travel a route |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20170329347A1 true US20170329347A1 (en) | 2017-11-16 |
Family
ID=60267336
Family Applications (3)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/152,425 Abandoned US20170329347A1 (en) | 2016-05-11 | 2016-05-11 | Systems and methods for training a robot to autonomously travel a route |
| US16/168,368 Active 2037-12-13 US11467602B2 (en) | 2016-05-11 | 2018-10-23 | Systems and methods for training a robot to autonomously travel a route |
| US17/961,926 Active 2036-12-22 US12393196B2 (en) | 2016-05-11 | 2022-10-07 | Systems and methods for training a robot to autonomously travel a route |
Family Applications After (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/168,368 Active 2037-12-13 US11467602B2 (en) | 2016-05-11 | 2018-10-23 | Systems and methods for training a robot to autonomously travel a route |
| US17/961,926 Active 2036-12-22 US12393196B2 (en) | 2016-05-11 | 2022-10-07 | Systems and methods for training a robot to autonomously travel a route |
Country Status (7)
| Country | Link |
|---|---|
| US (3) | US20170329347A1 (enExample) |
| EP (1) | EP3454705A4 (enExample) |
| JP (1) | JP6949107B2 (enExample) |
| KR (1) | KR102355750B1 (enExample) |
| CN (1) | CN109414142B (enExample) |
| CA (1) | CA3023552A1 (enExample) |
| WO (1) | WO2017197190A1 (enExample) |
Cited By (59)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9945677B1 (en) * | 2015-07-23 | 2018-04-17 | X Development Llc | Automated lane and route network discovery for robotic actors |
| US10001780B2 (en) * | 2016-11-02 | 2018-06-19 | Brain Corporation | Systems and methods for dynamic route planning in autonomous navigation |
| US20180299899A1 (en) * | 2017-04-13 | 2018-10-18 | Neato Robotics, Inc. | Localized collection of ambient data |
| US20180354132A1 (en) * | 2017-06-09 | 2018-12-13 | Lg Electronics Inc. | Moving robot and control method thereof |
| US20190082102A1 (en) * | 2017-09-13 | 2019-03-14 | Fuji Xerox Co.,Ltd. | Information processing apparatus and non-transitory computer readable medium |
| US10241514B2 (en) | 2016-05-11 | 2019-03-26 | Brain Corporation | Systems and methods for initializing a robot to autonomously travel a trained route |
| US10274325B2 (en) | 2016-11-01 | 2019-04-30 | Brain Corporation | Systems and methods for robotic mapping |
| US10282849B2 (en) | 2016-06-17 | 2019-05-07 | Brain Corporation | Systems and methods for predictive/reconstructive visual object tracker |
| US10335949B2 (en) * | 2016-01-20 | 2019-07-02 | Yujin Robot Co., Ltd. | System for operating mobile robot based on complex map information and operating method thereof |
| US10336469B2 (en) | 2016-09-30 | 2019-07-02 | Sony Interactive Entertainment Inc. | Unmanned aerial vehicle movement via environmental interactions |
| US10357709B2 (en) | 2016-09-30 | 2019-07-23 | Sony Interactive Entertainment Inc. | Unmanned aerial vehicle movement via environmental airflow |
| CN110045735A (zh) * | 2019-04-08 | 2019-07-23 | 北京优洁客创新科技有限公司 | 洗地机自主学习行走路径的方法、装置、介质和电子设备 |
| US10377484B2 (en) * | 2016-09-30 | 2019-08-13 | Sony Interactive Entertainment Inc. | UAV positional anchors |
| US10410320B2 (en) | 2016-09-30 | 2019-09-10 | Sony Interactive Entertainment Inc. | Course profiling and sharing |
| US10416669B2 (en) | 2016-09-30 | 2019-09-17 | Sony Interactive Entertainment Inc. | Mechanical effects by way of software or real world engagement |
| US10437254B2 (en) * | 2017-08-02 | 2019-10-08 | Ankobot (Shenzhen) Smart Technologies Co., Ltd. | Control method, device and system of robot and robot using the same |
| US20190317520A1 (en) * | 2018-04-16 | 2019-10-17 | Baidu Usa Llc | Learning based speed planner for autonomous driving vehicles |
| US20200029172A1 (en) * | 2016-10-11 | 2020-01-23 | Samsung Electronics Co., Ltd. | Monitoring system control method and electronic device for supporting same |
| EP3613321A1 (de) * | 2018-08-23 | 2020-02-26 | Vorwerk & Co. Interholding GmbH | Sich selbsttätig innerhalb einer umgebung fortbewegendes bodenbearbeitungsgerät |
| CN110858074A (zh) * | 2018-08-09 | 2020-03-03 | 科沃斯机器人股份有限公司 | 异常提示方法、系统、设备及存储介质 |
| WO2020046699A1 (en) * | 2018-08-30 | 2020-03-05 | Irobot Corporation | Map based training and interface for mobile robots |
| CN111158475A (zh) * | 2019-12-20 | 2020-05-15 | 华中科技大学鄂州工业技术研究院 | 一种虚拟场景中训练路径的生成方法及装置 |
| US10671874B2 (en) * | 2016-08-03 | 2020-06-02 | X Development Llc | Generating a model for an object encountered by a robot |
| US10679511B2 (en) | 2016-09-30 | 2020-06-09 | Sony Interactive Entertainment Inc. | Collision detection and avoidance |
| US10723018B2 (en) | 2016-11-28 | 2020-07-28 | Brain Corporation | Systems and methods for remote operating and/or monitoring of a robot |
| WO2020176838A1 (en) * | 2019-02-28 | 2020-09-03 | Brain Corporation | Systems, and methods for merging disjointed map and route data with respect to a single origin for autonomous robots |
| US10795367B2 (en) * | 2018-01-11 | 2020-10-06 | Uatc, Llc | Mapped driving paths for autonomous vehicle |
| US10850838B2 (en) | 2016-09-30 | 2020-12-01 | Sony Interactive Entertainment Inc. | UAV battery form factor and insertion/ejection methodologies |
| CN112008690A (zh) * | 2019-05-30 | 2020-12-01 | 精工爱普生株式会社 | 机器人系统及便携式示教装置 |
| US10852730B2 (en) | 2017-02-08 | 2020-12-01 | Brain Corporation | Systems and methods for robotic mobile platforms |
| US10899008B2 (en) * | 2017-03-30 | 2021-01-26 | Brain Corporation | Systems and methods for robotic path planning |
| US10945577B2 (en) * | 2017-08-17 | 2021-03-16 | Beijing Xiaomi Mobile Software Co., Ltd. | Timed cleaning method, device and storage medium |
| US10980387B2 (en) * | 2016-08-31 | 2021-04-20 | Murata Machinery, Ltd. | Autonomously traveling floor washer |
| US20210147202A1 (en) * | 2018-07-05 | 2021-05-20 | Brain Corporation | Systems and methods for operating autonomous tug robots |
| US20210220996A1 (en) * | 2018-08-10 | 2021-07-22 | Brain Corporation | Systems, apparatuses and methods for removing false positives from sensor detection |
| US20210223779A1 (en) * | 2018-09-19 | 2021-07-22 | Brain Corporation | Systems and methods for rerouting robots to avoid no-go zones |
| US11092458B2 (en) * | 2018-10-30 | 2021-08-17 | Telenav, Inc. | Navigation system with operation obstacle alert mechanism and method of operation thereof |
| US11131996B2 (en) * | 2018-11-29 | 2021-09-28 | Shenzhen Silver Star Intelligent Technology Co., Ltd. | Area partitioning method, partition cleaning method and robot thereof |
| US20210318687A1 (en) * | 2020-04-13 | 2021-10-14 | Boston Dynamics, Inc. | Online Authoring of Robot Autonomy Applications |
| US20210331312A1 (en) * | 2019-05-29 | 2021-10-28 | Lg Electronics Inc. | Intelligent robot cleaner for setting travel route based on video learning and managing method thereof |
| US11189007B2 (en) * | 2019-12-03 | 2021-11-30 | Imagry (Israel) Ltd | Real-time generation of functional road maps |
| US11311163B2 (en) * | 2018-11-06 | 2022-04-26 | Nihon Business Data Processing Center Co., Ltd. | Self-propelling cleaning robot |
| US11340630B2 (en) * | 2018-03-30 | 2022-05-24 | Brain Corporation | Systems and methods for robust robotic mapping |
| CN114728417A (zh) * | 2019-12-17 | 2022-07-08 | X开发有限责任公司 | 由远程操作员触发的机器人自主对象学习 |
| US11466901B2 (en) * | 2017-12-28 | 2022-10-11 | Weismacher Eco Private Limited | Self powered and timer based solar panel cleaning system |
| US11480974B2 (en) * | 2018-04-05 | 2022-10-25 | Electronics And Telecommunications Research Institute | Topological map generation apparatus for navigation of robot and method thereof |
| CN115248590A (zh) * | 2021-07-01 | 2022-10-28 | 浙江大唐国际绍兴江滨热电有限责任公司 | 应用于天然气电厂的巡检机器人导航方法 |
| US20220390944A1 (en) * | 2017-08-22 | 2022-12-08 | Waymo Llc | Context aware stopping for autonomous vehicles |
| US20230004155A1 (en) * | 2020-02-27 | 2023-01-05 | Panasonic Intellectual Property Management Co., Ltd. | Information presentation method, information presentation device, and recording medium |
| US20230004166A1 (en) * | 2020-03-13 | 2023-01-05 | Brain Corporation | Systems and methods for route synchronization for robotic devices |
| US20230114211A1 (en) * | 2020-06-30 | 2023-04-13 | Amicro Semiconductor Co., Ltd. | Edgewise Path Selection Method for Robot Obstacle Crossing, Chip, and Robot |
| US20230146810A1 (en) * | 2016-11-22 | 2023-05-11 | The Toro Company | Autonomous path treatment systems and methods |
| US11835343B1 (en) * | 2004-08-06 | 2023-12-05 | AI Incorporated | Method for constructing a map while performing work |
| US11958183B2 (en) | 2019-09-19 | 2024-04-16 | The Research Foundation For The State University Of New York | Negotiation-based human-robot collaboration via augmented reality |
| EP4434422A1 (de) * | 2023-03-24 | 2024-09-25 | Vorwerk & Co. Interholding GmbH | Verfahren zum betreiben eines selbstfahrenden reinigungsgeräts und selbstfahrendes reinigungsgerät |
| EP3795056B1 (de) * | 2019-09-23 | 2025-01-08 | Hako GmbH | Verfahren zum reinigen einer bodenfläche mit einer bodenreinigungsmaschine |
| US20250028332A1 (en) * | 2021-11-18 | 2025-01-23 | Mobile Industrial Robots A/S | A method for navigating an autonomous mobile robot |
| US20250076894A1 (en) * | 2021-01-29 | 2025-03-06 | Beijing Jingdong Qianshi Technology Co., Ltd. | Method and apparatus for movable robot to adjust pose of goods rack |
| US12466075B2 (en) | 2021-06-04 | 2025-11-11 | Boston Dynamics, Inc. | Autonomous and teleoperated sensor pointing on a mobile robot |
Families Citing this family (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10809071B2 (en) * | 2017-10-17 | 2020-10-20 | AI Incorporated | Method for constructing a map while performing work |
| FR3077721B1 (fr) * | 2018-02-15 | 2020-03-06 | Mv Robot | Procede de commande du fonctionnement d’un robot de traitement de sols, equipement et systeme pour la mise en œuvre de ce procede. |
| JP2020027574A (ja) * | 2018-08-17 | 2020-02-20 | 株式会社東芝 | 自律移動装置、方法及び3次元モデリングシステム |
| DE102019101337A1 (de) * | 2019-01-18 | 2020-07-23 | Vorwerk & Co. Interholding Gmbh | System mit einem ersten Bodenbearbeitungsgerät und einem zweiten Bodenbearbeitungsgerät sowie Verfahren zum Betrieb eines solchen Systems |
| CN110568848B (zh) * | 2019-09-10 | 2022-09-23 | 东风商用车有限公司 | 清扫车的示教自动驾驶作业系统 |
| KR102778546B1 (ko) * | 2019-10-01 | 2025-03-07 | 엘지전자 주식회사 | 로봇 청소기 및 청소 경로를 결정하기 위한 방법 |
| KR20210040613A (ko) | 2019-10-04 | 2021-04-14 | 삼성전자주식회사 | 전자 장치 및 그의 제어 방법 |
| KR102125538B1 (ko) * | 2019-12-13 | 2020-06-22 | 주식회사 토르 드라이브 | 자율 주행을 위한 효율적인 맵 매칭 방법 및 그 장치 |
| KR102348963B1 (ko) * | 2020-03-10 | 2022-01-11 | 엘지전자 주식회사 | 로봇 청소기 및 그 제어 방법 |
| JP7426508B2 (ja) * | 2020-05-21 | 2024-02-01 | ハイ ロボティクス カンパニー リミテッド | ナビゲーション方法、ナビゲーション装置、記憶媒体及びプログラム |
| KR102595387B1 (ko) * | 2021-03-09 | 2023-10-27 | 동의대학교 산학협력단 | 가상 그리드 기반 에이스타 경로 탐색 방법 및 이를 위한 시스템 |
| WO2022256475A1 (en) * | 2021-06-02 | 2022-12-08 | The Toro Company | System facilitating user arrangement of paths for use by autonomous work vehicle |
| CN113878577B (zh) * | 2021-09-28 | 2023-05-30 | 深圳市海柔创新科技有限公司 | 机器人的控制方法、机器人、控制终端和控制系统 |
| CN114794959B (zh) * | 2022-06-28 | 2023-03-03 | 山西嘉世达机器人技术有限公司 | 清洁机的控制方法、装置、清洁机及存储介质 |
| US12327482B2 (en) | 2022-07-08 | 2025-06-10 | Honeywell International Inc. | Radio frequency interference database for vehicle navigation planning |
| KR20240110310A (ko) * | 2023-01-06 | 2024-07-15 | 주식회사 케이티 | 로봇 장애 분석 방법 및 장치 |
Family Cites Families (22)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5204814A (en) * | 1990-11-13 | 1993-04-20 | Mobot, Inc. | Autonomous lawn mower |
| US6718258B1 (en) * | 2000-06-08 | 2004-04-06 | Navigation Technologies Corp | Method and system for obtaining user feedback regarding geographic data |
| DE10149115A1 (de) * | 2001-10-05 | 2003-04-17 | Bosch Gmbh Robert | Objekterfassungsvorrichtung |
| US8843244B2 (en) * | 2006-10-06 | 2014-09-23 | Irobot Corporation | Autonomous behaviors for a remove vehicle |
| US7957900B2 (en) * | 2008-02-08 | 2011-06-07 | Gaurav Chowdhary | Tracking vehicle locations in a parking lot for definitive display on a GUI |
| US8447463B1 (en) * | 2008-02-08 | 2013-05-21 | Gaurav Chowdhary | Tracking vehicle locations in a parking lot for definitive display on a GUI |
| JP5215740B2 (ja) * | 2008-06-09 | 2013-06-19 | 株式会社日立製作所 | 移動ロボットシステム |
| US8774970B2 (en) * | 2009-06-11 | 2014-07-08 | S.C. Johnson & Son, Inc. | Trainable multi-mode floor cleaning device |
| CN101620802B (zh) * | 2009-08-05 | 2011-06-01 | 北京四维图新科技股份有限公司 | 电子地图的检查方法和装置 |
| CN102155948A (zh) * | 2010-02-11 | 2011-08-17 | 北京四维图新科技股份有限公司 | 导航电子地图质量的随机检测评估方法及装置 |
| KR101395089B1 (ko) * | 2010-10-01 | 2014-05-16 | 안동대학교 산학협력단 | 장애물 감지 시스템 및 방법 |
| DE102012109004A1 (de) * | 2012-09-24 | 2014-03-27 | RobArt GmbH | Roboter und Verfahren zur autonomen Inspektion oder Bearbeitung von Bodenflächen |
| US8949016B1 (en) * | 2012-09-28 | 2015-02-03 | Google Inc. | Systems and methods for determining whether a driving environment has changed |
| JP6132659B2 (ja) * | 2013-02-27 | 2017-05-24 | シャープ株式会社 | 周囲環境認識装置、それを用いた自律移動システムおよび周囲環境認識方法 |
| US9816823B2 (en) * | 2013-03-15 | 2017-11-14 | Hewlett Packard Enterprise Development Lp | Updating road maps |
| JP6136543B2 (ja) * | 2013-05-01 | 2017-05-31 | 村田機械株式会社 | 自律移動体 |
| JP6227948B2 (ja) * | 2013-09-18 | 2017-11-08 | 村田機械株式会社 | 自律走行式床洗浄機、清掃スケジュールのデータ構造、記憶媒体、清掃スケジュールの生成方法、及びプログラム |
| JP6200822B2 (ja) * | 2014-01-30 | 2017-09-20 | シャープ株式会社 | 学習リモコン装置およびこれを備える自走式電子機器ならびにリモコン学習方法 |
| US9538702B2 (en) * | 2014-12-22 | 2017-01-10 | Irobot Corporation | Robotic mowing of separated lawn areas |
| CN105167716A (zh) * | 2015-08-21 | 2015-12-23 | 王震渊 | 一种智能扫地机器人 |
| KR20180079428A (ko) * | 2015-11-02 | 2018-07-10 | 스타쉽 테크놀로지스 오 | 자동 로컬리제이션을 위한 장치 및 방법 |
| US10545229B2 (en) * | 2016-04-22 | 2020-01-28 | Huawei Technologies Co., Ltd. | Systems and methods for unified mapping of an environment |
-
2016
- 2016-05-11 US US15/152,425 patent/US20170329347A1/en not_active Abandoned
-
2017
- 2017-05-11 EP EP17796888.0A patent/EP3454705A4/en not_active Withdrawn
- 2017-05-11 KR KR1020187035942A patent/KR102355750B1/ko active Active
- 2017-05-11 CN CN201780041655.5A patent/CN109414142B/zh not_active Expired - Fee Related
- 2017-05-11 CA CA3023552A patent/CA3023552A1/en not_active Abandoned
- 2017-05-11 JP JP2019511831A patent/JP6949107B2/ja active Active
- 2017-05-11 WO PCT/US2017/032273 patent/WO2017197190A1/en not_active Ceased
-
2018
- 2018-10-23 US US16/168,368 patent/US11467602B2/en active Active
-
2022
- 2022-10-07 US US17/961,926 patent/US12393196B2/en active Active
Cited By (90)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11835343B1 (en) * | 2004-08-06 | 2023-12-05 | AI Incorporated | Method for constructing a map while performing work |
| US9945677B1 (en) * | 2015-07-23 | 2018-04-17 | X Development Llc | Automated lane and route network discovery for robotic actors |
| US10335949B2 (en) * | 2016-01-20 | 2019-07-02 | Yujin Robot Co., Ltd. | System for operating mobile robot based on complex map information and operating method thereof |
| US10241514B2 (en) | 2016-05-11 | 2019-03-26 | Brain Corporation | Systems and methods for initializing a robot to autonomously travel a trained route |
| US10282849B2 (en) | 2016-06-17 | 2019-05-07 | Brain Corporation | Systems and methods for predictive/reconstructive visual object tracker |
| US11195041B2 (en) | 2016-08-03 | 2021-12-07 | X Development Llc | Generating a model for an object encountered by a robot |
| US11691273B2 (en) | 2016-08-03 | 2023-07-04 | X Development Llc | Generating a model for an object encountered by a robot |
| US10671874B2 (en) * | 2016-08-03 | 2020-06-02 | X Development Llc | Generating a model for an object encountered by a robot |
| US12103178B2 (en) | 2016-08-03 | 2024-10-01 | Google Llc | Generating a model for an object encountered by a robot |
| US10980387B2 (en) * | 2016-08-31 | 2021-04-20 | Murata Machinery, Ltd. | Autonomously traveling floor washer |
| US10410320B2 (en) | 2016-09-30 | 2019-09-10 | Sony Interactive Entertainment Inc. | Course profiling and sharing |
| US10336469B2 (en) | 2016-09-30 | 2019-07-02 | Sony Interactive Entertainment Inc. | Unmanned aerial vehicle movement via environmental interactions |
| US10850838B2 (en) | 2016-09-30 | 2020-12-01 | Sony Interactive Entertainment Inc. | UAV battery form factor and insertion/ejection methodologies |
| US10377484B2 (en) * | 2016-09-30 | 2019-08-13 | Sony Interactive Entertainment Inc. | UAV positional anchors |
| US11222549B2 (en) | 2016-09-30 | 2022-01-11 | Sony Interactive Entertainment Inc. | Collision detection and avoidance |
| US10416669B2 (en) | 2016-09-30 | 2019-09-17 | Sony Interactive Entertainment Inc. | Mechanical effects by way of software or real world engagement |
| US10692174B2 (en) | 2016-09-30 | 2020-06-23 | Sony Interactive Entertainment Inc. | Course profiling and sharing |
| US10679511B2 (en) | 2016-09-30 | 2020-06-09 | Sony Interactive Entertainment Inc. | Collision detection and avoidance |
| US10540746B2 (en) | 2016-09-30 | 2020-01-21 | Sony Interactive Entertainment Inc. | Course profiling and sharing |
| US11288767B2 (en) | 2016-09-30 | 2022-03-29 | Sony Interactive Entertainment Inc. | Course profiling and sharing |
| US10357709B2 (en) | 2016-09-30 | 2019-07-23 | Sony Interactive Entertainment Inc. | Unmanned aerial vehicle movement via environmental airflow |
| US20200029172A1 (en) * | 2016-10-11 | 2020-01-23 | Samsung Electronics Co., Ltd. | Monitoring system control method and electronic device for supporting same |
| US11012814B2 (en) * | 2016-10-11 | 2021-05-18 | Samsung Electronics Co., Ltd. | Monitoring system control method and electronic device for supporting same |
| US10274325B2 (en) | 2016-11-01 | 2019-04-30 | Brain Corporation | Systems and methods for robotic mapping |
| US10379539B2 (en) * | 2016-11-02 | 2019-08-13 | Brain Corporation | Systems and methods for dynamic route planning in autonomous navigation |
| US10001780B2 (en) * | 2016-11-02 | 2018-06-19 | Brain Corporation | Systems and methods for dynamic route planning in autonomous navigation |
| US20230146810A1 (en) * | 2016-11-22 | 2023-05-11 | The Toro Company | Autonomous path treatment systems and methods |
| US12346111B2 (en) * | 2016-11-22 | 2025-07-01 | The Toro Company | Autonomous path treatment systems and methods |
| US10723018B2 (en) | 2016-11-28 | 2020-07-28 | Brain Corporation | Systems and methods for remote operating and/or monitoring of a robot |
| US10852730B2 (en) | 2017-02-08 | 2020-12-01 | Brain Corporation | Systems and methods for robotic mobile platforms |
| US20210220995A1 (en) * | 2017-03-30 | 2021-07-22 | Brain Corporation | Systems and methods for robotic path planning |
| US10899008B2 (en) * | 2017-03-30 | 2021-01-26 | Brain Corporation | Systems and methods for robotic path planning |
| US11701778B2 (en) * | 2017-03-30 | 2023-07-18 | Brain Corporation | Systems and methods for robotic path planning |
| US20180299899A1 (en) * | 2017-04-13 | 2018-10-18 | Neato Robotics, Inc. | Localized collection of ambient data |
| US10717193B2 (en) * | 2017-06-09 | 2020-07-21 | Lg Electronics Inc. | Artificial intelligence moving robot and control method thereof |
| US20180354132A1 (en) * | 2017-06-09 | 2018-12-13 | Lg Electronics Inc. | Moving robot and control method thereof |
| US10437254B2 (en) * | 2017-08-02 | 2019-10-08 | Ankobot (Shenzhen) Smart Technologies Co., Ltd. | Control method, device and system of robot and robot using the same |
| US10945577B2 (en) * | 2017-08-17 | 2021-03-16 | Beijing Xiaomi Mobile Software Co., Ltd. | Timed cleaning method, device and storage medium |
| US20220390944A1 (en) * | 2017-08-22 | 2022-12-08 | Waymo Llc | Context aware stopping for autonomous vehicles |
| US20190082102A1 (en) * | 2017-09-13 | 2019-03-14 | Fuji Xerox Co.,Ltd. | Information processing apparatus and non-transitory computer readable medium |
| US10819902B2 (en) * | 2017-09-13 | 2020-10-27 | Fuji Xerox Co., Ltd. | Information processing apparatus and non-transitory computer readable medium |
| US11466901B2 (en) * | 2017-12-28 | 2022-10-11 | Weismacher Eco Private Limited | Self powered and timer based solar panel cleaning system |
| US10795367B2 (en) * | 2018-01-11 | 2020-10-06 | Uatc, Llc | Mapped driving paths for autonomous vehicle |
| US11454973B2 (en) | 2018-01-11 | 2022-09-27 | Uatc, Llc | Mapped driving paths for autonomous vehicle |
| US11340630B2 (en) * | 2018-03-30 | 2022-05-24 | Brain Corporation | Systems and methods for robust robotic mapping |
| US11480974B2 (en) * | 2018-04-05 | 2022-10-25 | Electronics And Telecommunications Research Institute | Topological map generation apparatus for navigation of robot and method thereof |
| US11126199B2 (en) * | 2018-04-16 | 2021-09-21 | Baidu Usa Llc | Learning based speed planner for autonomous driving vehicles |
| US20190317520A1 (en) * | 2018-04-16 | 2019-10-17 | Baidu Usa Llc | Learning based speed planner for autonomous driving vehicles |
| US20210147202A1 (en) * | 2018-07-05 | 2021-05-20 | Brain Corporation | Systems and methods for operating autonomous tug robots |
| US12030757B2 (en) * | 2018-07-05 | 2024-07-09 | Brain Corporation | Systems and methods for operating autonomous tug robots |
| CN110858074A (zh) * | 2018-08-09 | 2020-03-03 | 科沃斯机器人股份有限公司 | 异常提示方法、系统、设备及存储介质 |
| US20210220996A1 (en) * | 2018-08-10 | 2021-07-22 | Brain Corporation | Systems, apparatuses and methods for removing false positives from sensor detection |
| US12053892B2 (en) * | 2018-08-10 | 2024-08-06 | Brain Corporation | Systems, apparatuses and methods for removing false positives from sensor detection |
| EP3613321A1 (de) * | 2018-08-23 | 2020-02-26 | Vorwerk & Co. Interholding GmbH | Sich selbsttätig innerhalb einer umgebung fortbewegendes bodenbearbeitungsgerät |
| WO2020046699A1 (en) * | 2018-08-30 | 2020-03-05 | Irobot Corporation | Map based training and interface for mobile robots |
| US20210113050A1 (en) * | 2018-08-30 | 2021-04-22 | Irobot Corporation | Map based training and interface for mobile robots |
| US11703857B2 (en) * | 2018-08-30 | 2023-07-18 | Irobot Corporation | Map based training and interface for mobile robots |
| CN112367886A (zh) * | 2018-08-30 | 2021-02-12 | 美国iRobot公司 | 用于移动机器人的基于地图的训练与界面 |
| US10835096B2 (en) * | 2018-08-30 | 2020-11-17 | Irobot Corporation | Map based training and interface for mobile robots |
| US20200069138A1 (en) * | 2018-08-30 | 2020-03-05 | Irobot Corporation | Map based training and interface for mobile robots |
| US20210223779A1 (en) * | 2018-09-19 | 2021-07-22 | Brain Corporation | Systems and methods for rerouting robots to avoid no-go zones |
| US11092458B2 (en) * | 2018-10-30 | 2021-08-17 | Telenav, Inc. | Navigation system with operation obstacle alert mechanism and method of operation thereof |
| US11311163B2 (en) * | 2018-11-06 | 2022-04-26 | Nihon Business Data Processing Center Co., Ltd. | Self-propelling cleaning robot |
| US11131996B2 (en) * | 2018-11-29 | 2021-09-28 | Shenzhen Silver Star Intelligent Technology Co., Ltd. | Area partitioning method, partition cleaning method and robot thereof |
| WO2020176838A1 (en) * | 2019-02-28 | 2020-09-03 | Brain Corporation | Systems, and methods for merging disjointed map and route data with respect to a single origin for autonomous robots |
| CN110045735A (zh) * | 2019-04-08 | 2019-07-23 | 北京优洁客创新科技有限公司 | 洗地机自主学习行走路径的方法、装置、介质和电子设备 |
| US20210331312A1 (en) * | 2019-05-29 | 2021-10-28 | Lg Electronics Inc. | Intelligent robot cleaner for setting travel route based on video learning and managing method thereof |
| US11565411B2 (en) * | 2019-05-29 | 2023-01-31 | Lg Electronics Inc. | Intelligent robot cleaner for setting travel route based on video learning and managing method thereof |
| US11577402B2 (en) * | 2019-05-30 | 2023-02-14 | Seiko Epson Corporation | Robot system and portable teaching device |
| CN112008690A (zh) * | 2019-05-30 | 2020-12-01 | 精工爱普生株式会社 | 机器人系统及便携式示教装置 |
| US11958183B2 (en) | 2019-09-19 | 2024-04-16 | The Research Foundation For The State University Of New York | Negotiation-based human-robot collaboration via augmented reality |
| EP3795056B1 (de) * | 2019-09-23 | 2025-01-08 | Hako GmbH | Verfahren zum reinigen einer bodenfläche mit einer bodenreinigungsmaschine |
| US20220067877A1 (en) * | 2019-12-03 | 2022-03-03 | Imagry (Israel) Ltd | Real-time generation of functional road maps |
| US11189007B2 (en) * | 2019-12-03 | 2021-11-30 | Imagry (Israel) Ltd | Real-time generation of functional road maps |
| US11836884B2 (en) * | 2019-12-03 | 2023-12-05 | Imagry (Israel) Ltd | Real-time generation of functional road maps |
| CN114728417A (zh) * | 2019-12-17 | 2022-07-08 | X开发有限责任公司 | 由远程操作员触发的机器人自主对象学习 |
| CN111158475A (zh) * | 2019-12-20 | 2020-05-15 | 华中科技大学鄂州工业技术研究院 | 一种虚拟场景中训练路径的生成方法及装置 |
| US20230004155A1 (en) * | 2020-02-27 | 2023-01-05 | Panasonic Intellectual Property Management Co., Ltd. | Information presentation method, information presentation device, and recording medium |
| US12181871B2 (en) * | 2020-02-27 | 2024-12-31 | Panasonic Intellectual Property Management Co., Ltd. | Information presentation method, information presentation device, and recording medium |
| US20230004166A1 (en) * | 2020-03-13 | 2023-01-05 | Brain Corporation | Systems and methods for route synchronization for robotic devices |
| US11797016B2 (en) * | 2020-04-13 | 2023-10-24 | Boston Dynamics, Inc. | Online authoring of robot autonomy applications |
| US20210318687A1 (en) * | 2020-04-13 | 2021-10-14 | Boston Dynamics, Inc. | Online Authoring of Robot Autonomy Applications |
| US12346116B2 (en) | 2020-04-13 | 2025-07-01 | Boston Dynamics, Inc. | Online authoring of robot autonomy applications |
| US20230114211A1 (en) * | 2020-06-30 | 2023-04-13 | Amicro Semiconductor Co., Ltd. | Edgewise Path Selection Method for Robot Obstacle Crossing, Chip, and Robot |
| US12140955B2 (en) * | 2020-06-30 | 2024-11-12 | Amicro Semiconductor Co., Ltd. | Edgewise path selection method for robot obstacle crossing, chip, and robot |
| US20250076894A1 (en) * | 2021-01-29 | 2025-03-06 | Beijing Jingdong Qianshi Technology Co., Ltd. | Method and apparatus for movable robot to adjust pose of goods rack |
| US12466075B2 (en) | 2021-06-04 | 2025-11-11 | Boston Dynamics, Inc. | Autonomous and teleoperated sensor pointing on a mobile robot |
| CN115248590A (zh) * | 2021-07-01 | 2022-10-28 | 浙江大唐国际绍兴江滨热电有限责任公司 | 应用于天然气电厂的巡检机器人导航方法 |
| US20250028332A1 (en) * | 2021-11-18 | 2025-01-23 | Mobile Industrial Robots A/S | A method for navigating an autonomous mobile robot |
| EP4434422A1 (de) * | 2023-03-24 | 2024-09-25 | Vorwerk & Co. Interholding GmbH | Verfahren zum betreiben eines selbstfahrenden reinigungsgeräts und selbstfahrendes reinigungsgerät |
Also Published As
| Publication number | Publication date |
|---|---|
| US20230021778A1 (en) | 2023-01-26 |
| US20190121365A1 (en) | 2019-04-25 |
| EP3454705A1 (en) | 2019-03-20 |
| KR20190029524A (ko) | 2019-03-20 |
| CN109414142A (zh) | 2019-03-01 |
| EP3454705A4 (en) | 2019-12-11 |
| JP6949107B2 (ja) | 2021-10-13 |
| KR102355750B1 (ko) | 2022-01-26 |
| US11467602B2 (en) | 2022-10-11 |
| CA3023552A1 (en) | 2017-11-16 |
| WO2017197190A1 (en) | 2017-11-16 |
| JP2019522301A (ja) | 2019-08-08 |
| CN109414142B (zh) | 2021-12-28 |
| US12393196B2 (en) | 2025-08-19 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12393196B2 (en) | Systems and methods for training a robot to autonomously travel a route | |
| US10823576B2 (en) | Systems and methods for robotic mapping | |
| US11803185B2 (en) | Systems and methods for initializing a robot to autonomously travel a trained route | |
| US10379539B2 (en) | Systems and methods for dynamic route planning in autonomous navigation | |
| US11099575B2 (en) | Systems and methods for precise navigation of autonomous devices | |
| US20220026911A1 (en) | Systems and methods for precise navigation of autonomous devices | |
| US11340630B2 (en) | Systems and methods for robust robotic mapping | |
| US11886198B2 (en) | Systems and methods for detecting blind spots for robots | |
| US20240168487A1 (en) | Systems and methods for detecting and correcting diverged computer readable maps for robotic devices | |
| HK40005488B (en) | Systems and methods for training a robot to autonomously travel a route | |
| HK40005488A (en) | Systems and methods for training a robot to autonomously travel a route | |
| EP4487301A2 (en) | Systems and methods for aligning a plurality of local computer readable maps to a single global map and detecting mapping errors | |
| HK40011873A (en) | Systems and methods for robotic mapping | |
| HK40011873B (en) | Systems and methods for robotic mapping |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: BRAIN CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PASSOT, JEAN-BAPTISTE;SMITH, ANDREW;SZATMARY, BOTOND;AND OTHERS;SIGNING DATES FROM 20160525 TO 20160617;REEL/FRAME:039083/0559 |
|
| AS | Assignment |
Owner name: BRAIN CORPORATION, CALIFORNIA Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNOR NAME: ROMBOUTS, JALDERT PREVIOUSLY RECORDED ON REEL 039083 FRAME 0559. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNOR NAME: RAMBOUTS, JALDERT;ASSIGNORS:PASSOT, JEAN-BAPTISTE;SMITH, ANDREW;SZATMARY, BOTOND;AND OTHERS;SIGNING DATES FROM 20160525 TO 20160617;REEL/FRAME:042641/0251 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |