CN110023866A - System and method for the dynamic route planning in independent navigation - Google Patents

System and method for the dynamic route planning in independent navigation Download PDF

Info

Publication number
CN110023866A
CN110023866A CN201780074759.6A CN201780074759A CN110023866A CN 110023866 A CN110023866 A CN 110023866A CN 201780074759 A CN201780074759 A CN 201780074759A CN 110023866 A CN110023866 A CN 110023866A
Authority
CN
China
Prior art keywords
route
pose
robot
point
poses
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201780074759.6A
Other languages
Chinese (zh)
Other versions
CN110023866B (en
Inventor
B·I·加巴尔多斯
J·B·帕索
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yunhai Zhihang Co Ltd
Original Assignee
Yunhai Zhihang Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yunhai Zhihang Co Ltd filed Critical Yunhai Zhihang Co Ltd
Publication of CN110023866A publication Critical patent/CN110023866A/en
Application granted granted Critical
Publication of CN110023866B publication Critical patent/CN110023866B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/3415Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/343Calculating itineraries, i.e. routes leading from a starting point to a series of categorical destinations using a global route restraint, round trips, touristic trips
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0088Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0217Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with energy consumption, time reduction or distance reduction criteria
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Abstract

The open system and method for the dynamic route planning in independent navigation.In some example embodiments, robot can have one or more sensors, and the sensor is configured to collect the data about environment, include the data about the test point on one or more objects in the environment.Then, the robot can plan the route in the environment, wherein the route may include one or more route poses.The route pose, which may include, at least partly indicates the robot along the pose of the route, the trace of size and shape.It can have multiple points in each route pose.Based on the power on the point of each route pose is applied to by the object etc. in other route poses, the environment, each route pose can be relocated.It is based at least partially on the interpolation executed to the route pose (some of which may relocate), the robot being capable of dynamic routing.

Description

System and method for the dynamic route planning in independent navigation
Priority
This application claims the co-owning with same title submitted on November 2nd, 2016 and the U.S. of co-pending is special Benefit applies for that the 15/341st, No. 612 priority, the content of the application are incorporated herein by reference in its entirety.
Copyright
A part of disclosure of this patent document includes material protected by copyright.A part of this patent document is open Content includes material protected by copyright.Copyright owner does not oppose that anyone carries out this patent document or patent disclosure Facsimile reproduction because it is appeared in the patent document or record of Patent and Trademark Office, but retains all versions in other aspects Power.
Technical field
The application relates generally to robot technology, and systems for the dynamic route in independent navigation The system and method for planning.
Background technique
Robot navigation may be a complicated problem.In some cases, robot can determine travelling route.It lifts For example, robot can learn the route of user's demonstration (for example, user can control robot along route and/or can be with Upload the map containing route).As another example, robot can advise the understanding (for example, map) of environment based on it Draw the route of own in the environment.However, it is possible to which the challenge occurred is after robot determines route, the feature of environment may It can change.For example, article may drop out in the path of route and/or some parts of environment may change.Current machine Device people may cannot respond to adjust its planning path in real time in these changes (for example, blocking).In such situation In, current robot may stop, colliding object, and/or carry out suboptimum adjustment to its route.Therefore, it is necessary to improve use In the system and method for independent navigation, the system and method comprising being planned for dynamic route.
Summary of the invention
The disclosure meets above-mentioned needs, it specifically provides equipment and side for the dynamic route planning in independent navigation Method.Example implementation described herein has character of innovation, and wherein each character of innovation is not essential Or it is merely responsible for its desired properties.In the case where not limiting the scope of the claims, some favorable characteristics will be summarized now.
In a first aspect, disclosing a kind of robot.In an exemplary embodiment, robot includes: one or more A sensor is configured to collect the data about environment, comprising about the test point on one or more objects in environment Data;And controller, it is configured to: being based at least partially on the map that the data being collected into form environment;In map Determine route of the robot by traveling;One or more route poses on route are generated, wherein each route pose includes referring to Show that robot along the trace of the pose of route, and is mounted with multiple points in each route pose;Determine each route pose Power on each of multiple points, the power include the repulsion of one or more test points on one or more objects The attraction of one or more of power and multiple points on other route poses in one or more route poses;It rings One or more route poses should be relocated in the power on each point of one or more route poses;And at one or Interpolation is executed between multiple route poses to generate the collisionless road advanced for robot between one or more route poses Diameter.
In a variation, one or more route poses form the sequence that robot advances along route;And it is interior Slotting includes the linear interpolation between the continuous route pose in one or more route poses.
In another version, interpolation generates one of the trace with the trace for being substantially similar to each route pose A or multiple interpolation route poses.In another version, determine that the power on each point of one or more route pose is another Outer includes calculating at least partly to make the one or more of the object in power and environment on each point of each route pose special The associated force function of property.
In another version, one or more characteristics include one or more in distance, shape, material and color It is a.In another version, the zero-emission repulsion that the first test point that force function is associated on the first object is applied, wherein first The distance between second point of point and the first route pose is higher than predetermined distance threshold.
In another version, the trace of each route pose has the size substantially similar with the trace of robot And shape.
In another version, robot includes floor cleaning equipment.
In second aspect, a kind of method that the dynamic for robot is navigated is disclosed.In an exemplary implementation scheme In, method includes: using the map of the data build environment from one or more sensors;Route, institute are determined on map Stating route includes one or more route poses, and each route pose includes at least partly indicating robot along the position of route The trace of appearance and shape, and multiple points are mounted in each route pose;Calculate on the object in environment o'clock to one or more The repulsive force of multiple points of the first route pose in a route pose;The first route is relocated at least responsive to repulsive force Pose;And it is held between another route pose in the first route pose and one or more route poses of repositioning Row interpolation.
In a variation, it determines and applies from the point on another route pose in one or more route poses Attraction on multiple points of the first route pose.In another version, ring is detected using one or more sensors Multiple objects in border, each of the multiple object have test point;And force function is defined, the force function calculates Each test point of multiple objects is applied to the repulsive force on multiple points of the first route pose, wherein each repulsive force be to Amount.
In another version, relocating the first route pose includes the minimum value for calculating force function.
In another version, relocating the first route pose includes to translate and rotate the first route pose.
In another version, interpolation includes: generating the interior of the trace with the shape for being substantially similar to robot Insert route pose;And at least based on another in the first route pose and one or more route poses by translation and rotation Collisionless path between one route pose determines the translation and rotation of interpolation route pose.
In another version, method is additionally included in the feelings o'clock outside the trace of the first route pose on object Under condition, by the magnitude calculation of repulsive force be between each of multiple points of point and the first route pose on object away from From directly proportional.
In another version, o'clock in the case where the trace inside of the first route pose, repelling on object The magnitude calculation of power is that the distance between each of multiple points of point and the first route pose on object are inversely proportional.
In another version, method is additionally comprised calculating and is applied to the multiple of the first route pose due to repulsive force Torsion on point.
In a third aspect, a kind of non-transitory computer readable storage devices are disclosed.In one embodiment, nonvolatile It is stored with multiple instruction in property computer readable storage devices, described instruction can be executed by processing equipment to operate robot.Institute It states instruction to be configured to perform the following operation processing equipment when being executed by processing equipment: using from one or more sensors Data build environment map;Determine that route, the route include one or more route poses, each route on map Pose includes that at least partly instruction robot is mounted with along the pose of route and the trace of shape, and in each route pose Multiple points;And the point on the object in calculating environment is to multiple points of the first route pose in one or more route poses Repulsive force.
In a variation, described instruction in addition make when being executed by processing equipment processing equipment determine from one or The point on another route pose in multiple route poses is applied to the attraction on multiple points of the first route pose.
In another version, described instruction in addition make when being executed by processing equipment processing equipment determine from one or The point on another route pose in multiple route poses is applied to the torsion on multiple points of the first route pose.
These and other objects, the operating method and function of the related elements of features and characteristics and structure of the disclosure And component combination and manufacture economy will reference attached drawing consider the following description and the appended claims when become aobvious and easy See, all these a part for being respectively formed this specification, wherein identical appended drawing reference indicates the corresponding part in each attached drawing.So And, it should be clear that ground understanding, the purpose that attached drawing is merely to illustrate and describes, it is no intended to the definition of the limitation as the disclosure. As used in specification and claims like that, unless the context clearly determines otherwise, otherwise " the one of singular It is a/a kind of ", it is " described " include plural referents.
Detailed description of the invention
Disclosed aspect is described hereinafter in conjunction with attached drawing, provide attached drawing be in order to illustrate rather than limitation institute is public The aspect opened, wherein the same symbol indicates similar elements.
Fig. 1 illustrates each lateral plan of the exemplary body form of the robot of the principle according to the disclosure.
Fig. 2A is the figure for the top view walked according to the robot of some embodiments of the disclosure along path.
Fig. 2 B illustrates bowing for the forward direction robotic presentation route that user independently advances in the environment along route in robot View.
Fig. 3 is the functional block diagram according to the robot of some principles of the disclosure.
Fig. 4 A is the top view for illustrating the interaction between robot and barrier according to some embodiments of the disclosure.
Fig. 4 B is the figure according to the global layer of the embodiment of the disclosure, middle layer and partial layer.
Fig. 4 C is the process stream according to the illustrative methods for dynamic route planning of some embodiments of the disclosure Cheng Tu.
Fig. 4 D illustrates bowing for the repulsive force applied according to the route pose and object of some embodiments of the disclosure View.
Fig. 4 E illustrates the top view for showing the attraction between route pose of some embodiments according to the disclosure.
Fig. 5 is the top view according to the figure for showing the interpolation between route pose of some embodiments of the disclosure.
Fig. 6 is the process flow according to the illustrative methods for operating robot of some embodiments of the disclosure Figure.
Fig. 7 is the process flow according to the illustrative methods for operating robot of some embodiments of the disclosure Figure.
All attached drawings disclosed herein areCopyright 2017Brain Corporation.All rights reserved.
Specific embodiment
Each side of innovative system disclosed herein, device and method is described more fully hereinafter with reference to attached drawing Face.However, the disclosure can show in many different forms, and should not be construed as being limited to through present disclosure to Any given structure or function out.On the contrary, providing these aspects is in order to keep the disclosure thorough and complete, and by the disclosure Range be fully conveyed to those skilled in the art.Based on teaching herein, it will be appreciated by those skilled in the art that the disclosure Range is intended to cover innovative system disclosed herein, any aspect of device and method, either independently of any of the disclosure Other aspects implement or implementation in combination.It is, for example, possible to use any amount of aspects set forth herein to realize equipment Or practice method.In addition, the scope of the present disclosure is intended to cover using other structures, functionality or supplement or is different from this paper institute The structural and functional this equipment or method to practice of the various aspects of the disclosure of elaboration.It should be understood that herein Disclosed any aspect can be realized by the one or more elements of claim.
Although specific aspects are described herein, many changes and substitutions but in terms of these are within the scope of this disclosure. Although referring to some benefits and advantage of preferred aspect, the scope of the present disclosure is not intended to be limited to specific benefit, use Way and/or target.The detailed description and the accompanying drawings are only illustrative of the present disclosure rather than limit that the scope of the present disclosure is by appended right It is required that and its equivalent restriction.
The disclosure provides the improved system and method for the dynamic route planning in independent navigation.As used herein , robot may include mechanical entities or pseudo-entity, and the mechanical entities or pseudo-entity are configured to execute automatically a series of Complicated movement.In some cases, robot can be the machine by computer program or electronic circuit system guidance.One In a little situations, robot may include the electromechanical assemblies for being configured to navigation, and wherein robot can be moved to separately from a position One position.Such navigating robot may include autonomous driving vehicle, floor cleaning equipment, roaming vehicle, unmanned plane, cart etc..
As mentioned in this article, floor cleaning equipment may include manually control (for example, driving or long-range control) and/or from The floor cleaning equipment of main (for example, almost not having to user's control).For example, floor cleaning equipment may include cleaner, custodian or its The floor scrubber of other people operations and/or in the environment independent navigation and/or clean robotic floor scrubber.Similarly, Floor cleaning equipment also may include vacuum cleaner, steam engine, buffer, mop, polishing machine, sweeper, polisher etc..
It there is presently provided to the various implementations of the system and method for the disclosure and the detailed description of version.Although this Many examples discussed in text are in the context of robotic-floor cleaner, it is to be appreciated that the system contained by this paper System and method can be used for other robots.In view of content of this disclosure, those of ordinary skill in the art will be easy to imagine that herein The countless other examples embodiments or purposes of the technology.
Advantageously, the system and method for the disclosure are at least: (i) realizes the dynamic route planning of independent navigation robot; (ii) efficiency navigated in the environment is improved, so as to realize to used resource (for example, energy, fuel, cleaning solution etc.) It improves and/or efficiently utilizes;And (iii) offer can reduce in robot navigation to processing capacity, energy, the time and/ Or the computational efficiency of the consumption of other resources.In view of content of this disclosure, those of ordinary skill in the art can be easily verified that Further advantage out.
For example, current many can be walked at along route and/or path to target with the robotic programming of independent navigation.For It walks along these routes, these robots can form path planning (for example, global solution).Also, these machines People can have part plan in the zonule (for example, about several meters) around it, and wherein robot can determine that it will be how The barrier walking (usually being turned when detecting object by basic command) detected around its sensor.It connects , robot can pass through the space in pattern, and avoid its biography for example, by stopping, deceleration, leftward or rightward deflection etc. The barrier that sensor detects.However, in many current applications, such pass through and avoid may be more complicated, and robot It is possible that non-desired result (for example, interrupt or collide), and/or can not navigate in more complicated situation.In some feelings Under condition, such current application can also be to calculate expensive and/or operation slowly, prevent robot from operating naturally.
Advantageously, robot can deviate its programming, it then follows more efficient using system and method disclosed herein Path and/or more complicated adjustment is carried out with avoiding obstacles.In some embodiments described herein, such movement can lead to A kind of more efficient, faster mode determination is crossed, when the more complicated path of robot planning, this looks more natural.
It will be appreciated by the skilled addressee that as mentioned in this article, robot can have several different appearances/ Form.Fig. 1 illustrates each lateral plan of the exemplary body form of the robot of the principle according to the disclosure.These are non-limits Property example processed, it is intended that further illustrate a variety of body formats, but be not intended to be limited to robot described herein to appoint What special body form.For example, body format 100 illustrate wherein robot be stand-type workshop vacuum cleaner example.It is main Body form 102 illustrates that wherein robot is the example with the anthropomorphic robot for the appearance for being substantially similar to human body.Main body shape Wherein robot is the example with the unmanned plane of propeller to the explanation of formula 104.Wherein robot has band to the explanation of body format 106 The example of the vehicle shape of wheel and passenger compartment.Body format 108 illustrate wherein robot be roaming machine example.
Body format 110 can be the example that wherein robot is motor-driven floor scrubber.Body format 112 can be tool Have the motor-driven floor scrubber of seat, pedal and steering wheel, wherein body format 112 clean when, user can as vehicle that Sample drives body format 112, but body format 112 can also be with autonomous operation.It is also contemplated that other body formats, comprising can With industrial machine of automation, such as fork truck, towboat, ship, aircraft etc..
Fig. 2A is the figure for the top view walked according to the robot 202 of some embodiments of the disclosure along path 206. Robot 202 can independent navigation, environment 200 may include various objects 208,210,212,218 in environment 200.Robot 202 It can begin at initial position and end at end position.As described, initial position and end position are substantially the same, to show A substantially closed loop out.However, in other cases, initial position and end position may not be generally phase With, to form open loop.
For example, in some embodiments, robot 202 can be robotic-floor cleaner, such as robot Floor scrubber, vacuum cleaner, steam engine, mop, polisher, sweeper etc..Environment 200, which can be to have, wants cleaning Floor space.For example, environment 200 can be shop, warehouse, office building, house, storage facility etc..Object 208,210, 212, one or more of 218 shelf, display, object, article, people, animal is can be or can be located on floor or with it Any other entity or things for the ability that its mode hinders robot to navigate in environment 200.Route 206 can be robot 202 cleaning paths independently advanced.Route 206 can follow the path walked between object 208,210,212 and 218, strictly according to the facts Illustrated by example route 206.For example, if object 208,210,212,218 is the shelf in shop, robot 202 can be with Advance and clean the floor in passageway along the passageway in shop.However, it is also contemplated to other routes, such as, but not limited to along opening The route walked back and forth of floor area and/or user can be used to clean floor (manually plate cleans for example, in user In the case where device) any cleaning path.In some cases, robot 202 can repeatedly pass through a part.Therefore, route It can itself overlapping.Therefore, route 206 is intended merely as illustrative example, and mode that can be different from illustrated mode is in It is existing.Also, as described, an example of environment 200 is shown, it is to be appreciated that any number shape can be used in environment 200 Formula and arrangement (for example, room or building of any size, configuration and layout), and do not limited by the example diagram of the disclosure.
In route 206, robot 202 can begin at initial position, and initial position can be the starting point of robot 202. Then, robot 202 independently can be cleaned (for example, almost not having to user's control) along route 206, until it reaches knot Beam position, it can stop cleaning there.End position can be specified by user and/or be determined by robot 202.In some feelings Under condition, end position can be a certain position in route 206, and after this position, robot 202 is cleaned complete required Floor area.As described previously, route 206 can be closed loop or open loop.As illustrative example, end position can be machine Storage location of people 202, such as temporary storage point, storeroom/closet etc..In some cases, end position can be instruction User's deconditioning of the task of white silk and/or programming robot 202 and/or the point of programming.
In the context of floor cleaning equipment (for example, floor scrubber, vacuum cleaner etc.), robot 202 can be It is cleaned, can not also be cleaned at each of route 206 at each of route 206.Citing comes It says, if robot 202 is robotic floor scrubber, the cleaning systems of robot 202 are (for example, water flow, cleaning brush Deng) can only be operated in some parts of route 206, and do not operated in the other parts of route 206.For example, robot 202 can make some movements (for example, turning, opening/closing water, spray water, opening/closing vacuum cleaner, Mobile vacuum hose side Position brandishes arm, lifts/reduce lifter, movable sensor, opening/closing sensor etc.) with the certain party along demonstration route Position and/or track (for example, when moving along specific direction or being moved along route 206 with particular order) are associated.On ground In the context of plate cleaner, when in floor only some regions to clean and other regions do not have to cleaning when and/or in some rails In mark, such association can be desirable.In these cases, robot 202 can be that the demonstration of robot 202 is wanted in user Cleaning systems are opened in clean region, close cleaning systems in not having to clean region.
Fig. 2 B illustrate user in robot 202 in environment 200 along the forward direction robot 202 of the autonomous traveling of route 206 Demonstrate the top view of route 216.When demonstrating route 216, user can start robot 202 in initial position.Then, machine Device people 202 can walk around object 208,210,212,218.Robot 202 can stop at end position, such as previously Description.In some cases (and as described), independent navigation route 206 can be identical with demonstration route 216.In some feelings Under condition, route 206 may not be identical with route 216, but can be substantially similar.For example, when robot 202 is along road When line 206 is walked, robot 202 senses the relationship of it and it ambient enviroment using its sensor.In some cases, this Class sensing can be inaccurately, so as to make robot 202 that can not be trained to follow along demonstrated and robot 202 Arcuate path walking.In some cases, when robot 202 is along 206 autonomous of route, the smaller of environment 200 changes Change may make 202 deviation route 216 of robot, the change movement of such as shelf and/or the change of the article on shelf. As another example, as described previously, robot 202 when along 206 autonomous of route can by around object turn, Slow down etc. and to avoid object.When user demonstrates route 216, (and not being avoided) may be not present in these objects.For example, these Object may be interim placement and/or temporary article, and/or can be and change to the temporary of environment 200 and/or dynamic. As another example, user may demonstrate route 216 badly.For example, user may bump against and/or encounter wall Wall, shelf, object, barrier etc..As another example, barrier may exist when user demonstrates route 216, but work as machine Device people 202 is along when 206 autonomous of route and being not present.In these cases, robot 202 can be in memory (for example, depositing Reservoir 302) in storage its one or more movement that can correct, such as bump against and/or encounter wall, shelf, object, obstacle Object etc..When after robot 202 along demonstration route 216 (for example, such as route 206) autonomous, robot 202 can school Just such movement, and them are not executed (for example, not bumping against and/or encountering wall, shelf, object, barrier in its independent navigation Hinder object etc.).In this way, robot 202 can determine at least part along the navigation routine for for example demonstrating route Without independent navigation.In some embodiments, it determines along at least part of navigation routine without independent navigation packet Containing determining when avoiding obstacles and/or object.
As previously mentioned, when user demonstrates route 216, user can open and close the cleaning system of robot 202 Unite or execute other movements, so as to image training robot 202 along route 216 wherein (for example, in what orientation) and/or along Cleaned (when subsequent image training robot 202 is independently cleaned along route 206) in any track.Robot can deposit These movements are recorded in reservoir 302, and execute them in independent navigation later.These movements may include that robot 202 can With any movement of execution, such as turns, open/close water, spray water, opening/closing vacuum cleaner, Mobile vacuum hose position Set, brandish arm, lift/reduce elevator, movable sensor, open/close sensor etc..
Fig. 3 is the functional block diagram according to the robot 202 of some principles of the disclosure.As illustrated in Figure 3, robot 202 may include controller 304, memory 302, user interface elements 308, perception sensor unit 306, proprioception sensing Device unit 310 and communication unit 312 and other components and sub-component (for example, some of which may and not specify).In view of Content of this disclosure, those skilled in the art will be readily apparent, although being illustrated in Figure 3 specific embodiment, It is it will be appreciated that framework can change in certain embodiments.
The controllable various operations executed by robot 202 of controller 304.Controller 304 may include one or more places Manage device (for example, microprocessor) and other peripheral equipments.As used herein, processor, microprocessor and/or digital processing Device may include any kind of digital processing unit, such as, but not limited to digital signal processor (" DSP "), reduced instruction set computer meter Calculation machine (" RISC "), general (" CISC ") processor, microprocessor, gate array are (for example, field programmable gate array (" FPGA ")), programmable logic device (" PLD "), reconfigurable computer configuation (" RCF "), array processor, safety it is micro- Processor, application specific processor (for example, neuron processor) and specific integrated circuit (" ASIC ").Such digital processing unit can accommodate On single integral type integrated circuit die or across multiple components distributions.
Controller 304 operatively and/or can be communicatively connect to memory 302.Memory 302 may include appointing The integrated circuit of what type or the other storage devices for being configured to storage numerical data, including but not limited to read-only memory (" ROM "), random access memory (" RAM "), may be programmed read-only storage at nonvolatile RAM (" NVRAM ") Device (" PROM "), electrically erasable programmable read-only memory (" EEPROM "), dynamic random access memory (" DRAM "), movement DRAM, synchronous dram (" SDRAM "), Double Data Rate SDRAM (" DDR/2SDRAM "), expanded data output (" EDO ") RAM, fast page mode RAM (" FPM "), reduce time delay D RAM (" RLDRAM "), static state RAM (" SRAM "), flash memory (example Such as, NAND/NOR), memristor memory, pseudo static RAM (" PSRAM ") etc..Memory 302 can provide instruction to controller 304 And data.For example, memory 302 can be the non-transitory computer-readable storage media for being stored thereon with multiple instruction, institute Stating instruction can be executed by processing equipment (for example, controller 304) to operate robot 202.In some cases, instruction is configurable At make when being executed as processing equipment processing equipment execute the disclosure described in various methods, feature and/or functionality.Cause This, controller 304 can execute logic and arithmetical operation based on the program instruction being stored in memory 302.
In some embodiments, perception sensor unit 306 may include that can detecte in robot 202 and/or machine The system and or method of characteristic around device people 202.Perception sensor unit 306 may include multiple sensors and/or sensing The combination of device.Perception sensor unit 306 may include the sensor inside or outside robot 202, and/or have part In internal and/or part in external component.In some cases, perception sensor unit 306 may include perception sensing Device, such as sonar, LIDAR, radar, laser, camera (including video camera, infrared camera, 3D camera etc.), flight time (" TOF ") camera, antenna, microphone and/or any other sensor as known in the art.In some embodiments, outside Experience sensor unit 306 collect raw measurement results (for example, electric current, voltage, resistance, gate logic etc.) and/or transformation after Measurement result (for example, test point etc. in distance, angle, barrier).Perception sensor unit 306 can be at least partly Data are generated based on measurement result.Such data can be stored with data structure form, such as matrix, array etc..In some realities It applies in scheme, the data structure of sensing data can be referred to as image.
In some embodiments, proprioception sensor unit 310 may include the bulk properties that can measure robot 202 Sensor.For example, proprioception sensor unit 310 can measure the temperature of robot 202, power level, state and/or appoint What its characteristic.In some cases, proprioception sensor unit 310 can be configured to determine the mileage of robot 202.Example Such as, proprioception sensor unit 310 may include proprioception sensor unit 310, may include sensor, such as acceleration Meter, Inertial Measurement Unit (" IMU "), odometer, gyroscope, speedometer, camera (for example, using visual odometry), clock/fixed When device etc..Odometer facilitates the independent navigation of robot 202.This odometer may include robot 202 relative to initial bit Set orientation (for example, wherein orientation include robot position, displacement and/or orientation, and sometimes can with it is as used herein Term pose exchange).In some embodiments, proprioception sensor unit 310 collects raw measurement results (example Such as, electric current, voltage, resistance, gate logic etc.) and/or transformed measurement result (for example, the inspection in distance, angle, barrier Measuring point etc.).Such data can be stored with data structure form, such as matrix, array etc..In some embodiments, sensor The data structure of data can be referred to as image.
In some embodiments, user interface elements 308 may be arranged so that user can interact with robot 202. For example, user interface 308 may include touch panel, button, keypad/keyboard, port (for example, universal serial bus (" USB "), digital visual interface (" DVI "), display port, E-Sata, Firewire, PS/2, serial, VGA, SCSI, sound Frequency port, high-definition media interface (" HDMI "), the port Personal Computer Memory Card Internatio (" PCMCIA "), storage card end Mouthful (for example, secure digital (" SD ") and miniSD) and/or the port of computer-readable media), mouse, spin, console, vibration Dynamic device, audio converter and/or any interface that data and/or order are inputted and/or received for user, either with wireless parties Formula connection still passes through electric wire and connects.User interface elements 308 may include display, such as, but not limited to liquid crystal display (" LCD "), light emitting diode (" LED ") display, LED LCD display, in-plane changes (" IPS ") display, cathode is penetrated Spool, plasma display, high definition (" HD ") panel, 4K display, retinal display, organic LED display, touch screen, Surface, painting canvas and/or any display, TV, monitor, panel and/or the dress as known in the art presented for vision It sets.In some embodiments, user interface elements 308 can be positioned in the main body of robot 202.In some embodiments In, user interface elements 308 may be positioned to the main body far from robot 202, but can be by correspondence (for example, passing through communication Unit includes transmitter, receiver and/or transceiver) directly or indirectly (for example, passing through network, server and/or cloud) connection To robot 202.
In some embodiments, communication unit 312 may include one or more receivers, transmitter and/or transceiver. Communication unit 312 can be configured to transmission/reception transport protocol, such as Wi-Fi、 The transmission of induction wireless data, radio frequency, wireless radio transmission, radio frequency identification (" RFID "), near-field communication (" NFC "), infrared, network connect Mouth, the cellular technology of such as 3G (3GPP/3GPP2), high-speed down link packet access (" HSDPA "), high-speed uplink packet connect Enter (" HSUPA "), time division multiple acess (" TDMA "), CDMA (" CDMA ") (for example, IS-95A, wideband code division multiple access (" WCDMA ") etc.), it is frequency hopping spread spectrum (" FHSS "), Direct Sequence Spread Spectrum (" DSSS "), global system for mobile communications (" GSM "), a People's local area network (" PAN ") (for example, PAN/802.15), World Interoperability for Microwave Access, WiMax (" WiMAX "), 802.20, drill for a long time Into (" LTE ") (for example, LTE/LTE-A), time-division LTE (" TD-LTE "), global system for mobile communications (" GSM "), narrowband/frequency division Multiple access (" FDMA "), orthogonal frequency division multiplexing (" OFDM "), analog cellular, cellular digital packet data (" CDPD "), satellite system, milli Metric wave or microwave system, acoustics, infrared (for example, Infrared Data Association (" IrDA ")) and/or any other form without line number According to transmission.
As used herein, network interface may include any signal, data or the software with component, network or process Interface, including but not limited to those firewires (for example, FW400, FW800, FWS800T, FWS1600, FWS3200 etc.), general string Row bus (" USB ") (for example, USB 1.X, 2.0 USB, USB 3.0, USB c-type etc.), Ethernet are (for example, 10/100,10/ 100/1000 (gigabit Ethernet), 10-Gig-E etc.), multimedia over Coax Alliance technology (" MoCA "), Coaxsys (for example, TVNETTM), radio-frequency tuner (for example, with interior or OOB, cable modem etc.), Wi-Fi (802.11), WiMAX (example Such as, (802.16) WiMAX), PAN (for example, PAN/802.15), honeycomb (for example, 3G, LTE/LTE-A/TD-LTE/TD-LTE, GSM etc.), IrDA series etc..As used herein, Wi-Fi may include one of the following or multiple: IEEE- The version of Std.802.11, IEEE-Std.802.11, standard relevant to IEEE-Std.802.11 (for example, 802.11a/b/g/n/ac/ad/af/ah/ai/aj/aq/ax/ay) and/or other wireless standards.
Communication unit 312 may be additionally configured to send/receive transport protocol by wired connection, and the wired connection for example has There is any cable of signal line and ground connection.For example, such cable may include that Ethernet cable, coaxial cable, general serial are total Line (" USB "), firewire and/or any connection as known in the art.Such agreement for communication unit 312 using with outside System communication, such as computer, smart phone, tablet computer, data capture system, mobile telecom network, cloud, server etc. Deng.Communication unit 312 can be configured to send and receive signal, the signal include number, letter, alphanumeric character and/or Symbol.In some cases, signal can be used the algorithm of such as 128 or 256 keys and/or in accordance with the other of each standard Encryption Algorithm is encrypted, the standard such as Advanced Encryption Standard (" AES "), RSA, data encryption standards (" DES "), triple DES etc..Communication unit 312 can be configured to send and receive state, order and other data/informations.For example, communication unit 312 can communicate with user operation person to allow user's control robot 202.Communication unit 312 can be communicated with server/network, To allow robot 202 to send data, state, order and other communications to server.Server can be connected with communication mode It is connected to the computer and/or device that can be used for remotely monitoring and/or control robot 202.Communication unit 312 can also be from robot 202 server, which receives, updates (for example, firmware or data update), data, state, order and other communications.
In some embodiments, one in component and/or sub-component can be from 202 remote instance of robot.For example, It draws and positioning unit 262 can be located in cloud and/or be connected to robot 202 by communication unit 312.Connection can be straight It is connecing and/or pass through server and/or network.Therefore, functional embodiment of the disclosure should also be understood to include Remote interaction, wherein the transmission of communication unit 312 can be used in data, and one or more parts of process can remotely complete.
Fig. 4 A is the interaction illustrated between robot 202 and barrier 402 according to some embodiments of the disclosure Top view.When walking along route 216, robot 202 can be potentially encountered barrier 402.Barrier 402 can hinder robot 202 path, the path are illustrated as route portion 404.If robot will continue to follow route portion 404, it Barrier 402 may be collided.However, in some cases, using perception sensor unit 306 and/or proprioception Sensor unit 310, robot 202 can stop before colliding barrier 402.
The advantages of this interaction with barrier 402 illustrates the embodiment according to the disclosure.Fig. 4 B is according to this public affairs The figure of the global layer 406 of the embodiment opened, middle layer 408 and partial layer 410.Global layer 406, middle layer 408 and partial layer 410 can be the hardware and/or software layer instantiated in one or more of memory 302 and/or controller 304.Entirely Office's layer 406 may include implementing the global software and/or hardware drawn with routing.For example, advanced drawing may include the ground of environment 200 Figure.Map also may include the expression of route 216, and robot 202 is allowed to navigate in the space in environment 200.
In some embodiments, global layer 406 may include Global motion planning device.In this way, global layer 406 can be true Determine one of the following or multiple: the position of robot 202 is (for example, in such as two-dimensional coordinate, three-dimensional coordinate, four-dimensional coordinate World coordinates form);The path that robot 202 should be used to reach its target;And/or more advanced (for example, long-range) rule It draws.In this way, robot 202 can determine that it advances to the general path and/or direction of another location from a position.
Partial layer 410 includes the software and/or hardware for implementing sector planning.For example, partial layer 410 may include being configured to control The short-range planning of local motion constraint processed.Partial layer 410 can be handled from the received data of perception sensor unit 306, and really The presence and/or positioning of fixed barrier and/or object close to robot 202.For example, if object is in perception sensor list In the range of the sensor (for example, LIDAR, sonar, camera etc.) of member 306, then object can be detected in robot 202.Part Layer 410 can calculate and/or control the motor function walked around object, such as turned, moved forward by control actuator, Reverse movement etc..In some cases, the processing in partial layer 410 can be computation-intensive.For example, partial layer 410 can be from The sensor of perception sensor unit 306 and/or proprioception sensor unit 310 receives data.Then, partial layer 410 It can determine motor function, to avoid object that perception sensor unit 306 detects (for example, to the left and to the right using motor Control stick is rotated, and/or pushes robot forward using motor).The interaction of partial layer 410 and global layer 406 allows Robot 202 carries out local directed complete set, while its target is still moved to substantially along route.
However, in some cases, it can be possible to needing to be adjusted with the level than the higher level calculated by global layer 406 It is whole, but be adjusted with the computation-intensive level of the precise motion function of partial layer 410.Therefore, middle layer 408 may include It can determine the intermediate hardware and/or software adjusted of robot 202 when robot 202 walks around object.
In middle layer 408, robot 202 can plan how object and/or barrier are avoided in its environment.Some In the case of, the global path planning device that middle layer 408 can use global layer 406 be initialized as at least have local path and/ Or route.
Because of the things that there is object (for example, barrier, wall etc.) robot 202 may collide, object And/or barrier may generate repulsive force to robot 202.In some cases, robot 202, machine are repelled by object Device people 202 can walk along collisionless path around those objects and/or barrier.
Fig. 4 C is the process according to the illustrative methods 450 for dynamic route planning of some embodiments of the disclosure Flow chart.In some embodiments, method 450 can be executed by middle layer 408 and/or by controller 304.Frame 452 may include Obtain the route containing one or more route poses.In some cases, this route can be formed by robot 202 and/or on It is downloaded in robot 202.In some cases, route can be transmitted to middle layer 408 from global layer 406.Frame 454 may include selection First route pose.Frame 456 may include determining the repulsive force of the object in environment for the first route pose.Frame 458 can Comprising being directed to the first route pose, the attraction from other route poses is determined.Frame 460 may include determining by repulsive force and suction The translation and/or rotation of first route pose caused by gravitation.Frame 462 may include in view of the road by translation and/or rotation Line pose and execute interpolation.It will pass through this process of this disclosure and other processes.
For example, Fig. 4 D illustrates the route pose 414 and 416 of some embodiments according to the disclosure.And object The repulsive force applied.For example, the point on route can be the distributed locations along path, such as route pose, it illustrates machine Pose of the device people 202 in its entire route.In some cases, such distributed locations can also associated probability, example Such as particle or bubble.Route pose can recognize the orientation and/or orientation that robot 202 will advance along route.In planar applications In, route pose may include (x, y, θ) coordinate.In some cases, θ can be the direction of advance of robot in plane.Route Pose can be regularly or irregularly spaced apart along the route of robot 202.In some cases, middle layer can be from the overall situation Layer 406 obtains the route containing one or more route poses, as described in the frame 452 of method 450.In some embodiments In, route pose can formation sequence, wherein robot 202 advances between continuous route pose along route.For example, route position Appearance 414 and 416 can advance to the route pose sequence that route pose 414 then proceeds to route pose 416 for robot 202.
For example, route pose 414 and 416 illustrates the distributed locations along route portion 404.This illustrative example is shown The route pose 414 and 416 with substantially similar trace as shape machine people 202 out.414 He of route pose The size of 416 trace can be adjusted according to the conservative of the expected robot collision of people.A possibility that collision, is higher, can There are smaller traces, but such smaller trace allows robot 202 is separate should be able in independent navigation than it The more regions in separate region.Biggish trace can reduce a possibility that collision, but robot 202 will be unable to independently pass through It originally can by some places.Trace can pass through the trace of the trace size (for example, size) of setting robot 202 Parameter is predetermined, as illustrated in route pose (for example, route pose 414 and 416).In some cases, may be present it is multiple not Symmetrically control the trace parameter of the size of the route pose of robot 202.
In fig. 4d, although illustrate and describing route pose 414 and 416, the those of ordinary skill of ability is answered Solution, may be present any number route pose, and the description of the embodiment of the disclosure is applicable to those in entire route Route pose.Advantageously, the shape of route pose 414 and 416 is made to be similar to robot 202 (for example, the print of robot 202 Mark) aloow robot 202 to can determine its place for being suitble to when advancing.Trace parameter can be used for adjusting robot How 202 project itself.For example, the larger trace for route pose 414 and/or 416 can be more to guard, because it can Robot 202 is at least partly caused to advance farther away from object.In contrast, smaller trace can at least partly cause Robot 202 advances to obtain closer to object.Route pose (for example, route pose 414 and 416) can have different from each other big It is small.For example, it may be desired to which robot 202 is more conservative in some cases, such as at turning.Therefore, it illustrates herein In, trace of the route pose at turning can be greater than trace of the route pose on forthright.This dynamic weight of route pose Shaping can be by making the size of route pose depend on rotation or route pose of the route pose relative to other route poses The change of translation and/or rotation executes.One or more of route pose on route (for example, route pose 414 and/ Or 416) it can also be different shape in addition to the shape of robot 202.For example, route pose can be it is circular, rectangular , triangle and/or any other shape.
As described in the frame 454 of method 450, route pose 414 or 416 can be considered as to the first route pose.However, going out In the purpose of explanation, and the range of the embodiment in order to illustrate the disclosure, route pose 414 and 416 will be described together.
Along object point (for example, by draw, by by perception sensor unit 306 sensor detection etc. really Fixed point) repulsive force can be applied to the route pose (for example, route pose 414 and 416) of robot 202.In this way, Object can conceptually prevent robot 202 from colliding them.In some cases, these points can at least partly indicate pose And/or multiple groups pose.For example, arrow 412 illustrates the repulsive force from the point along object 210.
In some embodiments, it can be uniformly by the power that the point of object applies, because of 414 He of route pose There can be substantially similar power to apply on each point on 416.However, in other embodiments, the point of object is applied to Power on route pose 414 and 416 may not be uniform, and can be changed based on force function.
For example, in some cases, force function (for example, repelling force function) can at least partially determine object application The repulsive force on point on route pose 414 or 416.For example, can be used force function to determine in the frame 456 of method 450 and come The repulsive force of the first route pose (for example, first route pose of route pose 414 and 416) is directed to from the object in environment. In some embodiments, force function may depend on the characteristic that object is presented relative to route pose 414 and 416.Then, power letter Number can indicate on route pose 414 and 416 point (for example, one or more points on the surface of route pose 414 and 416, Any at the center of route pose 414 and 416, the mass centre of route pose 414 and 416 and/or route pose 414 and 416 Point and/or surrounding any point) power that is subjected to.Because power may depend on their direction and magnitude, repulsive force (and/or attraction) can be vector.In some cases, repulsive force can apply rotary force to route pose, this can be shown as Torsion.
For example, repulsive force and torsion can calculate at the n different positions and pose along path.In some cases, this n is a not It can be associated with route pose with pose.Each pose can be made of m point in trace.In some cases, this m point can To be the point on route pose.
In some cases, multiple points can limit the main body of robot 202, as reflected in route pose 414 and 416, The representative of the generally entire main body of a part of main body and/or robot 202 of robot 202 is covered to provide.Example Such as, 15 to 20 points can be distributed in whole surface and/or the inside of robot 202, and in route pose 414 and 416 Reflection.However, in some cases, less point may be present.Fig. 4 F illustrates the example point on route pose 414, such as point 418.Each point can at least partly be subjected to the power (example that the object in the ambient enviroment by route pose 414 is applied to above it Such as, repulsive force).
Advantageously, can be through multiple points of stress, route pose by having in the main body of route pose 414 and 416 414 and 416 point can be translated relative to each other and/or be rotated, at least partly make route pose 414 and 416 again Positioning (for example, translation and/or rotation).These translations and/or rotation of route pose 414 and 416 can lead to robot 202 Navigation routine deformation.
When the difference on route pose is subjected to the power of different magnitudes and direction, torsion may occur in which.Therefore, torsion can cause Rotate route pose.In some cases, predefined parameter can at least partially define what route pose 414 and 416 was subjected to Torsion.For example, predetermined torsion parameter may include the multiplier for the rotary force being subjected to for the point on route pose 414 or 416.This Predetermined torsion parameter can indicate the power caused by the misalignment in route pose 414 or 416 and path.In some cases, make a reservation for Torsion parameter can be based on power repellency or cohesion and change.
Back to Fig. 4 D, the characteristic of force function institute part foundation can be the point on object relative to 414 He of route pose 416 orientation.Distance can be based at least partially on the sensor of perception sensor unit 306 and determine.As the first example, From outside robot 202 (for example, not in the trace of route pose 414 and 416, such as the point on barrier 210 and 212, Point on object as described) is applied to the repulsive force on route pose 414 and 416 can be at least partly by function r (d) ∝ 1/d characterization, wherein r is the repulsive force of the point on object, and d is on point and route pose 414 or route pose 416 on object The distance between point.In this way, the point on the repulsive force and object of the point on object and route pose 414 or route position The distance between point in appearance 416 is inversely proportional.Advantageously, this function allows close to route pose compared to farther object 414 and 416 object applies bigger repulsive force, so as to more strongly influence the route of robot 202 in order to avoid colliding.
In some cases, make a reservation for repel distance threshold to be the point and object on route pose 414 and route pose 416 On point the distance between.This predetermined repellency distance threshold can at least partly indicate route pose 414 or route pose 416 On point and object on point between maximum distance, wherein the point on object can apply the point on route pose 414 or 416 Repulsive force (and/or torsion).Therefore, for example it is separated by with the point on route pose 414 or route pose 416 when the point on object When distance is higher than and (or is equal to and/or is higher than, this depends on the definition of threshold value) threshold value, repulsive force and/or torsion can be zero or big It is zero on body.Advantageously, in some cases, there is predetermined repellency distance threshold can prevent some points on object from satisfying the need Point applied force on line pose 414 and 416.In this way, when exist it is predetermined repel apart from when, robot 202 can be more It is not influenced close to certain objects and/or by more far object.
As the second example, from the inside of route pose 414 and 416 (for example, in the trace of route pose 414 and 416 It is interior)) point to route pose 414 and 416 apply repulsive force.Occur inside route pose 416 for example, object 402 has Part 420.In these cases, different force functions can be applied in part 420 by the point of the object 402 in part 420 Route pose 416 point on.In some embodiments, this power can be characterized at least partly by function r (d) ∝ d, wherein Variable is as described above.Advantageously, route pose 416 can not be right by limiting different force functions for internal objects Claim ground mobile, to generate rotation.
In some embodiments, force function may also depend upon other characteristics of object, for example, the shape of object, material, Color and/or any other characteristic.These characteristics can be according to procedures known in the art by perception sensor 306 One or more sensors determine.Advantageously, it is contemplated that characteristic could moreover be shown that how robot 202 is should be around object row It walks.In some cases, these characteristics are based on, cost map can be used to calculate extra repulsive force value.
For example, the shape of object can at least partly indicate that associated collision influences.For example, humanoid shape can be with Assignor.Thus, detect that the object with this shape can generate bigger repulsive force to route pose 414 and 416, to incite somebody to action Path pushes further from humanoid shape.As another example, if there is collision, then the shape of object can be indicated partly Increased damage (for example, for object or robot 202).For example, prongs, very thin object, irregularly shaped object, pre- Setting shape (for example, vase, lamp, display etc.) and/or any other shape can at least partly indicate to produce increased damage It is bad.Size can be admissible another style characteristic.For example, lesser object in the event of a collision may be more fragile, and Biggish object can cause bigger damage to robot 202.In the case where size, force function is contemplated that the size of object, makes Point on those objects optionally proportionally repels the point on route pose 414 and 416.For example, if route position Appearance 414 limits if the point of larger object has as being based at least partially on force function in larger object and compared between wisp Fixed relatively large repulsive force, then route pose 414 will be pushed relatively closer to compared with wisp.If compared with wisp Point has as being based at least partially on relatively large repulsive force defined by force function, then route pose 414 will be pushed Relatively closer to larger object.Therefore, shape can be based at least partially on to the repulsive force of route pose 414 and 416 to adjust It is whole.Shape can be detected at least partially through the sensor of perception sensor unit 306.It, can as another illustrative example To identify wall in cost map, and repulsive force can be associated with wall because of the size and shape of wall.
In some embodiments, force function may also depend upon the material of object.For example, if there is collision, then certain A little materials can at least partly indicate more to damage.For example, glass, porcelain, mirror and/or other friable materials it is provable It is more easy to damage in the case where colliding.In some cases, such as in the case where mirror, material causes perception sometimes There is error in the sensor of sensor unit 306.Therefore, in some cases, it can be possible to need to navigate robot 202 at from this A little objects are farther, this can reflect in force function (for example, increasing on the object made of some materials compared to other materials Point apply repulsion force).
In some embodiments, color can be detected by the sensor of perception sensor unit 306.Force function can be extremely Partially depend on the color of object and/or the point on object.For example, the certain objects in environment may have particular color (for example, red, yellow etc.), at least partly to indicate that robot 202 (or in some cases, people) should be treated carefully Those objects.Therefore, in some cases, it can be possible to need to navigate robot 202 at farther from these objects, this can be in power letter Reflect in number.
In some embodiments, force function may depend on other factors, such as the position of object.For example, (for example, such as Transmitted from global layer 406) some regions of map can have characteristic.For example, map (for example, cost map) is some Region can be robot 202 should not by region.There is likely to be the unapproachable places of robot 202, because these Place cannot be introduced into (such as into object in).Therefore, in some cases, force function can be in the situation in view of these places Lower adjustment.In some embodiments, force function can be such that point of the point in those places not on route pose 414 and 416 applies Reinforcing (or generally not applied force).Advantageously, region that robot 202 can not be gone to can be reflected (for example, object without power Inside etc.).In contrast, in some embodiments, these ground can be considered as barrier, thus to route pose 414 Apply repulsive force with 416.Advantageously, having such repulsive force that robot 202 can be prevented to seek entry into such region.
In some embodiments, not on route pose 414 and 416 be all effectively repellency.For example, route Point on pose (for example, route pose 414 and 416) can apply attraction (for example, cohesion) power, this can be at least partly by route Pose is pulled towards each other.Fig. 4 E illustrates the attraction between the route pose 414 and 416 according to some embodiments of the disclosure. Arrow at least partly indicates that route pose is pulled towards each other along route portion 404.Advantageously, interior between route pose Poly- power can at least partly make robot 202 follow be substantially similar to the path that global layer 406 is planned path (for example, It is substantially similar to the route of original route, such as robot 202 should there is no the object that get around walking The initial demonstration route followed).
Cohesive force can be arranged by force function (for example, cohesion force function), and the force function may depend on the characteristic in path, Such as smoothness, hope degree that path is followed to robot 202 of the spacing distance, path between route pose/particle etc.. In some cases, cohesion force function can be based at least partially on predetermined cohesive force multiplier, can at least partially determine road The power that line pose is drawn together.Lower predetermined cohesive force multiplier can reduce the cohesive strength of route portion 404 (for example, by road Line pose is drawn to its suction), and in some cases, it can be possible to the travel path that will lead to robot 202 loses smoothness. In some cases, only the point of continuous route pose to each other applies cohesive force.In other cases, all route poses all to Apply cohesive force each other.In other situations again, some route poses apply cohesive force to other route poses.Which road determined Line pose, which is configured to apply cohesive force to each other, may depend on several factors, these factors can change as the case may be.Example Such as, if route is circular, it would be possible that all route poses is needed to apply cohesive force to each other to fasten circle.As Another example, if route be it is complicated, certain pahtfinder hards may only need to apply continuous route pose to each other Cohesive force.This, which is limited, allows robot 202 more to be turned and/or had more predictable as a result, because other positioning Route pose will not exceedingly influence it.It some is applied about what the example between the previous examples of complexity can make in route pose Add cohesive force.As another example, the number of route pose is also likely to be a factor.If a large amount of route poses on route All to cohesive force is applied each other, then they may generate unexpected result.If there is less route pose, then This may not be problem, and all or some in route pose can be with applied force.In some cases, predetermined cohesion may be present Power distance threshold, wherein if point on point and the second route pose on the first route pose it is separated by a distance be greater than it is predetermined in Poly- power distance threshold (or being more than or equal to, this depends on how threshold value defines), then cohesive force can be zero or substantially Zero.
In some embodiments, cohesion force function and repulsion force function can be identical force function.In other implementations In scheme, cohesion force function and repulsion force function are different.According to the frame 458 of method 450, cohesion force function can be used for determining Attraction from other route poses.In some embodiments, cohesive force and repulsive force can produce the torsion of route pose Power (for example, rotating route pose).
As with reference to described by middle layer 408, route pose 414 and 416 can be subjected to different attraction and repulsive force.One In a little embodiments, power can store in the form of an array.For example, the power array of instruction repulsive force, torsion, cohesive force etc. may be present.
In some cases, power can be for example by using ON/OFF parameter switching, and the ON/OFF parameter is openable or closes Close any one power and/or any group of power from some point.For example, ON/OFF parameter can be binary, one of value is beaten Opening force, and another value closing forces.In this way, some power can close, such as be separated by based on object and route pose Distance, put whether inside object or entry region, the distance between route pose etc..
In order to balance, the resulting net force on route pose 414 and 416 can relocate one in route pose 414 and 416 or It is multiple.For example, route pose 414 and 416 can be shifted.Route pose 414 and 416 can be shifted (for example, translation and/ Or rotation), until their resulting net force in any direction are substantially zero and/or minimize.In this way, route Pose 414 and 416, which can be displaced to, at least partly to be indicated to advance for robot 202 to avoid object (for example, barrier 402) Adjustment route position.The translation and/or rotation of route pose caused by repulsive force and attraction can be according to methods 450 Frame 460 determines.
Different adjustment can be carried out to determine the displacement of route pose 414 and 416.Such as, in some cases it may only examine Consider attraction, rather than considers that the institute on route pose 414 and 416 is strong.Advantageously, this system allows robot 202 all along fixed path.It is based at least partially on the displacement of route pose 414 and 416, robot 202 can be route New route is arranged in planner.In new route, when robot 202 advances along path, track can be indicated in robot 202 Point, such as the center of robot 202.
After the displacement that robot 202 determines route pose 414 and 416, robot 202 can determine travel path.Example Such as, the orientation (for example, position and/or orientation) based on route pose 414 and 416, robot 202 can determine from its present position It walks to route pose 414 and 416 and/or any other route pose and/or the path between them.In some cases, Robot 202 will sequentially advance between (for example, continuous) the route pose that is connected, at least partially define path.For example, In view of the path that robot 202 can advance between those points, this determination can be based at least partially between route pose Interpolation.In many cases, linear interpolation can be used.According to the frame 462 in method 450, by using execution interpolation, robot 202 it is considered that by translation and/or the route pose rotated.
Fig. 5 is bowing according to the figures for showing the interpolation between route pose 414 and 416 of some embodiments of the disclosure View.As described herein, based on the power being applied on route pose 414 and 416, route pose 414 and 416 is shifted. As described, route pose 414 occurs translating and rotate simultaneously.Translation can be come with standard unit and/or opposite/non-absolute unit Measurement, the standard unit such as inch, foot, rice or any other measurement unit are (for example, metric system, made in U.S.A or other measurements System), described opposite/non-absolute unit such as tick count, pixel, the percentage of range of sensor etc..Rotation can be with Degree, radian etc. are unit to measure.Similarly, route pose 416 has equally been translated and/or has been rotated.It is worth noting that, road Line pose 414 and 416 is far from barrier 402.Because route pose 414 and 416 indicates the path advanced along robot 202 Distributed locations, so robot 202 can between them interpolation to determine its path that should take.The pose of interpolation 502A-502D illustrates the path advanced between route pose 414 and 416.It is worth noting that, robot 202 can also interpolation Other paths (undeclared), to be moved between route pose and/or route pose.
The pose 502A-502D of interpolation associated can be substantially similar to one in route pose 414 and 416 Or the trace of multiple traces.In some cases, as illustrated in fig. 5, the pose 502A-502D of interpolation can be interpolation Route pose.Therefore, the pose 502A-502D of interpolation can indicate that robot 202 will be along the orientation and/or orientation of route.Have Benefit, this allows the path of interpolation that robot 202 is directed to the suitable place of robot 202.Furthermore, it may be determined that interpolation Pose 502A-502D so that the trace of any of interpolation pose 502-502D and object are (for example, barrier 402, object Body 210 or object 212) between there is no overlapping, to avoid colliding.
It is also possible to consider to the pose for rotationally and/or translationally determining interpolation for becoming route pose 416 from route pose 414 502A-502D.For example, robot 202 can determine the pose of route pose 414 and the pose of route pose 416.Then, machine People 202 can search the difference between route pose 414 and the pose of route pose 416, then determine how from route pose 414 Pose become the pose of route pose 416.For example, robot 202 can distribute rotation between the pose 502A-502D of interpolation And translation, so that robot 202 will be from 414 rotation and translation of route pose to route pose 416.In some cases, machine People 202 can substantially equally distribute rotation and translation between the pose 502A-502D of interpolation.For example, if there is N number of interior Orientation is inserted, then robot 202 can substantially uniformly divide the position of route pose 414 and 416 in those N number of interpolation orientation The position of appearance and the difference of rotation.Alternatively, robot 202 can unevenly divide route in those N number of interpolation orientation The position of the pose of pose 414 and 416 and/or the difference of rotation.Advantageously, being evenly dividing permissible robot 202 from road Line pose 414 smoothly advances to route pose 416.However, uneven divide aloows robot 202 by some In region it is mobile subtleer than in other regions and more easily with respect to and avoid object.For example, in order to avoid interpolation The close object of pose 502A-502D, robot 202 will have to take a sudden turn.Accordingly, it is considered to the turning is arrived, it can The interpolation pose of the turning can be needed more to surround.In some cases, the number in interpolation orientation can be dynamically, and can It is used as needed than N number of more or fewer number interpolations orientation.
Fig. 6 is the process stream according to the illustrative methods 600 for operating robot of some embodiments of the disclosure Cheng Tu.Frame 602 includes the map for being based at least partially on the data being collected into and forming environment.Frame 604 is included in map and determines Robot is by the route of traveling.Frame 606 includes the one or more route poses generated on route, wherein each route pose packet Instruction robot is included along the trace of the pose of route, and there are multiple points in each route pose.Frame 608 includes to determine each Power on each of multiple points of route pose, the power include one or more inspections on one or more objects One or more of the repulsive force of measuring point and multiple points on other route poses in one or more route poses Attraction.Frame 610 includes to relocate each route pose in response to the power on each point of each route pose.Frame 612 include that interpolation is executed between one or more route poses relocated with raw between one or more route poses At the collisionless path advanced for robot.
Fig. 7 is the process stream according to the illustrative methods 700 for operating robot of some embodiments of the disclosure Cheng Tu.Frame 702 includes the map using the data build environment from one or more sensors.Frame 704 is included on map Determine that route, the route include one or more route poses, each route pose includes at least partly indicating robot Along the pose of route, the trace of size and shape, and there are multiple points in each route pose.Frame 706 includes to calculate environment In object on point to the repulsive forces of multiple points of the first route pose in one or more route poses.Frame 708 includes The first route pose is relocated at least responsive to repulsive force.Frame 710 is included in the first route pose and one relocated Interpolation is executed between another route pose in a or multiple route poses.
As used herein, computer and/or computing device may include, but are not limited to personal computer (" PC ") and small-sized Computer, either desktop computer, laptop computer or other, mainframe computer, work station, server, a number Word assistant (" PDA "), handheld computer, embedded computer, programmable logic device, personal communicator, tablet computer, Mobile device, portable navigating device, the device for being equipped with J2ME, cellular phone, smart phone, personal integrated communicaton or amusement dress It sets and/or any other device for being able to carry out one group of instruction and handling input data signal.
As used herein, computer program and/or software may include execute function any sequence or people or machine can The step of identification.Such computer program and/or software can be presented with any programming language or environment, the programming language Speech is including, for example, C/C++, C#, Fortran, COBOL, MATLABTM, PASCAL, Python, assembler language, markup language (example Such as, HTML, SGML, XML and VoXML) etc. and object-oriented environment, such as Common Object Request Broker Architecture (“CORBA”)、JAVATM(including J2ME, Java Bean etc.), binary runtime environment (for example, BREW) etc..
As used herein, connection, link, transmission channel, delay line and/or wirelessly may include any two or more Entity (either physics or logic/it is virtual) between cause and effect link, the link realizes the information between entity Exchange.
Although it will be recognized that according to the particular order of steps of method describe the disclosure in some terms, still these are retouched State be only the disclosure more extensive method explanation, and can be modified according to the needs of specific application.In certain situations Under, certain steps may become unnecessary or optional.Furthermore it is possible to which certain steps or functionality are added to disclosed implementation Scheme, or change the execution order of two or more steps.All these variations be considered to cover being disclosed herein and In claimed disclosure.
Although the novelty for having been shown, being described and pointed out the disclosure applied to various embodiments is discussed in detail above Feature, but it is to be understood that in the case of not departing from the present disclosure, those skilled in the art can be to shown device or process Form and details carry out various omissions, substitutions and changes.The description of front is the optimal mode for the realization disclosure being presently envisaged by. The description is in no way meant to limit, but should be considered as the explanation to the General Principle of the disclosure.It should be true with reference to claim Determine the scope of the present disclosure.
It is such to illustrate and describe although disclosure has been described and described in detail in schema and foregoing description It should be considered as illustrative or exemplary and not restrictive.The present disclosure is not limited to the disclosed embodiments.It is attached by studying Figure, disclosure and appended claims, those skilled in the art be understood that when practicing claimed disclosure and Realize the modification of disclosed embodiment.
It should be noted that being not construed as implying herein using specific term when describing certain features or aspect of the disclosure In redefine term be limited to comprising the disclosure associated with the term features or aspect any specific spy Sign.Unless expressly stated otherwise, otherwise in term and phrase and its modification, especially appended claims used herein Term and phrase and its modification, should be interpreted it is open and not restrictive.As the example of above content, term "comprising" is interpreted as " include and be not limited to ", " including but not limited to " etc.;As used herein term " includes " and " packet Containing ", " containing " or " being characterized in that " it is synonymous, and be inclusive or open, it is not excluded that additional, unlisted element or Method and step;Term " having " is interpreted as " at least having ";Term " such as " it is interpreted as " such as and being not limited to ";Term "comprising" is interpreted as " including but not limited to ";Term " example " is used to provide the illustrative examples of article when discussing, without It is the exhaustive or restricted list of article, and is interpreted as " such as, but not limited to ";Such as " well known ", " common ", The adjective of " standard " and term with similar meaning are understood not to described article being limited to given time period Or be limited to by given time available article, but be interpreted as covering now or can be available in any time in future Or well known, common or standard technology;And the art such as " preferably ", " preferred ", " expectation " or " desirable " The use of language and the word with similar meaning should be not construed as to imply that certain features for the structure or function of the disclosure It is crucial, essential or even vital, but is merely intended to highlight and can be used for that spy can also be not used in Determine the alternative solution or additional features of embodiment.Similarly, unless expressly stated otherwise, it is otherwise connected together with conjunction "and" One group of article be understood not to require each article in those articles to be present in the grouping, but be understood that For "and/or".Similarly, unless expressly stated otherwise, it otherwise should not be managed with one group of article that conjunction "or" connects together Solution is mutually exclusive to need in that group, but should be understood "and/or".Term " about " or " substantially " etc. are synonymous , and be used to indicate the value modified by the term have it is associated there understand range, wherein the range can be ± 20%, ± 15%, ± 10%, ± 5% or ± 1%.Term " generally " is used to indicate result (for example, measured value) close to target Value, wherein close to may imply that such as result in the 80% of value, value 90% in, value 95% in or the 99% of value in.And And as used herein, " definition " or " determination " may include " predefined " or " predetermined " and/or value determining in other ways, Condition, threshold value, measurement result etc..

Claims (20)

1. a kind of non-transitory computer readable storage devices for being stored thereon with multiple instruction, described instruction can be by processing equipment It executes to operate robot, described instruction is configured to that the processing equipment is made to carry out following behaviour when being executed by the processing equipment Make:
Use the map of the data build environment from one or more sensors;
Determine that route, the route include one or more route poses on the map, each route pose includes at least Partly indicate the robot along the pose of the route and the trace of shape, and
Multiple points are mounted in each route pose;And
The point on the object in the environment is calculated to described in the first route pose in one or more of route poses The repulsive force of multiple points.
2. non-transitory computer readable storage devices according to claim 1 also comprise one or more instructions, institute Stating instruction determines the processing equipment from one or more of route poses when being executed by the processing equipment Another route pose on point be applied to the attraction on the multiple point of the first route pose.
3. non-transitory computer readable storage devices according to claim 1 also comprise one or more instructions, institute Stating instruction determines the processing equipment from one or more of route poses when being executed by the processing equipment Another route pose on point be applied to the torsion on the multiple point of the first route pose.
4. a kind of method of the dynamic navigation for robot in the environment, comprising:
The map of the environment is generated using the data from one or more sensors;
Determine that route, the route include one or more route poses on the map, each route pose includes at least Partly indicate the robot along the pose of the route and the trace of shape, and
Multiple points are mounted in each route pose;
The point on the object in the environment is calculated to described in the first route pose in one or more of route poses The repulsive force of multiple points;
The first route pose is relocated at least responsive to the repulsive force;And
Another route pose in the first route pose and one or more of route poses of the repositioning it Between execute interpolation.
5. according to the method described in claim 4, also comprising another determining from one or more of route poses Point on route pose is applied to the attraction on the multiple point of the first route pose.
6. according to the method described in claim 4, further comprising:
The multiple objects in the environment, each of the multiple object tool are detected using one or more of sensors There is test point;And
Force function is defined, the force function, which is calculated, is applied to the first via by each described test point of the multiple object Repulsive force on the multiple point of line pose, wherein each repulsive force includes vector.
7. according to the method described in claim 6, wherein relocating the first route pose includes calculating the force function Minimum value.
8. according to the method described in claim 4, wherein described relocate the first route pose including translation and rotate The first route pose.
9. according to the method described in claim 4, wherein the interpolation includes:
Generate the interpolation route pose with the trace for the shape for being substantially similar to the robot;And
At least based on described in described pass through in translation and the first route pose and one or more of route poses that rotate Collisionless path between another route pose determines the translation and rotation of the interpolation route pose.
10. according to the method described in claim 4, also comprising the point on the object in the first route pose The trace outside in the case where, by the magnitude calculation of the repulsive force be and the point and described first on the object The distance between each of the multiple point of route pose is directly proportional.
11. according to the method described in claim 4, also comprising the point on the object in the first route pose The trace inside in the case where, by the magnitude calculation of the repulsive force be and the point and described first on the object The distance between each of the multiple point of route pose is inversely proportional.
12. being applied to first route due to the repulsive force according to the method described in claim 4, additionally comprising calculating Torsion on the multiple point of pose.
13. a kind of robot, comprising:
One or more sensors are configured to collect the data about environment, comprising about one or more in the environment The data of test point on a object;And
Controller is configured to:
The data being collected into are based at least partially on, the map of the environment is formed;
Determine the robot by the route of traveling in the map;
Generate one or more route poses on the route, wherein each route pose include indicate the robot along The trace of the pose of the route, and multiple points are mounted in each route pose;
Determine that the power on each of the multiple point of each route pose, the power include from one or more of The repulsive force and other routes in one or more of route poses of the one or more test point on object The attraction of one or more of the multiple point on pose;
One or more route positions are relocated in response to the power on each point of one or more of route poses Appearance;And
Interpolation is executed between one or more route poses, to generate described in confession between one or more of route poses The collisionless path that robot advances.
14. robot according to claim 13, in which:
One or more of route poses form the sequence that the robot advances along the route;And
The interpolation includes the linear interpolation between the continuous route pose in one or more of route poses.
15. robot according to claim 13, wherein the interpolation, which generates to have, is substantially similar to each route position One or more interpolation route poses of the trace of the trace of appearance.
16. robot according to claim 13, wherein each point of the one or more of route poses of the determination On the power also comprise to calculate and at least partly make in the power on each point of each route pose and the environment Object the associated force function of one or more characteristics.
17. robot according to claim 4, wherein one or more of characteristics include distance, shape, material and face One or more of color.
18. robot according to claim 4, in which:
The zero-emission repulsion that the first test point that the force function is associated on the first object is applied, wherein first test point and The distance between second point of first route pose is higher than predetermined distance threshold.
19. robot according to claim 13, wherein the trace of each route pose has and the robot The substantially similar size and shape of the trace.
20. robot according to claim 13, wherein the robot includes floor cleaning equipment.
CN201780074759.6A 2016-11-02 2017-10-31 System and method for dynamic route planning in autonomous navigation Active CN110023866B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US15/341,612 US10001780B2 (en) 2016-11-02 2016-11-02 Systems and methods for dynamic route planning in autonomous navigation
US15/341,612 2016-11-02
PCT/US2017/059379 WO2018085294A1 (en) 2016-11-02 2017-10-31 Systems and methods for dynamic route planning in autonomous navigation

Publications (2)

Publication Number Publication Date
CN110023866A true CN110023866A (en) 2019-07-16
CN110023866B CN110023866B (en) 2022-12-06

Family

ID=62021299

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780074759.6A Active CN110023866B (en) 2016-11-02 2017-10-31 System and method for dynamic route planning in autonomous navigation

Country Status (7)

Country Link
US (3) US10001780B2 (en)
EP (1) EP3535630A4 (en)
JP (1) JP7061337B2 (en)
KR (1) KR102528869B1 (en)
CN (1) CN110023866B (en)
CA (1) CA3042532A1 (en)
WO (1) WO2018085294A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112833899A (en) * 2020-12-31 2021-05-25 吉林大学 Full-coverage path planning method for unmanned sanitation vehicle
CN114431122A (en) * 2022-01-27 2022-05-06 山东交通学院 Road greening sprinkling intelligent control system and method

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10809071B2 (en) * 2017-10-17 2020-10-20 AI Incorporated Method for constructing a map while performing work
CN105760576A (en) * 2016-01-27 2016-07-13 首都师范大学 Formalized analyzing method and system for mechanical arm motion planning on basis of conformal geometric algebra
KR102526083B1 (en) * 2016-08-30 2023-04-27 엘지전자 주식회사 Mobile terminal and operating method thereof
US10429196B2 (en) * 2017-03-08 2019-10-01 Invensense, Inc. Method and apparatus for cart navigation
US10293485B2 (en) * 2017-03-30 2019-05-21 Brain Corporation Systems and methods for robotic path planning
FR3065853B1 (en) * 2017-04-27 2019-06-07 Peugeot Citroen Automobiles Sa METHOD AND DEVICE FOR CONTROLLING THE TRANSMISSION OF DATA FROM A VEHICLE TO A COMMUNICATION EQUIPMENT
KR102500634B1 (en) * 2018-01-05 2023-02-16 엘지전자 주식회사 Guide robot and operating method thereof
CN108742346A (en) * 2018-06-27 2018-11-06 杨扬 The method for traversing the method for working environment and establishing grating map
CN112672856A (en) * 2018-07-16 2021-04-16 云海智行股份有限公司 System and method for optimizing route planning for sharp turns of a robotic device
US11092458B2 (en) * 2018-10-30 2021-08-17 Telenav, Inc. Navigation system with operation obstacle alert mechanism and method of operation thereof
US10809734B2 (en) 2019-03-13 2020-10-20 Mobile Industrial Robots A/S Route planning in an autonomous device
CN110101340A (en) * 2019-05-24 2019-08-09 北京小米移动软件有限公司 Cleaning equipment, clean operation execute method, apparatus and storage medium
US11565411B2 (en) * 2019-05-29 2023-01-31 Lg Electronics Inc. Intelligent robot cleaner for setting travel route based on video learning and managing method thereof
US11958183B2 (en) 2019-09-19 2024-04-16 The Research Foundation For The State University Of New York Negotiation-based human-robot collaboration via augmented reality
US11592299B2 (en) 2020-03-19 2023-02-28 Mobile Industrial Robots A/S Using static scores to control vehicle operations
CN113741550B (en) * 2020-05-15 2024-02-02 北京机械设备研究所 Mobile robot following method and system
CN112015183B (en) * 2020-09-08 2022-02-08 安徽工程大学 Obstacle avoidance method for mobile robot in terrain with concave-convex features under constraint of energy consumption
US11927972B2 (en) * 2020-11-24 2024-03-12 Lawrence Livermore National Security, Llc Collision avoidance based on traffic management data
CN112595324B (en) * 2020-12-10 2022-03-29 安徽工程大学 Optimal node wheel type mobile robot path planning method under optimal energy consumption
WO2022140969A1 (en) * 2020-12-28 2022-07-07 深圳市优必选科技股份有限公司 Method for dynamically generating footprint set, storage medium, and biped robot
CN112971621A (en) * 2021-03-11 2021-06-18 河北工业大学 Indoor intelligent cleaning system and control method
US11940800B2 (en) * 2021-04-23 2024-03-26 Irobot Corporation Navigational control of autonomous cleaning robots
US20230071338A1 (en) * 2021-09-08 2023-03-09 Sea Machines Robotics, Inc. Navigation by mimic autonomy
CN114355925B (en) * 2021-12-29 2024-03-19 杭州海康机器人股份有限公司 Path planning method, device, equipment and computer readable storage medium
CN114947655A (en) * 2022-05-17 2022-08-30 安克创新科技股份有限公司 Robot control method, device, robot and computer readable storage medium

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63229503A (en) * 1987-03-19 1988-09-26 Fujitsu Ltd Posture control method for robot
DE19745656A1 (en) * 1997-10-16 1999-04-22 Daimler Chrysler Ag Impact absorber for a motor vehicle
EP1540564A1 (en) * 2002-03-22 2005-06-15 Ibrahim Nahla Vehicle navigation, collision avoidance and control system
US20070021915A1 (en) * 1997-10-22 2007-01-25 Intelligent Technologies International, Inc. Collision Avoidance Methods and Systems
US20080059015A1 (en) * 2006-06-09 2008-03-06 Whittaker William L Software architecture for high-speed traversal of prescribed routes
JP2010061442A (en) * 2008-09-04 2010-03-18 Murata Machinery Ltd Autonomous mobile device
EP2330471A2 (en) * 2009-11-10 2011-06-08 Vorwerk & Co. Interholding GmbH Method for controlling a robot
KR20120030263A (en) * 2010-09-20 2012-03-28 삼성전자주식회사 Robot and control method thereof
US20120109150A1 (en) * 2002-03-06 2012-05-03 Mako Surgical Corp. Haptic guidance system and method
US20130218467A1 (en) * 2010-07-27 2013-08-22 Masahiro Ogawa Driving assistance device
CN104029203A (en) * 2014-06-18 2014-09-10 大连大学 Path planning method for implementation of obstacle avoidance for space manipulators
CN104317291A (en) * 2014-09-16 2015-01-28 哈尔滨恒誉名翔科技有限公司 Artificial-potential-field-based robot collision preventation path planning method
US20150199458A1 (en) * 2014-01-14 2015-07-16 Energid Technologies Corporation Digital proxy simulation of robotic hardware
CN104875882A (en) * 2015-05-21 2015-09-02 合肥学院 Quadrotor
US20160107313A1 (en) * 2014-10-17 2016-04-21 GM Global Technology Operations LLC Dynamic obstacle avoidance in a robotic system
CN105549597A (en) * 2016-02-04 2016-05-04 同济大学 Unmanned vehicle dynamic path programming method based on environment uncertainty
CN105739507A (en) * 2016-04-29 2016-07-06 昆山工研院工业机器人研究所有限公司 Anti-collision optimal path planning method for robot
CN105955273A (en) * 2016-05-25 2016-09-21 速感科技(北京)有限公司 Indoor robot navigation system and method
US20160309973A1 (en) * 2015-04-24 2016-10-27 Avidbots Corp. Apparatus and methods for semi-autonomous cleaning of surfaces

Family Cites Families (174)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5280179A (en) 1979-04-30 1994-01-18 Sensor Adaptive Machines Incorporated Method and apparatus utilizing an orientation code for automatically guiding a robot
US4638445A (en) 1984-06-08 1987-01-20 Mattaboni Paul J Autonomous mobile robot
US5121497A (en) 1986-03-10 1992-06-09 International Business Machines Corporation Automatic generation of executable computer code which commands another program to perform a task and operator modification of the generated executable computer code
US4763276A (en) 1986-03-21 1988-08-09 Actel Partnership Methods for refining original robot command signals
US4852018A (en) 1987-01-07 1989-07-25 Trustees Of Boston University Massively parellel real-time network architectures for robots capable of self-calibrating their operating parameters through associative learning
EP0496785B1 (en) 1989-10-17 1997-03-26 The Perkin-Elmer Corporation Robotic interface
US5640323A (en) 1990-02-05 1997-06-17 Caterpillar Inc. System and method for operating an autonomous navigation system
US8352400B2 (en) 1991-12-23 2013-01-08 Hoffberg Steven M Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore
US5673367A (en) 1992-10-01 1997-09-30 Buckley; Theresa M. Method for neural network control of motion using real-time environmental feedback
CA2081519C (en) 1992-10-27 2000-09-05 The University Of Toronto Parametric control device
KR0161031B1 (en) 1993-09-09 1998-12-15 김광호 Position error correction device of robot
US5602761A (en) 1993-12-30 1997-02-11 Caterpillar Inc. Machine performance monitoring and fault classification using an exponentially weighted moving average scheme
EP1418026A1 (en) 1995-09-11 2004-05-12 Kabushiki Kaisha Yaskawa Denki Control apparatus for robot
US6169981B1 (en) 1996-06-04 2001-01-02 Paul J. Werbos 3-brain architecture for an intelligent decision and control system
US6366293B1 (en) 1998-09-29 2002-04-02 Rockwell Software Inc. Method and apparatus for manipulating and displaying graphical objects in a computer display device
US6243622B1 (en) 1998-10-16 2001-06-05 Xerox Corporation Touchable user interface using self movable robotic modules
EP1037134A2 (en) 1999-03-16 2000-09-20 Matsushita Electric Industrial Co., Ltd. Virtual space control data receiving apparatus and method
US6124694A (en) 1999-03-18 2000-09-26 Bancroft; Allen J. Wide area navigation for a robot scrubber
KR20010053322A (en) 1999-04-30 2001-06-25 이데이 노부유끼 Electronic pet system, network system, robot, and storage medium
JP3537362B2 (en) 1999-10-12 2004-06-14 ファナック株式会社 Graphic display device for robot system
AU2001250802A1 (en) 2000-03-07 2001-09-17 Sarnoff Corporation Camera pose estimation
KR20020008848A (en) 2000-03-31 2002-01-31 이데이 노부유끼 Robot device, robot device action control method, external force detecting device and external force detecting method
US8543519B2 (en) 2000-08-07 2013-09-24 Health Discovery Corporation System and method for remote melanoma screening
JP4765155B2 (en) 2000-09-28 2011-09-07 ソニー株式会社 Authoring system, authoring method, and storage medium
US6678413B1 (en) 2000-11-24 2004-01-13 Yiqing Liang System and method for object identification and behavior characterization using video analysis
JP2002197437A (en) 2000-12-27 2002-07-12 Sony Corp Walking detection system, walking detector, device and walking detecting method
US6442451B1 (en) 2000-12-28 2002-08-27 Robotic Workspace Technologies, Inc. Versatile robot control system
JP2002239960A (en) 2001-02-21 2002-08-28 Sony Corp Action control method of robot device, program, recording medium, and robot device
US20020175894A1 (en) 2001-03-06 2002-11-28 Vince Grillo Hand-supported mouse for computer input
US6917925B2 (en) 2001-03-30 2005-07-12 Intelligent Inference Systems Corporation Convergent actor critic-based fuzzy reinforcement learning apparatus and method
JP2002301674A (en) 2001-04-03 2002-10-15 Sony Corp Leg type moving robot, its motion teaching method and storage medium
EP1254688B1 (en) 2001-04-30 2006-03-29 Sony France S.A. autonomous robot
US6584375B2 (en) 2001-05-04 2003-06-24 Intellibot, Llc System for a retail environment
US6636781B1 (en) 2001-05-22 2003-10-21 University Of Southern California Distributed control and coordination of autonomous agents in a dynamic, reconfigurable system
JP3760186B2 (en) 2001-06-07 2006-03-29 独立行政法人科学技術振興機構 Biped walking type moving device, walking control device thereof, and walking control method
JP4188607B2 (en) 2001-06-27 2008-11-26 本田技研工業株式会社 Method for estimating floor reaction force of bipedal mobile body and method for estimating joint moment of bipedal mobile body
WO2003007129A2 (en) 2001-07-13 2003-01-23 Broks Automation, Inc. Trajectory planning and motion control strategies for a planar three-degree-of-freedom robotic arm
US6710346B2 (en) 2001-08-02 2004-03-23 International Business Machines Corporation Active infrared presence sensor
AU2002331786A1 (en) 2001-08-31 2003-03-18 The Board Of Regents Of The University And Community College System, On Behalf Of The University Of Coordinated joint motion control system
US6812846B2 (en) 2001-09-28 2004-11-02 Koninklijke Philips Electronics N.V. Spill detector based on machine-imaging
US7243334B1 (en) 2002-01-16 2007-07-10 Prelude Systems, Inc. System and method for generating user interface code
JP3790816B2 (en) 2002-02-12 2006-06-28 国立大学法人 東京大学 Motion generation method for humanoid link system
WO2004003680A2 (en) 2002-04-22 2004-01-08 Neal Solomon System, method and apparatus for automated collective mobile robotic vehicles used in remote sensing surveillance
US7505604B2 (en) 2002-05-20 2009-03-17 Simmonds Precision Prodcuts, Inc. Method for detection and recognition of fog presence within an aircraft compartment using video images
AU2003262893A1 (en) 2002-08-21 2004-03-11 Neal Solomon Organizing groups of self-configurable mobile robotic agents
AU2003900861A0 (en) 2003-02-26 2003-03-13 Silverbrook Research Pty Ltd Methods,systems and apparatus (NPS042)
JP3950805B2 (en) 2003-02-27 2007-08-01 ファナック株式会社 Teaching position correction device
US7313279B2 (en) 2003-07-08 2007-12-25 Computer Associates Think, Inc. Hierarchical determination of feature relevancy
SE0301531L (en) 2003-05-22 2004-11-23 Abb Ab A Control method for a robot
US7212651B2 (en) 2003-06-17 2007-05-01 Mitsubishi Electric Research Laboratories, Inc. Detecting pedestrians using patterns of motion and appearance in videos
US7769487B2 (en) 2003-07-24 2010-08-03 Northeastern University Process and architecture of robotic system to mimic animal behavior in the natural environment
KR100520049B1 (en) * 2003-09-05 2005-10-10 학교법인 인하학원 Path planning method for the autonomous mobile robot
WO2005028166A1 (en) 2003-09-22 2005-03-31 Matsushita Electric Industrial Co., Ltd. Device and method for controlling elastic-body actuator
US7342589B2 (en) 2003-09-25 2008-03-11 Rockwell Automation Technologies, Inc. System and method for managing graphical data
JP4592276B2 (en) 2003-10-24 2010-12-01 ソニー株式会社 Motion editing apparatus, motion editing method, and computer program for robot apparatus
WO2005081082A1 (en) 2004-02-25 2005-09-01 The Ritsumeikan Trust Control system of floating mobile body
JP4661074B2 (en) 2004-04-07 2011-03-30 ソニー株式会社 Information processing system, information processing method, and robot apparatus
EP1622072B1 (en) 2004-07-27 2010-07-07 Sony France S.A. An automated action-selection system and method and application thereof for training prediction machines and for driving the development of self-developing devices
SE0402672D0 (en) 2004-11-02 2004-11-02 Viktor Kaznov Ball robot
US7211979B2 (en) 2005-04-13 2007-05-01 The Broad Of Trustees Of The Leland Stanford Junior University Torque-position transformer for task control of position controlled robots
US7765029B2 (en) 2005-09-13 2010-07-27 Neurosciences Research Foundation, Inc. Hybrid control device
JP4876511B2 (en) 2005-09-29 2012-02-15 株式会社日立製作所 Logic extraction support device
JP5188977B2 (en) 2005-09-30 2013-04-24 アイロボット コーポレイション Companion robot for personal interaction
US7668605B2 (en) 2005-10-26 2010-02-23 Rockwell Automation Technologies, Inc. Wireless industrial control user interface
US7441298B2 (en) 2005-12-02 2008-10-28 Irobot Corporation Coverage robot mobility
US7741802B2 (en) 2005-12-20 2010-06-22 Intuitive Surgical Operations, Inc. Medical robotic system with programmably controlled constraints on error dynamics
US8224018B2 (en) 2006-01-23 2012-07-17 Digimarc Corporation Sensing data from physical objects
US7576639B2 (en) 2006-03-14 2009-08-18 Mobileye Technologies, Ltd. Systems and methods for detecting pedestrians in the vicinity of a powered industrial vehicle
US8924021B2 (en) 2006-04-27 2014-12-30 Honda Motor Co., Ltd. Control of robots from human motion descriptors
WO2007138987A1 (en) 2006-05-25 2007-12-06 Takehiro Ishizaki Work robot
KR100791382B1 (en) 2006-06-01 2008-01-07 삼성전자주식회사 Method for classifying and collecting of area features as robot's moving path and robot controlled as the area features, apparatus and method for composing user interface using area features
US8843244B2 (en) * 2006-10-06 2014-09-23 Irobot Corporation Autonomous behaviors for a remove vehicle
JP4699426B2 (en) 2006-08-08 2011-06-08 パナソニック株式会社 Obstacle avoidance method and obstacle avoidance moving device
US8174568B2 (en) * 2006-12-01 2012-05-08 Sri International Unified framework for precise vision-aided navigation
JP4267027B2 (en) 2006-12-07 2009-05-27 ファナック株式会社 Robot controller
EP2140316B1 (en) 2007-03-29 2011-12-28 iRobot Corporation Robot operator control unit configuration system and method
US8255092B2 (en) 2007-05-14 2012-08-28 Irobot Corporation Autonomous behaviors for a remote vehicle
JP5213023B2 (en) 2008-01-15 2013-06-19 本田技研工業株式会社 robot
JP4445038B2 (en) 2008-02-06 2010-04-07 パナソニック株式会社 ROBOT, ROBOT CONTROL DEVICE AND CONTROL METHOD, AND ROBOT CONTROL DEVICE CONTROL PROGRAM
JP5181704B2 (en) 2008-02-07 2013-04-10 日本電気株式会社 Data processing apparatus, posture estimation system, posture estimation method and program
US8175992B2 (en) 2008-03-17 2012-05-08 Intelliscience Corporation Methods and systems for compound feature creation, processing, and identification in conjunction with a data analysis and feature recognition system wherein hit weights are summed
CA2719494C (en) 2008-04-02 2015-12-01 Irobot Corporation Robotics systems
JP4715863B2 (en) 2008-05-01 2011-07-06 ソニー株式会社 Actuator control apparatus, actuator control method, actuator, robot apparatus, and computer program
EP2349120B1 (en) 2008-09-04 2017-03-22 Iwalk, Inc. Hybrid terrain-adaptive lower-extremity systems
US20110282169A1 (en) 2008-10-29 2011-11-17 The Regents Of The University Of Colorado, A Body Corporate Long Term Active Learning from Large Continually Changing Data Sets
US20100114372A1 (en) 2008-10-30 2010-05-06 Intellibot Robotics Llc Method of cleaning a surface using an automatic cleaning device
JP5242342B2 (en) 2008-10-31 2013-07-24 株式会社東芝 Robot controller
US8428781B2 (en) 2008-11-17 2013-04-23 Energid Technologies, Inc. Systems and methods of coordination control for robot manipulation
US8423182B2 (en) 2009-03-09 2013-04-16 Intuitive Surgical Operations, Inc. Adaptable integrated energy control system for electrosurgical tools in robotic surgical systems
US8120301B2 (en) 2009-03-09 2012-02-21 Intuitive Surgical Operations, Inc. Ergonomic surgeon control console in robotic surgical systems
US8364314B2 (en) 2009-04-30 2013-01-29 GM Global Technology Operations LLC Method and apparatus for automatic control of a humanoid robot
US8694449B2 (en) 2009-05-29 2014-04-08 Board Of Trustees Of Michigan State University Neuromorphic spatiotemporal where-what machines
JP4676544B2 (en) 2009-05-29 2011-04-27 ファナック株式会社 Robot control device for controlling a robot for supplying and taking out workpieces from a machine tool
US8706297B2 (en) 2009-06-18 2014-04-22 Michael Todd Letsky Method for establishing a desired area of confinement for an autonomous robot and autonomous robot implementing a control system for executing the same
CN102448683B (en) 2009-07-02 2014-08-27 松下电器产业株式会社 Robot, control device for robot arm, and control program for robot arm
EP2284769B1 (en) 2009-07-16 2013-01-02 European Space Agency Method and apparatus for analyzing time series data
US20110026770A1 (en) 2009-07-31 2011-02-03 Jonathan David Brookshire Person Following Using Histograms of Oriented Gradients
US8250901B2 (en) 2009-09-22 2012-08-28 GM Global Technology Operations LLC System and method for calibrating a rotary absolute position sensor
TW201113815A (en) 2009-10-09 2011-04-16 Primax Electronics Ltd QR code processing method and apparatus thereof
US8423225B2 (en) 2009-11-11 2013-04-16 Intellibot Robotics Llc Methods and systems for movement of robotic device using video signal
US8679260B2 (en) 2009-11-11 2014-03-25 Intellibot Robotics Llc Methods and systems for movement of an automatic cleaning device using video signal
JP5446765B2 (en) 2009-11-17 2014-03-19 トヨタ自動車株式会社 Route search system, route search method, route search program, and moving body
US8521328B2 (en) 2009-12-10 2013-08-27 The Boeing Company Control system for robotic vehicles
TW201123031A (en) 2009-12-24 2011-07-01 Univ Nat Taiwan Science Tech Robot and method for recognizing human faces and gestures thereof
JP5506618B2 (en) 2009-12-28 2014-05-28 本田技研工業株式会社 Robot control device
JP5506617B2 (en) 2009-12-28 2014-05-28 本田技研工業株式会社 Robot control device
EP2533678B1 (en) 2010-02-11 2020-03-25 Intuitive Surgical Operations, Inc. System for automatically maintaining an operator selected roll orientation at a distal tip of a robotic endoscope
KR101169674B1 (en) 2010-03-11 2012-08-06 한국과학기술연구원 Telepresence robot, telepresence system comprising the same and method for controlling the same
US8660355B2 (en) 2010-03-19 2014-02-25 Digimarc Corporation Methods and systems for determining image processing operations relevant to particular imagery
US9122994B2 (en) 2010-03-26 2015-09-01 Brain Corporation Apparatus and methods for temporally proximate object recognition
US9405975B2 (en) 2010-03-26 2016-08-02 Brain Corporation Apparatus and methods for pulse-code invariant object recognition
US9311593B2 (en) 2010-03-26 2016-04-12 Brain Corporation Apparatus and methods for polychronous encoding and multiplexing in neuronal prosthetic devices
US8336420B2 (en) 2010-06-02 2012-12-25 Disney Enterprises, Inc. Three-axis robotic joint using four-bar linkages to drive differential side gears
FR2963132A1 (en) 2010-07-23 2012-01-27 Aldebaran Robotics HUMANOID ROBOT HAVING A NATURAL DIALOGUE INTERFACE, METHOD OF USING AND PROGRAMMING THE SAME
US20120045068A1 (en) 2010-08-20 2012-02-23 Korea Institute Of Science And Technology Self-fault detection system and method for microphone array and audio-based device
US8594971B2 (en) 2010-09-22 2013-11-26 Invensense, Inc. Deduced reckoning navigation without a constraint relationship between orientation of a sensor platform and a direction of travel of an object
KR101233714B1 (en) * 2010-09-30 2013-02-18 아주대학교산학협력단 Autonomous mobile robot avoiding obstacle trap and controlling method for the same
KR20120035519A (en) 2010-10-05 2012-04-16 삼성전자주식회사 Debris inflow detecting unit and robot cleaning device having the same
US20120143495A1 (en) 2010-10-14 2012-06-07 The University Of North Texas Methods and systems for indoor navigation
US9015093B1 (en) 2010-10-26 2015-04-21 Michael Lamport Commons Intelligent control with hierarchical stacked neural networks
US8726095B2 (en) 2010-12-02 2014-05-13 Dell Products L.P. System and method for proactive management of an information handling system with in-situ measurement of end user actions
JP5185358B2 (en) 2010-12-13 2013-04-17 株式会社東芝 Action history search device
CN103038030B (en) 2010-12-17 2015-06-03 松下电器产业株式会社 Apparatus and method for controlling elastic actuator drive mechanism
US8639644B1 (en) 2011-05-06 2014-01-28 Google Inc. Shared robot knowledge base for use with cloud computing system
US8380652B1 (en) 2011-05-06 2013-02-19 Google Inc. Methods and systems for autonomous robotic decision making
US9566710B2 (en) 2011-06-02 2017-02-14 Brain Corporation Apparatus and methods for operating robotic devices using selective state space training
US9189891B2 (en) 2011-08-16 2015-11-17 Google Inc. Systems and methods for navigating a camera
US9015092B2 (en) 2012-06-04 2015-04-21 Brain Corporation Dynamically reconfigurable stochastic learning apparatus and methods
US20130096719A1 (en) 2011-10-13 2013-04-18 The U.S.A. As Represented By The Administrator Of The National Aeronautics And Space Administration Method for dynamic optimization of a robot control interface
JP6305673B2 (en) 2011-11-07 2018-04-04 セイコーエプソン株式会社 Robot control system, robot system and robot
JP5399593B2 (en) 2011-11-10 2014-01-29 パナソニック株式会社 ROBOT, ROBOT CONTROL DEVICE, CONTROL METHOD, AND CONTROL PROGRAM
EP2776216B1 (en) * 2011-11-11 2022-08-31 iRobot Corporation Robot apparautus and control method for resuming operation following a pause.
KR101133037B1 (en) * 2011-12-01 2012-04-04 국방과학연구소 Path updating method for collision avoidance of autonomous vehicle and the apparatus
KR101305819B1 (en) 2012-01-04 2013-09-06 현대자동차주식회사 Manipulating intention torque extracting method of wearable robot
US8958911B2 (en) 2012-02-29 2015-02-17 Irobot Corporation Mobile robot
JP5895628B2 (en) 2012-03-15 2016-03-30 株式会社ジェイテクト ROBOT CONTROL METHOD, ROBOT CONTROL DEVICE, AND ROBOT CONTROL SYSTEM
US9221177B2 (en) 2012-04-18 2015-12-29 Massachusetts Institute Of Technology Neuromuscular model-based sensing and control paradigm for a robotic leg
US9208432B2 (en) 2012-06-01 2015-12-08 Brain Corporation Neural network learning and collaboration apparatus and methods
US20130343640A1 (en) 2012-06-21 2013-12-26 Rethink Robotics, Inc. Vision-guided robots and methods of training them
US20130346347A1 (en) 2012-06-22 2013-12-26 Google Inc. Method to Predict a Communicative Action that is Most Likely to be Executed Given a Context
JP5645885B2 (en) 2012-06-29 2014-12-24 京セラドキュメントソリューションズ株式会社 Image forming apparatus
AU2012384518B2 (en) 2012-07-04 2019-01-24 Indra Sistemas, S.A. Infrared image based early detection of oil spills in water
US8977582B2 (en) 2012-07-12 2015-03-10 Brain Corporation Spiking neuron network sensory processing apparatus and methods
US9367798B2 (en) 2012-09-20 2016-06-14 Brain Corporation Spiking neuron network adaptive control apparatus and methods
US8793205B1 (en) 2012-09-20 2014-07-29 Brain Corporation Robotic learning and evolution apparatus
US8972061B2 (en) 2012-11-02 2015-03-03 Irobot Corporation Autonomous coverage robot
US20140187519A1 (en) 2012-12-27 2014-07-03 The Board Of Trustees Of The Leland Stanford Junior University Biomarkers for predicting major adverse events
EP2752726B1 (en) 2013-01-08 2015-05-27 Cleanfix Reinigungssysteme AG Floor treatment machine and method for treating floor surfaces
WO2014113091A1 (en) 2013-01-18 2014-07-24 Irobot Corporation Environmental management systems including mobile robots and methods using same
US8958937B2 (en) 2013-03-12 2015-02-17 Intellibot Robotics Llc Cleaning machine with collision prevention
US9764468B2 (en) 2013-03-15 2017-09-19 Brain Corporation Adaptive predictor apparatus and methods
US10561470B2 (en) 2013-03-15 2020-02-18 Intuitive Surgical Operations, Inc. Software configurable manipulator degrees of freedom
US9008840B1 (en) 2013-04-19 2015-04-14 Brain Corporation Apparatus and methods for reinforcement-guided supervised learning
US9292015B2 (en) 2013-05-23 2016-03-22 Fluor Technologies Corporation Universal construction robotics interface
US20140358828A1 (en) 2013-05-29 2014-12-04 Purepredictive, Inc. Machine learning generated action plan
US9242372B2 (en) 2013-05-31 2016-01-26 Brain Corporation Adaptive robotic interface apparatus and methods
WO2014196925A1 (en) 2013-06-03 2014-12-11 Ctrlworks Pte. Ltd. Method and apparatus for offboard navigation of a robotic device
US9792546B2 (en) 2013-06-14 2017-10-17 Brain Corporation Hierarchical robotic controller apparatus and methods
US9384443B2 (en) 2013-06-14 2016-07-05 Brain Corporation Robotic training apparatus and methods
US20150032258A1 (en) 2013-07-29 2015-01-29 Brain Corporation Apparatus and methods for controlling of robotic devices
SG2013071808A (en) 2013-09-24 2015-04-29 Ctrlworks Pte Ltd Offboard navigation apparatus capable of being coupled to a movable platform
US9296101B2 (en) 2013-09-27 2016-03-29 Brain Corporation Robotic control arbitration apparatus and methods
US9579789B2 (en) 2013-09-27 2017-02-28 Brain Corporation Apparatus and methods for training of robotic control arbitration
US9144907B2 (en) 2013-10-24 2015-09-29 Harris Corporation Control synchronization for high-latency teleoperation
US10612939B2 (en) 2014-01-02 2020-04-07 Microsoft Technology Licensing, Llc Ground truth estimation for autonomous navigation
US20150283703A1 (en) 2014-04-03 2015-10-08 Brain Corporation Apparatus and methods for remotely controlling robotic devices
US9346167B2 (en) 2014-04-29 2016-05-24 Brain Corporation Trainable convolutional network apparatus and methods for operating a robotic vehicle
US10255319B2 (en) 2014-05-02 2019-04-09 Google Llc Searchable index
US20150339589A1 (en) 2014-05-21 2015-11-26 Brain Corporation Apparatus and methods for training robots utilizing gaze-based saliency maps
GB2528953A (en) 2014-08-07 2016-02-10 Nokia Technologies Oy An apparatus, method, computer program and user device for enabling control of a vehicle
US9475195B2 (en) 2014-09-12 2016-10-25 Toyota Jidosha Kabushiki Kaisha Anticipatory robot navigation
US9628477B2 (en) 2014-12-23 2017-04-18 Intel Corporation User profile selection using contextual authentication
US20170329347A1 (en) * 2016-05-11 2017-11-16 Brain Corporation Systems and methods for training a robot to autonomously travel a route
US10241514B2 (en) * 2016-05-11 2019-03-26 Brain Corporation Systems and methods for initializing a robot to autonomously travel a trained route

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63229503A (en) * 1987-03-19 1988-09-26 Fujitsu Ltd Posture control method for robot
DE19745656A1 (en) * 1997-10-16 1999-04-22 Daimler Chrysler Ag Impact absorber for a motor vehicle
US20070021915A1 (en) * 1997-10-22 2007-01-25 Intelligent Technologies International, Inc. Collision Avoidance Methods and Systems
US20120109150A1 (en) * 2002-03-06 2012-05-03 Mako Surgical Corp. Haptic guidance system and method
EP1540564A1 (en) * 2002-03-22 2005-06-15 Ibrahim Nahla Vehicle navigation, collision avoidance and control system
US20080059015A1 (en) * 2006-06-09 2008-03-06 Whittaker William L Software architecture for high-speed traversal of prescribed routes
JP2010061442A (en) * 2008-09-04 2010-03-18 Murata Machinery Ltd Autonomous mobile device
EP2330471A2 (en) * 2009-11-10 2011-06-08 Vorwerk & Co. Interholding GmbH Method for controlling a robot
US20130218467A1 (en) * 2010-07-27 2013-08-22 Masahiro Ogawa Driving assistance device
KR20120030263A (en) * 2010-09-20 2012-03-28 삼성전자주식회사 Robot and control method thereof
US20150199458A1 (en) * 2014-01-14 2015-07-16 Energid Technologies Corporation Digital proxy simulation of robotic hardware
CN104029203A (en) * 2014-06-18 2014-09-10 大连大学 Path planning method for implementation of obstacle avoidance for space manipulators
CN104317291A (en) * 2014-09-16 2015-01-28 哈尔滨恒誉名翔科技有限公司 Artificial-potential-field-based robot collision preventation path planning method
US20160107313A1 (en) * 2014-10-17 2016-04-21 GM Global Technology Operations LLC Dynamic obstacle avoidance in a robotic system
US20160309973A1 (en) * 2015-04-24 2016-10-27 Avidbots Corp. Apparatus and methods for semi-autonomous cleaning of surfaces
CN104875882A (en) * 2015-05-21 2015-09-02 合肥学院 Quadrotor
CN105549597A (en) * 2016-02-04 2016-05-04 同济大学 Unmanned vehicle dynamic path programming method based on environment uncertainty
CN105739507A (en) * 2016-04-29 2016-07-06 昆山工研院工业机器人研究所有限公司 Anti-collision optimal path planning method for robot
CN105955273A (en) * 2016-05-25 2016-09-21 速感科技(北京)有限公司 Indoor robot navigation system and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
XING YANG 等: "A new method for robot path planning based artificial potential field", 《RESEARCHGATE》 *
胡远航: "朱知环境下自主移动机器人避障研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112833899A (en) * 2020-12-31 2021-05-25 吉林大学 Full-coverage path planning method for unmanned sanitation vehicle
CN112833899B (en) * 2020-12-31 2022-02-15 吉林大学 Full-coverage path planning method for unmanned sanitation vehicle
CN114431122A (en) * 2022-01-27 2022-05-06 山东交通学院 Road greening sprinkling intelligent control system and method

Also Published As

Publication number Publication date
EP3535630A1 (en) 2019-09-11
US20180364724A1 (en) 2018-12-20
KR102528869B1 (en) 2023-05-04
US10379539B2 (en) 2019-08-13
US20200004253A1 (en) 2020-01-02
JP2020502630A (en) 2020-01-23
US10001780B2 (en) 2018-06-19
US20180120856A1 (en) 2018-05-03
EP3535630A4 (en) 2020-07-29
WO2018085294A1 (en) 2018-05-11
JP7061337B2 (en) 2022-04-28
KR20190077050A (en) 2019-07-02
CN110023866B (en) 2022-12-06
CA3042532A1 (en) 2018-05-11

Similar Documents

Publication Publication Date Title
CN110023866A (en) System and method for the dynamic route planning in independent navigation
EP3880413B1 (en) Method and system for trajectory optimization for vehicles with geometric constraints
US11701778B2 (en) Systems and methods for robotic path planning
JP6949107B2 (en) Systems and methods for training robots to drive autonomously on the route
US9789612B2 (en) Remotely operating a mobile robot
US11858148B2 (en) Robot and method for controlling the same
KR20210066791A (en) Systems and Methods for Optimizing Path Planning for Tight Turns in Robotic Devices
US20210031367A1 (en) Systems, apparatuses, and methods for rapid machine learning for floor segmentation for robotic devices
TW202102959A (en) Systems, and methods for merging disjointed map and route data with respect to a single origin for autonomous robots
KR20210110610A (en) Systems, apparatus, and methods for detecting escalators
US11618164B2 (en) Robot and method of controlling same
TW202032366A (en) Systems and methods for improved control of nonholonomic robotic systems
JP2022500763A (en) Systems and methods for detecting blind spots for robots

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40012306

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant