WO2020015548A1 - 机器人控制方法、机器人及存储介质 - Google Patents
机器人控制方法、机器人及存储介质 Download PDFInfo
- Publication number
- WO2020015548A1 WO2020015548A1 PCT/CN2019/095146 CN2019095146W WO2020015548A1 WO 2020015548 A1 WO2020015548 A1 WO 2020015548A1 CN 2019095146 W CN2019095146 W CN 2019095146W WO 2020015548 A1 WO2020015548 A1 WO 2020015548A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- hijacking
- area
- cleaning
- task
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 69
- 238000004140 cleaning Methods 0.000 claims description 163
- 230000007613 environmental effect Effects 0.000 claims description 95
- 230000015654 memory Effects 0.000 claims description 30
- 230000009471 action Effects 0.000 claims description 17
- 239000007787 solid Substances 0.000 claims description 6
- 230000008569 process Effects 0.000 description 19
- 230000000875 corresponding effect Effects 0.000 description 18
- 238000010586 diagram Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 12
- 238000004590 computer program Methods 0.000 description 7
- 230000001276 controlling effect Effects 0.000 description 7
- 230000000007 visual effect Effects 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 6
- 238000010408 sweeping Methods 0.000 description 6
- 230000006399 behavior Effects 0.000 description 5
- 210000004209 hair Anatomy 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000006073 displacement reaction Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000002085 persistent effect Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
- A47L11/4011—Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/24—Floor-sweeping machines, motor-driven
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
- A47L11/4002—Installations of electric equipment
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
- A47L11/4061—Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L9/00—Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
- A47L9/28—Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
- A47L9/2805—Parameters or conditions being sensed
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L9/00—Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
- A47L9/28—Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
- A47L9/2836—Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means characterised by the parts which are controlled
- A47L9/2847—Surface treating elements
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L9/00—Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
- A47L9/28—Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
- A47L9/2836—Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means characterised by the parts which are controlled
- A47L9/2852—Elements for displacement of the vacuum cleaner or the accessories therefor, e.g. wheels, casters or nozzles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L2201/00—Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
- A47L2201/04—Automatic control of the travelling movement; Automatic obstacle detection
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L2201/00—Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
- A47L2201/06—Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning
Definitions
- the present application relates to the field of artificial intelligence technology, and in particular, to a robot control method, a robot, and a storage medium.
- robots gradually enter people's daily life, bringing great convenience to people's lives. For example, a robot with a floor cleaning function can automatically clean the room, saving a lot of human and material costs.
- autonomous positioning and navigation of the robot can be realized by means of real-time positioning and map construction (Simultaneous localization and mapping) technology.
- the robot may sometimes be hijacked, such as the robot being moved, suspended, or dragged in a large area.
- an uncontrollable drift error occurs in the positioning, and the robot needs to perform repositioning.
- Various aspects of the present application provide a robot control method, a robot, and a storage medium, so that the robot can perform corresponding tasks according to local conditions to meet user needs.
- An embodiment of the present application provides a robot control method, including:
- the robot determines a task execution area according to the environmental information around the location where the robot was out of hijacking
- the robot performs a task within the task execution area.
- An embodiment of the present application further provides a robot, including: a mechanical body, wherein the mechanical body is provided with one or more sensors, one or more processors, and one or more memories storing computer instructions;
- the one or more processors are configured to execute the computer instructions for:
- An embodiment of the present application further provides a computer-readable storage medium storing computer instructions, characterized in that when the computer instructions are executed by one or more processors, causing the one or more processors to execute includes the following Actions:
- the robot determines the position where it was away from the hijacking based on the relocation operation; and determines the task execution area based on the environmental information around the position where it was away from the hijacking; and then executes the task in the task execution area.
- the robot can flexibly determine the task execution area according to the environment where it was abducted from the hijacking, without having to return to the position where it was hijacked to continue executing the task, thereby realizing local conditions and meeting user needs as much as possible.
- FIG. 1 is a schematic flowchart of a robot control method according to an exemplary embodiment of the present application
- FIG. 2 is a schematic flowchart of a method for determining a position where a robot continues to perform a task according to an exemplary embodiment of the present application
- 3a-3g are schematic diagrams of areas to be cleaned provided by an exemplary embodiment of the present application.
- FIG. 4a is a schematic diagram of an arc-shaped sweeping route provided by an exemplary embodiment of the present application.
- FIG. 4b is a schematic diagram of an arch-shaped sweeping route provided by an exemplary embodiment of the present application.
- 5a is a block diagram of a hardware structure of a robot according to an exemplary embodiment of the present application.
- 5b is a line drawing of a humanoid robot according to an exemplary embodiment of the present application.
- FIG. 5c is a line drawing of a non-humanoid robot according to an exemplary embodiment of the present application.
- the embodiment of this application provides a solution.
- the basic idea is that the robot determines that it is out of hijacking based on the relocation operation. Position; and determine the task execution area based on the environmental information around the location where it was away from the hijack; then execute the task in the task execution area. In this way, the robot can flexibly determine the task execution area according to the environment in which the robot is out of hijacking, thereby realizing local conditions and meeting user needs as much as possible.
- FIG. 1 is a schematic flowchart of a robot control method according to an exemplary embodiment of the present application. As shown in Figure 1, the method includes:
- the robot determines the position where it is away from the hijacking.
- the robot determines a task execution area according to the environmental information around the location where the robot was away from the hijack.
- the robot executes a task in a task execution area.
- the method provided by this embodiment can be applied to a robot that can move autonomously, and mainly controls subsequent actions of the robot after repositioning.
- This embodiment is not limited to the shape of the robot, and may be, for example, a circle, an oval, a triangle, a convex polygon, a humanoid, or the like.
- the robot may implement the logic of the robot positioning method provided by this embodiment by installing software, an APP, or writing a program code in a corresponding device.
- the robot can move autonomously, and can complete certain tasks on the basis of autonomous movement.
- the shopping cart robot needs to follow the customer to accommodate the products that the customer purchases.
- the sorting robot needs to follow the sorter to move to the rack picking area, and then start sorting the order goods.
- the cleaning robot needs to clean the living room, bedroom, kitchen and other areas.
- the robot completes the corresponding tasks during autonomous movement.
- the robot may have difficulties in running, such as being trapped, repeatedly detouring, or being entangled, in the process of completing the corresponding task. In this case, the user generally moves or drags the robot to another position, so that the robot can continue to perform the task from that position.
- the robot needs to be charged when the power is insufficient.
- the robot can autonomously move to the position of the charging pile for charging when its power is insufficient. Or the robot warns the user when its power is low, so that the user moves the robot to the position of the charging pile to charge it.
- the robot is fully charged, the user will move or drag it to a location other than the charging station, so that the robot can continue to perform tasks from that location.
- the robot may be hijacked, such as the above being moved, hung or dragged to another position.
- the robot will trigger a relocation operation due to lack or loss of previous position information, that is, to re-determine the pose of the robot, where the pose includes the position and orientation of the robot.
- the robot can determine the position where it was out of hijacking based on the relocation operation, obtain the environmental information around the position where it was out of hijacking, and determine the task execution area based on the obtained environmental information; The task is executed within the task execution area. In this way, the robot can flexibly determine the task execution area according to the environment where it was abducted from the hijacking, without having to return to the position where it was hijacked to continue executing the task, thereby realizing local conditions and meeting user needs as much as possible.
- the behaviors of the robot being moved, suspended, or dragged in a large range are collectively defined as being held by the robot.
- the robot returns to the ground after being moved or suspended, and when the robot stops being dragged after being dragged it is uniformly defined as the robot getting out of hijacking.
- a touch sensor, a reflective light coupler, an inertial sensor, and the like may be provided on the robot, but are not limited thereto, and are used to detect whether the above-mentioned hijacked condition occurs in the robot.
- a contact sensor may be provided at the bottom of the base of the robot or on the roller, which is used to detect whether the robot is moved and suspended, and may also be used to determine whether the robot is back to the ground.
- a reflective light coupler is installed at the bottom of the robot, and the reflected light coupler emits a light beam that can be reflected from the ground. Through this light beam, the operation of the robot being moved or suspended and then returned to the ground can be detected.
- an inertial sensor such as an acceleration sensor, is installed on the robot to detect whether the robot is dragged by a large area.
- the inertial sensor detects that the robot is being dragged, it indicates that the robot is hijacked; and when it detects that the robot is stopped, it indicates that the robot is out of hijacking, triggering the robot to start the relocation function and perform the relocation operation.
- the robot may perform a relocation operation at a position where the robot is out of hijacking, and the position of the robot in the stored environment map can be accurately located at the position.
- the pose here includes the position and orientation of the robot in the stored environment map.
- an optional implementation of step 101 is: when the robot recognizes that it is out of hijacking, it collects the environmental information around its current location; and locates its stored environment according to the environmental information around its current location
- the pose in the map is the position in the pose as the position of the robot when it is out of hijacking, specifically, its current position in the stored environment map.
- the robot cannot accurately locate its pose in the stored environment map at the position where the robot was out of hijacking, then the robot can move from the position where it was out of hijacking to another location, at During the movement, the relocation operation is continuously performed according to the newly collected surrounding environment information until the position of the robot in the stored environment map is accurately located.
- another position in which the robot is moving from the position where it was released from hijack to another position is defined as a second position, and the second position is a stored environment map Any position that is different from the position where the robot was out of hijack.
- step 101 when the robot recognizes that it is out of hijacking, move from the current position to the second position, and locate the position of the robot in the stored environment map during the movement ; And based on the position in the pose and the data obtained by the robot during the movement, determine the position when the robot starts to move as the position when the robot is out of hijacking.
- a positioning period may be preset, and a timer or a counter may be started to time the positioning period.
- a relocation operation can be performed once each positioning cycle is reached, that is, a relocation is performed until the position of the robot in the stored environment map is located .
- the total number of positioning cycles that the robot has experienced in moving from its position when it was released from hijack to its position in the stored environment map can be determined. Determine the elapsed time.
- a displacement sensor and an acceleration sensor may be provided on the robot, and the displacement and acceleration of the robot may be collected during the process of the robot moving from the position where it was abducted to the position where it is located in the stored environment map.
- the distance traveled by the robot from its position when it was released from hijacking to its position in the stored environment map can be determined; and then based on its experience Distance, known navigation path, and its position in the stored environment map when it is relocated, and the position at which the robot starts to move is determined, which is the position when the robot is out of hijacking.
- the environment map may include at least one of a visual map and a grid map.
- the visual map is constructed in advance based on the environmental images collected by the visual sensor.
- the visual map can describe the environment in which the robot is located to a certain extent. It mainly stores information about several environmental images related to the environment in which the robot is located, such as the environment. The pose of the robot corresponding to the image, the feature points contained in the environment image, and the descriptors of the feature points.
- the grid map is constructed in advance based on the environmental data collected by the laser sensor.
- the grid map is a digital rasterization of the environment in which the storage robot is located. Each grid in the grid map corresponds to a small area in the environment where the robot is located.
- Each grid contains two types of basic information: coordinates, whether it is occupied by an obstacle, and the probability of the grid being occupied represents the corresponding area.
- Environmental information The greater the number of grids in the grid map, the more detailed the grid map describes the environment in which the robot is located, and accordingly, the higher the positioning accuracy based on the grid map.
- the vision sensor collects an image of the environment around the position where the robot is out of hijack.
- the robot can determine the task execution area according to the surrounding image of the location where the robot was out of hijack.
- the vision sensor collects environmental data around the position where the robot is out of hijack.
- the robot can determine the task execution area according to the environmental data around the location where the robot was out of hijack.
- the vision sensor collects environmental images around the position where the robot is out of hijack, and the vision sensors collect environmental data around the position when the robot is out of hijack.
- the robot can determine the task execution area based on the environmental image and environmental data around the location where the robot was out of hijack.
- the environmental image collected by the vision sensor and the environmental data collected by the laser sensor are collectively referred to as environmental information.
- the surrounding environment image of its location collected by the visual sensor and the surrounding environment data of its location collected by the laser sensor are collectively referred to as the robot at Environmental information around the location where it was hijacked or when it was released.
- step 102 it may be determined whether the robot needs to be disengaged according to the difference between the position where the robot was out of hijack and the position when it was hijacked. Perform the task at the location where it was hijacked. If necessary, step 102 and subsequent operations are performed, so that the robot starts performing the task from the position where the robot was released from hijacking.
- the robot has completed a job task in the current environment area or the user wants to move it to another environment area to perform the job task.
- the user generally moves or drags the robot to any position in another area, so that the robot performs a work task on the area from that position.
- the environmental area refers to a range of areas having independent existence meanings.
- the tasks of the robot in different environmental areas can be independent of each other.
- the division and definition of the environmental area will vary. For example, in a home environment, bedrooms, kitchens, living rooms, bathrooms, etc. may be considered as relatively independent environmental areas, but are not limited thereto.
- an optional implementation method for determining whether the robot needs to perform a task at the position where the robot is out of hijacking is: judging whether the position where the robot was out of hijacking and the position where it was hijacked belong to the same environmental area; if the robot The location where the hijack was released and the location where it was hijacked belong to different environmental areas, then it is determined that the robot needs to perform the task at the location where it was hijacked.
- the position where the robot was hijacked may be determined according to the environmental information collected by the robot before the hijacking or within a recent period of time.
- the robot may encounter difficulties in running, such as being trapped, repeatedly detoured, or entangled, in the process of completing the corresponding task. For example, when a cleaning robot performs a cleaning task at its current position, its floor brush is entangled with hair at that position. For another example, when the cleaning robot is performing a cleaning task, it encounters a step and cannot continue the operation. In these cases, the user generally moves or drags the robot to another location so that the robot can continue to perform tasks from that location. However, in some cases, the difficult operation of the robot may be lifted. For example, when the cleaning robot described above performs a cleaning task, its floor brush is entangled with the hair at this position and is removed, and the hair may be removed when the robot restarts. He was cleaned from time to time on the ground. In this case, the user expects the robot to return to where it was when it was hijacked and continue to perform the task.
- another optional implementation method for determining whether the robot needs to perform the task at the position where the robot was hijacked is: judging whether the position where the robot was hijacked is located in a difficult area for the robot to run; if the robot is hijacked If the robot is located in a difficult-to-run area and the robot is out of the hijacking location, it is determined that the robot needs to perform the task at the location when the robot is out of the hijacking. Correspondingly, if the position where the robot was hijacked was outside the difficult area, it is determined that the robot needs to return to the position where it was hijacked to continue performing the task.
- foreseeable or difficult operation areas of the robot such as the environment area where the steps are located, may be identified in the stored environment map in advance; operation may also be performed according to the environmental information collected by the robot. Difficult areas are marked in real time. For unpredictable robots with difficult operation areas or areas with large variability, for example, for cleaning robots, there are a large number of hairs and debris in an environmental area. When performing cleaning tasks in this area, there are The possibility of entanglement may occur, but the hair, debris, etc. in this area may be cleaned up, so the environmental area is no longer a difficult area for the robot to operate. Therefore, the robot can determine whether the environment area is a difficult area for operation according to the collected environmental information.
- the robot can autonomously move to the position of the charging pile for charging when its power is insufficient. Or the robot warns the user when its power is low, so that the user moves the robot to the position of the charging pile to charge it. When the robot is charged, the user will move or drag it to a location other than the charging station, so that the robot can continue to perform tasks from that location.
- another optional implementation method for determining whether the robot needs to perform the task at the position where the robot was hijacked is: judging whether the position where the robot was hijacked is the position of the charging pile; if the position where the robot was hijacked is The position of the charging pile, and the position of the robot when it is out of hijacking is a non-charging pile position, then it is determined that the robot needs to perform the task at the position when it is out of hijacking.
- the location of the charging post may be pre-marked in a stored environment map. In this way, when the robot detects that it has an external power source to charge it, it can determine that the position where the robot was hijacked is the position of the charging pile.
- the robot when the robot is being charged, its host circuit is periodically woken up.
- the robot can collect the surrounding environment information of the current location, and determine the robot's current location as the charging pile location according to its surrounding environment information.
- the foregoing implementation manner for determining whether the robot needs to perform a task at the position where the robot is out of hijacking may be implemented separately, or any two or three implementation manners may be combined for determination.
- the following two implementations will be implemented, "judge whether the robot's position when it was taken out of the hijacking and the position where it was hijacked belong to the same environmental area" and "whether the position where the robot was hijacked was located in a difficult area for the robot to operate", and will be exemplified As shown in Figure 2 for its corresponding flow diagram.
- the method includes:
- step 202 Determine whether the robot's position when it is out of hijacking and the position when it is hijacked belong to the same environmental area; if the determination result is yes, go to step 203; if the determination result is no, go to step 206.
- step 205 Determine whether the position of the robot when it is out of hijacking is located in a difficult area for the robot to run; if the determination result is yes, the operation is terminated, that is, step 207 is performed; if the determination result is no, then step 206 is performed.
- the robot control method provided in the embodiment of the present application is applicable to various types of robots, such as a cleaning robot, a sorting robot, a shopping guide robot, a shopping cart robot, and the like, but is not limited thereto.
- the task execution area will be different for different types of robots.
- the determined task execution area is the area to be cleaned.
- an optional implementation manner of step 102 is: the cleaning robot determines the area to be cleaned according to the environmental information around the location where the cleaning robot was out of the hijack.
- step 102 the determined execution task area is a to-be-sorted area.
- the sorting robot determines the area to be sorted according to the environmental information around the location where it was away from the hijack.
- the specific implementation manners for determining task execution areas and the manners for performing tasks will differ depending on their application scenarios.
- the task performed by the cleaning robot is cleaning the ground, etc .
- the task performed by the shopping cart robot or the shopping guide is to follow the customer
- the task performed by the sorting robot is sorting goods or orders.
- a cleaning robot is taken as an example, and combined with some application scenarios, the specific implementation manner of determining the area to be cleaned and the implementation manner of performing the cleaning task in the determined area to be cleaned are described.
- an optional implementation manner for the cleaning robot to determine the area to be cleaned is: determining at least one physical object with a space-defining role in the environmental information around the location where the cleaning robot is out of hijacking; and will include at least one entity The ground area of the object serves as the area to be cleaned; then, a cleaning task is performed in the area to be cleaned.
- the at least one solid object that has a space-defining role may be a table, a chair, a wall, a cabinet, a pillar, a door frame, etc., but is not limited thereto.
- the cleaning robot obtains the environmental information around the location where it was away from the hijacking, and determines the positions of the legs of all the tables and chairs in the environment area based on the obtained environmental information;
- the floor area of the legs of these tables and chairs is the area to be cleaned.
- the shape of the area to be cleaned is not limited, and it may be any area with a regular structure or an irregular structure that includes the positions of all the legs of the tables and chairs in the area.
- the area to be cleaned may be a circular area including the positions of all the legs of the table and chairs as shown in FIG. 3a, or a rectangle including the positions of the legs of all the tables and chairs of the area as shown in FIG. 3b. Area, but not limited to this.
- the environment area where the cleaning robot is away from the hijacking may include facilities that have a boundary-defining function such as walls and cabinets.
- a boundary-defining function such as walls and cabinets.
- the cleaning robot cleans these boundary positions, it is necessary to clean the edge of the wall or the cabinet to avoid detecting the obstacles that are limited by these boundaries and bypassing them at a long distance, resulting in missed scanning.
- an optional implementation method for the cleaning robot to determine the area to be cleaned is: the cleaning robot determines whether its current area contains a boundary, such as a wall, a cabinet, etc., according to the environmental information around the location where it was away from the hijacking; If the boundary information is included, as shown in FIG.
- a rectangular area is determined by using the distance between the position where the robot is out of hijack and the boundary as any half length, and the rectangular area is used as the area to be cleaned. Perform cleaning tasks in the area to be cleaned.
- the rectangular area includes a square area and a rectangular area.
- a half of a side length of one side of a rectangle is defined as a half-side length.
- the cleaning robot determines whether its current area contains a boundary according to the environmental information around the location where it was away from hijacking, for example Walls, cabinets, etc .; if it contains boundary information, as shown in Figure 3d, a circle is determined using the distance between the robot's position and the boundary as the radius, and the circle is used as the area to be cleaned .
- the distance between the position where the robot is out of hijacking and the boundary may be the distance between the position where the robot is out of hijacking and any point on the boundary, preferably the time when the robot is out of hijacking.
- the vertical distance between the position and the boundary may be the distance between the position where the robot is out of hijacking and any point on the boundary, preferably the time when the robot is out of hijacking.
- the environmental area where the cleaning robot is located when it is abducted may include a narrow area such as a wall and a corner formed by the wall.
- an optional implementation method for the cleaning robot to determine the area to be cleaned is: the cleaning robot determines whether its current area contains corners, such as walls and Formed corners, corners formed by walls, cabinets, etc .; if corners are included, the associated area of the corners is determined as the area to be cleaned.
- the vertex of the corner can be the center of the circle, and the distance between the position where the robot is out of hijack and the vertex of the corner can be used as a radius to determine a fan-shaped area, in which the two sides of the corner are Are the other two boundaries of the fan-shaped area, and the fan-shaped area is used as the area to be cleaned.
- the cleaning robot determines whether its current area contains corners according to the environmental information around the location where it was away from the hijacking, for example Corners formed by walls and walls, corners formed by walls and cabinets, etc .; if corners are included, as shown in Figure 3f, the distance from the position where the robot escaped from hijacking to any of the corners is half the side Long, determine a rectangular area, and use the rectangular area as the area to be cleaned.
- the rectangular area includes a square area and a rectangular area. For a square area, preferably, the longest distance among the vertical distances between the position where the robot is out of hijack and the side of the corner is taken as the half-side length.
- the environmental area where the cleaning robot is located from the hijacking may be an empty environment, and there are no obstacles such as walls, tables and chairs.
- a cleaning robot cleans an empty room without any furniture.
- the robot determines any area to be cleaned at its current location. For example, as shown in FIG. 3g, the robot can scan the robot at the center of its position when it is out of hijacking, and use the distance from the boundary of the environment area as the radius to determine a circular area as the area to be cleaned. For another example, the robot can get rid of the boundary of the environment area where it was hijacked as the boundary of the area to be cleaned, and use the entire environment area as the area to be cleaned.
- the cleaning robot executes the cleaning task in the area to be cleaned.
- the cleaning robot can clean the area to be cleaned in a random cleaning mode or a path planning cleaning mode.
- the path-planning cleaning mode is relative to the random cleaning mode. It refers to a cleaning mode that can accurately plan the cleaning route, realize planned cleaning, ensure regular cleaning paths, and try not to repeat the cleaning.
- the cleaning robot can support one or more different types of cleaning routes. For example, the cleaning robot can support bow-shaped cleaning routes, arc-shaped cleaning routes, “L” -shaped cleaning routes, mouth-shaped cleaning routes, spiral walking fixed-point cleaning routes, and so on. In the embodiment of the present application, the cleaning of the to-be-cleaned area with the bow-shaped sweeping path and the arc-shaped sweeping path of the cleaning robot will be described as an example.
- the cleaning robot can select an adapted cleaning mode to clean the area to be cleaned.
- the cleaning robot uses the arc cleaning route to perform a fixed-point cleaning on the area to be cleaned with the position at which it was removed from the hijack.
- the cleaning robot starts from the position where it was when it was released from hijacking, and performs bow-shaped cleaning on the area to be cleaned with a bow-shaped cleaning route.
- the bow-shaped cleaning method shown in FIG. 4b does not reach the boundary of the area to be cleaned, but it is not limited to the cleaning method in this application. Sweep the border.
- the execution subject of each step of the method provided in the foregoing embodiment may be the same device, or the method may also use different devices as execution subjects.
- the execution subject of steps 201 and 202 may be device A; for example, the execution subject of step 201 may be device A, and the execution subject of step 202 may be device B; and so on.
- FIG. 5a is a block diagram of a hardware structure of a robot according to an exemplary embodiment of the present application.
- the robot 500 includes: a mechanical body 501; the mechanical body 501 is provided with one or more processors 502 and one or more memories 503 storing computer instructions.
- one or more sensors 504 are provided on the machine body 501.
- one or more processors 502, one or more memories 503, and one or more sensors 504 may be disposed inside the mechanical body 501, or may be disposed on the surface of the mechanical body 501.
- the mechanical body 501 is an execution mechanism of the robot 500 and can perform operations specified by one or more processors 502 in a determined environment.
- the mechanical body 501 reflects the appearance of the robot 500 to a certain extent.
- the appearance of the robot 500 is not limited.
- the robot 500 may be a humanoid robot as shown in FIG. 5b
- the mechanical body 501 may include, but is not limited to, a mechanical structure such as a robot's head, hand, wrist, arm, waist, and base.
- the robot 500 may also be a relatively simple non-humanoid robot as shown in FIG. 5c, and the mechanical body 501 mainly refers to the body of the robot 500.
- the mechanical body 501 is also provided with some basic components of the robot 500, such as a driving component, an odometer, a power supply component, an audio component, and the like.
- the driving assembly may include a driving wheel, a driving motor, a universal wheel, and the like.
- One or more memories 503 are mainly used to store one or more computer instructions, and these computer instructions can be executed by one or more processors 502, causing one or more processors 502 to control the robot 500 to implement corresponding functions and complete corresponding actions Or task.
- the one or more memories 503 may be configured to store various other data to support operations on the robot 500. Examples of these data include instructions for any application or method for operating on the robot 500, an environment map corresponding to the environment in which the robot 500 is located.
- the environment map may be one or more maps corresponding to the entire environment stored in advance, or may be a part of a map that is being constructed before.
- One or more memories 503, which can be implemented by any type of volatile or non-volatile storage device, or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM) , Erasable Programmable Read Only Memory (EPROM), Programmable Read Only Memory (PROM), Read Only Memory (ROM), Magnetic Memory, Flash Memory, Disk or Optical Disk.
- SRAM static random access memory
- EEPROM electrically erasable programmable read-only memory
- EPROM Erasable Programmable Read Only Memory
- PROM Programmable Read Only Memory
- ROM Read Only Memory
- Magnetic Memory Flash Memory
- Disk Disk or Optical Disk.
- the one or more processors 502 may be regarded as a control system of the robot 500 and may be used to execute computer instructions stored in one or more memories 503 to control the robot 500 to implement corresponding functions and complete corresponding actions or tasks. It is worth noting that when the robot 500 is in different scenarios, its functions, actions or tasks to be implemented will be different; correspondingly, the computer instructions stored in one or more memories 503 will also be different, and The execution of different computer instructions by one or more processors 502 may control the robot 500 to implement different functions and complete different actions or tasks.
- one or more sensors 504 on the robot 500 can assist in completing navigation positioning, repositioning, and the like of the robot 500.
- the one or more sensors 504 may include a vision sensor, a laser sensor, a contact sensor, a reflective light coupler, an inertial sensor, and the like, but are not limited thereto.
- the vision sensor can be regarded as the “eyes” of the robot 500, and is mainly used to collect images of the surroundings of the robot 500, and these images can be referred to as environmental images.
- the vision sensor can be implemented by any device with an image acquisition function, such as a camera, a camera, and the like.
- the laser sensor is a radar system that collects environmental information around the robot 500 by emitting a laser beam.
- the environmental data collected by the laser sensor may include, but is not limited to, the distance, angle, etc. of objects around the robot 500.
- the laser sensor can be implemented by any device capable of emitting a laser beam, for example, a laser radar can be used.
- the robot 500 can move autonomously and can complete certain tasks on the basis of autonomous movement. For example, in shopping scenarios such as supermarkets and malls, the shopping cart robot needs to follow the customer to accommodate the products that the customer purchases. As another example, in some companies' warehouse sorting scenarios, the sorting robot needs to follow the sorter to move to the rack picking area, and then start sorting the order goods. For another example, in a home cleaning scene, the cleaning robot needs to clean the living room, bedroom, kitchen and other areas. In these application scenarios, the robot 500 completes the corresponding tasks during the autonomous movement. However, in actual applications, the robot 500 may be trapped, repeatedly detoured, or entangled in the process of completing the corresponding operation task, and may have difficulty in running. In this case, the user generally moves or drags the robot 500 to another position, so that the robot 500 can continue to perform the work task from the position.
- the robot 500 needs to be charged when the power is insufficient.
- the robot 500 can autonomously move to the position of the charging pile for charging when its power is insufficient. Or the robot 500 issues a warning to the user when its power is insufficient, so that the user moves the robot 500 to the position of the charging pile to charge it.
- the user will move or drag the robot 500 to a position other than the charging pile, so that the robot 500 can continue to perform the work task from the position.
- the robot 500 may be hijacked, such as being moved, hung, or dragged to another position.
- the robot 500 may trigger a relocation operation due to lack or loss of previous position information, that is, re-determine the pose of the robot 500, where the pose includes the position and orientation of the robot.
- one or more processors 502 may determine the position of the robot when it is out of hijacking based on the relocation operation; and collect environmental information around the position of the robot when it is out of hijacking through one or more sensors 504 And determine the task execution area according to the obtained environmental information; and then control the robot 500 to execute the task in the task execution area.
- the task execution area can be flexibly determined according to the environment where the robot is out of hijacking, without having to return the robot to the position where it was hijacked to continue the task, thereby realizing local conditions and meeting user needs as much as possible.
- the one or more sensors 504 may include: a contact sensor, a reflective photocoupler, an inertial sensor, and the like, but are not limited thereto, and are used to detect whether the above-mentioned hijacked condition occurs in the robot.
- the one or more processors 502 may perform a relocation operation at a position where the robot 500 is out of hijacking, and the position of the robot 500 in the stored environment map can be accurately located at the position .
- the pose here includes the position and orientation of the robot in the stored environment map.
- one or more processors 502 determine the position of the robot 500 when it is out of hijacking, it is specifically configured to: when it is recognized that the robot 500 is out of the hijacking, collect environmental information around the current position of the robot 500 through one or more sensors 504 ; And according to the environmental information around the current location of the robot 500, positioning the pose of the robot 500 in the stored environment map, and using the position in the pose as the position where the robot 500 was out of hijacking, specifically, the current position of the robot 500 Location in the stored environment map.
- one or more processors 502 cannot accurately locate the pose of the robot 500 in the stored environment map at the position where the robot 500 is out of hijack, then control the robot 500 Move from location to another location. During the movement process, the relocation operation is continuously performed according to the latest environmental information collected by one or more sensors 504, and the relocation is continued until the robot 500 is accurately located in the stored environment map. Pose.
- the one or more processors 502 determine the position of the robot 500 when it is out of hijacking, and is specifically configured to: when it is recognized that the robot 500 is out of the hijacking, control the robot 500 to move from the current position to the second position, and during the movement process The position of the middle positioning robot 500 in the stored environment map; and based on the position in the position and data obtained during the movement of the robot 500, determine the position when the robot 500 starts to move as the position when the robot 500 is out of hijacking.
- a specific implementation manner of how to determine the position where the robot 500 is out of hijacking refer to the related description in the foregoing method embodiment, and details are not described herein again.
- the environment map may include at least one of a visual map and a grid map.
- a visual map and a grid map For specific descriptions of the visual map and the grid map, reference may be made to related content in the foregoing method embodiments, and details are not described herein again.
- the one or more sensors 504 include a vision sensor
- the vision sensor collects an image of the environment around the location where the robot 500 escaped from the hijack.
- the task execution area can be determined according to the surrounding image of the location where the robot 500 was released from the hijacking.
- the vision sensor collects environmental data around the location where the robot 500 escaped from the hijack. Accordingly, the one or more processors 502 may determine the task execution area according to the environmental data around the location where the robot 500 escaped from the hijacking.
- the one or more sensors 504 include a vision sensor and a laser sensor
- the vision sensor collects an image of the environment around the position where the robot 500 is out of hijack
- the vision sensor collects the environmental data around the position where the robot 500 is out of the hijack.
- the one or more processors 502 may determine the task execution area according to the environmental image and environmental data around the location where the robot 500 was released from the hijacking.
- one or more sensors 102 may be based on the difference between the position where the robot 500 was out of hijack and the position where it was hijacked. It is determined whether the robot 500 needs to perform a task at the position where it was away from the hijack. If necessary, the operation of determining the task execution area of the robot 500 and subsequent operations are performed, so as to control the robot 500 to execute the task from the position where the robot 500 was released from the hijacking.
- one or more processors 502 are specifically used to determine whether the robot needs to perform a task at the position where the hijack was taken out of the hijacking and is specifically used to determine whether the position where the robot 500 was taken out of the hijacking and the position where it was taken out of the hijacking are the same.
- Environmental area if the location where the robot 500 was out of hijack and the location where it was hijacked belong to different environmental areas, then it is determined that the robot 500 needs to perform a task at the location where it was out of hijack.
- the position where the robot 500 was hijacked may be determined according to the environmental information collected by the one or more sensors 504 in the last or most recent period before the robot 500 is hijacked.
- one or more processors 502 are specifically used to determine whether the robot needs to perform a task at the position where it was out of hijacking: specifically, determine whether the position where the robot 500 was hijacked is located in a difficult area for the robot to run; When the hijacking is located in a difficult-to-run area, and the robot 500 is out of the hijacking location outside the difficult-to-run area, it is determined that the robot 500 needs to perform a task at the position when the hijacking is released. Accordingly, if the location where the robot 500 was hijacked is outside the difficult area, it is determined that the robot 500 needs to return to the location where it was hijacked to continue performing the task.
- the one or more processors 502 are specifically used to determine whether the robot needs to perform a task at the position where the hijack was taken out of the hijacking: specifically, determine whether the position where the robot 500 was hijacked is the charging pile position; if the robot 500 is The position at the time of the hijacking is the charging pile position, and the position where the robot 500 is out of the hijacking is the non-charging pile position, it is determined that the robot 500 needs to perform the task at the position where it was out of the hijacking.
- the robot provided in the embodiment of the present application may be various types of robots, such as a cleaning robot, a sorting robot, a shopping guide robot, a shopping cart robot, and the like, but is not limited thereto.
- the task execution area will be different for different types of robots. For example, for a cleaning robot, its task execution area is the area to be cleaned. Accordingly, when one or more processors 502 on the cleaning robot determine the task implementation area of the robot, it is specifically configured to determine the area to be cleaned according to the environmental information around the location where the cleaning robot is out of hijacking.
- the specific implementation manner of the one or more processors 502 to determine the corresponding task execution area and the manner in which the task is executed will be different due to different application scenarios.
- the task performed by the cleaning robot is cleaning the ground, etc .
- the task performed by the shopping cart robot or the shopping guide is to follow the customer
- the task performed by the sorting robot is sorting goods or orders.
- the specific implementation of one or more processors 502 to determine the area to be cleaned and the implementation of controlling the cleaning robot to perform the cleaning task in the determined area to be cleaned are described below.
- the one or more processors 502 are specifically configured to: determine at least one physical object having a space-defining role in the environmental information around the position where the cleaning robot is out of hijacking; The ground area of a solid object is used as the area to be cleaned; then the cleaning robot is controlled to perform the cleaning task in the area to be cleaned.
- the at least one solid object that has a limiting effect on the area to be cleaned may be a table, a chair, a wall, a cabinet, a pillar, a door frame, etc., but is not limited thereto.
- the one or more processors 502 are specifically configured to determine whether the current area of the cleaning robot includes a boundary, such as a wall, Cabinets, etc .; if it contains boundary information, use the distance between the robot's position at the time of the hijacking and the boundary as any half-length, determine a rectangular area, and use the rectangular area as the area to be cleaned; then control the cleaning robot in the area Perform cleaning tasks in the area to be cleaned.
- a boundary such as a wall, Cabinets, etc .
- the one or more processors 502 are specifically configured to: according to the environmental information around the location where the cleaning robot was out of hijack, determine whether its current area contains a boundary , Such as walls, cabinets, etc .; if it contains boundary information, a circle is determined using the distance between the robot's position and the boundary as the radius, and the circle is used as the area to be cleaned.
- the one or more processors 502 are specifically configured to determine whether the area in which the cleaning robot is currently located contains corners, such as walls and The corner formed by the wall, the corner formed by the wall and the cabinet, etc .; if the corner is included, the associated area of the corner is determined as the area to be cleaned.
- the vertex of the corner can be used as the center of the circle, and the distance between the position where the robot is out of hijack and the vertex of the corner can be used as a radius to determine a fan-shaped region, where the two edges of the corner are the other two Boundary, and use this sector as the area to be cleaned.
- the one or more processors 502 are specifically configured to determine whether the current area contains corners according to the environmental information around the location where the cleaning robot is out of hijacking. , Such as the corners formed by walls and walls, the corners formed by walls and cabinets, etc .; if the corners are included, the distance from the position where the robot is out of hijacking to any side of the corner is half the length, determine one A rectangular area, and use the rectangular area as the area to be cleaned.
- any area is determined as the area to be cleaned at the current position of the cleaning robot.
- the cleaning robot can be located at the center of the circle when it is out of the hijacking, and use the distance from the boundary of the environment area as the radius to determine a circular area as the area to be cleaned.
- the boundary of the environmental area where the cleaning robot is out of hijacking can be the boundary of the area to be cleaned, and the entire environmental area can be used as the area to be cleaned.
- the one or more processors 502 control the cleaning robot to perform the cleaning task in the area to be cleaned.
- the cleaning robot may be controlled in a random cleaning mode or a path planning cleaning mode. Clean the area.
- the cleaning robot can support one or more different types of cleaning routes.
- the cleaning robot can support bow-shaped cleaning routes, arc-shaped cleaning routes, “L” -shaped cleaning routes, mouth-shaped cleaning routes, spiral walking fixed-point cleaning routes, and so on.
- the cleaning of the to-be-cleaned area with the bow-shaped sweeping path and the arc-shaped sweeping path of the cleaning robot will be described as an example.
- one or more processors 502 may select an adapted cleaning mode to control the cleaning robot to clean the area to be cleaned.
- the processor 502 is specifically configured to: control where the cleaning robot is when it is out of hijacking The position is the center of the circle, and the arc cleaning route is adopted to perform a fixed-point cleaning on the area to be cleaned.
- one or more processors 502 when controlling the robot to perform a task in the task execution area, are specifically configured to: control the cleaning robot to start from its position when it is out of hijacking, and use a bow
- the font-shaped cleaning route performs bow-shaped cleaning on the area to be cleaned.
- an embodiment of the present application further provides a computer-readable storage medium storing computer instructions.
- the computer instructions are executed by one or more processors, the one or more processors are caused to perform the following actions:
- the position where the robot is out of hijacking is determined; the task execution area is determined according to the environmental information around the position when the robot is out of hijacking; and the robot is controlled to perform the task in the task execution area.
- the above-mentioned action of determining the position of the robot when it is out of hijacking includes: when it is recognized that the robot is out of hijacking, obtaining environmental information around the current position of the robot; positioning the robot according to the environmental information around the current position of the robot.
- the pose in the stored environment map, and the position in the pose is used as the position where the robot is out of hijack.
- the action of determining the position where the robot is out of hijacking further includes: when it is recognized that the robot is out of hijacking, controlling the robot to move from the current position to the second position, and positioning the robot in the stored process during the movement.
- the pose in the environment map based on the position in the pose and the data obtained during the movement of the robot, determine the position when the robot starts to move as the position when the robot is out of hijack.
- the actions performed by the one or more processors further include: determining that the robot needs to perform a task at the position when the robot is out of hijacking according to the difference between the position when the robot is out of hijacking and when it is abducted.
- the above-mentioned action of determining that the robot needs to perform a task at the position when it is out of hijacking includes: if the position where the robot is out of hijack and the position where it was hijacked belong to different environmental areas, determining that the robot needs to be out of hijack Execute tasks at your location.
- the above-mentioned action of determining that the robot needs to perform a task at a position where the robot is out of hijacking further includes: if the position of the robot when it is hijacked is located in a difficult area for the robot to run, but the position where the robot is out of hijacking is difficult to run the robot Outside the area, determine that the robot needs to perform the task at the location it was in when it was released from the hijack.
- the above-mentioned action of determining that the robot needs to perform a task at a position where the robot is out of hijacking further includes: if the position where the robot was hijacked is a charging pile position, but the position where the robot is out of hijacking is a non-charging pile position, Determine where the robot needs to perform the task at the location where it was out of the hijack.
- the readable storage medium provided in this embodiment is applicable to various types of robots.
- a cleaning robot, a sorting robot, a shopping guide robot, a shopping cart robot, and the like are not limited thereto.
- the task execution area and the instructions for executing tasks stored in the readable storage medium may be different, so that one or more processors execute different tasks when executing these instructions.
- the cleaning robot is taken as an example for illustrative description.
- the above-mentioned action of determining the task execution area includes: determining the area to be cleaned according to environmental information around the location where the cleaning robot is out of hijacking.
- the above-mentioned action of determining the area to be cleaned includes: determining at least one physical object having a space-defining role in the environmental information around the location where the cleaning robot is out of hijacking; and a ground area that will include at least one solid object As the area to be cleaned.
- the above-mentioned action of determining the area to be cleaned further includes: identifying whether the environmental information around the position where the cleaning robot is out of hijack includes a corner; if a corner is included, determining the associated area of the corner as a wait Clean up the area.
- the above-mentioned action of determining the associated area of the corners as the area to be cleaned includes: taking the vertex of the corner as the center of the circle, and using the distance from the position where the cleaning robot is out of hijack to the vertex as the radius to determine a sector area As the area to be cleaned; or a distance between the position where the cleaning robot is out of hijacking and any one of the corners is half the length, and a rectangular area is determined as the area to be cleaned.
- the actions of the control robot performing the task in the task execution area include: controlling the cleaning robot to perform a fixed-point cleaning of the area to be cleaned based on its position when it is out of hijacking; or, controlling The cleaning robot starts from the position where it was released from the hijacking, and performs bow-shaped cleaning on the area to be cleaned.
- the embodiments of the present invention may be provided as a method, a system, or a computer program product. Therefore, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Moreover, the present invention may take the form of a computer program product implemented on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) containing computer-usable program code.
- computer-usable storage media including, but not limited to, disk storage, CD-ROM, optical storage, etc.
- These computer program instructions may also be stored in a computer-readable memory capable of directing a computer or other programmable data processing device to work in a particular manner such that the instructions stored in the computer-readable memory produce a manufactured article including an instruction device, the instructions
- the device implements the functions specified in one or more flowcharts and / or one or more blocks of the block diagram.
- These computer program instructions can also be loaded on a computer or other programmable data processing device, so that a series of steps can be performed on the computer or other programmable device to produce a computer-implemented process, which can be executed on the computer or other programmable device.
- the instructions provide steps for implementing the functions specified in one or more flowcharts and / or one or more blocks of the block diagrams.
- a computing device includes one or more processors (CPUs), input / output interfaces, network interfaces, and memory.
- processors CPUs
- input / output interfaces output interfaces
- network interfaces network interfaces
- memory volatile and non-volatile memory
- Memory may include non-persistent memory, random access memory (RAM), and / or non-volatile memory in computer-readable media, such as read-only memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
- RAM random access memory
- ROM read-only memory
- flash RAM flash memory
- Computer-readable media includes both permanent and non-persistent, removable and non-removable media.
- Information can be stored by any method or technology.
- Information may be computer-readable instructions, data structures, modules of a program, or other data.
- Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), and read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technologies, read-only disc read-only memory (CD-ROM), digital versatile disc (DVD) or other optical storage, Magnetic tape cartridges, magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission media may be used to store information that can be accessed by computing devices.
- computer-readable media does not include temporary computer-readable media (transitory media), such as modulated data signals and carrier waves.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Robotics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Manipulator (AREA)
Abstract
本申请实施例提供一种机器人控制方法、机器人及存储介质。在本申请实施例中,机器人基于重定位操作,确定其脱离劫持时所在位置;并根据其在脱离劫持时所在位置周围的环境信息,确定任务执行区域;之后在该任务执行区域内执行任务。这样,机器人可根据其脱离劫持时所处的环境,灵活确定任务执行区域,而无需回到被劫持时所在位置继续执行任务,进而实现因地制宜,可尽量满足用户需求。
Description
交叉引用
本申请引用于2018年7月19日递交的名称为“机器人控制方法、机器人及存储介质”的第2018107978935号中国专利申请,其通过引用被全部并入本申请。
本申请涉及人工智能技术领域,尤其涉及一种机器人控制方法、机器人及存储介质。
随着机器人技术的发展,机器人逐渐进入人们的日常生活,给人们的生活带来极大的便利。例如,具有地面清洁功能的机器人,可以自动进行房间的清洁,节省了大量的人力和物力成本。
在现有的机器人技术中,可以借助即时定位与地图构建(Simultaneous localization and mapping)技术实现机器人的自主定位和导航。然而,在机器人进行SLAM过程中,有时可能会被劫持等状况,例如机器人被搬动、悬空或被大范围拖动等。当机器人重新回到地面时,则定位会出现不可控的漂移误差,机器人需要进行重新定位。
机器人进行重定位后,一般会重新回到被劫持的位置继续执行之前的任务。这种方式相对简单,但无法实现因地制宜,有可能无法满足用户需求。
发明内容
本申请的多个方面提供一种机器人控制方法、机器人及存储介质,使机器人能够因地制宜地执行相应的任务,以满足用户需求。
本申请实施例提供一种机器人控制方法,包括:
机器人基于重定位操作,确定所述机器人脱离劫持时所在位置;
所述机器人根据其脱离劫持时所在位置周围的环境信息,确定任务执行区域;
所述机器人在所述任务执行区域内执行任务。
本申请实施例还提供一种机器人,包括:机械本体,所述机械本体上设有一个或多个传感器、一个或多个处理器,以及一个或多个存储计算机指令的存储器;
所述一个或多个处理器,用于执行所述计算机指令,以用于:
基于重定位操作,确定所述机器人脱离劫持时所在位置;
通过所述一个或多个传感器采集所述机器人脱离劫持时所在位置周围的环境信息;根据所述机器人脱离劫持时所在位置周围的环境信息,确定任务执行区域;
控制所述机器人在所述任务执行区域内执行任务。
本申请实施例还提供一种存储有计算机指令的计算机可读存储介质,其特征在于,当所述计算机指令被一个或多个处理器执行时,致使所述一个或多个处理器执行包括以下的动作:
基于重定位操作,确定所述机器人脱离劫持时所在位置;
根据所述机器人脱离劫持时所在位置周围的环境信息,确定任务执行区域;
控制所述机器人在所述任务执行区域内执行任务。
在本申请实施例中,机器人基于重定位操作,确定其脱离劫持时所在位置;并根据其在脱离劫持时所在位置周围的环境信息,确定任务执行区域;之后在该任务执行区域内执行任务。这样,机器人可根据其脱离劫持时所处的环境,灵活确定任务执行区域,而无需回到被劫持时所在位置继续执行任务,进而实现因地制宜,可尽量满足用户需求。
此处所说明的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:
图1为本申请一示例性实施例提供的一种机器人控制方法的流程示意图;
图2为本申请一示例性实施例提供的一种确定机器人继续执行任务的位置的方法的流程示意图;
图3a-图3g分别为本申请一示例性实施例提供的待清扫区域的示意图;
图4a本申请一示例性实施例提供的一种圆弧型清扫路线的示意图;
图4b分别为本申请一示例性实施例提供的一种弓字型清扫路线的示意图;
图5a为本申请一示例性实施例提供的一种机器人的硬件结构框图;
图5b为本申请一示例性实施例提供的一种人形机器人的线条图;
图5c为本申请一示例性实施例提供的一种非人形机器人的线条图。
为使本申请的目的、技术方案和优点更加清楚,下面将结合本申请具体实施例及相应的附图对本申请技术方案进行清楚、完整地描述。显然,所描述的实施例仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
针对现有机器人进行重定位后其继续执行任务的方式无法因地制宜导致可能无法满足用户需求的技术问题,本申请实施例提供一种解决方案,基本思路是:机器人基于重定位操作,确定其脱离劫持时所在位置;并根据其在脱离劫持时所在位置周围的环境信息,确定任务执行区域;之后在该任务执行区域内执行任务。这样,机器人可根据其脱离劫持时所处的环境,灵活确 定任务执行区域,进而实现因地制宜,可尽量满足用户需求。
以下结合附图,详细说明本申请各实施例提供的技术方案。
图1为本申请一示例性实施例提供的一种机器人控制方法的流程示意图。如图1所示,该方法包括:
101、机器人基于重定位操作,确定其脱离劫持时所在的位置。
102、机器人根据其脱离劫持时所在位置周围的环境信息,确定任务执行区域。
103、机器人在任务执行区域内执行任务。
本实施例提供的方法可应用于可自主移动的机器人,主要对机器人重定位后的后续行为进行控制。本实施例并不限定机器人的形状,例如可以是圆形、椭圆形、三角形、凸多边形、人型等。其中,机器人可以通过安装软件、APP,或者在相应器件中写入程序代码来实现本实施例提供的机器人定位方法的逻辑。
在本实施例中,机器人可自主移动,并可在自主移动的基础上完成一定作业任务。例如,在超市、商场等购物场景中,购物车机器人需要跟随顾客移动,以容纳顾客选购的商品。又例如,在一些公司的仓储分拣场景中,分拣机器人需要跟随分拣人员移动到货架拣货区,然后开始分拣订单货物。又例如,在家庭清扫场景中,扫地机器人需要清扫客厅、卧室、厨房等区域。在这些应用场景中,机器人在自主移动过程中,完成相应的作业任务。但是,在实际应用中,机器人在完成相应的作业任务的过程中可能出现被困、反复绕行或者被缠绕等运行困难的情况。在这种情况下,用户一般会将机器人搬动或拖动至另一位置,以使机器人从该位置处继续执行作业任务。
另外,在本实施例中,机器人在电量不足时,需要进行充电。机器人可在其电量不足时,自主移动至充电桩的位置进行充电。或者机器人在其电量不足时,向用户发出警告,以使用户将机器人移动到充电桩的位置为其进行充电。当机器人充电完成时,用户会将其搬动或拖动至充电桩以外的某个位置,以使机器人从该位置处继续执行作业任务。
结合上述应用场景可知,在实际应用过程中,机器人可能发生被劫持的状况,例如上述被搬动、悬空或拖动至另一位置。在这些情况下,机器人会因缺少或丢失之前的位置信息而触发重定位操作,即重新确定机器人的位姿,这里的位姿包括机器人的位置和朝向。
在更多情况下,机器人之所以被劫持是因为被劫持时机器人所在位置不再适合机器人继续执行任务,机器人需要在新位置执行任务。基于该分析,在本实施例中,机器人可基于重定位操作,确定其脱离劫持时所在位置,获取其脱离劫持时所在位置周围的环境信息,并根据获取的环境信息,确定任务执行区域;之后在该任务执行区域内执行任务。这样,机器人可根据其脱离劫持时所处的环境,灵活确定任务执行区域,而无需回到被劫持时所在位置继续执行任务,进而实现因地制宜,可尽量满足用户需求。
在本申请实施例中,为了便于描述,将机器人被搬动、悬空或大范围拖动等行为统一定义为机器人被挟持。相应地,将机器人被搬动或悬空后重新回到地面时以及机器人被拖动后当其停止被拖动时,统一定义为机器人脱离劫持。
可选地,可以在机器人上设置触传感器、反射光耦合器以及惯性传感器等,但不限于此,用于检测机器人是否发生上述被劫持的状况。例如,机器人的基座的底部或者滚轮上可以设置接触传感器,用于检测机器人是否被搬动和悬空,还可用于确定机器人是否重新回到地面。或者,在机器人的底部安装反射光耦合器,反射光耦合器发射一可以从地面反射回来的光束,通过这个光束可以检测机器人被搬动或悬空以及随后放回地面的操作。当接触传感器或反射光耦合器检测到机器人脱离地面,说明机器人被劫持;且当其检测到机器人被重新放回地面时,说明机器人脱离劫持,触发机器人启动重定位功能,执行重定位操作。又例如,在机器人上安装惯性传感器,例如加速度传感器等,用以检测机器人是否被大范围拖动。当惯性传感器检测到机器人被拖动时,说明机器人被劫持;且当检测到机器人拖动停止时,说明机器人脱离劫持,触发机器人启动重新定位功能,执行重定位操作。
在一可选实施例中,机器人可以在其脱离劫持时所在位置处执行重定位操作,且在该位置处能够准确定位出机器人在存储的环境地图中的位姿。这里的位姿包括机器人在存储的环境地图中的位置和朝向。基于此,步骤101的一种可选实施方式为:当机器人识别到其脱离劫持时,采集自身当前所在位置周围的环境信息;并根据自身当前所在位置周围的环境信息,定位其在存储的环境地图中的位姿,将该位姿中的位置作为机器人脱离劫持时所在位置,确切地说,是其当前在存储的环境地图中的位置。
在另一可选实施例中,机器人在其脱离劫持时所在位置处无法准确定位出其在存储的环境地图中的位姿,则机器人可以从脱离劫持时所在位置处向另一位置移动,在移动过程中不断根据最新采集到的周围环境信息执行重定位操作,直到准确定位出机器人在存储的环境地图中的位姿。为了便于描述和区分,在本申请下述各实施例中,将机器人从脱离劫持时所在位置处向另一位置移动中的另一位置定义为第二位置,该第二位置为存储的环境地图中与机器人脱离劫持时所在位置不同的任一位置。基于此,步骤101的另一种可选实施方式为:当机器人识别到其脱离劫持时,从当前所在位置向第二位置移动,并在移动过程中定位机器人在存储的环境地图中的位姿;并根据该位姿中的位置以及机器人在移动过程中获取的数据,确定机器人开始移动时的位置作为机器人脱离劫持时所在位置。
可选地,可以预设定位周期,并启动一个定时器或计数器对该定位周期进行计时。这样,在机器人从脱离劫持时所在位置向第二位置移动过程中,可以在每个定位周期达到时执行一次重定位操作即进行一次重定位,直至定位出机器人在存储的环境地图中的位姿。这样,当定位出所述机器人的位姿时,便可根据机器人在从脱离劫持时所在位置移动至定位出其在存储的环境地图中的位姿时所在位置总共经历的定位周期数,进而可确定经历的时间。
进一步,可以在机器人上设置位移传感器和加速度传感器,在机器人从脱离劫持时所在位置移动至定位出其在存储的环境地图中的位姿时所在位置的过程中,采集机器人的位移以及加速度。这样,结合机器人的位移、加速 度、经历的时间,可确定出机器人从脱离劫持时所在位置移动至定位出其在存储的环境地图中的位姿时所在位置所经历的路程;之后根据其经历的路程、已知的导航路径以及重定位出其在存储的环境地图中的位姿时所在位置,确定出机器人开始移动时的位置,即为机器人脱离劫持时的位置。
可选地,环境地图可以包括视觉地图和栅格地图中的至少一个。其中,视觉地图是预先基于视觉传感器采集的环境图像构建的,该视觉地图可在一定程度上描述机器人所处的区域环境,主要存储有与机器人所处环境相关的若干环境图像的信息,例如环境图像对应的机器人位姿、环境图像包含的特征点以及特征点的描述子等。栅格地图是预先基于激光传感器采集到的环境数据构建的,该栅格地图是对存储机器人所处区域环境数字栅格化的产物。栅格地图中的每个栅格与机器人所处环境中的一个小块区域对应,每个栅格包含了坐标、是否被障碍物占用两类基本信息,栅格被占据的概率值表示对应区域的环境信息。栅格地图中栅格数量越多,栅格地图对机器人所处环境的描述也就越详细,相应地,基于该栅格地图的定位精度也就越高。
若上述机器人上的传感器包括视觉传感器,则在步骤102中,视觉传感器采集机器人脱离劫持时所在位置周围的环境图像。相应地,机器人可根据机器人脱离劫持时所在位置周围的环境图像,确定任务执行区域。
若上述机器人上的传感器包括激光传感器,则在步骤102中,视觉传感器采集机器人脱离劫持时所在位置周围的环境数据。相应地,机器人可根据机器人脱离劫持时所在位置周围的环境数据,确定任务执行区域。
若上述机器人上的传感器包括视觉传感器和激光传感器,则在步骤102中,视觉传感器采集机器人脱离劫持时所在位置周围的环境图像,视觉传感器采集机器人脱离劫持时所在位置周围的环境数据。相应地,机器人可根据机器人脱离劫持时所在位置周围的环境图像和环境数据,确定任务执行区域。
为了便于描述,在上述或下述实施例中,将视觉传感器采集的环境图像和激光传感器采集的环境数据,统一称为环境信息。相应地,在上述或下述实施例中,机器人在被劫持时或脱离劫持时通过视觉传感器采集的其所在位 置周围环境图像和通过激光传感器采集的其所在位置周围环境数据,统一称为机器人在被劫持时或脱离劫持时所在位置周围的环境信息。
虽然在更多情况下,机器人被劫持之后,需要从新位置开始执行任务。但是,在一些应用场景中,机器人重定位出自身位姿之后,可能需要重新回到被劫持时所在位置以便继续执行被劫持前尚未完成的任务。也就是说,根据机器人被劫持情况的不同,机器人重定位出自身位姿之后的行为也会有所不同。为了更加准确地对机器人重定位后的行为进行控制,在一些可选实施例中,在步骤102之前,可根据机器人脱离劫持时所在位置和被劫持时所在位置的差异,确定机器人是否需要在脱离劫持时所在位置处执行任务。若需要,则执行步骤102以及后续操作,以便机器人从脱离劫持时所在位置开始执行任务。
在应用场景1中,机器人在当前环境区域已经完成了作业任务或用户想将其移动至另一环境区域执行作业任务。这样,用户一般会将机器人搬动或拖动至另一区域中的任一位置,以使机器人从该位置处对该区域执行作业任务。例如,对于应用于家庭清扫作业中的清扫机器人来说,当对卧室A的清扫完成时,用户将其搬至卧室B继续执行清扫任务。在本申请实施例中,环境区域是指具有独立存在意义的区域范围,一般来说,机器人在不同环境区域时的作业任务可相互独立。根据应用场景的不同,对环境区域的划分和定义也会有所不同。例如,在家庭环境中,卧室、厨房、客厅、卫生间等可以视为相对独立的环境区域,但不限于此。
基于上述应用场景1,确定机器人是否需要在脱离劫持时所在位置处执行任务的一种可选实施方式为:判断机器人脱离劫持时所在位置和被劫持时所在位置是否属于相同的环境区域;若机器人脱离劫持时所在位置和被劫持时所在位置属于不同的环境区域,则确定机器人需要在脱离劫持时所在位置处执行任务。
可选地,可根据机器人在被劫持前最后一次或最近一段时间内采集到的环境信息,确定机器人被劫持时所在位置。
在应用场景2中,机器人在完成相应的作业任务的过程中可能出现被困、反复绕行或者被缠绕等运行困难的情况。例如,清扫机器人在当前位置执行清扫任务时,其地刷被该位置处的毛发所缠绕。又例如,清扫机器人在执行清扫任务时,遇到台阶而无法继续作业等。在这些情况下,用户一般会将机器人搬动或拖动至另一位置,以使机器人从该位置处继续执行作业任务。然而,对于某些情况,机器人运行困难的状况可能被解除,例如,上述清扫机器人在执行清扫任务时,其地刷被该位置处的毛发所缠绕而被搬离的情况,毛发可能在机器人重新回到地面时而被清理。对于这种情况,用户期望机器人重新回到被劫持时所在位置继续执行作业任务。
基于上述应用场景2,确定机器人是否需要在脱离劫持时所在位置处执行任务的另一种可选实施方式为:判断机器人被劫持时所在位置是否位于机器人运行困难区域;若机器人被劫持时所在位置位于运行困难区域,且机器人脱离劫持时所在的位置位于机器人运行困难区域之外,则确定机器人需要在脱离劫持时所在位置处执行任务。相应地,若机器人被劫持时所在位置位于困难区域之外,则确定机器人需要回到被劫持时所在位置继续执行作业任务。
可选地,对于可预见的机器人运行困难区域或固定的运行困难区域,例如台阶所在环境区域等,可预先在存储的环境地图中对其进行标识;也可根据机器人采集到的环境信息对运行困难区域进行实时标记。对于不可预见的机器人运行困难区域或可变性较大的运行困难区域,例如,对于清扫机器人来说,某一环境区域存在大量的毛发、碎屑等,当其在该区域执行清扫任务时,有可能发生被缠绕的可能,但是该区域中的毛发、碎屑等可能会被清理,则该环境区域也就不再为机器人运行困难区域。因此,机器人可根据采集到的环境信息,确定该环境区域是否其运行困难区域。
在应用场景3中,机器人可在其电量不足时,自主移动至充电桩的位置进行充电。或者机器人在其电量不足时,向用户发出警告,以使用户将机器人移动到充电桩的位置为其进行充电。当机器人充电完成时,用户会将其搬动或拖动至充电桩以外的某个位置,以使机器人从该位置处继续执行作业任 务。
基于上述应用场景3,确定机器人是否需要在脱离劫持时所在位置处执行任务的又一种可选实施方式为:判断机器人被劫持时所在位置是否为充电桩位置;若其被劫持时所在位置是充电桩位置,且机器人脱离劫持时所在位置是非充电桩位置,则确定机器人需要在脱离劫持时所在位置处执行任务。
可选地,可将充电桩的位置预先标记在存储的环境地图中。这样,当机器人检测到其有外部电源对其进行充电时,便可确定机器人被劫持时所在位置为充电桩的位置。
可选地,机器人在充电时,其主机电路会定时被唤醒。在其主机电路被唤醒时,机器人可采集当前所在位置的周围环境信息,并根据其周围环境信息,确定机器人当前所在位置为充电桩位置。
值得说明的是,在本申请实施例中,对于上述确定机器人是否需要在脱离劫持时所在位置处执行任务的实施方式可以单独进行实施,也可以任意两种或三种实施方式结合起来进行判断。下面将“判断机器人脱离劫持时所在位置和被劫持时所在位置是否属于相同的环境区域”以及“机器人被劫持时所在位置是否位于机器人运行困难区域”两种实施方式进行实施,并进行示例性说明,如图2所示为其对应的流程示意图。如图2所示,该方法包括:
200、机器人在脱离劫持后,执行重定位操作。
201、基于重定位操作,确定机器人脱离劫持时所在位置。
202、判断机器人脱离劫持时所在位置和被劫持时所在位置是否属于相同的环境区域;若判断结果为是,则执行步骤203;若判断结果为否,则执行步骤206。
203、判断机器人被劫持时所在位置是否位于机器人运行困难区域;若判断结果为否,则执行步骤204;若判断结果为是,则执行步骤205。
204、确定机器人需要回到被劫持时所在位置处执行任务。
205、判断机器人脱离劫持时的位置是否位于机器人运行困难区域;若判断结果为是,则结束操作,即执行步骤207;若判断结果为否,则执行步骤 206。
206、确定机器人需要在脱离劫持时所在位置处执行任务。
207、结束操作。
本申请实施例提供的机器人控制方法适用于各种类型的机器人,例如,清扫机器人、分拣机器人、导购机器人、购物车机器人等,但不限于此。
对于不同类型的机器人,其任务执行区域会有所不同。例如,对于清扫机器人,其在步骤102中,确定的任务执行区域为待清扫区域。相应地,步骤102的一种可选实施方式为:清扫机器人根据其脱离劫持时所在位置周围的环境信息,确定待清扫区域。
又例如,对于一些公司的仓储分拣场景中的分拣机器人来说,其在步骤102中,确定的执行任务区域为待分拣区域。相应地,步骤102的另一种可选实施方式为:分拣机器人根据其脱离劫持时所在位置周围的环境信息,确定待分拣区域。
相应地,对于不同类型的机器人,其确定任务执行区域的具体实施方式和执行任务的方式会因其应用场景的不同,而有所差别。例如,清扫机器人执行的任务为清扫地面等;购物车机器人或导购执行的任务为跟随顾客;分拣机器人执行的任务为分拣货物或订单等。下面以清扫机器人为例,结合一些应用场景,对其确定待清扫区域的具体实施方式和在确定出的待清扫区域内执行清扫任务的实施方式进行说明。
在应用场景A中,清扫机器人脱离劫持时所在位置的周围摆放有桌椅、柜子等。当清扫机器人对这些桌椅所在区域进行打扫时,需要绕开桌子、椅子的腿部所在位置,对桌椅腿与腿之间的区域进行打扫。基于此类应用场景,清扫机器人确定待清扫区域的一种可选实施方式为:确定清扫机器人脱离劫持时所在位置周围的环境信息中具有空间限定作用的至少一个实体对象;并将包含至少一个实体对象的地面区域作为待清扫区域;然后在该待清扫区域中执行清扫任务。对于上述应用场景,具有空间限定作用的至少一个实体对象可为桌椅的腿、墙体、柜体、柱子、门框等,但不限于此。
为了更清楚地该可选实施方式,结合上述清扫机器人脱离劫持时所在位置的周围摆放有桌椅、柜子的应用场景进行示例性说明。首先,清扫机器人获取其脱离劫持时所在位置周围的环境信息,并根据获取的环境信息,确定该环境区域中所有桌椅的腿部的位置;然后,根据这些桌椅腿部的位置,确定包含这些桌椅腿部的位置的地面区域为待清扫区域。在本实施例中,不限定待清扫区域的形状,可以为包含该区域所有桌椅腿部位置的任意具有规则结构或不规则结构的区域。可选地,待清扫区域可以如图3a所示,为包含该区域所有桌椅腿部位置的一个圆形区域;或者如图3b所示,为包含该区域所有桌椅腿部位置的一个矩形区域,但不限于此。
在应用场景B中,清扫机器人脱离劫持时所在位置的环境区域中可能包含墙壁、柜子等具有边界限定作用的设施。当清扫机器人对这些边界位置进行打扫时,需要打扫墙边或柜子边缘,以避免检测到这些边界限定作用的障碍物而远距离绕开而出现漏扫的情况。基于此类应用场景,清扫机器人确定待清扫区域的一种可选实施方式为:清扫机器人根据其脱离劫持时所在位置周围的环境信息,确定其当前所在区域是否包含边界,例如墙壁、柜子等;若包含有边界信息,则如图3c所示,以机器人脱离劫持时所在位置与边界之间的距离为任一半边长,确定一个矩形区域,并将该矩形区域作为待清扫区域;然后在该待清扫区域中执行清扫任务。可选地,矩形区域包括正方形区域和长方形区域。其中,在本申请实施例中,将矩形的一边的边长的一半定义为半边长。
可选地,基于上述应用场景B,清扫机器人确定待清扫区域的另一种可选实施方式为:清扫机器人根据其脱离劫持时所在位置周围的环境信息,确定其当前所在区域是否包含边界,例如墙壁、柜子等;若包含有边界信息,则如图3d所示,以机器人脱离劫持时所在位置与边界之间的距离为半径,确定一个圆形区域,并将该圆形区域作为待清扫区域。
值得说明的是,在上述或下述实施例中,机器人脱离劫持时所在位置与边界之间的距离,可以为机器人脱离劫持时所在位置与边界上任一点的距离, 优选地,为机器人脱离劫持时所在位置与边界的垂直距离。
在应用场景C中,清扫机器人脱离劫持时所在位置的环境区域中可能包含墙壁与墙壁所形成的边角等狭窄区域。基于此类应用场景,清扫机器人确定待清扫区域的一种可选实施方式为:清扫机器人根据其脱离劫持时所在位置周围的环境信息,确定其当前所在区域是否包含边角,例如墙壁与墙壁所形成的边角、墙壁与柜子等形成的边角等;若包含有边角,则确定该边角的关联区域作为待清扫区域。
进一步,则如图3e所示,可以该边角的顶点为圆心,以机器人脱离劫持时所在位置与该边角的顶点之间的距离为半径,确定一个扇形区域,其中边角的两条边为扇形区域的另外两个边界,并将该扇形区域作为待清扫区域。
可选地,基于应用场景C,清扫机器人确定待清扫区域的另一种可选实施方式为:清扫机器人根据其脱离劫持时所在位置周围的环境信息,确定其当前所在区域是否包含边角,例如墙壁与墙壁所形成的边角、墙壁与柜子等形成的边角等;若包含有边角,则如图3f所示,以机器人脱离劫持时所在位置到该边角的任一边的距离为半边长,确定一个矩形区域,并将该矩形区域作为待清扫区域。可选地,矩形区域包括正方形区域和长方形区域。对于正方形区域,优选地,将机器人脱离劫持时所在位置与边角的边的垂直距离中最长的距离作为半边长。
在应用场景D中,清扫机器人脱离劫持时所在位置的环境区域可能为空旷的环境,不存在墙壁、桌椅等障碍物。例如,清扫机器人在一个没有任何家具的空房间进行清扫等。基于此类应用场景,机器人在其当前所在位置处,确定任一区域作为待清扫区域。例如,则如图3g所示,机器人可以扫机器人脱离劫持时所在位置为圆心,以其到所在环境区域的边界的距离为半径,确定一个圆形区域作为待清扫区域。又例如,机器人可以脱离劫持时所在环境区域的边界为待清扫区域边界,将整个环境区域作为待清扫区域。
在又一可选实施例中,当清扫机器人确定待清扫区域后,便在待清扫区域内执行清扫任务,其可以随机清扫模式或路径规划清扫模式对待清扫区域 进行清扫。路径规划式清扫模式是相对于随机清扫模式而言的,是指可以准确规划清扫路线,实现规划式打扫,保证清扫路径规矩,且尽量不重复的清扫模式。清扫机器人可以支持一种或多种不同样式的清扫路线。例如,清扫机器人可以支持弓字型清扫路线、圆弧清扫路线、“L”型清扫路线,口字形清扫路线、螺旋行走定点清扫路线等等。在本申请实施例中,以清扫机器人弓字型清扫路线和圆弧清扫路线对待清扫区域进行定点清扫和弓字型清扫进行示例性说明。
基于上述应用场景A、B、C、D所确定的待清扫区域,清扫机器人可选择适配的清扫模式对待清扫区域进行清扫。优选地,如图4a所示,对于圆形或圆弧形区域,清扫机器人以其脱离劫持时所在位置为圆心,采用圆弧清扫路线对待清扫区域进行定点清扫。
优选地,如图4b所示,对于矩形区域,清扫机器人从其脱离劫持时所在位置开始,以弓字型清扫路线对待清扫区域进行弓字型清扫。需要说明的是,为了更清楚地显示弓字型清扫时的清扫路线,图4b中所示的弓字型清扫方式并未至待清扫区域的边界,但并不限定本申请中的清扫方式无法清扫边界。
需要说明的是,上述实施例所提供方法的各步骤的执行主体均可以是同一设备,或者,该方法也由不同设备作为执行主体。比如,步骤201和202的执行主体可以为设备A;又比如,步骤201的执行主体可以为设备A,步骤202的执行主体可以为设备B;等等。
另外,在上述实施例及附图中的描述的一些流程中,包含了按照特定顺序出现的多个操作,但是应该清楚了解,这些操作可以不按照其在本文中出现的顺序来执行或并行执行,操作的序号如201、202等,仅仅是用于区分开各个不同的操作,序号本身不代表任何的执行顺序。另外,这些流程可以包括更多或更少的操作,并且这些操作可以按顺序执行或并行执行。需要说明的是,本文中的“第一”、“第二”等描述,是用于区分不同的消息、设备、模块等,不代表先后顺序,也不限定“第一”和“第二”是不同的类型。
除上述方法实施例之外,本申请一些示例性实施例还提供一些适用于上 述方法的机器人。下面结合附图对这些方法进行详细描述。
图5a为本申请一示例性实施例提供的一种机器人的硬件结构框图。如图5a所示,该机器人500包括:机械本体501;机械本体501上设有一个或多个处理器502和一个或多个存储计算机指令的存储器503。除此之外,机械本体501上还设有一个或多个传感器504。
值得说明的是,一个或多个处理器502、一个或多个存储器503、一个或多个传感器504可设置于机械本体501内部,也可以设置于机械本体501的表面。
机械本体501是机器人500的执行机构,可以在确定的环境中执行一个或多个处理器502指定的操作。其中,机械本体501一定程度上体现了机器人500的外观形态。在本实施例中,并不限定机器人500的外观形态。例如,机器人500可以是图5b所示的人形机器人,则机械本体501可以包括但不限于:机器人的头部、手部、腕部、臂部、腰部和基座等机械结构。另外,机器人500也可以是图5c所示形态相对简单一些的非人形机器人,则机械本体501主要是指机器人500的机身。
值得说明的是,机械本体501上还设置有机器人500的一些基本组件,例如驱动组件、里程计、电源组件、音频组件等等。可选地,驱动组件可以包括驱动轮、驱动电机、万向轮等。不同机器人500所包含的这些基本组件以及基本组件的构成均会有所不同,本申请实施例列举的仅是部分示例。
一个或多个存储器503,主要用于存储一个或多个计算机指令,这些计算机指令可被一个或多个处理器502执行,致使一个或多个处理器502控制机器人500实现相应功能、完成相应动作或任务。除了存储计算机指令之外,一个或多个存储器503还可被配置为存储其它各种数据以支持在机器人500上的操作。这些数据的示例包括用于在机器人500上操作的任何应用程序或方法的指令,机器人500所处环境对应的环境地图。其中,环境地图可以是预先存储的整个环境对应的一幅或多幅地图,或者也可以是之前正在构建的部分地图。
一个或多个存储器503,可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEPROM),可擦除可编程只读存储器(EPROM),可编程只读存储器(PROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。
一个或多个处理器502,可以看作是机器人500的控制系统,可用于执行一个或多个存储器503中存储的计算机指令,以控制机器人500实现相应功能、完成相应动作或任务。值得说明的是,机器人500处于不同场景时,其所需实现的功能、完成的动作或任务会有所不同;相应地,一个或多个存储器503中存储的计算机指令也会有所不同,而一个或多个处理器502执行不同计算机指令可控制机器人500实现不同的功能、完成不同的动作或任务。
在本实施例中,机器人500上的一个或多个传感器504可辅助完成机器人500的导航定位和重定位等。其中,一个或多个传感器504可以包括视觉传感器、激光传感器、接触传感器、反射光耦合器以及惯性传感器等,但不限于此。
其中,视觉传感器可以看作是机器人500的“眼睛”,主要用于采集机器人500周围环境的图像,这些图像可称为环境图像。视觉传感器可以采用任何具有图像采集功能的设备实现,例如可以采用摄像头、照相机等。
其中,激光传感器是一种以发射激光束的方式采集机器人500周围的环境信息的雷达系统。激光传感器采集到的环境数据可以包括但不限于:机器人500周围对象的距离、角度等。激光传感器可以采用任何能够发射激光光束的设备实现,例如可以采用激光雷达。
在本实施例中,机器人500可自主移动,并可在自主移动的基础上完成一定作业任务。例如,在超市、商场等购物场景中,购物车机器人需要跟随顾客移动,以容纳顾客选购的商品。又例如,在一些公司的仓储分拣场景中,分拣机器人需要跟随分拣人员移动到货架拣货区,然后开始分拣订单货物。又例如,在家庭清扫场景中,扫地机器人需要清扫客厅、卧室、厨房等区域。在这些应用场景中,机器人500在自主移动过程中,完成相应的作业任务。 但是,在实际应用中,机器人500在完成相应的作业任务的过程中可能出现被困、反复绕行或者被缠绕等运行困难的情况。在这种情况下,用户一般会将机器人500搬动或拖动至另一位置,以使机器人500从该位置处继续执行作业任务。
另外,在本实施例中,机器人500在电量不足时,需要进行充电。机器人500可在其电量不足时,自主移动至充电桩的位置进行充电。或者机器人500在其电量不足时,向用户发出警告,以使用户将机器人500移动到充电桩的位置为其进行充电。当机器人500充电完成时,用户会将其搬动或拖动至充电桩以外的某个位置,以使机器人500从该位置处继续执行作业任务。
结合上述应用场景可知,在实际应用过程中,机器人500可能发生被劫持的状况,例如上述被搬动、悬空或拖动至另一位置。在这些情况下,机器人500会因缺少或丢失之前的位置信息而触发重定位操作,即重新确定机器人500的位姿,这里的位姿包括机器人的位置和朝向。
在更多情况下,机器人500之所以被劫持是因为被劫持时机器人500所在位置不再适合机器人继续执行任务,机器人500需要在新位置执行任务。基于该分析,在本实施例中,一个或多个处理器502可基于重定位操作,确定机器人脱离劫持时所在位置;并通过一个或多个传感器504采集机器人脱离劫持时所在位置周围的环境信息,并根据获取的环境信息,确定任务执行区域;之后控制机器人500在该任务执行区域内执行任务。这样,可根据机器人脱离劫持时所处的环境,灵活确定任务执行区域,而无需使机器人回到被劫持时所在位置继续执行任务,进而实现因地制宜,可尽量满足用户需求。
可选地,一个或多个传感器504可包括:接触传感器、反射光耦合器以及惯性传感器等,但不限于此,用于检测机器人是否发生上述被劫持的状况。其中,接触传感器、反射光耦合器以及惯性传感器在机器人500上的设置位置以及具体工作原理可参见上述方法实施例中的相关描述,在此不再赘述。
在一可选实施例中,一个或多个处理器502可以在机器人500脱离劫持时所在位置处执行重定位操作,且在该位置处能够准确定位出机器人500在 存储的环境地图中的位姿。这里的位姿包括机器人在存储的环境地图中的位置和朝向。基于此,一个或多个处理器502确定机器人500在脱离劫持时所在位置时具体用于:当识别到机器人500脱离劫持时,通过一个或多个传感器504采集机器人500当前所在位置周围的环境信息;并根据机器人500当前所在位置周围的环境信息,定位机器人500在存储的环境地图中的位姿,将该位姿中的位置作为机器人500脱离劫持时所在位置,确切地说,是机器人500当前在存储的环境地图中的位置。
在另一可选实施例中,一个或多个处理器502在机器人500脱离劫持时所在位置处无法准确定位出机器人500在存储的环境地图中的位姿,则控制机器人500从脱离劫持时所在位置处向另一位置移动,在移动过程中不断根据一个或多个传感器504最新采集到的周围环境信息执行重定位操作,不断进行重定位,直到准确定位出机器人500在存储的环境地图中的位姿。基于此,一个或多个处理器502确定机器人500在脱离劫持时所在位置时具体用于:当识别到机器人500脱离劫持时,控制机器人500从当前所在位置向第二位置移动,并在移动过程中定位机器人500在存储的环境地图中的位姿;并根据该位姿中的位置以及在机器人500移动过程中获取的数据,确定机器人500开始移动时的位置作为机器人500脱离劫持时所在位置。对于如何确定机器人500脱离劫持时所在位置的具体实施方式,可参见上述方法实施例中的相关描述,此处不再赘述。
可选地,环境地图可以包括视觉地图和栅格地图中的至少一个。其中,对于视觉地图和栅格地图的具体描述可参见上述方法实施例中的相关内容,在此不再赘述。
若一个或多个传感器504包括视觉传感器,则视觉传感器采集机器人500脱离劫持时所在位置周围的环境图像。相应地,可根据机器人500脱离劫持时所在位置周围的环境图像,确定任务执行区域。
若一个或多个传感器504包括激光传感器,则视觉传感器采集机器人500脱离劫持时所在位置周围的环境数据。相应地,一个或多个处理器502可根 据机器人500脱离劫持时所在位置周围的环境数据,确定任务执行区域。
若一个或多个传感器504包括视觉传感器和激光传感器,则视觉传感器采集机器人500脱离劫持时所在位置周围的环境图像,视觉传感器采集机器人500脱离劫持时所在位置周围的环境数据。相应地,一个或多个处理器502可根据机器人500脱离劫持时所在位置周围的环境图像和环境数据,确定任务执行区域。
虽然在更多情况下,机器人500被劫持之后,需要从新位置开始执行任务。但是,在一些应用场景中,机器人500重定位出自身位姿之后,可能需要重新回到被劫持时所在位置以便继续执行被劫持前尚未完成的任务。也就是说,根据机器人500被劫持情况的不同,机器人500重定位出自身位姿之后的行为也会有所不同。为了更加准确地对机器人500重定位出自身位姿后的行为进行控制,在一些可选实施例中,一个或多个传感器102可根据机器人500脱离劫持时所在位置和被劫持时所在位置的差异,确定机器人500是否需要在脱离劫持时所在位置处执行任务。若需要,则执行确定机器人500的任务执行区域的操作以及后续操作,以便控制机器人500从脱离劫持时所在位置开始执行任务。
基于上述应用场景1,一个或多个处理器502在确定机器人是否需要在脱离劫持时所在位置处执行任务时具体用于:判断机器人500脱离劫持时所在位置和被劫持时所在位置是否属于相同的环境区域;若机器人500脱离劫持时所在位置和被劫持时所在位置属于不同的环境区域,则确定机器人500需要在脱离劫持时所在位置处执行任务。
可选地,可根据一个或多个传感器504在机器人500被劫持前最后一次或最近一段时间内采集到的环境信息,确定机器人500被劫持时所在位置。
基于应用场景2,一个或多个处理器502在确定机器人是否需要在脱离劫持时所在位置处执行任务时具体用于:判断机器人500被劫持时所在位置是否位于机器人运行困难区域;若机器人500被劫持时所在位置位于运行困难区域,且机器人500脱离劫持时所在的位置位于机器人运行困难区域之外, 则确定机器人500需要在脱离劫持时所在位置处执行任务。相应地,若机器人500被劫持时所在位置位于困难区域之外,则确定机器人500需要回到被劫持时所在位置继续执行作业任务。
对于机器人运行困难区域的确定可参见上述方法实施例中的相关描述,在此不再赘述。
基于上述应用场景3,一个或多个处理器502在确定机器人是否需要在脱离劫持时所在位置处执行任务时具体用于:判断机器人500被劫持时所在位置是否为充电桩位置;若机器人500被劫持时所在位置是充电桩位置,且机器人500脱离劫持时所在位置是非充电桩位置,则确定机器人500需要在脱离劫持时所在位置处执行任务。
对于机器人500是否位于充电桩位置的判断,可参见上述方法实施例中的相关内容,在此不再赘述。
本申请实施例所提供的机器人可以为各种类型的机器人,例如,清扫机器人、分拣机器人、导购机器人、购物车机器人等,但不限于此。
对于不同类型的机器人,其任务执行区域会有所不同。例如,对于清扫机器人,其任务执行区域为待清扫区域。相应地,清扫机器人上的一个或多个处理器502在确定机器人的任务实现区域时,具体用于:根据清扫机器人脱离劫持时所在位置周围的环境信息,确定待清扫区域。
相应地,对于不同类型的机器人,一个或多个处理器502确定相应的任务执行区域的具体实施方式和执行任务的方式会因其应用场景的不同,而有所差别。例如,清扫机器人执行的任务为清扫地面等;购物车机器人或导购执行的任务为跟随顾客;分拣机器人执行的任务为分拣货物或订单等。下面以清扫机器人为例,结合一些应用场景,对一个或多个处理器502确定待清扫区域的具体实施方式和控制清扫机器人在确定出的待清扫区域内执行清扫任务的实施方式进行说明。
基于上述应用场景A,一个或多个处理器502在确定待清扫区域时,具体用于:确定清扫机器人脱离劫持时所在位置周围的环境信息中具有空间限 定作用的至少一个实体对象;并包含至少一个实体对象的地面区域作为待清扫区域;然后控制清扫机器人在该待清扫区域中执行清扫任务。对于上述应用场景,对待清扫区域具有限定作用的至少一个实体对象可为桌椅的腿、墙体、柜体、柱子、门框等,但不限于此。
基于上述应用场景B,一个或多个处理器502在确定待清扫区域时,具体用于:根据清扫机器人脱离劫持时所在位置周围的环境信息,确定清扫机器人当前所在区域是否包含边界,例如墙壁、柜子等;若包含有边界信息,则以机器人脱离劫持时所在位置与边界之间的距离为任一半边长,确定一个矩形区域,并将该矩形区域作为待清扫区域;然后控制清扫机器人在该待清扫区域中执行清扫任务。
可选地,基于上述应用场景B,一个或多个处理器502在确定待清扫区域时,具体用于:清扫机器人根据其脱离劫持时所在位置周围的环境信息,确定其当前所在区域是否包含边界,例如墙壁、柜子等;若包含有边界信息,则以机器人脱离劫持时所在位置与边界之间的距离为半径,确定一个圆形区域,并将该圆形区域作为待清扫区域。
基于上述应用场景C,一个或多个处理器502在确定待清扫区域时,具体用于:根据清扫机器人脱离劫持时所在位置周围的环境信息,确定其当前所在区域是否包含边角,例如墙壁与墙壁所形成的边角、墙壁与柜子等形成的边角等;若包含有边角,则确定该边角的关联区域作为待清扫区域。
进一步,可以该边角的顶点为圆心,以机器人脱离劫持时所在位置与该边角的顶点之间的距离为半径,确定一个扇形区域,其中边角的两条边为扇形区域的另外两个边界,并将该扇形区域作为待清扫区域。
可选地,基于上述应用场景C,一个或多个处理器502在确定待清扫区域时,具体用于:根据清扫机器人脱离劫持时所在位置周围的环境信息,确定其当前所在区域是否包含边角,例如墙壁与墙壁所形成的边角、墙壁与柜子等形成的边角等;若包含有边角,则以机器人脱离劫持时所在位置到该边角的任一边的距离为半边长,确定一个矩形区域,并将该矩形区域作为待清 扫区域。
基于上述应用场景D,在清扫机器人当前所在位置处,确定任一区域作为待清扫区域。例如,可以清扫机器人脱离劫持时所在位置为圆心,以其到所在环境区域的边界的距离为半径,确定一个圆形区域作为待清扫区域。又例如,可以将清扫机器人脱离劫持时所在环境区域的边界为待清扫区域边界,将整个环境区域作为待清扫区域。
在又一可选实施例中,当确定出待清扫区域后,一个或多个处理器502便控制清扫机器人在待清扫区域内执行清扫任务,可以控制清扫机器人随机清扫模式或路径规划清扫模式对待清扫区域进行清扫。清扫机器人可以支持一种或多种不同样式的清扫路线。例如,清扫机器人可以支持弓字型清扫路线、圆弧清扫路线、“L”型清扫路线,口字形清扫路线、螺旋行走定点清扫路线等等。在本申请实施例中,以清扫机器人弓字型清扫路线和圆弧清扫路线对待清扫区域进行定点清扫和弓字型清扫进行示例性说明。
基于上述应用场景A、B、C、D所确定的待清扫区域,一个或多个处理器502可选择适配的清扫模式来控制清扫机器人对待清扫区域进行清扫。优选地,如图4a所示,对于圆形或圆弧形区域,一个或多个处理器502在控制机器人在任务执行区域内执行任务时,具体用于:控制清扫机器人以其脱离劫持时所在位置为圆心,采用圆弧清扫路线对待清扫区域进行定点清扫。
优选地,如图4b所示,对于矩形区域,一个或多个处理器502在控制机器人在任务执行区域内执行任务时,具体用于:控制清扫机器人从其脱离劫持时所在位置开始,以弓字型清扫路线对待清扫区域进行弓字型清扫。
相应地,本申请实施例还提供一种存储有计算机指令的计算机可读存储介质,计算机指令被一个或多个处理器执行时,致使一个或多个处理器执行包括以下的动作:
基于机器人重定位操作,确定该机器人脱离劫持时所在位置;根据机器人脱离劫持时所在位置周围的环境信息,确定任务执行区域;控制机器人在该任务执行区域内执行任务。
在一可选实施方式中,上述确定机器人脱离劫持时所在位置的动作包括:在识别到机器人脱离劫持时,获取机器人当前所在位置周围的环境信息;根据机器人当前所在位置周围的环境信息,定位机器人在存储的环境地图中的位姿,并将该位姿中的位置作为机器人脱离劫持时所在位置。
在一可选实施方式中,上述确定机器人脱离劫持时所在位置的动作还包括:在识别到机器人脱离劫持时,控制机器人从当前所在位置向第二位置移动,在移动过程中定位机器人在存储的环境地图中的位姿;根据该位姿中的位置以及在机器人移动过程中获取的数据,确定机器人开始移动时的位置作为机器人脱离劫持时所在位置。
在一可选实施例中,一个或多个处理器执行的动作还包括:根据机器人脱离劫持时所在位置和被劫持时所在位置的差异,确定机器人需要在脱离劫持时所在位置处执行任务。
在一可选实施例中,上述确定机器人需要在脱离劫持时所在位置处执行任务的动作包括:若机器人脱离劫持时所在位置和被劫持时所在位置属于不同的环境区域,确定机器人需要在脱离劫持时所在位置处执行任务。
在一可选实施例中,上述确定机器人需要在脱离劫持时所在位置处执行任务的动作还包括:若机器人被劫持时所在位置位于机器人运行困难区域,但机器人脱离劫持时所在位置位于机器人运行困难区域之外,确定机器人需要在脱离劫持时所在位置处执行任务。
在一可选实施例中,上述确定机器人需要在脱离劫持时所在位置处执行任务的动作还包括:若机器人被劫持时所在位置是充电桩位置,但机器人脱离劫持时所在位置是非充电桩位置,确定机器人需要在脱离劫持时所在位置处执行任务。
本实施例所提供的可读存储介质适用于各种类型的机器人。例如,清扫机器人、分拣机器人、导购机器人、购物车机器人等,但不限于此。
对于不同类型的机器人,可读存储介质中所存储的确定任务执行区域以及执行任务的指令会有所不同,以使一个或多个处理器执行这些指令时执行 不同的作业任务。下面以清扫机器人为例,进行示例性说明。
在一可选实施例中,针对清扫机器人,上述确定任务执行区域的动作包括:根据清扫机器人脱离劫持时所在位置周围的环境信息,确定待清扫区域。
在一可选实施例中,上述确定待清扫区域的动作包括:确定清扫机器人脱离劫持时所在位置周围的环境信息中的具有空间限定作用的至少一个实体对象;将包含至少一个实体对象的地面区域作为待清扫区域。
在一可选实施例中,上述确定待清扫区域的动作还包括:识别清扫机器人脱离劫持时所在位置周围的环境信息是否包含边角;若包含边角,则确定该边角的关联区域作为待清扫区域。
在一可选实施例中,上述确定边角的关联区域作为待清扫区域的动作包括:以边角的顶点为圆心,以清扫机器人脱离劫持时所在位置到顶点的距离为半径,确定一个扇形区域作为所述待清扫区域;或者以清扫机器人脱离劫持时所在位置到该边角上任一边的距离为半边长,确定一个矩形区域作为待清扫区域。
在一可选实施例中,针对清扫机器人,上述控制机器人在任务执行区域内执行任务的动作包括:控制清扫机器人以其脱离劫持时所在的位置为圆心,对待清扫区域进行定点清扫;或者,控制清扫机器人从其脱离劫持时所在的位置开始,对待清扫区域进行弓字型清扫。
本领域内的技术人员应明白,本发明的实施例可提供为方法、系统、或计算机程序产品。因此,本发明可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本发明可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本发明是参照根据本发明实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、 嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
在一个典型的配置中,计算设备包括一个或多个处理器(CPU)、输入/输出接口、网络接口和内存。
内存可能包括计算机可读介质中的非永久性存储器,随机存取存储器(RAM)和/或非易失性内存等形式,如只读存储器(ROM)或闪存(flash RAM)。内存是计算机可读介质的示例。
计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。按照本文中的界定,计算机可读介质不包括暂存电脑可 读媒体(transitory media),如调制的数据信号和载波。
还需要说明的是,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、商品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、商品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、商品或者设备中还存在另外的相同要素。
以上所述仅为本申请的实施例而已,并不用于限制本申请。对于本领域技术人员来说,本申请可以有各种更改和变化。凡在本申请的精神和原理之内所作的任何修改、等同替换、改进等,均应包含在本申请的权利要求范围之内。
Claims (21)
- 一种机器人控制方法,其特征在于,包括:机器人基于重定位操作,确定所述机器人脱离劫持时所在位置;所述机器人根据其脱离劫持时所在位置周围的环境信息,确定任务执行区域;所述机器人在所述任务执行区域内执行任务。
- 根据权利要求1所述的方法,其特征在于,所述机器人基于重定位操作,确定所述机器人脱离劫持时所在位置,包括:所述机器人识别到其脱离劫持时采集当前所在位置周围的环境信息;根据所述当前所在位置周围的环境信息,定位所述机器人在存储的环境地图中的位姿,将所述位姿中的位置作为所述机器人脱离劫持时所在位置。
- 根据权利要求1所述的方法,其特征在于,所述机器人基于重定位操作,确定所述机器人脱离劫持时所在位置,包括:所述机器人识别到脱离劫持时,从当前所在位置向第二位置移动,在移动过程中定位所述机器人在存储的环境地图中的位姿;根据所述位姿中的位置以及所述移动过程中获取的数据,确定所述机器人开始移动时的位置作为所述机器人脱离劫持时所在位置。
- 根据权利要求1所述的方法,其特征在于,在所述机器人根据所述机器人脱离劫持时所在位置周围的环境信息,确定任务执行区域之前,还包括:根据所述机器人脱离劫持时所在位置和被劫持时所在位置的差异,确定所述机器人需要在脱离劫持时所在位置处执行任务。
- 根据权利要求4所述的方法,其特征在于,所述根据所述机器人脱离劫持时所在位置和被劫持时所在位置的差异,确定所述机器人需要在脱离劫持时所在位置处执行任务,包括以下至少一种情况:若所述机器人脱离劫持时所在位置和被劫持时所在位置属于不同的环境区域,确定所述机器人需要在脱离劫持时所在位置处执行任务;若所述机器人被劫持时所在位置位于所述机器人运行困难区域,但所述 机器人脱离劫持时所在位置位于所述机器人运行困难区域之外,确定所述机器人需要在脱离劫持时所在位置处执行任务;若所述机器人被劫持时所在位置是充电桩位置,但所述机器人脱离劫持时所在位置是非充电桩位置,确定所述机器人需要在脱离劫持时所在位置处执行任务。
- 根据权利要求1-5任一项所述的方法,其特征在于,所述机器人为清扫机器人,所述机器人根据其脱离劫持时所在位置周围的环境信息,确定任务执行区域,包括:所述清扫机器人根据其脱离劫持时所在位置周围的环境信息,确定待清扫区域。
- 根据权利要求6所述的方法,其特征在于,所述清扫机器人根据其脱离劫持时所在位置周围的环境信息,确定所述待清扫区域,包括:确定所述清扫机器人脱离劫持时所在位置周围的环境信息中具有空间限定作用的至少一个实体对象;将包含所述至少一个实体对象的地面区域作为所述待清扫区域。
- 根据权利要求6所述的方法,其特征在于,所述清扫机器人根据其脱离劫持时所在位置周围的环境信息,确定所述待清扫区域,还包括:识别所述清扫机器人脱离劫持时所在位置周围的环境信息是否包含边角;若包含,则确定所述边角的关联区域作为所述待清扫区域。
- 根据权利要求8所述的方法,其特征在于,所述确定所述边角的关联区域作为所述待清扫区域,包括:以所述边角的顶点为圆心,以所述清扫机器人脱离劫持时所在位置到所述顶点的距离为半径,确定一个扇形区域作为所述待清扫区域;或者以所述清扫机器人脱离劫持时所在位置到所述边角上任一边的距离为半边长,确定一个矩形区域作为所述待清扫区域。
- 根据权利要求6所述的方法,其特征在于,所述机器人在所述任务 执行区域内执行任务,包括:所述清扫机器人以其脱离劫持时所在的位置为圆心,对所述待清扫区域进行定点清扫;或者,所述清扫机器人从其脱离劫持时所在的位置开始,对所述待清扫区域进行弓字型清扫。
- 一种机器人,其特征在于,包括:机械本体,所述机械本体上设有一个或多个传感器、一个或多个处理器,以及一个或多个存储计算机指令的存储器;所述一个或多个处理器,用于执行所述计算机指令,以用于:基于重定位操作,确定所述机器人脱离劫持时所在位置;通过所述一个或多个传感器采集所述机器人脱离劫持时所在位置周围的环境信息;根据所述机器人脱离劫持时所在位置周围的环境信息,确定任务执行区域;控制所述机器人在所述任务执行区域内执行任务。
- 根据权利要求11所述的机器人,其特征在于,所述一个或多个处理器在确定所述机器人脱离劫持时所在位置时,具体用于:在识别到所述机器人脱离劫持时,通过所述一个或多个传感器采集所述机器人当前所在位置周围的环境信息;根据所述当前所在位置周围的环境信息,定位所述机器人在存储的环境地图中的位姿,将所述位姿中的位置作为所述机器人脱离劫持时所在位置。
- 根据权利要求11所述的机器人,其特征在于,所述一个或多个处理器在确定所述机器人脱离劫持时所在位置时,具体用于:在识别到所述机器人脱离劫持时,控制所述机器人从当前所在位置向第二位置移动,并在所述机器人移动过程中定位所述机器人在存储的环境地图中的位姿;根据所述位姿中的位置以及所述移动过程中获取的数据,确定所述机器 人开始移动时的位置作为所述机器人脱离劫持时所在位置。
- 根据权利要求11所述的机器人,其特征在于,所述一个或多个处理器在确定任务执行区域时,具体用于:根据所述机器人脱离劫持时所在位置和被劫持时所在位置的差异,确定所述机器人需要在脱离劫持时所在位置处执行任务。
- 根据权利要求14所述的机器人,其特征在于,所述一个或多个处理器在确定所述机器人需要在脱离劫持时所在位置处执行任务时,具体用于执行以下一种操作:若所述机器人脱离劫持时所在位置和被劫持时所在位置属于不同的环境区域,确定所述机器人需要在脱离劫持时所在位置处执行任务;若所述机器人被劫持时所在位置位于所述机器人运行困难区域,但所述机器人脱离劫持时所在位置位于所述机器人运行困难区域之外,确定所述机器人需要在脱离劫持时所在位置处执行任务;若所述机器人被劫持时所在位置是充电桩位置,但所述机器人脱离劫持时所在位置是非充电桩位置,确定所述机器人需要在脱离劫持时所在位置处执行任务。
- 根据权利要求11-15任一项所述的机器人,其特征在于,所述机器人为清扫机器人,所述一个或多个处理器在确定任务执行区域时,具体用于:根据所述清扫机器人脱离劫持时所在位置周围的环境信息,确定待清扫区域。
- 根据权利要求16所述的机器人,其特征在于,所述一个或多个处理器在确定待清扫区域时,具体用于:确定所述清扫机器人脱离劫持时所在位置周围的环境信息中具有空间限定作用的至少一个实体对象;将包含所述至少一个实体对象的地面区域作为所述待清扫区域。
- 根据权利要求16所述的机器人,其特征在于,所述一个或多个处理器在确定待清扫区域时,具体用于:识别所述清扫机器人脱离劫持时所在位置周围的环境信息是否包含边角;若包含,则确定所述边角的关联区域作为所述待清扫区域。
- 根据权利要求18所述的机器人,其特征在于,所述一个或多个处理器在确定所述边角的关联区域作为所述待清扫区域时,具体用于:以所述边角的顶点为圆心,以所述清扫机器人脱离劫持时所在位置到所述顶点的距离为半径,确定一个扇形区域作为所述待清扫区域;或者以所述清扫机器人脱离劫持时所在位置到所述边角上任一边的距离为半边长,确定一个矩形区域作为所述待清扫区域。
- 根据权利要求16所述的机器人,其特征在于,所述一个或多个处理器在控制所述机器人在所述任务执行区域内执行任务时,具体用于:控制所述清扫机器人以其脱离劫持时所在的位置为圆心,对所述待清扫区域进行定点清扫;或者,控制所述清扫机器人从其脱离劫持时所在的位置开始,对所述待清扫区域进行弓字型清扫。
- 一种存储有计算机指令的计算机可读存储介质,其特征在于,当所述计算机指令被一个或多个处理器执行时,致使所述一个或多个处理器执行包括以下的动作:基于机器人重定位操作,确定所述机器人脱离劫持时所在位置;根据所述机器人脱离劫持时所在位置周围的环境信息,确定任务执行区域;控制所述机器人在所述任务执行区域内执行任务。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
ES19836995T ES2966385T3 (es) | 2018-07-19 | 2019-07-08 | Procedimiento de control de robot, robot y medio de almacenamiento |
EP23189753.9A EP4252973A3 (en) | 2018-07-19 | 2019-07-08 | Robot control method, robot and storage medium |
EP19836995.1A EP3825070B1 (en) | 2018-07-19 | 2019-07-08 | Robot control method, robot and storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810797893.5 | 2018-07-19 | ||
CN201810797893.5A CN110733033B (zh) | 2018-07-19 | 2018-07-19 | 机器人控制方法、机器人及存储介质 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020015548A1 true WO2020015548A1 (zh) | 2020-01-23 |
Family
ID=69162326
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/095146 WO2020015548A1 (zh) | 2018-07-19 | 2019-07-08 | 机器人控制方法、机器人及存储介质 |
Country Status (5)
Country | Link |
---|---|
US (3) | US11534916B2 (zh) |
EP (2) | EP3825070B1 (zh) |
CN (2) | CN110733033B (zh) |
ES (1) | ES2966385T3 (zh) |
WO (1) | WO2020015548A1 (zh) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20210099217A (ko) * | 2019-01-03 | 2021-08-12 | 엘지전자 주식회사 | 로봇 시스템의 제어 방법 |
CN111168685B (zh) * | 2020-02-17 | 2021-06-18 | 上海高仙自动化科技发展有限公司 | 机器人控制方法、机器人和可读存储介质 |
CN114443264B (zh) * | 2020-11-05 | 2023-06-09 | 珠海一微半导体股份有限公司 | 一种基于硬件加速的激光重定位系统及芯片 |
CN113359769B (zh) * | 2021-07-06 | 2022-08-09 | 广东省科学院智能制造研究所 | 室内自主移动机器人复合导航方法及装置 |
CN113985882B (zh) * | 2021-10-29 | 2024-02-27 | 珠海格力电器股份有限公司 | 作业路径规划方法、装置、电子设备和存储介质 |
CN114833827B (zh) * | 2022-04-20 | 2023-12-26 | 深圳模德宝科技有限公司 | 零件加工处理的方法及相关装置 |
CN115446834B (zh) * | 2022-09-01 | 2024-05-28 | 西南交通大学 | 一种基于占据栅格配准的车底巡检机器人单轴重定位方法 |
CN118356120B (zh) * | 2024-06-19 | 2024-09-24 | 追觅创新科技(苏州)有限公司 | 清洁机器人的控制方法 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050065655A1 (en) * | 2003-09-16 | 2005-03-24 | Samsung Electronics Co., Ltd. | Apparatus and method for estimating a position and an orientation of a mobile robot |
CN101089556A (zh) * | 2006-06-16 | 2007-12-19 | 三星电子株式会社 | 移动设备、用于补偿该移动设备的位置的方法和介质 |
WO2013071190A1 (en) * | 2011-11-11 | 2013-05-16 | Evolution Robotics, Inc. | Scaling vector field slam to large environments |
CN104180799A (zh) * | 2014-07-15 | 2014-12-03 | 东北大学 | 一种基于自适应蒙特卡罗定位的机器人定位方法 |
CN107969995A (zh) * | 2017-11-27 | 2018-05-01 | 深圳市沃特沃德股份有限公司 | 视觉扫地机器人及其重定位方法 |
CN108072370A (zh) * | 2016-11-18 | 2018-05-25 | 中国科学院电子学研究所 | 基于全局地图的机器人导航方法及用该方法导航的机器人 |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5942869A (en) * | 1997-02-13 | 1999-08-24 | Honda Giken Kogyo Kabushiki Kaisha | Mobile robot control device |
JP2005211365A (ja) * | 2004-01-30 | 2005-08-11 | Funai Electric Co Ltd | 自律走行ロボットクリーナー |
KR100560966B1 (ko) * | 2004-10-12 | 2006-03-15 | 삼성광주전자 주식회사 | 로봇 청소기의 자이로 센서 보정방법 |
KR101484940B1 (ko) | 2009-05-14 | 2015-01-22 | 삼성전자 주식회사 | 로봇청소기 및 그 제어방법 |
US8706297B2 (en) * | 2009-06-18 | 2014-04-22 | Michael Todd Letsky | Method for establishing a desired area of confinement for an autonomous robot and autonomous robot implementing a control system for executing the same |
KR20120044768A (ko) | 2010-10-28 | 2012-05-08 | 엘지전자 주식회사 | 로봇 청소기 및 이의 제어 방법 |
KR101731968B1 (ko) * | 2010-11-01 | 2017-05-04 | 삼성전자주식회사 | 로봇의 리로케이션 장치 및 방법 |
JP6638988B2 (ja) * | 2013-12-19 | 2020-02-05 | アクチエボラゲット エレクトロルックス | サイドブラシを有し、渦巻きパターンで動くロボットバキュームクリーナ |
KR102328252B1 (ko) | 2015-02-13 | 2021-11-19 | 삼성전자주식회사 | 청소 로봇 및 그 제어방법 |
KR102393921B1 (ko) | 2015-05-12 | 2022-05-04 | 삼성전자주식회사 | 로봇 및 그의 제어 방법 |
DE102015114883A1 (de) * | 2015-09-04 | 2017-03-09 | RobArt GmbH | Identifizierung und Lokalisierung einer Basisstation eines autonomen mobilen Roboters |
DE102015119865B4 (de) * | 2015-11-17 | 2023-12-21 | RobArt GmbH | Robotergestützte Bearbeitung einer Oberfläche mittels eines Roboters |
CN107037806B (zh) | 2016-02-04 | 2020-11-27 | 科沃斯机器人股份有限公司 | 自移动机器人重新定位方法及采用该方法的自移动机器人 |
DE102016102644A1 (de) * | 2016-02-15 | 2017-08-17 | RobArt GmbH | Verfahren zur Steuerung eines autonomen mobilen Roboters |
WO2017200343A1 (ko) * | 2016-05-20 | 2017-11-23 | 엘지전자 주식회사 | 로봇 청소기 |
EP3494446B1 (de) * | 2016-08-05 | 2023-09-13 | Robart GmbH | Verfahren und vorrichtung zur steuerung eines autonomen mobilen roboters |
JP6831213B2 (ja) | 2016-11-09 | 2021-02-17 | 東芝ライフスタイル株式会社 | 電気掃除機 |
CN106679647A (zh) | 2016-12-02 | 2017-05-17 | 北京贝虎机器人技术有限公司 | 用于初始化自主移动式设备位姿的方法及装置 |
CN106708037A (zh) * | 2016-12-05 | 2017-05-24 | 北京贝虎机器人技术有限公司 | 自主移动式设备定位的方法、装置及自主移动式设备 |
EP3974934A1 (de) | 2017-03-02 | 2022-03-30 | Robart GmbH | Verfahren zur steuerung eines autonomen, mobilen roboters |
CN108052101B (zh) * | 2017-12-06 | 2021-12-21 | 北京奇虎科技有限公司 | 机器人的重定位方法及装置 |
US10575699B2 (en) * | 2018-01-05 | 2020-03-03 | Irobot Corporation | System for spot cleaning by a mobile robot |
-
2018
- 2018-07-19 CN CN201810797893.5A patent/CN110733033B/zh active Active
- 2018-07-19 CN CN202310244057.5A patent/CN116509280A/zh active Pending
-
2019
- 2019-07-08 EP EP19836995.1A patent/EP3825070B1/en active Active
- 2019-07-08 WO PCT/CN2019/095146 patent/WO2020015548A1/zh active Application Filing
- 2019-07-08 EP EP23189753.9A patent/EP4252973A3/en active Pending
- 2019-07-08 ES ES19836995T patent/ES2966385T3/es active Active
- 2019-07-16 US US16/513,341 patent/US11534916B2/en active Active
-
2022
- 2022-11-04 US US17/981,226 patent/US11850753B2/en active Active
-
2023
- 2023-11-08 US US18/388,105 patent/US20240066697A1/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050065655A1 (en) * | 2003-09-16 | 2005-03-24 | Samsung Electronics Co., Ltd. | Apparatus and method for estimating a position and an orientation of a mobile robot |
CN101089556A (zh) * | 2006-06-16 | 2007-12-19 | 三星电子株式会社 | 移动设备、用于补偿该移动设备的位置的方法和介质 |
WO2013071190A1 (en) * | 2011-11-11 | 2013-05-16 | Evolution Robotics, Inc. | Scaling vector field slam to large environments |
CN104180799A (zh) * | 2014-07-15 | 2014-12-03 | 东北大学 | 一种基于自适应蒙特卡罗定位的机器人定位方法 |
CN108072370A (zh) * | 2016-11-18 | 2018-05-25 | 中国科学院电子学研究所 | 基于全局地图的机器人导航方法及用该方法导航的机器人 |
CN107969995A (zh) * | 2017-11-27 | 2018-05-01 | 深圳市沃特沃德股份有限公司 | 视觉扫地机器人及其重定位方法 |
Also Published As
Publication number | Publication date |
---|---|
EP4252973A3 (en) | 2023-11-22 |
US20200023517A1 (en) | 2020-01-23 |
EP3825070A1 (en) | 2021-05-26 |
EP3825070A4 (en) | 2021-11-24 |
CN116509280A (zh) | 2023-08-01 |
US11850753B2 (en) | 2023-12-26 |
US20240066697A1 (en) | 2024-02-29 |
US20230056758A1 (en) | 2023-02-23 |
ES2966385T3 (es) | 2024-04-22 |
US11534916B2 (en) | 2022-12-27 |
CN110733033B (zh) | 2023-03-24 |
CN110733033A (zh) | 2020-01-31 |
EP4252973A2 (en) | 2023-10-04 |
EP3825070B1 (en) | 2023-09-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020015548A1 (zh) | 机器人控制方法、机器人及存储介质 | |
WO2019237990A1 (zh) | 机器人定位方法、机器人及存储介质 | |
EP3729226B1 (en) | Semantic obstacle recognition for path planning | |
JP6979961B2 (ja) | 自律移動ロボットを制御するための方法 | |
US20200319640A1 (en) | Method for navigation of a robot | |
JP2020532018A (ja) | 自律移動ロボットの移動計画 | |
CN110709790A (zh) | 用于控制自主移动机器人的方法 | |
CN110088704A (zh) | 控制清洁设备的方法 | |
WO2021143543A1 (zh) | 机器人及其控制方法 | |
US20170371341A1 (en) | Method and apparatus for controlling a robotic cleaning device for intensive cleaning | |
CN113126632B (zh) | 虚拟墙划定和作业方法、设备及存储介质 | |
CN111721280B (zh) | 一种区域识别方法、自移动设备及存储介质 | |
WO2020215945A1 (zh) | 清洁方法、擦窗机器人及存储介质 | |
CN107450557A (zh) | 一种基于云端记忆的扫地机器人寻路方法 | |
CN109947094B (zh) | 行进方法、自移动设备及存储介质 | |
CN111714033B (zh) | 一种机器人控制方法、机器人及存储介质 | |
CN118550305B (zh) | 一种机器人沿边清洁的控制方法、机器人及存储介质 | |
WO2024145776A1 (zh) | 扫地机器人的控制方法、装置、扫地机器人、系统及存储介质 | |
CN117835884A (zh) | 机器人的控制方法、控制机器人回基座的方法、装置及机器人 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19836995 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2019836995 Country of ref document: EP |