US20200097017A1 - Auto-recharging of robot - Google Patents

Auto-recharging of robot Download PDF

Info

Publication number
US20200097017A1
US20200097017A1 US16/396,613 US201916396613A US2020097017A1 US 20200097017 A1 US20200097017 A1 US 20200097017A1 US 201916396613 A US201916396613 A US 201916396613A US 2020097017 A1 US2020097017 A1 US 2020097017A1
Authority
US
United States
Prior art keywords
robot
path
charging pile
real time
docking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US16/396,613
Other versions
US10585437B1 (en
Inventor
Ji Zhou
Xinpeng Feng
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NextVPU Shanghai Co Ltd
Original Assignee
NextVPU Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NextVPU Shanghai Co Ltd filed Critical NextVPU Shanghai Co Ltd
Assigned to NextVPU (Shanghai) Co., Ltd. reassignment NextVPU (Shanghai) Co., Ltd. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FENG, Xinpeng, ZHOU, JI
Application granted granted Critical
Publication of US10585437B1 publication Critical patent/US10585437B1/en
Publication of US20200097017A1 publication Critical patent/US20200097017A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0225Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving docking at a fixed facility, e.g. base station or loading bay
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0003Home robots, i.e. small robots for domestic use
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/02Docking stations; Docking operations
    • A47L2201/022Recharging of batteries
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2868Arrangements for power supply of vacuum cleaners or the accessories thereof
    • A47L9/2873Docking units or charging stations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60LPROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
    • B60L2200/00Type of vehicles
    • B60L2200/40Working vehicles

Definitions

  • the present disclosure relates to the field of robots, and more particularly to a robot, an auto-recharging method for a robot and a storage medium.
  • a sweeping robot may automatically return to a charging pile for charging when the power is not sufficient to continue cleaning.
  • Conventional auto-recharging approaches of sweeping robots include: 1) a charging base emits an infrared signal; after entering into the coverage of the infrared signal in the process of movement, the robot receives the infrared signal via an infrared receiver at the front end of the robot and repeatedly adjusts the direction of motion until it contacts with a metal electrode sheet on the charging base; 2) a navigation technology is adopted; the charging base projects two beacon's faculae to the ceiling; a four-quadrant infrared receiving window is arranged at an upper end of the robot; the current coordinates and pose of the robot may be computed by converting the projected area of the faculae on a sensor into an electrical signal.
  • the disclosure provides a robot, an auto-recharging method therefor and a storage medium, and auto-recharging of the robot can be achieved without guidance of active light source, thereby reducing the cost of the robot.
  • the method comprises: moving a robot from an initial position to a docking position, wherein the docking position faces a charging interface of a charging pile, and the docking position is determined based on a position of the charging pile identified by means of images captured by the robot in real time; and moving the robot from the docking position to a charging position along a first path such that the robot is docked to the charging pile at the charging position.
  • the first path is a straight-line or approximately straight-line path.
  • the robot maintains a docking pose and the charging pile is identifiable in the images captured by the robot in real time.
  • the robot comprises a sensor at least configured to capture images surrounding the robot in real time; a motor configured to drive the robot; and a processor configured to:
  • the robot to move from an initial position to a docking position, wherein the docking position faces a charging interface of the charging pile, and the docking position is determined based on a position of the charging pile identified by images captured by the robot in real time;
  • the robot to travel along a first path from the docking position to a charging position so as to be docked with the charging pile at the charging position, wherein the first path is a straight-line or approximately straight-line path.
  • the robot maintains a docking pose and the charging pile is identifiable in the images captured by the robot in real time.
  • FIG. 1 shows a flowchart of an auto-recharging method for a robot according to an embodiment of the present disclosure
  • FIG. 2 shows a schematic diagram of auto-recharging for a robot according to an auto-recharging system for a robot of an embodiment of the present disclosure
  • FIGS. 3-7 show schematic diagrams of auto-recharging for a robot according to an exemplary embodiment of the present disclosure
  • FIG. 8 shows a modular diagram of a robot according to an embodiment of the present disclosure
  • FIG. 9 schematically shows a schematic diagram of a computer readable storage medium in an exemplary embodiment of the present disclosure.
  • FIG. 10 schematically shows a schematic diagram of an electronic equipment.
  • an infrared emitter and a sensing device for receiving infrared signals are needed to be additionally provided to the charging base and the robot, respectively, or a device for projecting beacon's faculae needs to be additionally provided to the charging base so as to project beacon's faculae; consequently, the cost of the equipment will increase.
  • the emitter of the charging base needs to be open; besides, the infrared emitter per se has a relatively high energy consumption, which incurs a high cost of the robot.
  • a long wavelength light beam is adopted; however, the long wavelength light beam has poor penetration of obstacle.
  • the active light source cannot penetrate through the obstacle and thus cannot be received by a light beam sensing device on the robot; thereby automatic return to the pile is impeded, environment adaptability is poor, the equipment is easy to damage and thus has a short service life.
  • a matching light emitting device and a matching light sensing and receiving device are needed; if one of them is damaged, the matching light emitting device and the matching light sensing and receiving device is required to replace, which brings about a great limitation and a poor flexibility.
  • the present disclosure provides a robot, an auto-recharging method for a robot, an auto-recharging system for a robot, an electronic equipment, and a storage medium, which can achieve auto-recharging of the robot without guidance of active light source, thereby reducing the cost of the robot and meanwhile offering a high flexibility to the equipment.
  • FIG. 1 shows a flowchart of an auto-recharging method for a robot according to an embodiment of the present disclosure.
  • FIG. 1 shows two steps:
  • Step S 110 the robot moving from an initial position to a docking position, wherein the docking position faces a charging interface of a charging pile;
  • Step S 120 the robot traveling from the docking position to a charging position along a first path such that the robot is docked to the charging pile at the charging position, wherein the first path is a straight-line or approximately straight-line path, and the robot maintains a docking pose during the course of traveling along the first path, and the charging pile is identified in the images captured by the robot in real time.
  • the auto-recharging method for a robot provided by the present disclosure has the following advantages:
  • the auto-recharging for a robot can be achieved by virtue of the images captured by the robot in real time, which omits guide devices such as an active light source emitter and an active light source receiver, thereby reducing the manufacturing cost, and solving the problems such as high energy consumption of the emitting device and high usage cost of the robot;
  • the auto-recharging for a robot can be achieved by virtue of the images captured by the robot in real time, which put forward low requirements on the equipment of the charging pile and the robot, because a universal charging pile may enable the robot to go back to the pile for charging, thereby offering a high flexibility to the equipment;
  • FIG. 2 shows a schematic diagram of auto-recharging for a robot according to an auto-recharging system for a robot of an embodiment of the present disclosure.
  • the robot 202 detects a low battery at an initial position 213 and needs to go to a charging position 211 so as to be docked with a charging pile 201 for charging.
  • the robot 202 To move to the charging position 211 , the robot 202 first determines a position of its initial position 213 in an environment map.
  • the environment map may be trained and built when the robot 202 is in use.
  • the initial position 213 of the robot 202 is determined based on a motion trajectory of the robot 202 in the environment map before recharging.
  • the robot 202 may obtain its motion trajectory and then determine the position of its initial position 213 in an environment map based on the motion trajectory.
  • the robot 202 may directly capture the charging pile 201 before recharging, so as to determine the position of the initial position 213 in the environment map by capturing the charging pile 201 in the images.
  • the robot 202 may just be turned on or moved by a person before recharging, such that the robot 202 cannot get its actual motion trajectory and thus cannot determine the position of its initial position 213 in the environment map based on the actual motion trajectory.
  • the initial position 213 of the robot 202 is determined based on images captured by the robot in real time.
  • a plurality of identification features may be set in the environment map where the robot 202 is located (e.g., in the form of outlines of objects such as a chair, a desk, a sofa, or two-dimensional codes such that the coordinates of the identification features may be read); when an identification feature appears in the images captured by the robot 202 in real time, the position of the initial position 213 of the robot 202 in the environment map may be determined based on the coordinates (position in the environment map) of the identification feature.
  • an identification feature may also be provided on the charging pile. If the robot 202 identifies the charging pile in the images captured at the initial position 213 , the position of the initial position 213 in the environment map may be determined by the identification feature on the charging pile. What has been mentioned above only schematically depicts embodiments of the present disclosure, and the present disclosure is not limited thereto.
  • the robot 202 After the robot 202 determines the position of its initial position 213 in the environment map, the robot 202 needs to determine a boundary of a transit area 241 preset in the environment map to which the robot 202 moves.
  • the transit area 241 may be preset based on an area where the charging pile 201 may be identified in the images captured when building the environment map.
  • an arbitrary position at the boundary of the transit area 241 may be selected.
  • the boundary position of the selected transit area 241 is located in a shortest path for avoiding an obstacle from the initial position 213 to the boundary of the transit area 241 ; if there is no obstacle, the shortest path is a shortest straight-line path from the initial position 213 to the boundary of the transit area 241 .
  • the boundary position of the selected transit area 241 is located on a connecting line between the initial position 213 and the charging position 211 , such that the transit position 212 may be uniquely determined and the shortest path planning between the positions may be implemented.
  • the present disclosure may implement more variations, which will not be detailed here.
  • the robot 202 After the robot 202 determines the initial position 213 and the boundary of the transit area 241 preset in the environment map to which the robot 202 moves, the robot 202 plans a second path 221 from the initial position 213 to the boundary of the transit area 241 .
  • the second path 221 is preferably a shortest path that may avoid an obstacle, and preferably a straight-line path. Then, the robot 202 travels along the second path 221 from the initial position 213 to the boundary of the transit area 241 .
  • the robot 202 may capture the image of the charging pile at the initial position 213 , i.e., when the initial position 213 is located in the transit area 241 , the robot 202 may not plan the second path 221 .
  • the image of the charging pile 201 may be prestored in the robot 202 , such that an image feature of the charging pile may be determined based on the image of the charging pile 201 .
  • an image feature of the charging pile may be determined based on the image of the charging pile 201 .
  • the current position of the robot 202 serves as the transit position 212 and the robot 202 travels along a third path from the transit position 212 to a docking position 214 .
  • the initial position 213 may serve as the transit position 212 so as to carry out subsequent steps.
  • the robot 202 acts according to a predetermined mode or an adaptive mode (e.g., acts within a predetermined scope according to the predetermined rotation or displacement) till the charging pile 201 is identified in the images captured by the robot 202 in real time, and then the current position of the robot 202 serves as the transit position 212 .
  • a predetermined mode or an adaptive mode e.g., acts within a predetermined scope according to the predetermined rotation or displacement
  • an alert indicating a failure to find the charging pile 201 is generated.
  • the alert is used for indicating that the charging pile 201 is blocked or the charging pile 201 is displaced.
  • the robot 202 needs to be retrained, and the position of the charging pile 201 needs to be relabeled based on the existing environment map.
  • a transit position 212 where the image of the charging pile 201 may be captured can be determined within a preset scope based on the action in the predetermined mode or adaptive mode, to settle the above problem.
  • the robot 202 determines, at the transit position 212 , a docking position 214 in the transit area 241 , wherein the docking position 214 faces a charging interface of the charging pile 201 .
  • the robot 202 already has, at the docking position 214 , a pose for being docked with the charging interface of the charging pile 201 .
  • the docking position 214 is a position that has been preset on an environment map.
  • the robot 202 identifies, at the transit position 212 , the position of the charging pile 201 by the images captured in real time, and determines the docking position 214 in the transit area 241 based on the position of the charging pile 201 . For example, supposing that on a horizontal plane, a direction of the charging interface of the charging pile 201 is the Y axis and a direction perpendicular to the charging interface is the X axis, then the coordinates of the charging pile 201 are (x 1 , y 1 ).
  • the robot 202 travels along the third path from the transit position 212 to the docking position 214 , wherein the third path is calculated on basis of the determined coordinates of the transit position in the environment map based on the images of the charging pile captured by the robot 202 at the transit position.
  • the calculated third path may ensure that the robot poses to be docked with the charging pile for charging at the instant of reaching the docking position from the transit position, wherein the third path is not limited to a straight line.
  • the robot 202 travels from the docking position 214 to the charging position 211 along a first path, wherein the first path is a straight line or an approximately straight line, and during the course of traveling along the first path, the charging pile 201 is identified in the images captured by the robot 202 in real time (i.e., in the first path, fine adjustment only occurs in the direction of X axis, such that the charging pile 201 is identified in the images captured by the robot 202 in real time).
  • the robot 202 maintains a docking pose (for example, the charging socket of the robot 202 faces the charging interface of the charging pile 201 ); for example, the robot 202 and the charging pile 201 both maintain a docking state (e.g., keeping the cover of the charging interface in an opened state, or a state for docking among the states of a telescopic charging interface).
  • a docking pose for example, the charging socket of the robot 202 faces the charging interface of the charging pile 201
  • the robot 202 and the charging pile 201 both maintain a docking state (e.g., keeping the cover of the charging interface in an opened state, or a state for docking among the states of a telescopic charging interface).
  • the robot 202 may, for example, adjust the first path based on an auxiliary pattern (e.g., a specific pattern or a two-dimensional code (QR code)) identified in the images captured by the robot 202 in real time, wherein the auxiliary pattern is provided on the charging pile. Further, the robot 202 may, for example, further adjust the first path via an openable auxiliary robot arm disposed on the charging pile 201 .
  • an auxiliary pattern e.g., a specific pattern or a two-dimensional code (QR code)
  • QR code two-dimensional code
  • FIGS. 3-7 show schematic diagrams of auto-recharging of a robot according to exemplary embodiment of the present disclosure.
  • a sweeping robot 202 is depicted as an example.
  • the sweeping robot 202 performs a cleaning work according to a predetermined working path 229 in an environment map 250 comprising rooms 251 , 252 , and 253 .
  • the position of the charging pile 201 and the transit area 241 are labeled in the environment map 250 .
  • the charging position 211 where the sweeping robot 202 needs to reach may also be labeled on the environment map.
  • the sweeping robot 202 moves and cleans along the working path 229 in the room 251 in the environment map 250 .
  • the sweeping robot 202 detects that its battery power is lower than a predetermined threshold to prevent the sweeping robot 202 from continuing working, the sweeping robot 202 would mark the current position of the sweeping robot 202 as the initial position 213 .
  • the position of the initial position 213 in the environment map may be determined based on the working path 229 of the sweeping robot 202 .
  • the sweeping robot 202 returns to the initial position 213 to continue the unfinished cleaning work along the working path 229 after the sweeping robot 202 's charging is complete.
  • the sweeping robot 202 plans a second path 221 based on the initial position 213 and the boundary of the transit area 241 , wherein the second path 221 refers to a shortest path for avoiding an obstacle from the initial position 213 to the boundary of the transit area 241 ; if there is no obstacle, the second path 221 refers to a straight-line path from the initial position 213 to the boundary of the transit area 241 .
  • the sweeping robot 202 moves along the second path 221 from the initial position 213 to the boundary of the transit area 241 .
  • the current position of the robot 202 serves as a transit position 212 .
  • the robot 202 acts according to a predetermined mode or an adaptive mode (e.g. acts within a predetermined scope according to predetermined rotation or displacement), till the charging pile 201 is identified in the images captured by the robot 202 in real time, and the current position of the robot 202 serves as the transit position 212 . If the charging pile 201 is identified in the images captured in real time by the sweeping robot 202 at the initial position 213 , then the initial position 213 serves as the transit position 212 .
  • the sweeping robot 202 travels along a third path 223 from the transit position 212 to the docking position 214 (pre-labeled on the environment map).
  • the sweeping robot 202 may identify, at the transit position 212 , the position of the charging pile 201 through the images captured in real time, and determine the docking position 214 in the transit area 241 based on the position of the charging pile 201 , or determine the docking position 214 based on the position of the charging pile 201 labeled on the environment map.
  • the sweeping robot 202 may determine a horizontal plane based on the images captured in real time during the course of traveling, and determine whether an obstacle 260 exists in the traveling direction based on whether the horizontal plane is blocked or not. If the sweeping robot 202 identifies the obstacle 260 on the third path 223 , the sweeping robot 202 may, for example, adjust the third path 223 based on a positional relationship between the charging pile 201 and the obstacle 260 in the real-time captured image.
  • the sweeping robot 202 turns to that side so as to avoid the obstacle 260 and meanwhile plans the third path 223 , causing the sweeping robot 202 to move towards the docking position 214 .
  • the third path 223 is calculated on basis of the determined coordinates of the transit position 212 on the environment map based on the image of the charging pile captured by the robot 202 at the transit position 212 .
  • the calculated third path 223 may ensure that the robot 202 poses to be docked with the charging pile 201 for charging at the instant of reaching the docking position 214 from the transit position 212 .
  • the third path 223 may be the shortest path for avoiding the obstacle from the transit position 212 to the docking position 214 ; if there is no obstacle, the third path 223 is a straight-line path from the transit position 212 to the docking position 214 .
  • the third path 223 is not limited to a straight-line path.
  • the sweeping robot 202 travels to the charging position 211 along the first path 222 from the docking position 214 to the charging position 211 and carries out subsequent docking and charging.
  • both the robot 202 and the charging pile 201 maintain a docking state.
  • the charging interface 261 of the sweeping robot 202 for plugging with the charging pile 201 and a sensor 262 of the sweeping robot 202 are located at the same side of the sweeping robot.
  • a straight-line path may be planned during the real-time planning of the first path 222 so as to eliminate a step of rotating the sweeping robot 202 in situ, after arriving at the charging position 211 , to cause its charging interface 261 to be docked with the charging pile 201 .
  • the sweeping robot 202 may first move to the docking position 214 for the charging interface of the charging pile 201 to cause the charging interface 261 of the sweeping robot 202 at the docking position 214 already face the charging interface of the charging pile 201 , such that the sweeping robot 202 is only needed to adjust the path in real time to cause the charging interface of the charging pile 201 in the first path 222 to be located at the image center of the images captured in real time by the sweeping robot 202 .
  • the above technical solution is adjusted based on the positions of the sensor 262 and the charging interface 261 provided on the sweeping robot 202 .
  • the senor 262 and the charging interface 261 are arranged in an angle at different sides of the sweeping robot 202 ; based on the angle between the sensor 262 and the charging interface 261 , the orientation of the charging interface 261 may be determined in the image captured by the sensor 262 , and then the first path 222 of the sweeping robot 202 is finely adjusted based on whether the charging interface of the charging pile 201 is aligned with the charging interface 261 .
  • the present disclosure is not limited thereto.
  • the senor 262 of the sweeping robot 202 for example may be a camera with a fixed viewing angle. In some other embodiments, the sensor 262 of the sweeping robot 202 for example may be a panorama camera which may rotate 360°.
  • the present disclosure may implement more variations, which will not be detailed here.
  • segmented path planning is performed on the path returning to the pile, and different obstacle avoidance methods are adopted in different path segments with different possibilities of obstacles, thereby better solving the problem of failing to return to the pile for charging due to the obstacles in the path of the robot and the charging pile, so as to achieve intelligent obstacle avoidance, thereby reducing the number of collisions of the robot, increasing the service life of the robot, and improving the user experience and purchasing desire.
  • the present disclosure further provides a robot.
  • FIG. 8 shows a modular diagram of a robot according to an embodiment of the present disclosure.
  • the robot 300 comprises a sensor 310 , a motor 320 , and a processor 330 .
  • the sensor 310 is at least configured to capture images surrounding the robot in real time.
  • the motor 320 is configured to drive the robot to move.
  • the processor 330 is configured to cause the robot to move from an initial position to a docking position, wherein the docking position faces a charging interface of the charging pile; the processor 330 is further configured to adjust the robot to travel along a first path from the docking position to a charging position so as to be docked with the charging pile at the charging position, wherein the first path is a straight-line or approximately straight-line path, the robot maintains a docking pose during the course of traveling along the first path, and the charging pile is identified in the images captured by the robot in real time.
  • the robot may be a sweeping robot or a mopping robot.
  • FIG. 8 only schematically shows a modular diagram of the robot according to the present disclosure. Without departing from the idea of the present disclosure, splitting, merging, and adding of the modules all fall within the protection scope of the present disclosure.
  • the present disclosure provides an auto-recharging system for a robot. Please refer to FIG. 2 .
  • the auto-recharging system for a robot comprises the robot 300 (reference number 202 in FIG. 2 ) and the charging pile 201 shown in FIG. 8 .
  • the robot moves from an initial position to a docking position, wherein the docking position faces a charging interface of the charging pile; the robot travels from the docking position to a charging position along a first path such that the robot is docked to the charging pile at the charging position, wherein the first path is a straight-line or approximately straight-line path; the robot maintains a docking pose during the course of traveling along the first path, and the charging pile is identified in the images captured by the robot in real time.
  • a computer readable storage medium on which a computer program is stored, wherein the program, for example when being executed by the processor, may implement the steps of the auto-recharging method for a robot in any embodiment above.
  • various aspects of the present disclosure may be further implemented in the form of a program product, comprising program codes; when the program product is executed on a terminal equipment, the program codes are configured to cause the terminal equipment to execute the steps according to various exemplary embodiments of the present disclosure described in the auto-recharging method for a robot in the description.
  • the program product 900 may adopt a portable compact disk read-only memory (CD-ROM) and comprise program codes, and may be run on a terminal equipment, for example, a personal computer.
  • CD-ROM portable compact disk read-only memory
  • the program product of the present disclosure is not limited thereto.
  • the readable storage medium may be any tangible medium containing or storing the program that may be used by an instruction executing system, device, or member or combination thereof.
  • the program product may adopt any combination of one or more readable mediums.
  • the readable medium may be a readable signal medium or a readable storage medium.
  • the readable storage medium for example, may be, but is not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or member, or any combination thereof.
  • the readable storage medium may include an electrical connection having one or more wires, a portable disk, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disk read-only memory (CD-ROM), an optical memory member, a magnetic memory member, or any appropriate combination thereof.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or flash memory erasable programmable read-only memory
  • CD-ROM compact disk read-only memory
  • magnetic memory member or any appropriate combination thereof.
  • the computer-readable storage medium may include a data signal propagated in a baseband or as part of a carrier wave, in which readable program codes are carried.
  • a data signal propagated in such a way may adopt a plurality of forms, including, but not limited to, an electromagnetic signal, an optical signal, or any appropriate combination thereof.
  • the readable storage medium may also be any readable medium other than the readable storage medium, which readable medium may send, propagate or transmit the programs used by the instruction executing system, device, member, or combination thereof.
  • the program codes included in the readable medium may be transmitted using any appropriate medium, including, but not limited to: wireless, wired, cable, RF, etc., or any appropriate combination thereof.
  • Program codes for carrying out operations of the present disclosure may be compiled in any combination of one or more programming languages including object-oriented programming languages such as Java, C++ or the like, as well as conventional procedural programming languages, such as the “C” language or similar programming languages.
  • the program codes may be executed entirely on a tenant's computing equipment, partially on the tenant's equipment, executed as a stand-alone software package, partially on the tenant's computing equipment and partially executed on a remote computing equipment, or entirely executed on the remote computing equipment or server.
  • the remote computing equipment may be connected to the tenant's computing equipment through any type of network, including a local area network (LAN) or a wide area network (WAN), or connected to an external computing equipment (for example, connected through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider an Internet Service Provider
  • an electronic equipment which may comprise a processor (which may, for example, be used to implement the aforementioned processor 330 ) and a memory for storing an executable instruction of the processor.
  • the processor is configured to execute the steps of the auto-recharging method for a robot in any one of above embodiments by executing the executable instruction.
  • FIG. 10 an electronic equipment 1000 according to such an embodiment of the present disclosure will be described.
  • the electronic equipment 1000 shown in FIG. 10 is only an example, which should not constitute any limitation to the function and use scope of the embodiments of the present disclosure.
  • the electronic equipment 1000 is represented in the form of a general computing equipment.
  • Components of the electronic equipment 1000 may comprise, but is not limited to: at least one processing unit 1010 (for example, for implementing the above-mentioned processor 330 ), at least one memory unit 1020 , and a bus 1030 connecting different system components (including a memory unit 1020 and a processing unit 1010 ) etc.
  • the memory unit stores program codes which may be executed by the processing unit 1010 , causing the processing unit 1010 to execute the steps according to various exemplary embodiments of the present disclosure described in the auto-recharging method for a robot in the description.
  • the processing unit 1010 may execute the steps as shown in FIG. 1 .
  • the memory unit 1020 may comprise a readable medium in the form of a volatile memory unit, e.g. a random-access memory unit (RAM) 10201 and/or a cache memory unit 10202 , and may further comprise a read-only memory unit (ROM) 10203 .
  • RAM random-access memory unit
  • ROM read-only memory unit
  • the memory unit 1020 may further comprise a program/practical tool 10204 having a set (at least one) of program modules 10205 .
  • a program module 10205 includes, but is not limited to: an operating system, one or more application programs, other program modules and program data, wherein each or a certain combination in these examples may include implementation of a network environment.
  • the bus 1030 may represent one or more of several bus structures, including a memory unit bus or a memory unit controller, a peripheral bus, a graphical acceleration port, a processing unit, or a local area bus using any bus structure(s) in a plurality of bus structures.
  • the electronic equipment 1000 may also communicate with one or more external equipments 1100 (e.g., a keyboard, a pointing device, a Bluetooth device, etc.), or communicate with one or more equipment s enabling the tenant to interact with the electronic equipment 1000 , and/or communicate with any equipment (e.g., a router, a modem, etc.) enabling the electronic equipment 1000 to communicate with one or more other computing equipment. Such communication may be carried out via an input/output (I/O) interface 1050 . Moreover, the electronic equipment 1000 may further communicate with one or more networks (e.g., a local area network (LAN), a wide area network (WAN), and/or a public network, e.g., the Internet) via a network adapter 1060 .
  • networks e.g., a local area network (LAN), a wide area network (WAN), and/or a public network, e.g., the Internet
  • the network adapter 1060 may communicate with other modules of the electronic equipment 1000 via the bus 1030 . It should be understood that although not shown in the figure, other hardware and/or software modules may be used in conjunction with the electronic equipment 1000 , including, but not limited to, microcode, an equipment driver, a redundancy processing unit, an external disk driving array, a RAID system, a tape driver, and a data backup memory system, etc.
  • the exemplary embodiments described here may be implemented via software or via a combination of software and necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product.
  • the software product may be stored in a non-volatile storage medium (which may be a CD-ROM, a U disc, or a mobile hard disk, etc.) or in a network, including a plurality of instructions to cause a computing equipment (which may be a personal computer, a server, or a network equipment etc.) to execute the auto-recharging method for a robot according to the embodiments of the present disclosure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The present disclosure provides a robot, an auto-recharging method therefor and a storage medium. The auto-recharging method for a robot comprises: the robot moving from an initial position to a docking position, wherein the docking position faces a charging interface of a charging pile; the robot traveling from the docking position to a charging position along a first path such that the robot is docked to the charging pile at the charging position, wherein the first path is a straight-line or approximately straight-line path, and the robot maintains a docking pose during the course of traveling along the first path and the charging pile is identified in the images captured by the robot in real time. The present disclosure may achieve auto-recharging of the robot without guidance of active light source, thereby reducing the cost of the robot and meanwhile offering a high flexibility to the equipment.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of International Application No. PCT/CN2019/082317, filed on Apr. 11, 2019, which claims priority to Chinese Patent Application No. 201811118208.8, filed on Sep. 25, 2018. The disclosures of the aforementioned applications are herein incorporated by reference in their entireties.
  • FIELD
  • The present disclosure relates to the field of robots, and more particularly to a robot, an auto-recharging method for a robot and a storage medium.
  • BACKGROUND
  • Currently, mobile robots (e.g., sweeping robots) have been accepted and actually used by more and more families. A sweeping robot may automatically return to a charging pile for charging when the power is not sufficient to continue cleaning. Conventional auto-recharging approaches of sweeping robots include: 1) a charging base emits an infrared signal; after entering into the coverage of the infrared signal in the process of movement, the robot receives the infrared signal via an infrared receiver at the front end of the robot and repeatedly adjusts the direction of motion until it contacts with a metal electrode sheet on the charging base; 2) a navigation technology is adopted; the charging base projects two beacon's faculae to the ceiling; a four-quadrant infrared receiving window is arranged at an upper end of the robot; the current coordinates and pose of the robot may be computed by converting the projected area of the faculae on a sensor into an electrical signal.
  • SUMMARY
  • The disclosure provides a robot, an auto-recharging method therefor and a storage medium, and auto-recharging of the robot can be achieved without guidance of active light source, thereby reducing the cost of the robot.
  • According to an aspect of the present disclosure, there is provided a computer-implemented method.
  • The method comprises: moving a robot from an initial position to a docking position, wherein the docking position faces a charging interface of a charging pile, and the docking position is determined based on a position of the charging pile identified by means of images captured by the robot in real time; and moving the robot from the docking position to a charging position along a first path such that the robot is docked to the charging pile at the charging position. The first path is a straight-line or approximately straight-line path. During the course of traveling along the first path, the robot maintains a docking pose and the charging pile is identifiable in the images captured by the robot in real time.
  • According to another aspect of the present disclosure, there is also provided a robot.
  • The robot comprises a sensor at least configured to capture images surrounding the robot in real time; a motor configured to drive the robot; and a processor configured to:
  • cause the robot to move from an initial position to a docking position, wherein the docking position faces a charging interface of the charging pile, and the docking position is determined based on a position of the charging pile identified by images captured by the robot in real time; and
  • cause the robot to travel along a first path from the docking position to a charging position so as to be docked with the charging pile at the charging position, wherein the first path is a straight-line or approximately straight-line path. During the course of traveling along the first path, the robot maintains a docking pose and the charging pile is identifiable in the images captured by the robot in real time.
  • According to yet another aspect of the present disclosure, there is also provided a storage medium on which a computer program is stored, wherein when being executed by the processor, the computer program performs above mentioned steps.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other features and advantages of the present disclosure will become more apparent through the detailed depictions of the exemplary embodiments with reference to the accompanying drawings.
  • FIG. 1 shows a flowchart of an auto-recharging method for a robot according to an embodiment of the present disclosure;
  • FIG. 2 shows a schematic diagram of auto-recharging for a robot according to an auto-recharging system for a robot of an embodiment of the present disclosure;
  • FIGS. 3-7 show schematic diagrams of auto-recharging for a robot according to an exemplary embodiment of the present disclosure;
  • FIG. 8 shows a modular diagram of a robot according to an embodiment of the present disclosure;
  • FIG. 9 schematically shows a schematic diagram of a computer readable storage medium in an exemplary embodiment of the present disclosure.
  • FIG. 10 schematically shows a schematic diagram of an electronic equipment.
  • DETAILED DESCRIPTION OF EMBODIMENTS
  • Now, exemplary embodiments will be described more comprehensively with reference to the accompanying drawings. However, the exemplary embodiments may be implemented in a plurality of forms, and should not be construed as being limited to the examples illustrated herein; on the contrary, provision of these embodiments makes the present disclosure more comprehensive and complete, and the ideas of the exemplary embodiments can be comprehensively conveyed to those skilled in the art. The described features, structures or properties may be combined in one or more embodiments in any appropriate manner.
  • Besides, the accompanying drawings are only schematic illustrations of the present disclosure, not necessarily drawn proportionally. The same reference numbers in the drawings represent same or similar portions, and thereby repeated depictions thereof will be omitted. Some blocks illustrated in the drawings are functional entities, which do not necessarily correspond to physically or logically independent entities. These functional entities may be implemented by software, or implemented in one or more hardware modules or integrated circuits, or implemented in various networks and/or processor devices and/or microcontroller devices.
  • In both of the above-mentioned auto-recharging methods, extra equipment need to be added to the robot and the charging base: an infrared emitter and a sensing device for receiving infrared signals are needed to be additionally provided to the charging base and the robot, respectively, or a device for projecting beacon's faculae needs to be additionally provided to the charging base so as to project beacon's faculae; consequently, the cost of the equipment will increase. Meanwhile, for an auto-recharging solution using infrared technology, the emitter of the charging base needs to be open; besides, the infrared emitter per se has a relatively high energy consumption, which incurs a high cost of the robot. For technical solutions of the guidance of active light source, such as an auto-recharging solution using infrared technology and an auto-recharging solution using guidance of an active light etc., a long wavelength light beam is adopted; however, the long wavelength light beam has poor penetration of obstacle. When an obstacle exists between the charging pile and the sweeping robot, the active light source cannot penetrate through the obstacle and thus cannot be received by a light beam sensing device on the robot; thereby automatic return to the pile is impeded, environment adaptability is poor, the equipment is easy to damage and thus has a short service life. Besides, if auto recharging for the robot is guided by active light, a matching light emitting device and a matching light sensing and receiving device are needed; if one of them is damaged, the matching light emitting device and the matching light sensing and receiving device is required to replace, which brings about a great limitation and a poor flexibility.
  • To overcome the drawbacks of the relevant technology, the present disclosure provides a robot, an auto-recharging method for a robot, an auto-recharging system for a robot, an electronic equipment, and a storage medium, which can achieve auto-recharging of the robot without guidance of active light source, thereby reducing the cost of the robot and meanwhile offering a high flexibility to the equipment.
  • First, referring to FIG. 1, which shows a flowchart of an auto-recharging method for a robot according to an embodiment of the present disclosure.
  • FIG. 1 shows two steps:
  • Step S110: the robot moving from an initial position to a docking position, wherein the docking position faces a charging interface of a charging pile;
  • Step S120: the robot traveling from the docking position to a charging position along a first path such that the robot is docked to the charging pile at the charging position, wherein the first path is a straight-line or approximately straight-line path, and the robot maintains a docking pose during the course of traveling along the first path, and the charging pile is identified in the images captured by the robot in real time.
  • Compared with relevant technology, the auto-recharging method for a robot provided by the present disclosure has the following advantages:
  • 1) The auto-recharging for a robot can be achieved by virtue of the images captured by the robot in real time, which omits guide devices such as an active light source emitter and an active light source receiver, thereby reducing the manufacturing cost, and solving the problems such as high energy consumption of the emitting device and high usage cost of the robot;
  • 2) The auto-recharging for a robot can be achieved by virtue of the images captured by the robot in real time, which put forward low requirements on the equipment of the charging pile and the robot, because a universal charging pile may enable the robot to go back to the pile for charging, thereby offering a high flexibility to the equipment;
  • 3) Different traveling ways of the robot or different path planning/adjusting ways of the robot can be used in different path segments by segmented path planning of the path of returning to the pile, which help solve the problem of the robot going back to the pile in a highly efficient manner.
  • Hereinafter, the auto-recharging method for a robot provided by the present disclosure will be further described with reference to FIG. 2. FIG. 2 shows a schematic diagram of auto-recharging for a robot according to an auto-recharging system for a robot of an embodiment of the present disclosure.
  • As shown in FIG. 2, the robot 202 detects a low battery at an initial position 213 and needs to go to a charging position 211 so as to be docked with a charging pile 201 for charging.
  • To move to the charging position 211, the robot 202 first determines a position of its initial position 213 in an environment map. The environment map may be trained and built when the robot 202 is in use. In some embodiments, the initial position 213 of the robot 202 is determined based on a motion trajectory of the robot 202 in the environment map before recharging. In other words, in this embodiment, the robot 202 may obtain its motion trajectory and then determine the position of its initial position 213 in an environment map based on the motion trajectory. In another embodiment, the robot 202 may directly capture the charging pile 201 before recharging, so as to determine the position of the initial position 213 in the environment map by capturing the charging pile 201 in the images. In some other embodiments, the robot 202 may just be turned on or moved by a person before recharging, such that the robot 202 cannot get its actual motion trajectory and thus cannot determine the position of its initial position 213 in the environment map based on the actual motion trajectory. In such an embodiment, the initial position 213 of the robot 202 is determined based on images captured by the robot in real time. For example, a plurality of identification features may be set in the environment map where the robot 202 is located (e.g., in the form of outlines of objects such as a chair, a desk, a sofa, or two-dimensional codes such that the coordinates of the identification features may be read); when an identification feature appears in the images captured by the robot 202 in real time, the position of the initial position 213 of the robot 202 in the environment map may be determined based on the coordinates (position in the environment map) of the identification feature. In some other embodiments, an identification feature may also be provided on the charging pile. If the robot 202 identifies the charging pile in the images captured at the initial position 213, the position of the initial position 213 in the environment map may be determined by the identification feature on the charging pile. What has been mentioned above only schematically depicts embodiments of the present disclosure, and the present disclosure is not limited thereto.
  • After the robot 202 determines the position of its initial position 213 in the environment map, the robot 202 needs to determine a boundary of a transit area 241 preset in the environment map to which the robot 202 moves. When presetting the transit area 241 in the environment map, the transit area 241 may be preset based on an area where the charging pile 201 may be identified in the images captured when building the environment map.
  • In some embodiments, an arbitrary position at the boundary of the transit area 241 may be selected. In some other embodiments, the boundary position of the selected transit area 241 is located in a shortest path for avoiding an obstacle from the initial position 213 to the boundary of the transit area 241; if there is no obstacle, the shortest path is a shortest straight-line path from the initial position 213 to the boundary of the transit area 241. In some other exemplary embodiments, the boundary position of the selected transit area 241 is located on a connecting line between the initial position 213 and the charging position 211, such that the transit position 212 may be uniquely determined and the shortest path planning between the positions may be implemented. The present disclosure may implement more variations, which will not be detailed here.
  • After the robot 202 determines the initial position 213 and the boundary of the transit area 241 preset in the environment map to which the robot 202 moves, the robot 202 plans a second path 221 from the initial position 213 to the boundary of the transit area 241. The second path 221 is preferably a shortest path that may avoid an obstacle, and preferably a straight-line path. Then, the robot 202 travels along the second path 221 from the initial position 213 to the boundary of the transit area 241.
  • In some embodiments, when the robot 202 may capture the image of the charging pile at the initial position 213, i.e., when the initial position 213 is located in the transit area 241, the robot 202 may not plan the second path 221.
  • When the robot 202 moves to the boundary of the transit area 241, it is determined whether the charging pile 201 is identified in the images captured by the robot 202 in real time. In some embodiments, the image of the charging pile 201 may be prestored in the robot 202, such that an image feature of the charging pile may be determined based on the image of the charging pile 201. By matching the image feature of the charging pile 201 to the real-time captured image, it may be determined whether the charging pile 201 appears in the images captured by the robot in real time. The present disclosure is not limited thereto. If the charging pile 201 is identified in the images captured by the robot 202 in real time, the current position of the robot 202 serves as the transit position 212 and the robot 202 travels along a third path from the transit position 212 to a docking position 214. In some embodiments, if the image of the charging pile may be captured at the initial position 213, the initial position 213 may serve as the transit position 212 so as to carry out subsequent steps. If the charging pile 201 is not identified in the images captured by the robot 202 in real time, the robot 202 acts according to a predetermined mode or an adaptive mode (e.g., acts within a predetermined scope according to the predetermined rotation or displacement) till the charging pile 201 is identified in the images captured by the robot 202 in real time, and then the current position of the robot 202 serves as the transit position 212.
  • Optionally, if the charging pile 201 still fails to be identified in the images captured by the robot 202 in real time after the robot acts according to a predetermined mode or an adaptive mode, an alert indicating a failure to find the charging pile 201 is generated. The alert is used for indicating that the charging pile 201 is blocked or the charging pile 201 is displaced. In such an embodiment, the robot 202 needs to be retrained, and the position of the charging pile 201 needs to be relabeled based on the existing environment map.
  • In the embodiment above, considering that the current environment is different from the generated environment map, such that the image of the charging pile 201 may not be captured at the boundary of the transit area 241, a transit position 212 where the image of the charging pile 201 may be captured can be determined within a preset scope based on the action in the predetermined mode or adaptive mode, to settle the above problem. In this embodiment, the robot 202 determines, at the transit position 212, a docking position 214 in the transit area 241, wherein the docking position 214 faces a charging interface of the charging pile 201. In other words, the robot 202 already has, at the docking position 214, a pose for being docked with the charging interface of the charging pile 201. According to some embodiments, the docking position 214 is a position that has been preset on an environment map. According to some embodiments, the robot 202 identifies, at the transit position 212, the position of the charging pile 201 by the images captured in real time, and determines the docking position 214 in the transit area 241 based on the position of the charging pile 201. For example, supposing that on a horizontal plane, a direction of the charging interface of the charging pile 201 is the Y axis and a direction perpendicular to the charging interface is the X axis, then the coordinates of the charging pile 201 are (x1, y1). Based on the coordinates (x1, y1) of the charging pile 201 and a preset spacing, the coordinates of the docking position 214 are determined to be (x2, y2), where x2=x1, y2=y1+n, n refers to a preset spacing between the docking position 214 and the charging position 211, the spacing between the docking position 214 (i.e., the robot 202 arrives at the central point of the docking position 214) and the charging pile 201 (i.e., the central point of the charging pile 201) is greater than or equal to twice the maximum diameter of the robot 202 but less than or equal to three times the maximum diameter of the robot 202.
  • The robot 202 travels along the third path from the transit position 212 to the docking position 214, wherein the third path is calculated on basis of the determined coordinates of the transit position in the environment map based on the images of the charging pile captured by the robot 202 at the transit position. The calculated third path may ensure that the robot poses to be docked with the charging pile for charging at the instant of reaching the docking position from the transit position, wherein the third path is not limited to a straight line. The robot 202 travels from the docking position 214 to the charging position 211 along a first path, wherein the first path is a straight line or an approximately straight line, and during the course of traveling along the first path, the charging pile 201 is identified in the images captured by the robot 202 in real time (i.e., in the first path, fine adjustment only occurs in the direction of X axis, such that the charging pile 201 is identified in the images captured by the robot 202 in real time). During the course of traveling along the first path, the robot 202 maintains a docking pose (for example, the charging socket of the robot 202 faces the charging interface of the charging pile 201); for example, the robot 202 and the charging pile 201 both maintain a docking state (e.g., keeping the cover of the charging interface in an opened state, or a state for docking among the states of a telescopic charging interface). Further, during the course of the robot traveling along the first path from the docking position 214 to the charging position 211, the robot 202 may, for example, adjust the first path based on an auxiliary pattern (e.g., a specific pattern or a two-dimensional code (QR code)) identified in the images captured by the robot 202 in real time, wherein the auxiliary pattern is provided on the charging pile. Further, the robot 202 may, for example, further adjust the first path via an openable auxiliary robot arm disposed on the charging pile 201. However, the present disclosure is not limited thereto. The two above manners may be used in combination, which is not detailed here. Finally, the robot 202 moves to the charging position 211 to dock with the charging pile 201 for charging.
  • A plurality of exemplary embodiments of the present disclosure will be described with reference to FIGS. 3-7. FIGS. 3-7 show schematic diagrams of auto-recharging of a robot according to exemplary embodiment of the present disclosure. In this embodiment, a sweeping robot 202 is depicted as an example. The sweeping robot 202 performs a cleaning work according to a predetermined working path 229 in an environment map 250 comprising rooms 251, 252, and 253. The position of the charging pile 201 and the transit area 241 are labeled in the environment map 250. Correspondingly, for the purpose of charging, the charging position 211 where the sweeping robot 202 needs to reach may also be labeled on the environment map.
  • First, referring to FIG. 3, the sweeping robot 202 moves and cleans along the working path 229 in the room 251 in the environment map 250. When the sweeping robot 202 detects that its battery power is lower than a predetermined threshold to prevent the sweeping robot 202 from continuing working, the sweeping robot 202 would mark the current position of the sweeping robot 202 as the initial position 213. The position of the initial position 213 in the environment map may be determined based on the working path 229 of the sweeping robot 202. The sweeping robot 202 returns to the initial position 213 to continue the unfinished cleaning work along the working path 229 after the sweeping robot 202's charging is complete.
  • Then, referring to FIG. 4, in this embodiment, the sweeping robot 202 plans a second path 221 based on the initial position 213 and the boundary of the transit area 241, wherein the second path 221 refers to a shortest path for avoiding an obstacle from the initial position 213 to the boundary of the transit area 241; if there is no obstacle, the second path 221 refers to a straight-line path from the initial position 213 to the boundary of the transit area 241. The sweeping robot 202 moves along the second path 221 from the initial position 213 to the boundary of the transit area 241. At the boundary of the transit area 241, if the charging pile 201 is identified in the images captured by the robot 202 in real time, the current position of the robot 202 serves as a transit position 212. If the charging pile 201 is not identified in the images captured by the robot 202 in real time, the robot 202 acts according to a predetermined mode or an adaptive mode (e.g. acts within a predetermined scope according to predetermined rotation or displacement), till the charging pile 201 is identified in the images captured by the robot 202 in real time, and the current position of the robot 202 serves as the transit position 212. If the charging pile 201 is identified in the images captured in real time by the sweeping robot 202 at the initial position 213, then the initial position 213 serves as the transit position 212.
  • Hereinafter, referring to FIG. 5, in an exemplary embodiment, the sweeping robot 202 travels along a third path 223 from the transit position 212 to the docking position 214 (pre-labeled on the environment map). The sweeping robot 202 may identify, at the transit position 212, the position of the charging pile 201 through the images captured in real time, and determine the docking position 214 in the transit area 241 based on the position of the charging pile 201, or determine the docking position 214 based on the position of the charging pile 201 labeled on the environment map.
  • In some embodiments, during the course of traveling along the third path 223 from the transit position 212 to the docking position 214, the sweeping robot 202 might encounter an obstacle 260. Therefore, the sweeping robot 202 may determine a horizontal plane based on the images captured in real time during the course of traveling, and determine whether an obstacle 260 exists in the traveling direction based on whether the horizontal plane is blocked or not. If the sweeping robot 202 identifies the obstacle 260 on the third path 223, the sweeping robot 202 may, for example, adjust the third path 223 based on a positional relationship between the charging pile 201 and the obstacle 260 in the real-time captured image. For example, if the charging pile 201 is located at one side to the central line of the images captured by the sweeping robot 202 in real time, the sweeping robot 202 turns to that side so as to avoid the obstacle 260 and meanwhile plans the third path 223, causing the sweeping robot 202 to move towards the docking position 214. In some embodiments, the third path 223 is calculated on basis of the determined coordinates of the transit position 212 on the environment map based on the image of the charging pile captured by the robot 202 at the transit position 212. The calculated third path 223 may ensure that the robot 202 poses to be docked with the charging pile 201 for charging at the instant of reaching the docking position 214 from the transit position 212. The third path 223 may be the shortest path for avoiding the obstacle from the transit position 212 to the docking position 214; if there is no obstacle, the third path 223 is a straight-line path from the transit position 212 to the docking position 214. The third path 223 is not limited to a straight-line path.
  • Then, the sweeping robot 202 travels to the charging position 211 along the first path 222 from the docking position 214 to the charging position 211 and carries out subsequent docking and charging. During the course of traveling along the first path 222, both the robot 202 and the charging pile 201 maintain a docking state. Hereinafter, referring to FIGS. 6 and 7, in another exemplary embodiment, the charging interface 261 of the sweeping robot 202 for plugging with the charging pile 201 and a sensor 262 of the sweeping robot 202 are located at the same side of the sweeping robot. To cause the charging interface 261 of the sweeping robot 202 to be exactly docked with the charging pile 201 when the sweeping robot 202 arrives at the charging position 211, for example a straight-line path may be planned during the real-time planning of the first path 222 so as to eliminate a step of rotating the sweeping robot 202 in situ, after arriving at the charging position 211, to cause its charging interface 261 to be docked with the charging pile 201. For example, the sweeping robot 202 may first move to the docking position 214 for the charging interface of the charging pile 201 to cause the charging interface 261 of the sweeping robot 202 at the docking position 214 already face the charging interface of the charging pile 201, such that the sweeping robot 202 is only needed to adjust the path in real time to cause the charging interface of the charging pile 201 in the first path 222 to be located at the image center of the images captured in real time by the sweeping robot 202. In a variation of this embodiment, the above technical solution is adjusted based on the positions of the sensor 262 and the charging interface 261 provided on the sweeping robot 202. For example, the sensor 262 and the charging interface 261 are arranged in an angle at different sides of the sweeping robot 202; based on the angle between the sensor 262 and the charging interface 261, the orientation of the charging interface 261 may be determined in the image captured by the sensor 262, and then the first path 222 of the sweeping robot 202 is finely adjusted based on whether the charging interface of the charging pile 201 is aligned with the charging interface 261. The present disclosure is not limited thereto.
  • In various embodiments above, the sensor 262 of the sweeping robot 202 for example may be a camera with a fixed viewing angle. In some other embodiments, the sensor 262 of the sweeping robot 202 for example may be a panorama camera which may rotate 360°. The present disclosure may implement more variations, which will not be detailed here.
  • What has been described mentioned above only schematically depicts a plurality of embodiments of the present disclosure; the present disclosure is not limited thereto.
  • According to one or more embodiments described above, segmented path planning is performed on the path returning to the pile, and different obstacle avoidance methods are adopted in different path segments with different possibilities of obstacles, thereby better solving the problem of failing to return to the pile for charging due to the obstacles in the path of the robot and the charging pile, so as to achieve intelligent obstacle avoidance, thereby reducing the number of collisions of the robot, increasing the service life of the robot, and improving the user experience and purchasing desire.
  • The present disclosure further provides a robot. Hereinafter, referring to FIG. 8, which shows a modular diagram of a robot according to an embodiment of the present disclosure. The robot 300 comprises a sensor 310, a motor 320, and a processor 330.
  • The sensor 310 is at least configured to capture images surrounding the robot in real time.
  • The motor 320 is configured to drive the robot to move.
  • The processor 330 is configured to cause the robot to move from an initial position to a docking position, wherein the docking position faces a charging interface of the charging pile; the processor 330 is further configured to adjust the robot to travel along a first path from the docking position to a charging position so as to be docked with the charging pile at the charging position, wherein the first path is a straight-line or approximately straight-line path, the robot maintains a docking pose during the course of traveling along the first path, and the charging pile is identified in the images captured by the robot in real time.
  • In some embodiments of the present disclosure, the robot may be a sweeping robot or a mopping robot.
  • FIG. 8 only schematically shows a modular diagram of the robot according to the present disclosure. Without departing from the idea of the present disclosure, splitting, merging, and adding of the modules all fall within the protection scope of the present disclosure.
  • The present disclosure provides an auto-recharging system for a robot. Please refer to FIG. 2. The auto-recharging system for a robot comprises the robot 300 (reference number 202 in FIG. 2) and the charging pile 201 shown in FIG. 8. In the auto-recharging system for a robot, the robot moves from an initial position to a docking position, wherein the docking position faces a charging interface of the charging pile; the robot travels from the docking position to a charging position along a first path such that the robot is docked to the charging pile at the charging position, wherein the first path is a straight-line or approximately straight-line path; the robot maintains a docking pose during the course of traveling along the first path, and the charging pile is identified in the images captured by the robot in real time.
  • In the exemplary embodiment of the present disclosure, there is further provided a computer readable storage medium on which a computer program is stored, wherein the program, for example when being executed by the processor, may implement the steps of the auto-recharging method for a robot in any embodiment above. In some possible embodiments, various aspects of the present disclosure may be further implemented in the form of a program product, comprising program codes; when the program product is executed on a terminal equipment, the program codes are configured to cause the terminal equipment to execute the steps according to various exemplary embodiments of the present disclosure described in the auto-recharging method for a robot in the description.
  • Referring to FIG. 9, a program product 900 for implementing the method above according to the embodiments of the present disclosure is described. The program product 900 may adopt a portable compact disk read-only memory (CD-ROM) and comprise program codes, and may be run on a terminal equipment, for example, a personal computer. However, the program product of the present disclosure is not limited thereto. In the present disclosure, the readable storage medium may be any tangible medium containing or storing the program that may be used by an instruction executing system, device, or member or combination thereof.
  • The program product may adopt any combination of one or more readable mediums. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium, for example, may be, but is not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or member, or any combination thereof. More specific examples (a non-exhaustive list) of the readable storage medium may include an electrical connection having one or more wires, a portable disk, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disk read-only memory (CD-ROM), an optical memory member, a magnetic memory member, or any appropriate combination thereof.
  • The computer-readable storage medium may include a data signal propagated in a baseband or as part of a carrier wave, in which readable program codes are carried. A data signal propagated in such a way may adopt a plurality of forms, including, but not limited to, an electromagnetic signal, an optical signal, or any appropriate combination thereof. The readable storage medium may also be any readable medium other than the readable storage medium, which readable medium may send, propagate or transmit the programs used by the instruction executing system, device, member, or combination thereof. The program codes included in the readable medium may be transmitted using any appropriate medium, including, but not limited to: wireless, wired, cable, RF, etc., or any appropriate combination thereof.
  • Program codes for carrying out operations of the present disclosure may be compiled in any combination of one or more programming languages including object-oriented programming languages such as Java, C++ or the like, as well as conventional procedural programming languages, such as the “C” language or similar programming languages. The program codes may be executed entirely on a tenant's computing equipment, partially on the tenant's equipment, executed as a stand-alone software package, partially on the tenant's computing equipment and partially executed on a remote computing equipment, or entirely executed on the remote computing equipment or server. In a scenario involving a remote computing equipment, the remote computing equipment may be connected to the tenant's computing equipment through any type of network, including a local area network (LAN) or a wide area network (WAN), or connected to an external computing equipment (for example, connected through the Internet using an Internet Service Provider).
  • In an exemplary embodiment of the present disclosure, there is further provided an electronic equipment, which may comprise a processor (which may, for example, be used to implement the aforementioned processor 330) and a memory for storing an executable instruction of the processor. Wherein, the processor is configured to execute the steps of the auto-recharging method for a robot in any one of above embodiments by executing the executable instruction.
  • Those skilled in the art may understand that various aspects of the present disclosure may be implemented as a system, a method or a program product. Therefore, various aspects of the present disclosure may be specifically implemented in the following forms: complete hardware, complete software (including firmware and microcode, etc.), or a combination of hardware and software, which may be generally referred to as “a circuit,” “a module,” or “a system.”
  • Hereinafter, referring to FIG. 10, an electronic equipment 1000 according to such an embodiment of the present disclosure will be described. The electronic equipment 1000 shown in FIG. 10 is only an example, which should not constitute any limitation to the function and use scope of the embodiments of the present disclosure.
  • As shown in FIG. 10, the electronic equipment 1000 is represented in the form of a general computing equipment. Components of the electronic equipment 1000 may comprise, but is not limited to: at least one processing unit 1010 (for example, for implementing the above-mentioned processor 330), at least one memory unit 1020, and a bus 1030 connecting different system components (including a memory unit 1020 and a processing unit 1010) etc.
  • Wherein, the memory unit stores program codes which may be executed by the processing unit 1010, causing the processing unit 1010 to execute the steps according to various exemplary embodiments of the present disclosure described in the auto-recharging method for a robot in the description. For example, the processing unit 1010 may execute the steps as shown in FIG. 1.
  • The memory unit 1020 may comprise a readable medium in the form of a volatile memory unit, e.g. a random-access memory unit (RAM) 10201 and/or a cache memory unit 10202, and may further comprise a read-only memory unit (ROM) 10203.
  • The memory unit 1020 may further comprise a program/practical tool 10204 having a set (at least one) of program modules 10205. Such a program module 10205 includes, but is not limited to: an operating system, one or more application programs, other program modules and program data, wherein each or a certain combination in these examples may include implementation of a network environment.
  • The bus 1030 may represent one or more of several bus structures, including a memory unit bus or a memory unit controller, a peripheral bus, a graphical acceleration port, a processing unit, or a local area bus using any bus structure(s) in a plurality of bus structures.
  • The electronic equipment 1000 may also communicate with one or more external equipments 1100 (e.g., a keyboard, a pointing device, a Bluetooth device, etc.), or communicate with one or more equipment s enabling the tenant to interact with the electronic equipment 1000, and/or communicate with any equipment (e.g., a router, a modem, etc.) enabling the electronic equipment 1000 to communicate with one or more other computing equipment. Such communication may be carried out via an input/output (I/O) interface 1050. Moreover, the electronic equipment 1000 may further communicate with one or more networks (e.g., a local area network (LAN), a wide area network (WAN), and/or a public network, e.g., the Internet) via a network adapter 1060. The network adapter 1060 may communicate with other modules of the electronic equipment 1000 via the bus 1030. It should be understood that although not shown in the figure, other hardware and/or software modules may be used in conjunction with the electronic equipment 1000, including, but not limited to, microcode, an equipment driver, a redundancy processing unit, an external disk driving array, a RAID system, a tape driver, and a data backup memory system, etc.
  • Through the descriptions of the embodiments above, those skilled in the art should easily understand that the exemplary embodiments described here may be implemented via software or via a combination of software and necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product. The software product may be stored in a non-volatile storage medium (which may be a CD-ROM, a U disc, or a mobile hard disk, etc.) or in a network, including a plurality of instructions to cause a computing equipment (which may be a personal computer, a server, or a network equipment etc.) to execute the auto-recharging method for a robot according to the embodiments of the present disclosure.
  • After considering the specification and practicing the disclosures, those skilled in the art will easily envisage other embodiments of the present disclosure. The present application intends to cover any transformation, use or adaptive variation of the present disclosure, and such transformations, uses or adaptive variations follow a general principle of the present disclosure and include the common knowledge or customary technical means in the technical field as undisclosed in the present disclosure. The specification and the embodiments are only regarded as exemplary, and the actual scope and spirit of the present disclosure is pointed out by the appended claims.

Claims (20)

1. A computer-implemented method, comprising:
moving a robot from an initial position to a docking position in an environment map that is generated by the robot, wherein the robot captures images of surroundings of the robot when generating the environment map, wherein the robot at the docking position faces a charging interface of a charging pile, and the docking position is determined based on a position of the charging pile identified by means of images captured by the robot in real time; and
moving the robot from the docking position to a charging position along a first path such that the robot is docked to the charging pile at the charging position, wherein the first path is a straight-line or approximately straight-line path, and wherein, during the course of traveling along the first path, the robot maintains a docking pose and the charging pile is identifiable in the images captured by the robot in real time, and
wherein the operation of moving the robot from the inital position to the docking position further comprises:
moving the robot from the inital position to a boundary of a transit area along a second path, wherein the second path is planned by the robot at the inital position, wherein the transit area is preset in the environment map when the environment map is generated, and the transit area is defined as an area in which each image captured by the robot when generating the environment map has the charging pile; and
moving the robot from the boundary of the transit area to the docking position along a third path, wherein the third path is adjusted in real time based on the images captured by the robot in real time.
2. (canceled)
3. The computer-implemented method according to claim 1, wherein the operation of moving the robot from the boundary of the transit area to the docking position along the third path further comprises:
determining whether the charging pile is identified in the images captured by the robot in real time when the robot moves to the transit area;
when positive, taking the current position of the robot as a transit position, and moving the robot from the transit position to the docking position along the third path;
when negative, moving the robot according to a predetermined mode or an adaptive mode until the charging pile is identified in the images captured by the robot in real time, and taking the position of the robot at which the charging pile is identified in the images captured by the robot as the transit position.
4. The computer-implemented method according to claim 3, wherein when the charging pile still fails to be identified in the images captured by the robot in real time after moving the robot according to the predetermined mode or the adaptive mode, an alert indicating a failure to find the charging pile is generated.
5. (canceled)
6. The computer-implemented method according to claim 3, wherein the docking position is located in the transit area, and the docking position is determined at the transit position on the basis of the position of the charging pile identified by means of images captured by the robot in real time at the transit position.
7. The computer-implemented method according to claim 1, wherein the operation of moving the robot from the docking position to the charging position along the first path comprises:
adjusting, by the robot, the first path based on an auxiliary pattern identified in images captured by the robot in real time, wherein the auxiliary pattern is provided on the charging pile.
8. (canceled)
9. The computer-implemented method according to claim 1, wherein the initial position of the robot is determined based on a motion trajectory of the robot in the environment map before recharging, and/or a position, in the environment map, of an identification feature in the images captured by the robot in real time, wherein the identification feature comprises a feature of the charging pile, or a feature of an object or a label in the environment map, wherein the object or the label has a fixed positional relationship with respect to the charging pile.
10. A robot, characterized in that the robot comprises:
a sensor at least configured to capture images of surroundings of the robot in real time;
a motor configured to drive the robot;
a processor configured to:
cause the robot to move from an initial position to a docking position in an environment map that is generated by the robot, wherein the robot captures images of surroundings of the robot when generating the environment map, wherein the robot at the docking position faces a charging interface of a charging pile, and the docking position is determined based on a position of the charging pile identified by means of images captured by the robot in real time; and
cause the robot to travel along a first path from the docking position to a charging position so as to be docked with the charging pile at the charging position, wherein the first path is a straight-line or approximately straight-line path, and wherein, during the course of traveling along the first path, the robot maintains a docking pose and the charging pile is identifiable in the images captured by the robot in real time, and
wherein the processor is configured to:
cause the robot to move from the inital position to a boundary of a transit area along a second path, wherein the second path is planned by the robot at the inital position, wherein the transit area is preset in the environment map when the environment map is generated, and the transit area is defined as an area in which each image captured by the robot when generating the environment map has the charging pile; and
cause the robot to move from the boundary of the transit area to the docking position along a third path, wherein the third path is adjusted in real time based on the images captured by the robot in real time.
11. (canceled)
12. The robot according to claim 10, wherein the processor is configured to:
determine whether the charging pile is identified in the images captured by the robot in real time when the robot moves to the boundary of the transit area;
when positive, take the current position of the robot as a transit position, and cause the robot to move from the transit position to the docking position along the third path;
when negative, cause the robot to move according to a predetermined mode or an adaptive mode until the charging pile is identified in the images captured by the robot in real time, and take the position of the robot at which the charging pile is identified in the images captured by the robot as the transit position.
13. The robot according to claim 12, wherein the processor is configured to generate an alert indicating a failure to find the charging pile, when the charging pile still fails to be identified in the images captured by the robot in real time after moving the robot according to the predetermined mode or the adaptive mode.
14. (canceled)
15. The robot according to claim 12, wherein the docking position is located in the transit area, and the docking position is determined at the transit position on the basis of the position of the charging pile identified by means of images captured by the robot in real time at the transit position.
16. The robot according to claim 10, wherein the processor is configured to:
adjust the first path based on an auxiliary pattern identified in images captured by the robot in real time, wherein the auxiliary pattern is provided on the charging pile.
17. (canceled)
18. The robot according to claim 10, wherein the initial position of the robot is determined based on a motion trajectory of the robot in the environment map before recharging, and/or a position, in the environment map, of an identification feature in the images captured by the robot in real time, wherein the identification feature comprises a feature of the charging pile, or a feature of an object or a label in the environment map, wherein the object or the label has a fixed positional relationship with respect to the charging pile.
19. A non-transitory storage medium comprising a computer program which, when executed by a processor, causes the processor to perform the operations as follows:
moving a robot from an initial position to a docking position in an environment map that is generated by the robot, wherein the robot captures images of surroundings of the robot when generating the environment map, wherein the robot at the docking position faces a charging interface of a charging pile, and the docking position is determined based on a position of the charging pile identified by means of images captured by the robot in real time; and
moving the robot from the docking position to a charging position along a first path so as to be docked with the charging pile at the charging position, wherein the first path is a straight-line or approximately straight-line path, and wherein, during the course of traveling along the first path, the robot maintains a docking pose and the charging pile is identifiable in the images captured by the robot in real time, and
wherein the operation of moving the robot from the initial position to the docking position further comprises:
moving the robot from the initial position to a boundary of a transit area along a second path, wherein the second path is planned by the robot at the inital position, wherein the transit area is preset in the environment map when the environment map is generated, and the transit area is defined as an area in which each image captured by the robot when generating the environment map has the charging pile; and
moving the robot from the boundary of the transit area to the docking position along a third path, wherein the third path is adjusted in real time based on the images captured by the robot in real time.
20. (canceled)
US16/396,613 2018-09-25 2019-04-26 Auto-recharging of robot Active US10585437B1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201811118208.8A CN109683605B (en) 2018-09-25 2018-09-25 Robot and automatic recharging method and system thereof, electronic equipment and storage medium
CN201811118208.8 2018-09-25
CN201811118208 2018-09-25
PCT/CN2019/082317 WO2020062835A1 (en) 2018-09-25 2019-04-11 Robot and automatic recharging method and system therefor, electronic device and storage medium

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/082317 Continuation WO2020062835A1 (en) 2018-09-25 2019-04-11 Robot and automatic recharging method and system therefor, electronic device and storage medium

Publications (2)

Publication Number Publication Date
US10585437B1 US10585437B1 (en) 2020-03-10
US20200097017A1 true US20200097017A1 (en) 2020-03-26

Family

ID=66184526

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/396,613 Active US10585437B1 (en) 2018-09-25 2019-04-26 Auto-recharging of robot

Country Status (6)

Country Link
US (1) US10585437B1 (en)
EP (1) EP3629120B8 (en)
JP (1) JP6631823B1 (en)
KR (1) KR20210003243A (en)
CN (1) CN109683605B (en)
WO (1) WO2020062835A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110712204A (en) * 2019-09-27 2020-01-21 深圳乐动机器人有限公司 Robot working method and robot
CN111427361A (en) * 2020-04-21 2020-07-17 浙江欣奕华智能科技有限公司 Recharging method, recharging device and robot

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110597265A (en) * 2019-09-25 2019-12-20 深圳巴诺机器人有限公司 Recharging method and device for sweeping robot
CN110647047B (en) * 2019-09-30 2023-03-28 青岛海尔科技有限公司 Control method and device for equipment in smart home operating system and storage medium
CN111474928B (en) * 2020-04-02 2023-08-01 上海高仙自动化科技发展有限公司 Robot control method, robot, electronic device, and readable storage medium
CN111546333A (en) * 2020-04-24 2020-08-18 深圳市优必选科技股份有限公司 Robot and automatic control method and device thereof
CN113778068A (en) * 2020-05-22 2021-12-10 苏州科瓴精密机械科技有限公司 Self-moving device and charging station docking method and device, self-moving device, system and readable storage medium
CN111753695B (en) * 2020-06-17 2023-10-13 上海宜硕网络科技有限公司 Method and device for simulating robot charging return route and electronic equipment
CN111897319B (en) * 2020-06-17 2024-09-10 北京旷视机器人技术有限公司 Charging control method, device, electronic equipment and readable storage medium
CN111805553A (en) * 2020-06-23 2020-10-23 丰疆智能(深圳)有限公司 Pushing robot, pushing system and pushing management method
CN112022025A (en) * 2020-08-14 2020-12-04 深圳市大象机器人科技有限公司 Automatic robot back flushing method and system based on visual positioning
CN112183524A (en) * 2020-08-31 2021-01-05 深圳市优必选科技股份有限公司 Robot wired network docking method, system, terminal device and storage medium
CN112180989B (en) * 2020-09-30 2021-12-07 苏州盈科电子有限公司 Robot charging method and device
CN112674655B (en) * 2021-01-14 2022-06-10 深圳市云鼠科技开发有限公司 Wall-following-based refilling method and device, computer equipment and storage
CN113370816B (en) * 2021-02-25 2022-11-18 德鲁动力科技(成都)有限公司 Quadruped robot charging pile and fine positioning method thereof
CN112578799B (en) * 2021-02-25 2022-02-11 德鲁动力科技(成都)有限公司 Autonomous charging method for four-foot robot and autonomous charging four-foot robot
CN113138596B (en) * 2021-03-31 2024-09-17 深圳市优必选科技股份有限公司 Robot automatic charging method, system, terminal equipment and storage medium
CN112977136B (en) * 2021-04-08 2022-07-26 国网新疆电力有限公司乌鲁木齐供电公司 Sliding rail mobile type wall-mounted automobile charging pile
CN113706452B (en) * 2021-07-09 2024-10-18 武汉思恒达科技有限公司 Automatic charging pile detection system and detection method based on image recognition
CN113858268B (en) * 2021-08-02 2023-09-05 深兰机器人产业发展(河南)有限公司 Charging method of robot chassis and related device
CN113459852A (en) * 2021-09-01 2021-10-01 北京智行者科技有限公司 Path planning method and device and mobile tool
CN114285114A (en) * 2021-12-06 2022-04-05 北京云迹科技股份有限公司 Charging control method and device, electronic equipment and storage medium
CN114355911B (en) * 2021-12-24 2024-03-29 深圳甲壳虫智能有限公司 Charging method and device for robot, robot and storage medium
CN114815858B (en) * 2022-06-29 2022-11-08 季华实验室 Robot automatic charging method and device, electronic equipment and storage medium
CN115268432A (en) * 2022-07-11 2022-11-01 深圳市优必选科技股份有限公司 Robot, automatic recharging method and device thereof and storage medium
JP2024141341A (en) * 2023-03-29 2024-10-10 日野自動車株式会社 Initial Position Setting Device

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU767561B2 (en) * 2001-04-18 2003-11-13 Samsung Kwangju Electronics Co., Ltd. Robot cleaner, system employing the same and method for reconnecting to external recharging device
JP2004237075A (en) * 2003-02-06 2004-08-26 Samsung Kwangju Electronics Co Ltd Robot cleaner system provided with external charger and connection method for robot cleaner to external charger
KR100595923B1 (en) * 2005-02-25 2006-07-05 삼성광주전자 주식회사 Automatic cleaning apparatus and a method for controlling the same
KR20060110483A (en) * 2005-04-20 2006-10-25 엘지전자 주식회사 Cleaning robot having function of returning charging equipment and method for thereof
KR100766434B1 (en) * 2005-07-22 2007-10-15 엘지전자 주식회사 Robot having function of recognizing image and leading method for thereof
KR101553654B1 (en) * 2009-02-13 2015-10-01 삼성전자 주식회사 Mobile robot and method for moving of mobile robot
KR101672787B1 (en) * 2009-06-19 2016-11-17 삼성전자주식회사 Robot cleaner and docking station and robot cleaner system having the same and control method thereof
KR101752190B1 (en) * 2010-11-24 2017-06-30 삼성전자주식회사 Robot cleaner and method for controlling the same
US8515580B2 (en) * 2011-06-17 2013-08-20 Microsoft Corporation Docking process for recharging an autonomous mobile device
KR101295959B1 (en) * 2011-10-20 2013-08-13 충북대학교 산학협력단 Position estimation apparatus of mobile robot using indoor ceiling image
CN102662400A (en) * 2012-05-10 2012-09-12 慈溪思达电子科技有限公司 Path planning algorithm of mowing robot
EP2903787B1 (en) 2012-10-05 2019-05-15 iRobot Corporation Robot management systems for determining docking station pose including mobile robots and methods using same
GB2513912B (en) 2013-05-10 2018-01-24 Dyson Technology Ltd Apparatus for guiding an autonomous vehicle towards a docking station
US9840003B2 (en) * 2015-06-24 2017-12-12 Brain Corporation Apparatus and methods for safe navigation of robotic devices
KR101660703B1 (en) 2015-06-26 2016-09-28 주식회사 유진로봇 Visual homing system and method using stereo camera and active logo
DE102015114883A1 (en) 2015-09-04 2017-03-09 RobArt GmbH Identification and localization of a base station of an autonomous mobile robot
CN106125724A (en) * 2016-06-13 2016-11-16 华讯方舟科技有限公司 A kind of method and system of robot autonomous charging
CN106097341A (en) * 2016-06-13 2016-11-09 华讯方舟科技有限公司 A kind of method and system of robot autonomous charging
CN107637255B (en) * 2016-07-22 2020-10-20 苏州宝时得电动工具有限公司 Walking path control method and automatic working system of intelligent mower
CN106647747B (en) * 2016-11-30 2019-08-23 北京儒博科技有限公司 A kind of robot charging method and device
CN107291084B (en) * 2017-08-08 2023-08-15 小狗电器互联网科技(北京)股份有限公司 Sweeping robot charging system, sweeping robot and charging seat
US10243379B1 (en) * 2017-09-22 2019-03-26 Locus Robotics Corp. Robot charging station protective member
US10761539B2 (en) * 2017-11-22 2020-09-01 Locus Robotics Corp. Robot charger docking control
CN107945233B (en) * 2017-12-04 2020-11-24 深圳市无限动力发展有限公司 Visual floor sweeping robot and refilling method thereof

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110712204A (en) * 2019-09-27 2020-01-21 深圳乐动机器人有限公司 Robot working method and robot
CN111427361A (en) * 2020-04-21 2020-07-17 浙江欣奕华智能科技有限公司 Recharging method, recharging device and robot

Also Published As

Publication number Publication date
US10585437B1 (en) 2020-03-10
EP3629120B8 (en) 2021-03-10
JP6631823B1 (en) 2020-01-15
KR20210003243A (en) 2021-01-11
EP3629120A1 (en) 2020-04-01
JP2020053007A (en) 2020-04-02
EP3629120B1 (en) 2020-12-23
CN109683605A (en) 2019-04-26
CN109683605B (en) 2020-11-24
WO2020062835A1 (en) 2020-04-02

Similar Documents

Publication Publication Date Title
US10585437B1 (en) Auto-recharging of robot
US20240019869A1 (en) Cleaning robot and control method therefor
EP3460614B1 (en) Combined robot and cruising path generation method therefor
CN106877454B (en) Robot charging method and device
US10775803B2 (en) Docking system and method for charging a mobile robot
US20210001480A1 (en) Autonomous Mobile Robot And Method For Controlling An Autonomous Mobile Robot
US20130218342A1 (en) Control method for cleaning robots
US20170083023A1 (en) Apparatus for localizing cleaning robot, cleaning robot, and controlling method of cleaning robot
TWI529507B (en) Charging station and charging system
TWI424296B (en) Guidance device and operation system utilizing the same
KR20190088115A (en) Moving apparatus for cleaning, and system and method for cooperative cleaning thereof
JP2021504793A (en) Robot charger docking control
JP2013168150A (en) Charging station and charging system
WO2023025028A1 (en) Charging method, charging apparatus, and robot
CN109986561A (en) A kind of robot long-distance control method, device and storage medium
CN102087529B (en) Movable device and control method thereof
US11004317B2 (en) Moving devices and controlling methods, remote controlling systems and computer products thereof
KR102436960B1 (en) Method for providing charging system for home robot
KR20180060339A (en) Robot cleaner system for controlling recharge of mobile robot and method thereof
CN116700262B (en) Automatic recharging control method, device, equipment and medium for mobile robot
WO2022252849A1 (en) Self-moving device
WO2021190514A1 (en) Automatic working system, self-moving device and control method therefor
Yang et al. Development, Planning and Control of an Autonomous Mobile Manipulator for Power Substation Live-Maintaining
EP4148525A1 (en) Method and apparatus for docking self-moving device to charging station, and self-moving device and readable storage medium
CN117128975B (en) Navigation method, system, medium and equipment for switch cabinet inspection operation robot

Legal Events

Date Code Title Description
AS Assignment

Owner name: NEXTVPU (SHANGHAI) CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZHOU, JI;FENG, XINPENG;REEL/FRAME:049015/0012

Effective date: 20190411

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

Year of fee payment: 4