CN110477825B - Cleaning robot, autonomous charging method, system, and readable storage medium - Google Patents

Cleaning robot, autonomous charging method, system, and readable storage medium Download PDF

Info

Publication number
CN110477825B
CN110477825B CN201910811775.XA CN201910811775A CN110477825B CN 110477825 B CN110477825 B CN 110477825B CN 201910811775 A CN201910811775 A CN 201910811775A CN 110477825 B CN110477825 B CN 110477825B
Authority
CN
China
Prior art keywords
cleaning robot
image
charging
area
charging seat
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910811775.XA
Other languages
Chinese (zh)
Other versions
CN110477825A (en
Inventor
刘煜熙
李浩明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Flyco Electrical Appliance Co Ltd
Original Assignee
Shenzhen Feike Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Feike Robot Co ltd filed Critical Shenzhen Feike Robot Co ltd
Priority to CN201910811775.XA priority Critical patent/CN110477825B/en
Publication of CN110477825A publication Critical patent/CN110477825A/en
Application granted granted Critical
Publication of CN110477825B publication Critical patent/CN110477825B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4091Storing or parking devices, arrangements therefor; Means allowing transport of the machine when it is not being used

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Electric Vacuum Cleaner (AREA)

Abstract

The invention discloses an autonomous charging method, which is applied to a cleaning robot, wherein the cleaning robot comprises an image acquisition unit for acquiring a current environment image; the autonomous charging method includes the steps of: controlling the cleaning robot to move along a preset direction when the cleaning robot is determined to be in a first area range; determining position information of a charging stand based on an image collected within a second area range when it is determined that the cleaning robot is within the second area range; controlling the cleaning robot to move to the charging seat to charge according to the position information; wherein the second area range is closer to the charging dock than the first area range. The invention also provides an autonomous charging system, a cleaning robot and a readable storage medium. The invention can reduce the operation amount of the cleaning robot in the recharging process, reduce the noise interference of the image sensor and improve the positioning precision.

Description

Cleaning robot, autonomous charging method, system, and readable storage medium
Technical Field
The invention relates to the technical field of robot control, in particular to a cleaning robot, an autonomous charging method, an autonomous charging system and a readable storage medium.
Background
With the development of technology, cleaning robots are increasingly widely used. Among them, automatic positioning and automatic recharging are important features of the cleaning robot. The scheme adopted by the existing automatic recharging method is that when the charging seat is determined to exist around the cleaning robot, the position information of the charging seat is determined, and the cleaning robot is guided to move to the charging seat in real time according to the position information of the charging seat so as to charge the cleaning robot. However, this method requires the robot to perform positioning navigation continuously, which results in large battery power consumption.
Disclosure of Invention
The embodiment of the invention discloses a cleaning robot, an autonomous charging method, an autonomous charging system and a readable storage medium, and aims to solve the problems.
In order to solve the above technical problem, a first aspect of the present invention provides a cleaning robot comprising:
the image acquisition unit is used for acquiring an image of the current environment;
the cleaning robot comprises a processing unit, a control unit and a control unit, wherein the processing unit is used for controlling the cleaning robot to move along a preset direction when the cleaning robot is determined to be in a first area range;
when the cleaning robot is determined to be in a second area range, determining position information of a charging seat based on an image collected in the second area range, and controlling the cleaning robot to move to the charging seat for charging according to the position information; wherein the second area range is closer to the charging dock than the first area range.
In a second aspect, the present invention provides an autonomous charging method applied to a cleaning robot, the autonomous charging method including the steps of:
acquiring a current environment image;
controlling the cleaning robot to move along a preset direction when the cleaning robot is determined to be in a first area range;
determining position information of a charging stand based on an image collected within a second area range when it is determined that the cleaning robot is within the second area range; and
controlling the cleaning robot to move to the charging seat to charge according to the position information; wherein the second area range is closer to the charging dock than the first area range.
In a third aspect, the present invention provides an autonomous charging system including a charging stand, the autonomous charging system further including the cleaning robot described in the first aspect.
In a fourth aspect, the present invention provides a readable storage medium having stored thereon a corresponding program of an autonomous charging method, the corresponding program of the autonomous charging method, when executed, implementing the autonomous charging method described in the second aspect.
In a fifth aspect, the present invention also provides a cleaning robot comprising at least one module that can be used to implement the method of the second aspect.
In a sixth aspect, embodiments of the present application provide a computer program product; the computer program product comprising program instructions which, when executed by a cleaning robot, cause the cleaning robot to perform the method of the first aspect as described above. The computer program product may be a software installation package, which, in case it is desired to use the method provided by any of the possible designs of the first aspect described above, may be downloaded and executed on a cleaning robot for carrying out the method of the second aspect.
According to the cleaning robot, the autonomous charging method, the autonomous charging system and the readable storage medium, when the cleaning robot is determined to be in a first area range, the cleaning robot is controlled to move along a preset direction, when the cleaning robot is determined to be in a second area range, the position information of a charging seat is determined based on an image collected in the second area range, and the cleaning robot is controlled to move to the charging seat for charging according to the position information. Since the second area range is closer to the charging dock than the first area range. Compared with the real-time positioning navigation in the existing recharging technology, the scheme in the application is that the position information of the charging seat is determined only when the cleaning robot is located in the second area range closer to the charging pile, so that the calculation amount is reduced, the noise interference of a sensor is avoided, and the energy consumption of the cleaning robot caused by the real-time self positioning navigation in the recharging process is reduced.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a block diagram of a cleaning robot according to an embodiment of the present invention.
Fig. 2A is a top schematic view of a cleaning robot according to an embodiment of the invention.
Fig. 2B is a bottom schematic view of the cleaning robot in an embodiment of the invention.
Fig. 3 is a schematic diagram of an autonomous charging system according to an embodiment of the invention.
Fig. 4 is a block diagram of a main body seat according to an embodiment of the present invention.
Fig. 5 is a schematic view of a search area of the cleaning robot according to an embodiment of the present invention.
Fig. 6 is a schematic view illustrating a distance between the cleaning robot and the charging stand according to an embodiment of the present invention.
Fig. 7 is a flowchart illustrating steps of an autonomous charging method according to an embodiment of the invention.
Fig. 8 is a flowchart illustrating steps of an autonomous charging method according to another embodiment of the present invention.
Fig. 9 is a flowchart illustrating steps of an autonomous charging method according to still another embodiment of the present invention.
Fig. 10 is a block diagram showing a cleaning robot according to another embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It is to be understood that the terminology used in the embodiments of the present application is for the purpose of describing particular embodiments only, and is not intended to be limiting of the application. As used in the examples of this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
Referring to fig. 1, one embodiment of a cleaning robot provided by the present application includes: image acquisition unit 110, battery unit 120, drive unit 130, left wheel 131, right wheel 132, guide wheel 133, cleaning unit 140, processing unit 150, storage unit 160, obstacle detection unit 170.
The image capturing unit 110 is used to capture an image of the current environment of the cleaning robot. The image acquisition unit 110 includes one or more cameras among a two-dimensional camera and a three-dimensional camera. For example, one two-dimensional camera may be placed on the upper surface of the cleaning robot and capture images above the cleaning robot, i.e., images of the ceiling of the space to be worked.
For another example, a three-dimensional camera is placed in front of the cleaning robot and captures a three-dimensional image viewed by the cleaning robot, as shown in fig. 2A. The three-dimensional image comprises information about the distance from the object to be acquired to the two-dimensional image of the object to be acquired. A stereo camera module or a depth sensor module may be employed as the three-dimensional camera.
The image acquisition unit 110 may include one or more of a depth sensor 111, an RGB image sensor 112, or a structured light image sensor 113.
The depth sensor includes: a two-dimensional camera that acquires an image of an object to be acquired; and an infrared sensor which irradiates infrared rays to the object to be collected and detects the magnitude of the infrared rays reflected from the object to be collected, thereby measuring the distance of the object to be collected in the two-dimensional image. And the depth sensor outputs images collected by the two-dimensional camera and distance information obtained by the infrared sensor.
The RGB sensor 112 may capture RGB images, also referred to as color images. For example, the charging pile is photographed by using an RGB sensor to obtain an RGB image including the charging pile.
The structured light image sensor 113 includes an infrared transceiver module. For example, the infrared transceiver module can measure the distance from the cleaning robot to the charging pile. And generating a three-dimensional image of the charging pile according to the distance from the cleaning robot to the charging pile.
Wherein the stereo camera module includes a plurality of two-dimensional cameras, and determines distance information on an object to be captured using a difference between images captured by the plurality of two-dimensional cameras. Also, the stereo camera module outputs information on a distance between one of the images captured by the plurality of two-dimensional cameras and the object to be captured.
The image acquisition unit 110 may further include a graphics processing unit that processes the acquired images as needed. Such as changing the size or resolution of the image captured by the camera.
The battery unit 120 includes a rechargeable battery, a charging circuit respectively connected to the rechargeable battery, and electrodes of the rechargeable battery. The number of the rechargeable batteries is one or more, and the rechargeable batteries can provide electric energy required by the operation of the cleaning robot. The electrode may be provided at a side of the body or at the bottom of the body of the cleaning robot. The battery unit 120 may also include a battery parameter detection component for detecting battery parameters, such as voltage, current, battery temperature, and the like. When the working mode of the cleaning robot is switched to the recharging mode, the cleaning robot starts to search for the charging pile and charges the cleaning robot by utilizing the charging pile.
The driving unit 130 includes a motor for applying a driving force. The driving unit 130 connects the sweeping unit 140, the left wheel 131, the right wheel 132, and the guide wheel 133. Under the control of the processing unit 150, the driving unit 130 may drive the sweeping unit 140, the left wheel 131, the right wheel 132, and the guide wheel 133. Alternatively, the driving unit 130 includes a cleaning driving sub-unit connected to the cleaning unit 140, a left wheel driving sub-unit connected to the left wheel 131, a right wheel driving sub-unit connected to the right wheel 132, and a guide wheel driving unit connected to the guide wheel 133.
The left and right wheels 131 and 132 (wherein the left and right wheels may also be referred to as travel wheels and drive wheels) are centrally disposed at opposite sides of the bottom of the machine body of the cleaning robot in a symmetrical manner, respectively. The moving operation including the forward movement, the backward movement, and the rotation is performed during the cleaning. The guide wheel 133 may be provided at the front or rear of the machine body.
As shown in fig. 2B, sweeping unit 140 includes: a main brush 141 and one or more side brushes 142. The main brush is installed at the bottom of the body of the cleaning robot. Alternatively, the main brush 141 is a drum-shaped rotating brush rotating with respect to the contact surface in a roller type. The side brushes 142 are mounted at left and right edge portions of the front end of the bottom surface of the cleaning robot. That is, the side brush 142 is mounted substantially in front of the plurality of travel wheels. The side brush 142 is used to clean a cleaning area that the main brush 141 cannot clean. Also, the side brush 142 may not only rotate on the spot but also be installed to protrude to the outside of the cleaning robot, so that the area swept by the cleaning robot may be enlarged.
The obstacle detecting unit 170 is used to detect the surroundings of the cleaning robot on the circumferential side, and thereby find obstacles, walls, steps, and environmental objects such as a charging pile used to charge the cleaning robot. The obstacle detecting unit 170 is also used to provide various position information and motion state information of the cleaning robot to the control module. The obstacle detection unit 170 may include a cliff sensor, an ultrasonic sensor, an infrared sensor, a magnetometer, a three-axis accelerometer, a gyroscope, a odometer, an LDS, an ultrasonic sensor, a camera, a hall sensor, and the like. The number and positions of the obstacle detection units 170 are not limited in this embodiment.
The processing unit 150 is disposed on a circuit board in the body of the cleaning robot, and may draw an instant map of the environment where the cleaning robot is located according to the information of the surrounding environment object fed back by the obstacle detecting unit 170 and a preset positioning algorithm. The processing unit 150 may further comprehensively determine the current working state of the cleaning robot according to distance information and speed information fed back by devices such as a cliff sensor, an ultrasonic sensor, an infrared sensor, a magnetometer, an accelerometer, a gyroscope, and a speedometer. The processing unit 150 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital signal processing units (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-processing units, or other electronic components for performing the autonomous charging method in the embodiments of the present disclosure.
The storage unit 160 is used to store instructions and data, including but not limited to: map data, temporary data generated when controlling the operation of the cleaning robot, such as position data, speed data, etc. of the cleaning robot. The processing unit 150 can read the instructions stored in the storage unit 160 to execute the corresponding functions. The Memory unit 160 may include a Random Access Memory (RAM) and a Non-Volatile Memory (NVM). The nonvolatile Memory unit may include a Hard Disk Drive (Hard Disk Drive, HDD), a Solid State Drive (SSD), a Silicon Disk Drive (SDD), a Read-Only Memory unit (ROM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy Disk, an optical data storage device, and the like.
It is understood that in one or more embodiments, the cleaning robot may further include an input-output unit, a position measurement unit, a wireless communication unit, a display unit, and the like.
Fig. 2A and 2B are schematic views of the cleaning robot 10 at two different viewing angles, respectively. As shown in fig. 2A, an image pickup unit 110 is provided at a side of the cleaning robot 10 for picking up a front environment image. As shown in fig. 2B, the cleaning robot 10 is provided at the bottom thereof with a left wheel 131, a right wheel 132, a guide wheel 133, a cleaning unit 140, and a battery unit 120. Sweeping unit 140 includes a main brush 141 and an edge brush 142. The rechargeable battery in the battery unit 120 is packaged inside the cleaning robot 10 with a cover to prevent it from falling. One of the electrode 121 and the electrode 122 of the rechargeable battery is a positive electrode, and the other is a negative electrode.
It should be noted that the connection relationship between the units or components in the cleaning robot is not limited to the connection relationship shown in fig. 1. For example, the processing unit 150 may be connected to other units or components via a bus.
It should be noted that the cleaning robot may further include other units or components, or only include some of the units or components, which is not limited in this embodiment, and only the cleaning robot is described as an example.
Please refer to fig. 3, which is a schematic diagram illustrating an architecture of an autonomous charging system 900 according to an embodiment of the present invention. As shown in fig. 3, the autonomous charging system 900 includes the cleaning robot 10 and a charging stand 800. The charging stand 800 is used to charge the cleaning robot 10, that is, when the cleaning robot 10 moves onto the charging stand 800, the cleaning robot 10 can be electrically connected to the charging stand 800, so that the charging stand 800 can charge the cleaning robot 10. In the present embodiment, the cleaning robot 10 has a substantially disc shape. In other embodiments, the cleaning robot 10 may also have other shapes, such as a square shape, and is not limited herein.
Specifically, the charging stand 800 includes a base 810 and a main body stand 820 connected to the base. In this embodiment, the base 810 is vertically connected to the main body seat 820. Specifically, the base 810 is a structure vertically extended from the bottom of the main body seat 820. In this embodiment, the base 810 and the main body seat 820 are integrally formed, and in other embodiments, the main body seat 820 may be connected to the base 810 in other manners, for example, by welding, bonding, and the like, which is not limited herein. Obviously, in other embodiments, the base 810 may be connected to the main body seat 820 in an inclined manner.
In some embodiments, the base 810 is a plate-shaped structure, and is a bottom plate extending perpendicularly from the bottom of the main body seat 820. That is, when the charging stand 800 is placed on the ground, the extension direction of the base 810 is parallel to the ground.
Two charging electrode plates 811 are arranged on the base 810 at intervals, wherein one charging electrode plate 811 is a positive charging electrode plate, and the other charging electrode plate 811 is a negative charging electrode plate. When the cleaning robot 10 moves to the base 810, the two electrodes 121/122 are in electrical contact with the two charging electrode pads 811, respectively, so that the cleaning robot 10 is electrically connected to the charging stand 800. Here, the bottom of the cleaning robot 10 refers to a portion facing the floor when the cleaning robot 10 is placed on the floor for sweeping. It is understood that, in other embodiments, the charging electrode pads 821 may be disposed on the main body seat 820, which is not limited herein.
As shown in fig. 3, in some embodiments, the two charging electrode pads 811 are disposed on the base 810 in a protruding manner, two recesses are disposed at intervals on the bottom of the cleaning robot 10, the two electrodes 121/122 of the cleaning robot 10 are disposed in the corresponding recesses, and the two charging electrode pads 811 are respectively fitted into the two recesses on the bottom of the cleaning robot 10 when the cleaning robot 10 moves onto the base 810. Accordingly, the relative fixation between the cleaning robot 10 and the base 810 can be maintained, thereby improving the stability of the charging connection.
In other embodiments, the cleaning robot 10 may not be provided with the recess, and the two charging electrode pads 811 of the charging stand 800 may be retracted into the base 810 when an external force is applied thereto, and elastically return to a state of protruding from the base 810 when no external force is applied thereto. When the cleaning robot 10 moves to the base 810, the two charging electrode pads 811 of the charging stand 800 are retracted to the base by the pressure of the cleaning robot 10, and the two charging electrode pads 811 of the charging stand 800 are closely abutted against and electrically connected to the two electrodes 121/122 of the cleaning robot 10 due to the elastic restoring force. Thus, the stability of the electrical contact is improved.
Springs may be disposed below the two charging electrode pads 811, respectively, so that the two charging electrode pads 811 retract into the base 810 when receiving an external force and elastically return to a state protruding out of the base 810 when not receiving the external force. The spring can be a spiral spring, a spring sheet and the like, and the spring can wrap an insulating material, so that the electric property of the two charging electrode plates 811 is prevented from being influenced, and the charging safety is improved.
The main body seat 820 is provided with a charging seat identifier 821. In some embodiments, the cradle id 821 is a device that includes a recognizable pattern. Wherein the pattern comprises one or more combinations of dots, lines, faces, colors. Preferably, in order to improve the detection effect of the charging dock identifier 821, the pattern may be a matrix pattern comprising alternating black and white grids. In some embodiments, the cradle identification 821 may also be a pattern or the like with specific content such as lightning symbols.
Referring to fig. 4, in some embodiments, a charging circuit 822 is disposed in the main body seat 820, the charging circuit 822 is electrically connected to the charging electrode pads 811, and the charging circuit 822 is configured to output a charging voltage and a charging current to the charging electrode pads 811 so as to charge the cleaning robot 10 by outputting the charging voltage and the charging current through the charging electrode pads 811 when the cleaning robot 10 moves onto the base 810 and the two electrodes 121/122 of the cleaning robot 10 are electrically contacted to the two charging electrode pads 811, respectively.
In some embodiments, a charging interface 823 may be further disposed on the main body seat 820, the charging circuit 822 is further electrically connected to the charging interface 823, and when the charging interface 823 is electrically connected to a mains power supply through a power line, the charging circuit 822 converts the electric energy of the mains power supply into a suitable charging voltage and charging current.
In some embodiments, a battery 824 is further disposed in the main body seat 820, the charging circuit 822 is further connected to the battery 824, and the charging circuit 822 is configured to convert electric energy of the mains power into a suitable charging voltage and charging current to charge the battery 824 and charge the cleaning robot 10 when the charging interface 823 is electrically connected to the mains power. The charging circuit 822 is further configured to convert the electric energy of the battery 824 into a suitable charging voltage and charging current to charge the cleaning robot 10 when the charging interface 823 is not connected to the commercial power source and the cleaning robot 10 moves to the base 810 to be charged.
Thus, since the battery 824 is disposed in the charging stand 800, the charging stand 800 can be placed at any position as a portable power source to charge the cleaning robot 10, thereby improving flexibility and convenience.
Specifically, the processing unit 150 is configured to control the cleaning robot 10 to move along a preset direction when the cleaning robot 10 is determined to be in the first area range; when it is determined that the cleaning robot 10 is in the second area range, the position information of the charging stand 800 is determined based on the image collected in the second area range, and the cleaning robot 10 is controlled to move to the charging stand 800 to be charged according to the position information. Wherein the second area range is closer to the charging dock than the first area range.
In the cleaning robot 10 provided in the embodiment of the present invention, when it is determined that the cleaning robot is located in a first area range, the cleaning robot is controlled to move along a preset direction, and when it is determined that the cleaning robot 10 is located in a second area range, position information of the charging stand 800 is determined based on an image collected in the second area range, and the cleaning robot 10 is controlled to move to the charging stand 800 for charging according to the position information. Since the second area range is closer to the charging dock than the first area range. Compare in the real-time location navigation among the current technique of recharging, the scheme in this application is the affirmation cleaning robot 10 is in when filling the second region scope that electric pile is more nearly, just confirms the positional information of charging seat, and then has reduced the operand, and has avoided sensor noise interference to reduce cleaning robot 10 and recharged the energy consumption that the in-process produced because of real-time self location navigation, improved cleaning robot 10's live time, still improved the life of battery simultaneously.
In this embodiment, the predetermined direction is a direction toward the charging stand 800, so that the time required for the recharging process can be reduced.
In some embodiments, the image acquisition unit 110 is configured to acquire an image of the current environment. The processing unit 150 is also used to determine whether there is a charging stand 800 around from the image captured by the image capturing unit 110 before determining that the cleaning robot 10 is in the first area range or the second area range.
In some embodiments, the processing unit 150 determining whether the charging dock 800 is present around according to the image captured by the image capturing unit 110 includes: the processing unit 150 controls the cleaning robot 10 to rotate in place, and captures an image of the current environment through the image capturing unit 110 during the rotation. When the processing unit 150 recognizes the charging stand 800 from the captured image, it is determined that the charging stand 800 exists around the cleaning robot 10. The image of the charging dock 800 can be recognized through the image of the charging dock 800 itself, and also through the image of the charging dock id 821. Since the relative position relationship between the charging dock 800 and the charging dock identifier 821 is determined, the image of the charging dock 800 can be calculated after the image of the charging dock identifier 821 is recognized.
In some embodiments, in order to improve the accuracy of detection and reduce the time required for rotation, the processing unit 150 controls the cleaning robot 10 to rotate in place and captures an image of the current environment through the image capturing unit 110 during rotation, including: after the processing unit 150 controls the cleaning robot 10 to rotate at the first angular speed by the preset angle, the cleaning robot 10 is controlled to continue to rotate at the second angular speed for the preset time, and the image of the current environment is acquired by the image acquisition unit 110 within the preset time of the rotation at the second angular speed. Wherein the second angular velocity is less than the first angular velocity.
For example, when the image capturing unit 110 is a camera and the angle of view of the camera is θ, and 2 θ/3 is a preset angle, the cleaning robot 10 may be controlled to rotate at a first angular speed by 2 θ/3 (the preset angle), then rotate at a second angular speed for a preset time, and capture an image of the current environment within the preset time of the rotation at the second angular speed. Since the image capture unit (e.g., camera) 10 requires a certain amount of time to expose an image, too fast a rotational motion can blur the image, while slower rotational speeds can increase the time required to search the cradle 800. Therefore, the image is collected at a lower angular velocity (second angular velocity) to reduce the image blur and improve the detection accuracy, and the rotation at a higher angular velocity (first angular velocity) to improve the average angular velocity of the entire rotation and improve the detection efficiency.
In this embodiment, when the preset time of the rotation at the second angular velocity is over, and the processing unit 150 determines that there is no charging seat 800 in the image captured by the image capturing unit 110, the cleaning robot 10 continues to rotate in the above-mentioned rotation method, that is, after the cleaning robot 10 continues to rotate at the first angular velocity by 2 θ/3 (preset angle) at the current position, it rotates at the second angular velocity for the preset time again until the image captured by the image capturing unit 110 is an image of 360 degrees of the current position of the cleaning robot 10, and if there is no charging seat 800 around, the cleaning robot 10 is controlled to move to the next position to continue the search and detection.
Please refer to fig. 5, which is a schematic view of the search area of the cleaning robot 10. In some embodiments, in order to improve the search efficiency of the charging stand 800, the processing unit 150 controls the cleaning robot 10 to move from the currently searched area to the next area to continue the search when it is determined that the charging stand 800 does not exist around the cleaning robot 10. Wherein the next region is at least partially non-overlapping with the currently searched region. Specifically, the controlling the cleaning robot 10 to move from the currently searched area to the next area to continue the search includes: the position of the cleaner robot 100 in the map is determined based on the image collected by the image collecting unit 110, and an area within a preset range around the cleaning robot 10 is determined as a searched area and marked in the map. The processing unit 150 also determines an unsearched area adjacent to the currently located area as the next area, and controls the cleaning robot 10 to move to the next area to continue the search. In this embodiment, since the processing unit 150 marks the searched area, the occurrence of repeated searching in a certain area can be effectively avoided, and the searching efficiency of the charging stand 800 is improved.
The map is a SLAM (Simultaneous Localization and Mapping, real-time self-positioning and self-building map) system map, and the map may be stored in the storage unit 30 or may be stored in the cloud server.
In some embodiments, the processing unit 150 for identifying whether there is a charging dock 800 around according to the image captured by the image capturing unit 110 may include: the processing unit 150 acquires the image of the surrounding environment acquired by the image acquisition unit 110, and recognizes the charging dock 800 by performing recognition processing on the acquired image by using a preset recognition algorithm. The preset recognition algorithm includes, but is not limited to, a recognition method based on image matching, a recognition method based on deep learning, and the like.
The processing unit 150 extracts the contour and shape of the image acquired by the image acquisition unit 110, compares the extracted contour and the preset contour of the charging dock 800, and determines to identify the charging dock 800 when the matching is determined to be successful. For example, the processing unit 150 may acquire a plurality of sample data of the charging dock 800 in advance through the image acquisition unit 110, optimize parameters of the charging dock 800 by using the acquired plurality of sample data to obtain a feature model of the charging dock 800, and detect whether the charging dock 800 exists around based on the parameters of the trained feature model. It is understood that, in the present embodiment, the trained feature model of the cradle may be stored in the storage unit 30 in advance.
In other embodiments, the processing unit 150, based on the image captured by the image capturing unit 110, identifies whether there is a charging dock 800 around, which may include: the processing unit 150 is further configured to compare the image acquired by the image acquisition unit 110 with a pre-stored pattern content of a charging seat identifier 821 set on the charging seat 800, and when it is determined that the acquired image includes a portion matching with the charging seat identifier 821, it is determined that the charging seat 800 is identified.
The pattern content of the charging dock id 821 can be stored in the storage unit 30 in advance, and when the processing unit 150 determines that the collected image includes a portion matching with the stored charging dock id 821, it can be determined that the charging dock 800 is identified.
Wherein, when the processing unit 150 determines that the similarity between the outline shape in the currently captured image and the outline shape of the charging dock 800 is higher than the threshold, for example, 90%, the matching is successful, and the portion of the captured image that includes the pattern content matched with the charging dock identifier 821 may also refer to the portion of the captured image that includes the similarity between the pattern content and the pre-stored charging dock identifier 821 that is higher than the threshold.
It should be noted that the present application does not limit the present invention to identify the charging dock 800, and the above embodiment is only an example.
Please refer to fig. 6, which is a schematic diagram illustrating a distance between the cleaning robot 10 and the charging stand 800. As shown in fig. 6, the processing unit 150 is further configured to acquire the distance between the charging stand 800 and the cleaning robot 10. The distance between the cleaning robot 10 and the charging stand 800 may be acquired by infrared or laser, which is not limited herein. When the relative distance between the cleaning robot 10 and the charging stand 800 is less than or equal to the first preset distance L1, it is determined that the cleaning robot 10 is within a second area range, and at this time, the cleaning robot 10 may calculate the position information of the charging stand 800; when the relative distance between the cleaning robot 10 and the charging stand 800 is greater than the first preset distance L1, it is determined that the cleaning robot 10 is in a first area range. In the present embodiment, when the relative distance between the cleaning robot 10 and the charging stand 800 is greater than the first preset distance L1 and is less than or equal to a detection distance threshold L2, the cleaning robot 10 is in a first area range, and at this time, the cleaning robot 10 may detect that the charging stand 800 exists around but does not calculate the position information of the charging stand 800; when the distance between the cleaning robot 10 and the charging stand 800 is greater than the detection distance threshold L2, the cleaning robot 10 will not be able to detect the presence of the charging stand 800.
According to the principle of image formation, when the relative distance between the cleaning robot 10 and the charging stand 800 changes, the area of the image of the charging stand 800 or the charging stand identifier 821 acquired by the image acquisition unit 110 changes, that is, the area of the image of the charging stand 800 or the charging stand identifier 821 acquired by the image acquisition unit 110 increases as the relative distance between the cleaning robot 10 and the charging stand 800 decreases. Therefore, in some embodiments, the processing unit 150 is further configured to acquire an area of a charging-stand image in the captured image, and determine that the cleaning robot 10 is in the first area range when the area of the charging-stand image is smaller than a preset area; when the area of the charging-stand image is greater than or equal to a preset area, it is determined that the cleaning robot 10 is in a second area range. This is so, avoiding the use of additional elements to detect the distance between the charging stand 800 and the cleaning robot 10, thereby saving production costs.
In this embodiment, the processing unit 150 is further configured to determine that the distance between the cleaning robot 10 and the charging stand 800 is the first preset distance L1 when the area of the target image captured by the image capturing unit 110 is a preset area.
Wherein, when it is determined that the area of the charging-stand image in the captured image is smaller than the preset area, it is determined that the distance between the cleaning robot 10 and the charging stand 800 is greater than the first preset distance L1; when it is determined that the area of the charging-stand image of the collected image is greater than the preset area, it is determined that the distance between the cleaning robot 10 and the charging stand 800 is less than or equal to the first preset distance L1. When the area of the charging-stand image is smaller than or equal to the preset area and the position of the characteristic pixel of the image is accurate, calculating the position information of the charging stand 800 according to the charging-stand image is accurate; when the area of the target image is smaller than the predetermined area, the pixel distance between the image features is relatively small, and the pixel variation of the noise is relatively large, which results in a relatively large error of the calculated position information of the charging stand 800.
In some embodiments, when the processing unit 150 determines that the cleaning robot 10 is in the first area range, the processing unit 150 controls the cleaning robot 10 to move in a preset direction, including: the processing unit 150 controls the cleaning robot 10 to rotate so that the charging-stand image is positioned at the center of the image captured by the image capturing unit 110, and controls the cleaning robot 10 to move toward the charging stand 800, and keeps the charging-stand image positioned at the center of the image captured by the image capturing unit 110 during the movement. Therefore, the cleaning robot 10 can be ensured to move to the charging seat 800 in the approaching process, the charging seat 800 is always positioned right in front of the cleaning robot 10, and the charging alignment precision can be improved and the movement time can be reduced.
In some embodiments, the image captured by the image capturing unit 110 is a depth image or an RGB image. The processing unit 150 determines the location information of the cradle 800, including: the processing unit 150 determines the position information of the charging dock according to the depth information or RGB information of the charging dock image acquired by the image acquisition unit 110. In the present embodiment, since the position information of the charging dock 800 is determined by image acquisition, the cost can be reduced and the positioning efficiency can be improved compared to other positioning technologies (e.g., infrared positioning navigation).
Further, the determining, by the processing unit 150, the position information of the charging dock according to the depth information or RGB information of the charging dock image acquired by the image acquisition unit 110 includes: the processing unit 150 determines the relative position information of the charging stand 800 and the cleaning robot 10 according to the depth information or RGB information of the charging stand image acquired by the image acquisition unit 110, and the processing unit 150 determines the position information of the charging stand 800 according to the current position of the cleaning robot 10 and the relative position information of the charging stand 800 and the cleaning robot 10. Wherein the relative position information includes a direction and a distance of the charging stand 800 with respect to the cleaning robot 10. Since the position of the cleaning robot 10 is known, the position information of the charging stand 800 can be quickly determined according to the relative position information in the image, thereby improving the positioning efficiency.
The processing unit 150 determines the current coordinate position of the cleaning robot 10 in real time according to the SLAM system map, determines the relative position information including the relative distance and the relative direction between the charging stand 800 and the cleaning robot 10 according to the depth information or RGB information of the acquired image, and then determines the coordinate position of the charging stand 800 in the SLAM system map according to the current coordinate position and the relative position information of the cleaning robot 10 in the SLAM system map, thereby obtaining the position of the charging stand 800.
Thus, the processing unit 150 can derive the absolute coordinates of the charging stand 800 in the SLAM system map from the absolute coordinates of the cleaning robot 10 in the SLAM system map and the relative position of the charging stand 800 with respect to the cleaning robot 10. The absolute coordinate of the charging stand 800 in the map of the SLAM system is the coordinate position of the central point position of the charging stand 800 in the SLAM system.
Further, the position information of the charging dock 800 includes the position coordinate data of the charging dock 800 and the posture of the charging dock 800. Through the position coordinate data of the charging stand 800, the cleaning robot 10 can accurately move to the vicinity of the charging stand 800, and through the guidance of the posture of the charging stand 800, the cleaning robot 10 can quickly move to the front of the charging stand 800, thereby realizing the quick alignment with the charging stand 800.
In some embodiments, the controlling the cleaning robot 10 to move to the charging stand 800 for charging according to the position information includes: the processing unit 150 determines the charging surface of the charging dock 800 according to the position information of the charging dock 800, and determines the position of the central perpendicular line of the charging surface. The position of the center perpendicular line of the charging surface may be a position through which the center perpendicular line of the charging surface extends outward from the charging surface. The processing unit 150 further determines whether the distance between the cleaning robot 10 and the center perpendicular line is less than a second preset distance; when it is determined that the distance between the cleaning robot 10 and the center perpendicular line is less than the second preset distance threshold, the cleaning robot 10 is controlled to continue moving toward the charging stand to move to a preset charging position of the charging stand 800 for charging. In this embodiment, the charging surface is a surface of the main body seat 820 adjacent to the base 810 and connected to the base 810. As described above, in other embodiments, the charging electrode pads 821 may be provided on the charging surface charging electrode pad, which is not limited herein.
Wherein the processing unit 150 controls the cleaning robot 10 to move toward the charging stand 800 according to a PID (proportional-integral-derivative) control technique. Specifically, the processing unit 150 first sets an initial angular velocity feedback adjustment amount and a linear velocity feedback adjustment amount to be zero, then assigns a preset feedback value to the current feedback value, and changes the current angular velocity and linear velocity control amount of the cleaning robot 10 by the current angular velocity feedback adjustment amount and linear velocity feedback adjustment amount, thereby adjusting the current body attitude and calculating the center perpendicular line of the image of the charging stand 800 observed under the attitude. When the position of the fuselage is near the center vertical line, calculating an included angle between the current fuselage orientation and the center vertical line as an error angle, and calculating a vertical distance between the current fuselage position and the center vertical line as an error distance. Weighting and calculating the error angle and the error distance according to a certain proportion to obtain angular acceleration feedback adjustment quantity and linear acceleration feedback adjustment quantity at the next moment; weighting and calculating the current angular acceleration and angular acceleration feedback adjustment quantity, the error angle and the error distance according to a certain proportion to obtain the angular velocity feedback adjustment quantity at the next moment; and weighting and calculating the current linear acceleration, the linear acceleration feedback adjustment quantity, the error angle and the error distance according to a certain proportion to obtain the linear velocity feedback adjustment quantity at the next moment. The feedback adjustment at the next time will take effect in the next cycle check. In the moving process of the body, whether the charging seat 800 is connected or not is continuously detected through the electrode 121/122, and the body is controlled to stop moving when the charging seat is connected 800.
In some embodiments, when the processing unit 150 determines that the distance from the perpendicular center line of the cleaning robot 10 is greater than or equal to the second preset distance, the cleaning robot 10 is controlled to move to a position where the distance from the perpendicular center line is less than the second preset distance.
It should be noted that when the distance between the cleaning robot 10 and the center perpendicular line is long, it indicates that the cleaning robot 10 is not yet facing the charging surface, and therefore, it is necessary to guide the cleaning robot 10 to a position close to the center perpendicular line so that the cleaning robot 10 faces the charging surface. At this time, that is, when the distance between the cleaning robot 10 and the central perpendicular line is short, the movement direction and the movement distance of the cleaning robot 10 are adjusted according to the PID control technique, so that the cleaning robot 10 is finally controlled to move to the base 810, and the charging stand 800 is electrically connected to the cleaning robot 10 to charge the cleaning robot 10.
In some embodiments, the processing unit 150 further determines whether there is a location of the charging stand 800 having the history mark in the map, and controls the cleaning robot 10 to move to the location of the charging stand 800 having the history mark when it is determined that there is a location of the charging stand 800 having the history mark in the map, and charges the cleaning robot through the charging stand 800 when there is a charging stand 800 at the location of the charging stand having the history mark. Thus, the time for searching the charging stand 800 can be saved, and the recharging efficiency can be improved.
It is understood that if there is no cradle 800 at the location of the history marked cradle 800, the search can be performed according to the detection method described above until the cradle 800 is searched.
Referring to fig. 7 again, fig. 7 is a flowchart illustrating steps of an autonomous charging method according to an embodiment of the invention. The autonomous charging method is applied to a cleaning robot 10, and the cleaning robot 10 includes an image capturing unit 110 for capturing a current environment. As shown in fig. 7, the autonomous charging method includes the steps of:
in step S71, an image of the current environment is acquired.
And step S72, controlling the cleaning robot 10 to move in a preset direction when it is determined that the cleaning robot 10 is in the first area range.
Wherein the preset direction is a direction moving toward the charging stand 800.
In some embodiments, the controlling the cleaning robot 10 to move along a preset direction includes: the cleaning robot 10 is controlled to rotate so that the charging-stand image is positioned at the center of the image collected by the image collection unit 110, and the cleaning robot 10 is controlled to move toward the charging stand 800 and maintain the charging-stand image positioned at the center of the image collected by the image collection unit 110 during the movement. Therefore, the cleaning robot 10 can be ensured to move to the charging seat 800 in the approaching process, the charging seat 800 is always positioned right in front of the cleaning robot 10, and the charging alignment precision can be improved and the movement time can be reduced.
In step S73, when it is determined that the cleaning robot is in the second area range, the position information of the charging stand 800 is determined based on the image captured within the second area range.
In some embodiments, the image captured by the image capturing unit 110 is a depth image or an RGB image; the determining the position information of the charging stand 800 includes: the position information of the charging dock 800 is determined according to the depth information or RGB information of the charging dock image acquired by the image acquisition unit 110. In the present embodiment, since the position information of the charging dock 800 is determined by image acquisition, the cost can be reduced and the positioning efficiency can be improved compared to other positioning technologies (e.g., infrared positioning navigation).
In some embodiments, the determining the position information of the charging dock 800 according to the depth information or RGB information of the charging dock image acquired by the image acquisition unit 110 includes: the relative position information of the charging stand 800 and the cleaning robot 10 is determined according to the depth information or RGB information of the charging stand image acquired by the image acquisition unit 110, and the position information of the charging stand 800 is determined according to the current position of the cleaning robot and the relative position information of the charging stand 800 and the cleaning robot 10. Since the position of the cleaning robot 10 is known, the position information of the charging stand 800 can be quickly determined according to the relative position information in the image, thereby improving the positioning efficiency.
Specifically, determining whether the cleaning robot 10 is in the first area range or the second area range includes: acquiring a distance between the charging stand and the cleaning robot 10; when the distance between the charging stand 800 and the cleaning robot 10 is greater than a first preset distance, determining that the cleaning robot 10 is in a first area range; when the distance between the charging stand 800 and the cleaning robot 10 is less than or equal to the first preset distance, determining that the cleaning robot is in a second area range; or
Acquiring the area of a charging seat image in the acquired image, and determining that the cleaning robot 10 is in a first area range when the area of the charging seat image is smaller than a preset area; when the area of the charging-stand image is greater than or equal to a preset area, it is determined that the cleaning robot 10 is in a second area range.
In other embodiments, when it is determined that the area of the charging-stand image captured by the image capturing unit 110 is a preset area, it is determined that the distance between the cleaning robot 10 and the charging stand 800 is the first preset distance. When it is determined that the charging-stand image area acquired by the image acquisition unit 110 is smaller than the preset area, it is determined that the distance between the cleaning robot 10 and the charging stand 800 is greater than the first preset distance; when it is determined that the charging-stand image area acquired by the image acquisition unit 110 is greater than or equal to the preset area, it is determined that the distance between the cleaning robot 10 and the charging stand 800 is less than or equal to the first preset distance.
Step S74, controlling the cleaning robot 10 to move to the charging stand 800 for charging according to the position information; wherein the second area range is closer to the charging dock than the first area range.
In some embodiments, the controlling the cleaning robot 10 to move to the charging stand 800 for charging according to the position information includes: determining a charging surface of the charging seat 800 according to the position information of the charging seat 800, and determining the position of a central perpendicular line of the charging surface; judging whether the distance between the cleaning robot 10 and the center perpendicular line is smaller than a second preset distance; when it is determined that the distance between the cleaning robot 10 and the perpendicular center line is less than the second preset distance, the cleaning robot 10 is controlled to continue moving toward the charging stand 800 to move to a preset charging position of the charging stand 800 for charging.
Wherein the processing unit 150 controls the cleaning robot 10 to move toward the charging stand 800 according to a PID control technique. Specifically, the processing unit 150 first sets an initial angular velocity feedback adjustment amount and a linear velocity feedback adjustment amount to be zero, then assigns a preset feedback value to the current feedback value, and changes the current angular velocity and linear velocity control amount of the cleaning robot 10 by the current angular velocity feedback adjustment amount and linear velocity feedback adjustment amount, thereby adjusting the current body attitude and calculating the center perpendicular line of the image of the charging stand 800 observed under the attitude. When the position of the fuselage is near the center vertical line, calculating an included angle between the current fuselage orientation and the center vertical line as an error angle, and calculating a vertical distance between the current fuselage position and the center vertical line as an error distance. Weighting and calculating the error angle and the error distance according to a certain proportion to obtain angular acceleration feedback adjustment quantity and linear acceleration feedback adjustment quantity at the next moment; weighting and calculating the current angular acceleration and angular acceleration feedback adjustment quantity, the error angle and the error distance according to a certain proportion to obtain the angular velocity feedback adjustment quantity at the next moment; and weighting and calculating the current linear acceleration, the linear acceleration feedback adjustment quantity, the error angle and the error distance according to a certain proportion to obtain the linear velocity feedback adjustment quantity at the next moment. The feedback adjustment at the next time will take effect in the next cycle check. In the process of the movement of the body, whether the charging seat 800 is connected or not is continuously detected through the electrode 121/122, and the body is controlled to stop moving when the charging seat 800 is connected 800.
In some embodiments, when it is determined that the distance from the perpendicular center line of the cleaning robot 10 is greater than or equal to the second preset distance, the cleaning robot 10 is controlled to move to a position where the distance from the perpendicular center line is less than the second preset distance.
It should be noted that when the distance between the cleaning robot 10 and the center perpendicular line is long, it indicates that the cleaning robot 10 is not yet facing the charging surface, and therefore, the cleaning robot 10 needs to be guided to the center perpendicular line so that the cleaning robot 10 faces the charging surface. At this time, the movement direction and the movement distance of the cleaning robot 10 are adjusted according to the PID control technology, so that the cleaning robot 10 is finally controlled to move to the base 810, and the charging stand 800 is electrically connected to the cleaning robot 10 to charge the cleaning robot 10.
In the autonomous charging method provided by the embodiment of the present invention, when it is determined that the cleaning robot is located in the first area range, the cleaning robot is controlled to move along the preset direction, and when it is determined that the cleaning robot 10 is located in the second area range, the position information of the charging stand 800 is determined based on the image collected in the second area range, and the cleaning robot 10 is controlled to move to the charging stand 800 for charging according to the position information. Since the second area range is closer to the charging dock than the first area range. Compare in the real-time location navigation among the current technique of recharging, the scheme in this application is the affirmation cleaning robot 10 is in when filling the second region scope that electric pile is more nearly, just confirms the positional information of charging seat, and then has reduced the operand, and has avoided sensor noise interference to reduce cleaning robot 10 and recharged the energy consumption that the in-process produced because of real-time self location navigation, improved cleaning robot 10's live time, still improved the life of battery simultaneously.
Referring to fig. 8 again, fig. 8 is a flowchart illustrating steps of an autonomous charging method according to another embodiment of the present invention. In comparison with the autonomous charging method in fig. 7, the autonomous charging method in this embodiment further includes the following steps:
in step S81, it is determined whether there is a charging dock 800 around the image acquisition unit 110. If yes, go to step S72; if not, step S82 is executed.
In some embodiments, the determining whether there is a charging dock 800 around according to the image captured by the image capturing unit 110 includes: controlling the cleaning robot 10 to rotate in place, and acquiring an image of a current environment through the image acquisition unit 110 during the rotation; when the charging stand 800 is recognized from the captured image, it is determined that the charging stand 800 exists around the cleaning robot 10.
The controlling the cleaning robot 10 to rotate in place and capture an image of the current environment through the image capturing unit 110 during the rotation includes: after controlling the cleaning robot 10 to rotate at a first angular speed by a preset angle, controlling the cleaning robot 10 to continue rotating at a second angular speed for a preset time, and acquiring an image of the current environment through the image acquisition unit 110 within the preset time of rotating at the second angular speed; wherein the second angular velocity is less than the first angular velocity.
For example, when the image capturing unit 110 is a camera and the angle of view of the camera is θ, and 2 θ/3 is set as a preset angle, the cleaning robot 10 may be controlled to rotate at a first angular speed by 2 θ/3 (the preset angle), then rotate at a second angular speed for a preset time, and capture an image of the current environment during the preset time of rotation at the second angular speed. Since the image capture unit 110 (e.g., camera) 10 requires a certain amount of time to expose an image, too fast a rotational motion can blur the image, while slower rotational speeds can increase the time required to search the cradle 800. Therefore, the image is collected at a lower angular velocity (second angular velocity) to reduce the image blur and improve the detection accuracy, and the rotation at a higher angular velocity (first angular velocity) to improve the average angular velocity of the rotation and improve the detection efficiency.
In this embodiment, when the preset time of rotation at the second angular velocity is over, and it is determined that there is no charging seat 800 in the image captured by the image capturing unit 110, the cleaning robot 10 continues to rotate in the above-mentioned rotation method, that is, after the cleaning robot 10 continues to rotate at the first angular velocity by 2 θ/3 (preset angle) at the current position, it rotates at the second angular velocity for the preset time again until the image captured by the image capturing unit 110 is an image of 360 degrees of the current position of the cleaning robot 10, and if there is no charging seat 800 around, the cleaning robot 10 is controlled to move to the next position to continue search and detect.
Step S82, controlling the cleaning robot 10 to move from the currently searched area to the next area to continue searching; wherein the next region is at least partially non-overlapping with the searched region.
The controlling the cleaning robot 10 to move from the currently searched area to the next area to continue searching includes: determining the position of the cleaning robot in a map according to the image acquired by the image acquisition unit 110, determining an area in a preset range around the cleaning robot 10 as a current searched area, and marking the searched area in the map; and determining an unsearched area adjacent to the currently searched area as the next area, and controlling the cleaning robot 10 to move to the next area to continue searching. In this embodiment, since the searched area is marked, the occurrence of repeated searching in a certain area can be effectively avoided, and the searching efficiency of the charging stand 800 is improved.
Referring to fig. 9 again, fig. 9 is a flowchart illustrating steps of an autonomous charging method according to another embodiment of the present invention. Compared with the autonomous charging method in fig. 8, the autonomous charging method in this embodiment further includes the following steps:
in step S91, it is determined whether or not there is a history of the location of the cradle 800 in the map. If yes, go to step S92; if not, step S71 is executed.
Step S92, when the position of the charging stand 800 having the history mark is determined to exist in the map, the cleaning robot 10 is controlled to move to the position of the charging stand 800 having the history mark, and when the charging stand 800 exists at the position of the charging stand 800 having the history mark, the cleaning robot 10 is charged by the charging stand 800. Thus, the time for searching the charging stand 800 can be saved, and the recharging efficiency can be improved.
Referring to fig. 10 again, fig. 10 is a block diagram of a cleaning robot 10 according to another embodiment of the present invention. As shown in fig. 10, the cleaning robot 10 includes an image acquisition module 106, an area determination module 101, a control module 102, and a position determination module 103. The image acquiring module 106 is configured to acquire an image acquired by the image acquiring unit 110. The zone determination module 101 is configured to determine whether the cleaning robot 10 is in a first zone area or a second zone area. When the area determination module 101 determines that the cleaning robot 10 is in the first area range, the control module 102 may be configured to implement the method shown in step S72 in the above method embodiment; when the area determination module 101 determines that the cleaning robot 10 is in the second area range, the position determination module 103 may be configured to implement the method shown in step S73 in the above-described method embodiment. The control module 102 may also be configured to implement the method shown in step S74 in the above method embodiment. Wherein the second area range is closer to the charging dock than the first area range.
In some embodiments, the cleaning robot 10 further includes an identification module 104 and a search module 105. The identification module 104 may be configured to implement the method shown in step S81 in the above method embodiment. The search module may be configured to implement the method shown in step S82 in the above method embodiment.
In some embodiments, the position determination module 103 may also be configured to implement the method shown in step S91 in the above method embodiment. The control module 102 may also be configured to implement the method shown in step S92 in the above method embodiment.
It should be noted that, for specific details of the functions performed by the units of the cleaning robot 10 illustrated in fig. 10, reference may be made to the embodiments of the autonomous charging method described above, and details are not repeated here.
It should be noted that all or part of the steps in the methods of the above embodiments may be implemented by hardware instructions of a program, and the program may be stored in a computer-readable storage medium, where the storage medium includes a Read-Only Memory (ROM), a Random Access Memory (RAM), a Programmable Read-Only Memory (PROM), an Erasable Programmable Read-Only Memory (EPROM), a One-time Programmable Read-Only Memory (OTPROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), an optical Disc (EEPROM), a Compact Disc-Read-Only Memory (CD-ROM), or other Programmable Read-Only memories (ROM, CD-ROM), Disk storage, tape storage, or any other medium readable by a computer that can be used to carry or store data.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to related descriptions of other embodiments for parts that are not described in detail in a certain embodiment.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is merely a logical division, and in actual implementation, there may be other divisions, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or modules, and may be in an electrical, mechanical or other form.
The modules described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present application may be integrated into one processing unit, or each module may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
The integrated module, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a device (which may be a personal computer, a server, or a network device, a robot, a single chip, a chip, etc.) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: u disk, removable hard disk, read only memory, random access memory, magnetic or optical disk, etc. for storing program codes.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (14)

1. A cleaning robot, characterized in that the cleaning robot comprises:
the image acquisition unit is used for acquiring the image of the current environment and the image of the charging seat;
a processing unit for acquiring a distance between a charging stand and the cleaning robot; when the distance between the charging seat and the cleaning robot is larger than a first preset distance, determining that the cleaning robot is in a first area range; when the distance between the charging seat and the cleaning robot is smaller than or equal to the first preset distance, determining that the cleaning robot is in a second area range; or
The processing unit is further used for acquiring the area of a charging seat image in the acquired image, and when the area of the charging seat image is smaller than a preset area, the cleaning robot is determined to be in the first area range; when the area of the charging seat image is larger than or equal to a preset area, determining that the cleaning robot is in a second area range;
the processing unit is further used for controlling the cleaning robot to move along a preset direction when the cleaning robot is determined to be in the first area range; the processing unit controls the cleaning robot to move along a preset direction, including: the processing unit controls the cleaning robot to rotate so that the charging seat image is positioned in the center of the image acquired by the image acquisition unit, controls the cleaning robot to move towards the charging seat, and keeps the charging seat image positioned in the center of the image acquired by the image acquisition unit in the moving process;
when the cleaning robot is determined to be in a second area range, determining position information of a charging seat based on an image collected in the second area range, and controlling the cleaning robot to move to the charging seat for charging according to the position information; the determining the position information of the charging seat based on the image collected in the second area range specifically includes: and determining the position information of the charging seat according to the depth information or RGB information of the charging seat image acquired by the image acquisition unit, wherein the position information of the charging seat comprises the coordinate data of the position of the charging seat and the posture of the charging seat.
2. The cleaning robot of claim 1, wherein the processing unit is further configured to determine whether a charging dock is present around based on the image captured by the image capturing unit before determining that the cleaning robot is in the first area range or the second area range;
wherein, the processing unit confirms whether there is a charging seat around according to the image that the image acquisition unit gathered, includes: the processing unit controls the cleaning robot to rotate in situ and acquires an image of the current environment through the image acquisition unit in the rotating process; when the processing unit recognizes the charging stand from the captured image, it is determined that the charging stand exists around the cleaning robot.
3. The cleaning robot as claimed in claim 2, wherein the processing unit controls the cleaning robot to rotate in situ and captures an image of a current environment through the image capturing unit during the rotation, including: after the processing unit controls the cleaning robot to rotate at a first angular speed for a preset angle, the processing unit controls the cleaning robot to continue rotating at a second angular speed for a preset time, and the image acquisition unit acquires the image of the current environment within the preset time of rotating at the second angular speed; wherein the second angular velocity is less than the first angular velocity.
4. The cleaning robot according to claim 2, wherein when the processing unit determines that the charging stand does not exist around the cleaning robot, the cleaning robot is controlled to move from a currently searched area to a next area to continue searching; wherein the next region is at least partially non-overlapping with the currently searched region.
5. The cleaning robot according to claim 4, wherein the controlling the cleaning robot to move from a currently searched area to a next area to continue searching comprises: determining the position of the cleaning robot in a map according to the image acquired by the image acquisition unit, determining an area in a preset range around the cleaning robot at present as a searched area, and marking the searched area in the map; the processing unit also determines an unsearched area adjacent to the currently located area as the next area, and controls the cleaning robot to move to the next area to continue searching.
6. The cleaning robot of claim 1, wherein the processing unit is further configured to determine that the distance between the cleaning robot and the charging dock is the first predetermined distance when an area of a charging dock image in the captured image is a predetermined area;
the processing unit determines that the distance between the cleaning robot and the charging seat is greater than the first preset distance when the area of the charging seat image in the acquired image is determined to be smaller than the preset area;
and the processing unit determines that the distance between the cleaning robot and the charging seat is less than or equal to the first preset distance when the area of the charging seat image in the acquired image is determined to be greater than or equal to the preset area.
7. An autonomous charging method applied to a cleaning robot; the method is characterized by comprising the following steps:
acquiring an image of the current environment and an image of a charging seat through an image acquisition unit;
determining whether the cleaning robot is in a first area range or a second area range includes:
acquiring the distance between a charging seat and the cleaning robot; when the distance between the charging seat and the cleaning robot is larger than a first preset distance, determining that the cleaning robot is in the first area range; when the distance between the charging seat and the cleaning robot is smaller than or equal to the first preset distance, determining that the cleaning robot is in the second area range; or
Acquiring the area of a charging seat image in the acquired image, and determining that the cleaning robot is in the first area range when the area of the charging seat image is smaller than a preset area; when the area of the charging seat image is larger than or equal to a preset area, determining that the cleaning robot is in the second area range;
when the cleaning robot is determined to be in the first area range, controlling the cleaning robot to move along a preset direction; the controlling the cleaning robot to move in a preset direction includes: controlling the cleaning robot to rotate so that the charging seat image is positioned in the center of the image acquired by the image acquisition unit, controlling the cleaning robot to move towards the charging seat, and keeping the charging seat image positioned in the center of the image acquired by the image acquisition unit in the moving process;
determining position information of the charging stand based on an image acquired within the second area range when it is determined that the cleaning robot is within the second area range, the determining the position information of the charging stand based on the image acquired within the second area range specifically includes: determining the position information of the charging seat according to the depth information or RGB information of the charging seat image acquired by the image acquisition unit, wherein the position information of the charging seat comprises coordinate data of the position of the charging seat and the posture of the charging seat; and
controlling the cleaning robot to move to the charging seat to charge according to the position information; wherein the second area range is closer to the charging dock than the first area range.
8. The autonomous charging method of claim 7, further comprising, prior to determining that the cleaning robot is in the first zone range or the second zone range:
determining whether a charging seat exists around according to the image acquired by the image acquisition unit;
wherein, said image according to gathering the unit and gathering and confirming whether there is a charging seat around the said picture, including:
controlling the cleaning robot to rotate in situ, and acquiring an image of the current environment through the image acquisition unit in the rotating process;
determining that the charging stand exists around the cleaning robot when the charging stand is recognized from the captured image.
9. The autonomous charging method of claim 8 wherein said controlling said cleaning robot to rotate in situ and to capture an image of the current environment through said image capturing unit during rotation comprises:
after controlling the cleaning robot to rotate at a first angular speed for a preset angle, controlling the cleaning robot to continue rotating at a second angular speed for a preset time, and acquiring an image of the current environment through the image acquisition unit within the preset time of rotating at the second angular speed; wherein the second angular velocity is less than the first angular velocity.
10. The autonomous charging method of claim 8, further comprising:
when the charging seat is determined not to exist around the cleaning robot, controlling the cleaning robot to move from the currently searched area to the next area to continue searching; wherein the next region is at least partially non-overlapping with the searched region.
11. The autonomous charging method of claim 10 wherein said controlling the cleaning robot to move from a currently searched area to a next area to continue searching comprises:
determining the position of the cleaning robot in a map according to the image acquired by the image acquisition unit, determining an area in a preset range around the cleaning robot at present as a currently searched area, and marking the searched area in the map;
and determining an unsearched area adjacent to the currently searched area as the next area, and controlling the cleaning robot to move to the next area to continue searching.
12. The autonomous charging method of claim 7, wherein when it is determined that an area of a charging-stand image in the captured image is a preset area, it is determined that the distance between the cleaning robot and the charging stand is the first preset distance;
when the area of a charging seat image in the acquired image is determined to be smaller than the preset area, determining that the distance between the cleaning robot and the charging seat is larger than the first preset distance;
and when the area of the charging seat image in the acquired image is determined to be larger than or equal to the preset area, determining that the distance between the cleaning robot and the charging seat is smaller than or equal to the first preset distance.
13. An autonomous charging system comprising a charging stand, characterized in that the autonomous charging system further comprises a cleaning robot as claimed in any one of claims 1 to 6.
14. A readable storage medium, characterized in that the readable storage medium has stored thereon a corresponding program of an autonomous charging method, which when executed, implements the autonomous charging method according to any one of claims 7 to 12.
CN201910811775.XA 2019-08-30 2019-08-30 Cleaning robot, autonomous charging method, system, and readable storage medium Active CN110477825B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910811775.XA CN110477825B (en) 2019-08-30 2019-08-30 Cleaning robot, autonomous charging method, system, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910811775.XA CN110477825B (en) 2019-08-30 2019-08-30 Cleaning robot, autonomous charging method, system, and readable storage medium

Publications (2)

Publication Number Publication Date
CN110477825A CN110477825A (en) 2019-11-22
CN110477825B true CN110477825B (en) 2021-10-26

Family

ID=68555310

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910811775.XA Active CN110477825B (en) 2019-08-30 2019-08-30 Cleaning robot, autonomous charging method, system, and readable storage medium

Country Status (1)

Country Link
CN (1) CN110477825B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110989688A (en) * 2019-12-09 2020-04-10 台州学院 Automatic following system and method based on AprilTag code recognition
CN113116238B (en) * 2019-12-31 2022-11-15 佛山市云米电器科技有限公司 Cleaning robot maintenance method, cleaning robot, cleaning system, and storage medium
CN113077413A (en) * 2020-01-06 2021-07-06 苏州宝时得电动工具有限公司 Self-moving equipment and control method thereof
CN113641172A (en) * 2020-04-27 2021-11-12 科沃斯机器人股份有限公司 Autonomous mobile device, refilling method, and storage medium
CN111753695B (en) * 2020-06-17 2023-10-13 上海宜硕网络科技有限公司 Method and device for simulating robot charging return route and electronic equipment
CN112904846A (en) * 2021-01-15 2021-06-04 厦门攸信信息技术有限公司 Mobile robot homing charging method and system
CN112971616B (en) * 2021-02-07 2022-12-30 美智纵横科技有限责任公司 Charging seat avoiding method and device, sweeping robot and storage medium
CN113017492A (en) * 2021-02-23 2021-06-25 江苏柯林博特智能科技有限公司 Object recognition intelligent control system based on cleaning robot
CN113139949B (en) * 2021-04-30 2023-04-07 逻腾(杭州)科技有限公司 Robot image ambiguity detection method
CN115399677B (en) * 2021-05-28 2024-01-09 速感科技(北京)有限公司 Recharging method of cleaning robot and cleaning robot system
CN113768417A (en) * 2021-08-20 2021-12-10 深圳市踩点智慧科技有限公司 Sweeper system with control circuit
CN114451835A (en) * 2022-02-14 2022-05-10 深圳市优必选科技股份有限公司 Robot motion control method and device, readable storage medium and robot

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105209997A (en) * 2013-05-10 2015-12-30 戴森技术有限公司 Apparatus for guiding an autonomous vehicle towards a docking station
CN105242670A (en) * 2015-10-26 2016-01-13 深圳拓邦股份有限公司 Robot having function of automatic return charging, system and corresponding method
CN106455888A (en) * 2014-05-29 2017-02-22 夏普株式会社 Self-propelled electronic device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100010962A (en) * 2008-07-24 2010-02-03 주식회사 한울로보틱스 Apparatus for and method of guiding robot cleaner to charging station by ir sensors
CN106980320B (en) * 2017-05-18 2020-06-19 上海思岚科技有限公司 Robot charging method and device
CN109217925B (en) * 2018-01-17 2021-04-09 北京石头世纪科技股份有限公司 Near field distance adjusting method and system of charging pile, storage medium and electronic equipment
CN108303984A (en) * 2018-02-27 2018-07-20 弗徕威智能机器人科技(上海)有限公司 A kind of autonomous recharging method of mobile robot
CN109730590B (en) * 2019-01-30 2023-08-25 深圳银星智能集团股份有限公司 Cleaning robot and method for automatically returning and charging same

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105209997A (en) * 2013-05-10 2015-12-30 戴森技术有限公司 Apparatus for guiding an autonomous vehicle towards a docking station
CN106455888A (en) * 2014-05-29 2017-02-22 夏普株式会社 Self-propelled electronic device
CN105242670A (en) * 2015-10-26 2016-01-13 深圳拓邦股份有限公司 Robot having function of automatic return charging, system and corresponding method

Also Published As

Publication number Publication date
CN110477825A (en) 2019-11-22

Similar Documents

Publication Publication Date Title
CN110477825B (en) Cleaning robot, autonomous charging method, system, and readable storage medium
CN111035327B (en) Cleaning robot, carpet detection method, and computer-readable storage medium
CN109730590B (en) Cleaning robot and method for automatically returning and charging same
CN110522359B (en) Cleaning robot and control method of cleaning robot
EP3185096B1 (en) A charging pile, method and device for recognizing the charging pile, and an autonomous cleaning device
CN111166248A (en) Cleaning robot, autonomous charging method and autonomous charging system
US11960304B2 (en) Localization and mapping using physical features
JP7484015B2 (en) Obstacle detection method and device, self-propelled robot, and storage medium
CN110091326B (en) Mobile robot and control method for mobile robot
AU2018100726A4 (en) Automatic cleaning device and cleaning method
CN112183133B (en) Aruco code guidance-based mobile robot autonomous charging method
TWI706763B (en) Moving robot and controlling method
EP3505037A1 (en) Cleaning robot and control method therefor
EP2592989B1 (en) Robot cleaner and controlling method of the same
CN108481327B (en) Positioning device, positioning method and robot for enhancing vision
CN211933898U (en) Cleaning robot
KR20170057840A (en) Pose estimation apparatus and vacuum cleaner system
US20190184569A1 (en) Robot based on artificial intelligence, and control method thereof
KR101951414B1 (en) Robot cleaner and controlling method of the same
CN113675923B (en) Charging method, charging device and robot
WO2018233493A1 (en) Autonomous robot and control method, apparatus and system therefor, and computer readable storage medium
KR20130042389A (en) Mobile robot and controlling method of the same
WO2022121392A1 (en) Parking control method, control system, mobile robot, and storage medium
CN109512340B (en) Control method of cleaning robot and related equipment
KR101965739B1 (en) Mobile robot and method for controlling the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20220808

Address after: No.555, Guangfulin East Road, Songjiang District, Shanghai, 201613

Patentee after: SHANGHAI FLYCO ELECTRICAL APPLIANCE Co.,Ltd.

Address before: 518109 area 401f, building D, gangzhilong Science Park, 6 Qinglong Road, Qinghua community, Longhua street, Longhua District, Shenzhen City, Guangdong Province

Patentee before: SHENZHEN FEIKE ROBOT Co.,Ltd.