CN112971616B - Charging seat avoiding method and device, sweeping robot and storage medium - Google Patents

Charging seat avoiding method and device, sweeping robot and storage medium Download PDF

Info

Publication number
CN112971616B
CN112971616B CN202110168904.5A CN202110168904A CN112971616B CN 112971616 B CN112971616 B CN 112971616B CN 202110168904 A CN202110168904 A CN 202110168904A CN 112971616 B CN112971616 B CN 112971616B
Authority
CN
China
Prior art keywords
charging seat
sweeping robot
target
image
sweeping
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110168904.5A
Other languages
Chinese (zh)
Other versions
CN112971616A (en
Inventor
何博
范泽宣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Midea Robozone Technology Co Ltd
Original Assignee
Midea Robozone Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Midea Robozone Technology Co Ltd filed Critical Midea Robozone Technology Co Ltd
Priority to CN202110168904.5A priority Critical patent/CN112971616B/en
Publication of CN112971616A publication Critical patent/CN112971616A/en
Application granted granted Critical
Publication of CN112971616B publication Critical patent/CN112971616B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application discloses a charging seat avoiding method, a charging seat avoiding device, a sweeping robot and a storage medium, which are applied to the sweeping robot, wherein the charging seat avoiding method comprises the following steps: controlling a camera module to collect a target image of an area to be cleaned; detecting a target image by using an image detection model, and if the target image is detected to contain a charging seat of the sweeping robot, acquiring a detection result of the charging seat; the detection result comprises orientation angle information and image information of the charging seat; determining a target distance between the sweeping robot and the charging seat based on the orientation angle information and the image information; and controlling the sweeping robot to move and avoid the charging seat according to the target distance. Therefore, the image detection model is used for detecting the acquired target image, and if the sweeping robot is detected to be positioned near the charging seat, the direction information and the distance information of the sweeping robot relative to the charging seat are determined according to the orientation angle information and the image information of the charging seat in the detection result, so that the aim of accurately avoiding the charging seat by the sweeping robot is fulfilled.

Description

Charging seat avoiding method and device, sweeping robot and storage medium
Technical Field
The present disclosure relates to computer technologies, and in particular, to a charging stand evasion method, a charging stand evasion device, a sweeping robot, and a storage medium.
Background
Along with the improvement of the concept popularity of the smart home, the floor sweeping robot is favored by consumers. The automatic charging function is a sweeping robot marking function, and when the sweeping robot is insufficient in electric quantity, the charging seat can be automatically searched to complete charging. When the sweeping robot sweeps along the edge, the robot may hit the charging seat, which results in that the sweeping robot fails to find the charging seat and cannot complete the charging operation.
At present, there is a method for avoiding a charging stand, specifically, an infrared receiving head of a robot body detects an omnidirectional signal of the charging stand, and when the robot body approaches the charging stand, the robot body bypasses the charging stand by an arc track action, so as to achieve the purpose of avoiding the charging stand.
However, because the omni-directional signal has a blind area, it is not detected that the cleaning robot approaches the charging stand when the cleaning robot is in the blind area, and the cleaning robot may hit the charging stand, so that the charging stand cannot be found when the electric quantity is insufficient in the later period, that is, the charging fails.
Disclosure of Invention
In order to solve the technical problems, the present application provides a charging-stand evasion method, a charging-stand evasion device, a sweeping robot, and a storage medium.
The technical scheme of the application is realized as follows:
in a first aspect, a charging seat avoiding method is provided, and is applied to a sweeping robot, wherein the sweeping robot comprises a camera module; the method comprises the following steps:
controlling the camera module to collect a target image of an area to be cleaned;
detecting the target image by using an image detection model, and acquiring a detection result of a charging seat of the sweeping robot if the target image is detected to contain the charging seat; wherein the detection result comprises orientation angle information of the charging seat and image information of the charging seat;
determining a target distance between the sweeping robot and the charging stand based on the orientation angle information and the image information;
and controlling the sweeping robot to move and avoid the charging seat according to the target distance.
In the above solution, the image information is pixel information occupied by the charging seat in the target image; the determining a target distance between the sweeping robot and the charging dock based on the orientation angle information and the image information includes: determining a target corresponding relation from at least one preset corresponding relation based on the orientation angle information; the corresponding relation is the corresponding relation between the distance and the pixel information occupied by the charging seat; and determining the target distance based on the target corresponding relation and the pixel information occupied by the charging seat.
In the above-mentioned scheme, according to the target distance, control sweep the floor robot and remove and avoid the charging seat, include: if the target distance is smaller than or equal to a preset distance threshold, determining an avoidance route of the sweeping robot; and controlling the sweeping robot to move along the evaded route to carry out sweeping work.
In the above scheme, the method further comprises: and if the target distance is greater than the preset distance threshold value, controlling the sweeping robot to continue sweeping according to the current sweeping track.
In the above scheme, the method further comprises: performing region division on the target image according to a preset partition rule to obtain a region division result; determining a target area of the charging seat in the image to be detected based on the area division result and the image information; and determining an avoidance route of the sweeping robot based on the target area.
In the above scheme, the region division result includes a left region, a middle region, and a right region; the determining an avoidance route of the sweeping robot based on the target area comprises: when the target area is the left area, the evading route of the sweeping robot is to sweep around a semicircular arc track to the right front; when the target area is the right area, the evasion mode of the sweeping robot is an evasion mode for sweeping around a semicircular arc track to the left front; and when the target area is the middle area, the avoiding mode of the sweeping robot is the avoiding mode of turning around for sweeping.
In the above solution, the detecting the target image by using the image detection model, and if it is detected that the charging seat of the sweeping robot is included in the target image, includes: detecting the target image by using the image detection model to obtain at least one detection result of at least one detection object; sequentially judging whether the confidence value in each detection result is greater than or equal to a preset confidence threshold value; and taking the detection object with the confidence value larger than or equal to the preset confidence threshold value as the charging seat, and taking the corresponding detection result as the detection result of the charging seat. In a second aspect, a charging seat avoidance device is provided, and is applied to a sweeping robot, wherein the sweeping robot comprises a camera module; the device comprises:
the acquisition unit is used for controlling the camera module to acquire a target image of an area to be cleaned;
the acquisition unit is used for detecting the target image by using an image detection model, and acquiring a detection result of a charging seat of the sweeping robot if the target image is detected to contain the charging seat; the detection result comprises orientation angle information of the charging seat and image information of the charging seat;
a determination unit configured to determine a target distance between the sweeping robot and the charging stand based on the orientation angle information and the image information;
and the control unit is used for controlling the sweeping robot to move and avoid the charging seat according to the target distance.
In a third aspect, a sweeping robot is provided, comprising: a processor and a memory configured to store a computer program operable on the processor, wherein the processor is configured to perform the steps of the aforementioned method when executing the computer program.
In a fourth aspect, a computer-readable storage medium is provided, having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the steps of the aforementioned method.
The application provides a charging seat avoiding method which is applied to a sweeping robot, wherein the sweeping robot comprises a camera module; the method comprises the following steps: controlling a camera module to collect a target image of an area to be cleaned; detecting a target image by using an image detection model, and acquiring a detection result of a charging seat if the target image is detected to contain the charging seat of the sweeping robot; the detection result comprises orientation angle information of the charging seat and image information of the charging seat; determining a target distance between the sweeping robot and the charging stand based on the orientation angle information and the image information; and controlling the sweeping robot to move and avoid the charging seat according to the target distance. Therefore, the collected target image is detected by using the image detection model, and if the sweeping robot is detected to be positioned near the charging seat, the direction information and the distance information of the sweeping robot relative to the charging seat are determined according to the orientation angle information and the image information of the charging seat in the detection result, so that the aim of accurately avoiding the charging seat by the sweeping robot is fulfilled.
Drawings
FIG. 1 is a schematic diagram of a first process of avoiding a charging cradle in an embodiment of the present application;
FIG. 2 is a diagram illustrating a second process of avoiding the cradle in the embodiment of the present application;
FIG. 3 is a third flowchart of a cradle evasion in an embodiment of the present application;
FIG. 4 is a schematic structural diagram of a charging dock avoidance apparatus in an embodiment of the present application;
fig. 5 is a schematic structural diagram of a sweeping robot in the embodiment of the present application.
Detailed Description
So that the manner in which the features and elements of the present embodiments can be understood in detail, a more particular description of the embodiments, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings.
The application aims to solve the problem that the blind area exists in the existing all-directional signal, and the possibility of impacting the charging seat still exists when the sweeping robot avoids the charging seat, so the method for avoiding the charging seat is provided. Here, how to determine the specific position of the charging stand relative to the sweeping robot is specifically explained in the following embodiments.
Example one
An embodiment of the application provides a charging-seat avoidance method, fig. 1 is a schematic diagram of a first process of the charging-seat avoidance in the embodiment of the application, and as shown in fig. 1, the charging-seat avoidance method is applied to a sweeping robot, and the sweeping robot comprises a camera module; the charging-stand avoiding method specifically includes:
step 101: controlling the camera module to collect a target image of an area to be cleaned;
here, the camera module may include a wide-angle camera. The wide-angle camera has a short focal length and a wide visual angle, and is suitable for shooting images in a large space.
In practical application, when the floor sweeping robot is indirectly started through a terminal (including a mobile phone, a flat plate and the like) or the floor sweeping robot is directly and manually started by pressing an opening key, a camera module installed on the floor sweeping robot is automatically opened, the floor sweeping robot carries out cleaning according to a preset cleaning mode, and the camera module can collect a target image of an area to be cleaned in real time or collect the target image of the area to be cleaned according to a preset time period. Here, the cleaning mode may be a zigzag cleaning mode.
Step 102: detecting the target image by using an image detection model, and acquiring a detection result of a charging seat of the sweeping robot if the target image is detected to contain the charging seat;
it should be noted that the image detection model is used to detect whether a charging seat of the sweeping robot exists in the target image, and specifically, whether the charging seat exists in the target image is determined according to the detection result.
Here, the image detection model may be a model integrating a bounding box algorithm for identifying the position of the charging stand in the image.
Here, the image detection model may be integrated in the main control module of the robot, or may be integrated in the cloud, that is, before the image detection model is used to detect the target image, the target image needs to be transmitted to the main control module or the cloud.
In some embodiments, the detecting the target image by using the image detection model, if it is detected that the target image includes a charging seat of the sweeping robot, includes: detecting the target image by using the image detection model to obtain at least one detection result of at least one detection object; sequentially judging whether the confidence value in each detection result is greater than or equal to a preset confidence threshold value; and taking the detection object with the confidence value larger than or equal to the preset confidence threshold value as the charging seat, and taking the corresponding detection result as the detection result of the charging seat.
It should be noted that the confidence value is one of the detection results of the image detection model. The confidence value can be understood as the probability value of the detected object being a charging dock in the target image.
It should be noted that the preset confidence threshold is a minimum confidence value that the detection object in the target image corresponds to a charging seat, and is used to determine whether the detection object in the target image has the charging seat. Here, the preset confidence level threshold is generally obtained experimentally or empirically by an experimenter.
It should be noted that, in general, the target image includes at least one detection object, and the image detection model detects each detection object respectively to obtain at least one detection result. Determining whether a confidence value greater than or equal to a preset confidence value exists according to a confidence value corresponding to at least one detection result, if so, taking a detection object of the confidence value as a charging seat, and taking a detection result corresponding to the detection object as a detection result of the charging seat; if the charging seat does not exist in the target image, and the sweeping robot is controlled to continue sweeping according to the current sweeping mode and the sweeping direction.
Step 103: determining a target distance between the sweeping robot and the charging stand based on the orientation angle information and the image information;
the detection result of the charging dock further includes orientation angle information, and the orientation angle information may include a front side, a left side, and a right side. In other words, the sweeping robot is located on the front, left side or right side of the charging seat. In addition, the orientation angle information may specifically include angle information, for example, the sweeping robot is located in the 45-degree direction of the front of the charging seat.
It should be noted that the detection result of the charging dock further includes image information, and the image information is pixel information occupied by the charging dock in the target image. Here, when the distance between the cleaning robot and the charging stand is different, the pixel information of the target image occupied by the charging stand is different. Specifically, when the distance is long, the number of pixels in the pixel information is small or the area of the pixels is relatively small; when the distance is short, the number of pixels in the pixel information is large or the area of the pixels is relatively large.
In addition, because the orientation angle information of the charging seat is different, the number of pixels or the area of the pixels of the charging seat in the pixel information of the target image is different, and the orientation angle information and the image information of the charging seat are combined to determine the target distance between the sweeping robot and the charging seat.
And 104, controlling the sweeping robot to move and avoid the charging seat according to the target distance.
It should be noted that when the target distance between the sweeping robot and the charging base is different, the sweeping route of the sweeping robot is different.
In some embodiments, the step specifically includes: if the target distance is smaller than or equal to a preset distance threshold, determining an avoidance route of the sweeping robot; and controlling the sweeping robot to move along the evasive route to carry out sweeping work.
It should be noted that the preset distance threshold is the maximum distance between the sweeping robot and the charging base when the sweeping robot starts the avoidance route. And the preset distance threshold is used for judging whether the sweeping robot starts the avoidance route.
Specifically, if the target distance is smaller than or equal to the preset distance threshold, an avoidance route of the sweeping robot is determined, the sweeping robot is controlled to carry out sweeping work on the determined avoidance route, and the purpose of avoiding a charging seat is achieved. And if the target distance is greater than the preset distance threshold value, controlling the sweeping robot to continue sweeping according to the current sweeping track.
In some embodiments, the method further comprises: performing region division on the target image according to a preset partition rule to obtain a region division result; determining a target area of the charging seat in the image to be detected based on the area division result and the image information; and determining an avoidance route of the sweeping robot based on the target area.
It should be noted that when the orientation angle information of the charging stand is different, the position information of the charging stand in the target image is different, and the avoidance route of the sweeping robot is also different.
In order to accurately avoid the charging seat, the target image can be subjected to area division according to a preset partition rule, so that the position information of the charging seat in the target image is determined, and then an avoidance route of the sweeping robot is determined according to the position information, so that the charging seat is accurately avoided.
It should be noted that the preset partition rule may be divided according to three areas, i.e. the left area, the middle area and the right area, or may be divided more finely, for example: four zones, five zones, etc.
In some embodiments, the region division result includes a left region, a middle region, and a right region; the determining an avoidance route of the sweeping robot based on the target area comprises: when the target area is the left area, the evading route of the sweeping robot is to sweep around a semicircular arc track to the right front; when the target area is the right area, the evasion mode of the sweeping robot is an evasion mode for sweeping around a semicircular arc track to the left front; and when the target area is the middle area, the avoiding mode of the sweeping robot is the avoiding mode of turning around for sweeping.
Here, after the preset partition rule is divided into three regions of the left side, the middle side and the right side, the region division results in a left region, a middle region and a right region.
Specifically, if the sweeping robot is determined to be located in the left area of the charging seat, the sweeping robot is controlled to sweep around a semicircular arc track towards the right front, or to turn around for sweeping, or to turn right for straight-going sweeping, and the like. And if the sweeping robot is determined to be positioned in the right area of the charging seat, controlling the sweeping robot to sweep around a semicircular arc track towards the left front, or turning around for sweeping, or turning left for straight-going sweeping, and the like. And if the sweeping robot is determined to be positioned in the middle area of the charging seat, controlling the sweeping robot to turn right for straight-going sweeping, or turning left for straight-going sweeping, or turning around for sweeping, and the like.
Here, the execution subject of steps 101 to 104 may be a processor of the cradle avoidance apparatus.
The application provides a charging seat avoiding method which is applied to a sweeping robot, wherein the sweeping robot comprises a camera module; the method comprises the following steps: controlling a camera module to collect a target image of an area to be cleaned; detecting a target image by using an image detection model, and acquiring a detection result of a charging seat if the target image is detected to contain the charging seat of the sweeping robot; the detection result comprises orientation angle information of the charging seat and image information of the charging seat; determining a target distance between the sweeping robot and the charging seat based on the orientation angle information and the image information; and controlling the sweeping robot to move and avoid the charging seat according to the target distance. Therefore, the collected target image is detected by using the image detection model, and if the sweeping robot is detected to be positioned near the charging seat, the direction information and the distance information of the sweeping robot relative to the charging seat are determined according to the orientation angle information and the image information of the charging seat in the detection result, so that the aim of accurately avoiding the charging seat by the sweeping robot is fulfilled.
Example two
Fig. 2 is a schematic diagram of a second process of avoiding the charging stand in the embodiment of the present application, and as shown in fig. 2, the charging stand avoiding method is applied to a sweeping robot, where the sweeping robot includes a camera module; the charging-stand avoiding method specifically comprises the following steps:
step 201: controlling the camera module to collect a target image of an area to be cleaned;
step 202: detecting the target image by using an image detection model, and acquiring a detection result of a charging seat of the sweeping robot if the target image is detected to contain the charging seat;
in some embodiments, the detecting the target image by using the image detection model, if it is detected that the target image includes a charging seat of the sweeping robot, includes: detecting the target image by using the image detection model to obtain at least one detection result of at least one detection object; sequentially judging whether the confidence value in each detection result is greater than or equal to a preset confidence threshold value; and taking the detection object with the confidence value larger than or equal to the preset confidence threshold value as the charging seat, and taking the corresponding detection result as the detection result of the charging seat.
Step 203: determining a target corresponding relation from at least one preset corresponding relation based on the orientation angle information; the corresponding relation is the corresponding relation between the distance and the pixel information occupied by the charging seat;
it should be noted that, because the orientation angle information of the charging seat is different, the number of pixels or the area of pixels in the pixel information of the target image occupied by the charging seat is different, and the distance between the sweeping robot and the charging seat is also different, it is necessary to set the pixel information corresponding to different distances in advance according to different orientation angle information for different orientation angle information, so as to determine the target corresponding relationship between the distance and the pixel information of the target image occupied by the charging seat according to the orientation angle information of the charging seat in the later stage.
It should be noted that the orientation angle information may include a front side, a left side, and a right side, and on this basis, the orientation angle information may further include specific angle information. The more accurate the orientation angle information is, the more accurate the selected corresponding relation is, and thus, the more accurate the distance between the sweeping robot and the charging stand is calculated.
Exemplarily, the sweeping robot is located at the front position of the charging seat, and when the sweeping robot is located at 0-60 degrees, a first corresponding relation is set; the sweeping robot is positioned at the front position of the charging seat, and a second corresponding relation is set when the sweeping robot is positioned between 60 and 120 degrees; the sweeping robot is positioned at the front position of the charging seat, and a third corresponding relation is set when the sweeping robot is positioned between 120 degrees and 180 degrees. The sweeping robot is positioned on the left side surface of the charging seat, and a fourth corresponding relation is set when the sweeping robot is positioned between 0 and 60 degrees; the sweeping robot is positioned on the left side surface of the charging seat, and a fifth corresponding relation is set when the sweeping robot is positioned between 60 and 120 degrees; the sweeping robot is positioned on the left side surface of the charging seat, and a sixth corresponding relation is set when the sweeping robot is positioned between 120 degrees and 180 degrees. The sweeping robot is positioned on the right side surface of the charging seat, and a seventh corresponding relation is set when the sweeping robot is positioned between 0 and 60 degrees; the sweeping robot is positioned on the right side surface of the charging seat, and when the sweeping robot is positioned between 60 degrees and 120 degrees, an eighth corresponding relation is set; the sweeping robot is located at the right side position of the charging seat, and a ninth corresponding relation is set when the sweeping robot is located at 120-180 degrees.
In addition, more detailed division can be performed, such as: the sweeping robot is positioned on the front face of the charging seat, and different corresponding relations are set for different angle ranges between 0 degree and 30 degrees, or between 30 degrees and 60 degrees, or between 60 degrees and 90 degrees, or between 90 degrees and 120 degrees, between 120 degrees and 150 degrees, and between 150 degrees and 180 degrees. The sweeping robot is positioned on the left side surface of the charging seat, and different corresponding relations are set for different angle ranges between 0 degree and 30 degrees, or between 30 degrees and 60 degrees, or between 60 degrees and 90 degrees, or between 90 degrees and 120 degrees, between 120 degrees and 150 degrees, and between 150 degrees and 180 degrees. The sweeping robot is positioned at the right side position of the charging seat, and different corresponding relations are set for different angle ranges between 0 degree and 30 degrees, or between 30 degrees and 60 degrees, or between 60 degrees and 90 degrees, or between 90 degrees and 120 degrees, between 120 degrees and 150 degrees, and between 150 degrees and 180 degrees.
It should be noted that, if the distance between the sweeping robot and the charging seat needs to be determined more accurately, the angle range is further refined, and will not be described in detail here.
Step 204: determining the target distance based on the target corresponding relation and the pixel information occupied by the charging seat;
it should be noted that the target distance is a real distance between the sweeping robot and the charging base.
Here, the target correspondence relationship is a correspondence relationship between a preset distance and pixel information, and the target distance corresponding to the pixel information occupied by the charging stand can be determined based on the target correspondence relationship.
Step 205: and controlling the sweeping robot to move and avoid the charging seat according to the target distance.
It should be noted that when the target distance between the sweeping robot and the charging base is different, the sweeping route of the sweeping robot is different.
In some embodiments, the step specifically includes: if the target distance is smaller than or equal to a preset distance threshold, determining an avoidance route of the sweeping robot; and controlling the sweeping robot to move along the evaded route to carry out sweeping work.
It should be noted that the preset distance threshold is the maximum distance between the sweeping robot and the charging stand when the sweeping robot starts the avoidance route. And the preset distance threshold is used for judging whether the sweeping robot starts the avoidance route.
Specifically, if the target distance is smaller than or equal to the preset distance threshold, an avoidance route of the sweeping robot is determined, the sweeping robot is controlled to carry out sweeping work on the determined avoidance route, and the purpose of avoiding a charging seat is achieved. And if the target distance is greater than the preset distance threshold, controlling the sweeping robot to continue sweeping according to the current sweeping track.
In some embodiments, the method further comprises: performing region division on the target image according to a preset partition rule to obtain a region division result; determining a target area of the charging seat in the image to be detected based on the area division result and the image information; and determining an avoidance route of the sweeping robot based on the target area.
It should be noted that, when the orientation angle information of the charging seat is different, the position information of the charging seat in the target image is different, and the evasive route of the sweeping robot is also different.
In order to accurately avoid the charging seat, the target image can be subjected to area division according to a preset partition rule, so that the position information of the charging seat in the target image is determined, and then an avoidance route of the sweeping robot is determined according to the position information, so that the charging seat is accurately avoided.
It should be noted that the preset partition rule may be divided according to three areas, i.e. the left area, the middle area and the right area, or may be divided more finely, for example: four zones, five zones, etc.
In some embodiments, the region division result includes a left region, a middle region, and a right region; the determining an avoidance route of the sweeping robot based on the target area comprises: when the target area is the left area, the evading route of the sweeping robot is to sweep around a semicircular arc track to the right front; when the target area is the right area, the evasion mode of the sweeping robot is an evasion mode for sweeping around a semicircular arc track to the left front; and when the target area is the middle area, the avoiding mode of the sweeping robot is the avoiding mode of turning around for sweeping.
Here, after the preset partition rule is divided into three regions of the left side, the middle side and the right side, the region division results in a left region, a middle region and a right region.
Specifically, if the sweeping robot is determined to be located in the left area of the charging seat, the sweeping robot is controlled to sweep around a semicircular arc track towards the right front, or to turn around for sweeping, or to turn right for straight-going sweeping, and the like. And if the sweeping robot is determined to be positioned in the right area of the charging seat, controlling the sweeping robot to sweep around a semicircular arc track towards the left front, or turning around for sweeping, or turning left for straight-going sweeping, and the like. And if the sweeping robot is determined to be positioned in the middle area of the charging seat, controlling the sweeping robot to turn right for straight-going sweeping, or turning left for straight-going sweeping, or turning around for sweeping, and the like.
Therefore, the collected target image is detected by using the image detection model, and if the sweeping robot is detected to be positioned near the charging seat, the direction information and the distance information of the sweeping robot relative to the charging seat are determined according to the orientation angle information and the image information of the charging seat in the detection result, so that the aim of accurately avoiding the charging seat by the sweeping robot is fulfilled.
EXAMPLE III
Based on the above embodiment, fig. 3 is a schematic diagram of a third flow for avoiding by a charging stand in the embodiment of the present application, and as shown in fig. 3, the charging stand avoiding method is applied to a sweeping robot, where the sweeping robot includes a camera module; the charging-stand avoiding method specifically includes:
step 301: recognizing a pixel frame of a charging seat;
before the step, a camera module on the cleaning robot needs to acquire a target image of an area to be cleaned in real time or according to a preset time period, the target image is transmitted to the main control module, an image detection model in the main control module is used for detecting the target image, or the target image is transmitted to the cloud, an image detection model in the cloud is used for detecting the target image, that is, pixel frame recognition is performed on at least one detection object in the target image, and whether a charging seat exists is determined based on a frame recognition result (that is, a detection result).
Here, when determining whether there is a charging seat based on the frame recognition result (i.e., the detection result), specifically, obtaining a confidence value corresponding to at least one detection result, determining whether there is a confidence value greater than or equal to a preset confidence value, and if it is determined that there is a confidence value, taking a detection object of the confidence value as the charging seat, where the detection result corresponding to the detection object is the detection result of the charging seat; if the current cleaning mode and the cleaning direction do not exist, the charging seat does not exist in the target image, and the cleaning robot is controlled to continue cleaning according to the current cleaning mode and the cleaning direction.
The preset confidence threshold is a minimum confidence value corresponding to the charging seat of the detection object in the target image, and is used for judging whether the detection object in the target image has the charging seat. Here, the preset confidence level threshold is generally obtained experimentally or empirically by an experimenter.
The above-mentioned image detection model may be a model integrated by a bounding box algorithm.
Step 302: acquiring orientation angle information of a charging seat, and determining a target corresponding relation according to the orientation angle information;
here, the target correspondence relationship is a correspondence relationship between a preset distance and pixel information.
It should be noted that, because the orientation angle information of the charging seat is different, the number of pixels or the area of pixels in the pixel information of the target image occupied by the charging seat is different, and the distance between the sweeping robot and the charging seat is also different, the pixel information corresponding to different distances needs to be set in advance according to different orientation angle information for different orientation angle information, so that the target corresponding relationship between the distance and the pixel information of the target image occupied by the charging seat is determined according to the orientation angle information of the charging seat in the later stage.
Here, the orientation angle information includes a front face, a left side face, and a right side face. In other words, the sweeping robot is located on the front, left side or right side of the charging seat.
Here, if the orientation angle information of the charging stand is the front information, the corresponding relationship corresponding to the orientation angle information as the front information needs to be queried from the database storing the corresponding relationship; if the orientation angle information of the charging seat is the left side information, the corresponding relation of the orientation angle information corresponding to the left side information needs to be inquired from a database storing the corresponding relation; if the orientation angle information of the charging stand is the right side information, the corresponding relationship corresponding to the orientation angle information as the right side information needs to be inquired from the database storing the corresponding relationship.
Step 303: acquiring image information of a charging seat, and predicting a target distance by combining a target corresponding relation;
here, the image information is pixel information of the charging stand in the target image. When the distance between the sweeping robot and the charging seat is different, the pixel information of the charging seat occupying the target image is different, specifically, when the distance is longer, the pixel number or the pixel area in the pixel information is relatively smaller; when the distance is short, the number of pixels or the area of pixels in the pixel information is relatively large.
After the pixel information of the charging seat is determined, the target distance between the sweeping robot and the charging seat can be determined by combining the target corresponding relation.
Step 304: if the target distance is smaller than or equal to the preset distance threshold, starting a charging seat winding program;
here, the preset distance threshold is the maximum distance between the sweeping robot and the charging stand when the sweeping robot starts the avoidance route. The preset distance threshold is used for judging whether the sweeping robot starts a charging seat winding program (starts an evasive route).
If the target distance is larger than the preset distance threshold value, the sweeping robot is controlled to continue sweeping according to the current sweeping track.
Step 305: performing area division on the target image, and determining a cleaning track of the sweeping robot according to the area of the charging seat in the target image;
here, the target image is divided into three regions of the left side, the middle side, and the right side, resulting in a left region, a middle region, and a right region.
Specifically, if the sweeping robot is determined to be located in the left area of the charging seat, the sweeping robot is controlled to sweep around a semicircular arc track towards the right front, or to turn around for sweeping, or to turn right for straight-going sweeping, and the like. And if the sweeping robot is determined to be positioned in the right area of the charging seat, controlling the sweeping robot to sweep around a semicircular arc track towards the left front, or turning around for sweeping, or turning left for straight-going sweeping, and the like. And if the sweeping robot is determined to be positioned in the middle area of the charging seat, controlling the sweeping robot to turn right for straight-going sweeping, or turn left for straight-going sweeping, or turn around for sweeping and the like.
Step 306: and cleaning according to the previous operation mode after the charging seat is avoided.
Therefore, the collected target image is detected by using the image detection model, and if the sweeping robot is detected to be positioned near the charging seat, the direction information and the distance information of the sweeping robot relative to the charging seat are determined according to the orientation angle information and the image information of the charging seat in the detection result, so that the aim of accurately avoiding the charging seat by the sweeping robot is fulfilled.
Example four
In the embodiment of the present application, a charging-stand avoidance device is provided, fig. 4 is a schematic structural diagram of the charging-stand avoidance device in the embodiment of the present application, and as shown in fig. 4, the charging-stand avoidance device is applied to a sweeping robot, and the sweeping robot includes a camera module; this charging seat circumvention device includes:
the acquisition unit 401 is used for controlling the camera module to acquire a target image of an area to be cleaned;
an obtaining unit 402, configured to detect the target image by using an image detection model, and if it is detected that the target image includes a charging seat of the sweeping robot, obtain a detection result of the charging seat; the detection result comprises orientation angle information of the charging seat and image information of the charging seat;
a determination unit 403, configured to determine a target distance between the sweeping robot and the charging stand based on the orientation angle information and the image information;
and a control unit 404, configured to control the sweeping robot to move and avoid the charging seat according to the target distance.
In some embodiments, the image information is pixel information occupied by the charging seat in the target image; the device comprises: a determining unit 403, specifically configured to determine a target corresponding relationship from at least one preset corresponding relationship based on the orientation angle information; wherein, the corresponding relation is the corresponding relation between the distance and the pixel information occupied by the charging seat; and determining the target distance based on the target corresponding relation and the pixel information occupied by the charging seat.
In some embodiments, the apparatus comprises: the control unit 404 is specifically configured to determine an avoidance route of the sweeping robot if the target distance is less than or equal to a preset distance threshold; and controlling the sweeping robot to move along the evaded route to carry out sweeping work.
In some embodiments, the method further comprises: and if the target distance is greater than the preset distance threshold value, controlling the sweeping robot to continue sweeping according to the current sweeping track.
In some embodiments, the method further comprises: performing region division on the target image according to a preset partition rule to obtain a region division result; determining a target area of the charging seat in the image to be detected based on the area division result and the image information; and determining an avoidance route of the sweeping robot based on the target area.
In some embodiments, the region division result includes a left region, a middle region, and a right region; the determining an avoidance route of the sweeping robot based on the target area comprises: when the target area is a left area, the evading route of the sweeping robot is to sweep around a semicircular arc track to the right front; when the target area is the right area, the avoidance mode of the sweeping robot is an avoidance mode of sweeping around a semicircular arc track to the left front; and when the target area is the middle area, the avoiding mode of the sweeping robot is the avoiding mode of turning around for sweeping.
In some embodiments, the detecting the target image by using an image detection model includes, if it is detected that the target image includes a charging seat of the cleaning robot, the detecting step includes: detecting the target image by using the image detection model to obtain at least one detection result of at least one detection object; sequentially judging whether the confidence value in each detection result is greater than or equal to a preset confidence threshold value; and taking the detection object with the confidence value larger than or equal to the preset confidence threshold value as the charging seat, and taking the corresponding detection result as the detection result of the charging seat.
The application provides a charging seat avoiding method which is applied to a sweeping robot, wherein the sweeping robot comprises a camera module; the method comprises the following steps: controlling a camera module to collect a target image of an area to be cleaned; detecting a target image by using an image detection model, and acquiring a detection result of a charging seat if the target image is detected to contain the charging seat of the sweeping robot; the detection result comprises orientation angle information of the charging seat and image information of the charging seat; determining a target distance between the sweeping robot and the charging seat based on the orientation angle information and the image information; and controlling the sweeping robot to move and avoid the charging seat according to the target distance. Therefore, the collected target image is detected by using the image detection model, and if the sweeping robot is detected to be positioned near the charging seat, the direction information and the distance information of the sweeping robot relative to the charging seat are determined according to the orientation angle information and the image information of the charging seat in the detection result, so that the aim of accurately avoiding the charging seat by the sweeping robot is fulfilled.
The embodiment of the present application provides a robot of sweeping floor, and fig. 5 is a schematic structural diagram that the robot of sweeping floor constitutes in the embodiment of the present application, as shown in fig. 5, this robot of sweeping floor includes: a processor 501 and a memory 502 configured to store a computer program capable of running on the processor;
wherein the processor 501 is configured to execute the steps of the charging-stand avoiding method in the foregoing embodiments when running the computer program.
In practice, of course, as shown in fig. 5, the various components of the sweeping robot are coupled together by a bus system 503. It will be appreciated that the bus system 503 is used to enable communications among the components. The bus system 503 includes a power bus, a control bus, and a status signal bus in addition to the data bus. For clarity of illustration, however, the various buses are labeled as bus system 503 in FIG. 5.
In practical applications, the processor may be at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a controller, a microcontroller, and a microprocessor. It is understood that the electronic devices for implementing the above processor functions may be other devices, and the embodiments of the present application are not limited in particular.
The Memory may be a volatile Memory (volatile Memory), such as a Random-Access Memory (RAM); or a non-volatile Memory (non-volatile Memory), such as a Read-Only Memory (ROM), a flash Memory (flash Memory), a Hard Disk (HDD), or a Solid-State Drive (SSD); or a combination of the above types of memories and provides instructions and data to the processor.
The embodiment of the application also provides a computer readable storage medium for storing the computer program.
Optionally, the computer-readable storage medium may be applied to any one of the methods in the embodiments of the present application, and the computer program enables a computer to execute a corresponding process implemented by a processor in each method in the embodiments of the present application, which is not described herein again for brevity.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. The above-described device embodiments are merely illustrative, for example, the division of the unit is only one logical function division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may be separately used as one unit, or two or more units may be integrated into one unit; the integrated unit may be implemented in the form of hardware, or in the form of hardware plus a software functional unit. Those of ordinary skill in the art will understand that: all or part of the steps of implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer-readable storage medium, and when executed, executes the steps including the method embodiments; and the aforementioned storage medium includes: a mobile storage device, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The methods disclosed in the several method embodiments provided in the present application may be combined arbitrarily without conflict to obtain new method embodiments.
Features disclosed in several of the product embodiments provided in the present application may be combined in any combination to yield new product embodiments without conflict.
The features disclosed in the several method or apparatus embodiments provided herein may be combined in any combination to arrive at a new method or apparatus embodiment without conflict.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (9)

1. A charging seat avoiding method is characterized by being applied to a sweeping robot, wherein the sweeping robot comprises a camera module; the method comprises the following steps:
controlling the camera module to collect a target image of an area to be cleaned;
detecting the target image by using an image detection model, and acquiring a detection result of a charging seat of the sweeping robot if the target image is detected to contain the charging seat; the detection result comprises orientation angle information of the charging seat and image information of the charging seat;
determining a target corresponding relation matched with the orientation angle information from at least one preset corresponding relation based on the orientation angle information; the corresponding relation is the corresponding relation between the distance and the pixel information occupied by the charging seat;
determining a target distance based on the target corresponding relation and the pixel information occupied by the charging seat;
and controlling the sweeping robot to move and avoid the charging seat according to the target distance.
2. The method according to claim 1, wherein the controlling the sweeping robot to move and avoid the charging seat according to the target distance comprises:
if the target distance is smaller than or equal to a preset distance threshold, determining an avoidance route of the sweeping robot;
and controlling the sweeping robot to move along the evasive route to carry out sweeping work.
3. The method of claim 2, further comprising:
and if the target distance is greater than the preset distance threshold value, controlling the sweeping robot to continue sweeping according to the current sweeping track.
4. The method of claim 2, further comprising:
performing region division on the target image according to a preset partition rule to obtain a region division result;
determining a target area of the charging seat in an image to be detected based on the area division result and the image information;
and determining an avoidance route of the sweeping robot based on the target area.
5. The method of claim 4,
the region division result comprises a left region, a middle region and a right region;
the determining an avoidance route of the sweeping robot based on the target area comprises:
when the target area is a left area, the evading route of the sweeping robot is to sweep around a semicircular arc track to the right front;
when the target area is the right area, the evasion mode of the sweeping robot is an evasion mode for sweeping around a semicircular arc track to the left front;
and when the target area is the middle area, the avoiding mode of the sweeping robot is the avoiding mode of turning around for sweeping.
6. The method of claim 1, wherein the detecting the target image using the image detection model, if detecting that the target image includes a charging seat of the sweeping robot, comprises:
detecting the target image by using the image detection model to obtain at least one detection result of at least one detection object;
sequentially judging whether the confidence value in each detection result is greater than or equal to a preset confidence threshold value;
and taking the detection object with the confidence value larger than or equal to the preset confidence threshold value as the charging seat, and taking the corresponding detection result as the detection result of the charging seat.
7. The charging seat avoiding device is characterized by being applied to a sweeping robot, wherein the sweeping robot comprises a camera module; the device comprises:
the acquisition unit is used for controlling the camera module to acquire a target image of an area to be cleaned;
the acquisition unit is used for detecting the target image by using an image detection model, and acquiring a detection result of a charging seat of the sweeping robot if the target image is detected to contain the charging seat; the detection result comprises orientation angle information of the charging seat and image information of the charging seat;
a determining unit configured to determine, based on the orientation angle information, a target correspondence relationship that matches the orientation angle information from at least one correspondence relationship that is set in advance; the corresponding relation is the corresponding relation between the distance and the pixel information occupied by the charging seat; determining a target distance based on the target corresponding relation and the pixel information occupied by the charging seat;
and the control unit is used for controlling the sweeping robot to move and avoid the charging seat according to the target distance.
8. A sweeping robot, comprising: a processor and a memory configured to store a computer program capable of running on the processor,
wherein the processor is configured to perform the steps of the method of any one of claims 1 to 6 when running the computer program.
9. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 6.
CN202110168904.5A 2021-02-07 2021-02-07 Charging seat avoiding method and device, sweeping robot and storage medium Active CN112971616B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110168904.5A CN112971616B (en) 2021-02-07 2021-02-07 Charging seat avoiding method and device, sweeping robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110168904.5A CN112971616B (en) 2021-02-07 2021-02-07 Charging seat avoiding method and device, sweeping robot and storage medium

Publications (2)

Publication Number Publication Date
CN112971616A CN112971616A (en) 2021-06-18
CN112971616B true CN112971616B (en) 2022-12-30

Family

ID=76348875

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110168904.5A Active CN112971616B (en) 2021-02-07 2021-02-07 Charging seat avoiding method and device, sweeping robot and storage medium

Country Status (1)

Country Link
CN (1) CN112971616B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114355889A (en) * 2021-12-08 2022-04-15 上海擎朗智能科技有限公司 Control method, robot charging stand, and computer-readable storage medium
CN114532918A (en) * 2022-01-26 2022-05-27 深圳市杉川机器人有限公司 Cleaning robot, target detection method and device thereof, and storage medium

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100010962A (en) * 2008-07-24 2010-02-03 주식회사 한울로보틱스 Apparatus for and method of guiding robot cleaner to charging station by ir sensors
KR20170053351A (en) * 2015-11-06 2017-05-16 삼성전자주식회사 Cleaning robot and controlling method thereof
WO2018038552A1 (en) * 2016-08-25 2018-03-01 엘지전자 주식회사 Mobile robot and control method therefor
US10878294B2 (en) * 2018-01-05 2020-12-29 Irobot Corporation Mobile cleaning robot artificial intelligence for situational awareness
JP7030007B2 (en) * 2018-04-13 2022-03-04 東芝ライフスタイル株式会社 Autonomous vacuum cleaner
CN109077671A (en) * 2018-09-11 2018-12-25 张家港市鸿嘉数字科技有限公司 A kind of sweeping robot motion control method and device
CN111103872A (en) * 2018-10-10 2020-05-05 北京奇虎科技有限公司 Method and device for controlling robot to avoid charging device and computing equipment
JP6838027B2 (en) * 2018-10-31 2021-03-03 ファナック株式会社 Robot system
KR102683536B1 (en) * 2019-03-28 2024-07-11 삼성전자주식회사 Cleaning robot and controlling method thereof, cleaning robot charging system
CN110477825B (en) * 2019-08-30 2021-10-26 深圳飞科机器人有限公司 Cleaning robot, autonomous charging method, system, and readable storage medium
CN111067439B (en) * 2019-12-31 2022-03-01 深圳飞科机器人有限公司 Obstacle processing method and cleaning robot
CN111700553B (en) * 2020-06-05 2021-12-24 深圳瑞科时尚电子有限公司 Obstacle avoidance method, device, robot and storage medium
CN111539400A (en) * 2020-07-13 2020-08-14 追创科技(苏州)有限公司 Control method and device of self-moving equipment, storage medium and self-moving equipment

Also Published As

Publication number Publication date
CN112971616A (en) 2021-06-18

Similar Documents

Publication Publication Date Title
CN112971616B (en) Charging seat avoiding method and device, sweeping robot and storage medium
CN107305627B (en) Vehicle video monitoring method, server and system
EP3641298B1 (en) Method and device for capturing target object and video monitoring device
CN108985225B (en) Focus following method, device, electronic equipment and storage medium
EP3193320A1 (en) Method for collecting evidence of vehicle parking violation, and apparatus therefor
KR101687530B1 (en) Control method in image capture system, control apparatus and a computer-readable storage medium
CN111814752B (en) Indoor positioning realization method, server, intelligent mobile device and storage medium
CN110442120B (en) Method for controlling robot to move in different scenes, robot and terminal equipment
CN110561416B (en) Laser radar repositioning method and robot
US11126875B2 (en) Method and device of multi-focal sensing of an obstacle and non-volatile computer-readable storage medium
CN110491060B (en) Robot, safety monitoring method and device thereof, and storage medium
CN102833478A (en) Fault tolerant background modeling
CN110866544B (en) Sensor data fusion method and device and storage medium
CN112824182A (en) Automatic parking method, automatic parking device, computer equipment and storage medium
CN114495299A (en) Vehicle parking automatic payment method, system and readable storage medium
CN111784730A (en) Object tracking method and device, electronic equipment and storage medium
KR101333459B1 (en) Lane detecting method and apparatus thereof
CN112540604A (en) Robot charging system and method and terminal equipment
CN112241660A (en) Anti-theft monitoring method and device based on vision
CN110430361B (en) Window cleaning method and device
CN113326715A (en) Target association method and device
CN113051978A (en) Face recognition method, electronic device and readable medium
CN111409584A (en) Pedestrian protection method, device, computer equipment and storage medium
CN115206130B (en) Parking space detection method, system, terminal and storage medium
CN115191866B (en) Recharging method and device, cleaning robot and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant