CN113467448A - Fixed-point working method, self-moving robot and storage medium - Google Patents

Fixed-point working method, self-moving robot and storage medium Download PDF

Info

Publication number
CN113467448A
CN113467448A CN202110706064.3A CN202110706064A CN113467448A CN 113467448 A CN113467448 A CN 113467448A CN 202110706064 A CN202110706064 A CN 202110706064A CN 113467448 A CN113467448 A CN 113467448A
Authority
CN
China
Prior art keywords
fixed
guide object
point
self
sweeping
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110706064.3A
Other languages
Chinese (zh)
Inventor
鲍亮
王孟昊
汤进举
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecovacs Robotics Suzhou Co Ltd
Original Assignee
Ecovacs Robotics Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecovacs Robotics Suzhou Co Ltd filed Critical Ecovacs Robotics Suzhou Co Ltd
Priority to CN202110706064.3A priority Critical patent/CN113467448A/en
Publication of CN113467448A publication Critical patent/CN113467448A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0259Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means
    • G05D1/0263Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means using magnetic strips
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0285Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using signals transmitted via a public communication network, e.g. GSM network

Abstract

The embodiment of the application provides a fixed-point working method, a self-moving robot and a storage medium. In the embodiment of the application, in the fixed point working mode, the self-moving robot can detect the guide object based on the visual sensor and then move along with the guide object. In the process of following movement, the self-moving robot identifies whether the self-moving robot reaches the fixed point working area. After the fixed-point working area is reached, the fixed-point working task is executed in the fixed-point working area according to the fixed-point working parameters, so that automatic fixed-point work is realized, a user does not need to manually move the self-moving robot to the fixed-point working area, manpower resources are saved, and the automation and intelligence degree of the self-moving robot is improved.

Description

Fixed-point working method, self-moving robot and storage medium
The patent application of the invention is a divisional application of Chinese invention patent application with the application date of 2018, 06 and 07, and the application number of 201810582948.0, entitled "fixed point cleaning method, sweeping robot and storage medium".
Technical Field
The application relates to the technical field of artificial intelligence, in particular to a fixed-point cleaning method, a sweeping robot and a storage medium.
Background
With the development of artificial intelligence technology, the home robot tends to be intelligent. The floor sweeping robot can automatically complete the floor sweeping task by means of certain artificial intelligence, and liberates users from cleaning work, so that the floor sweeping robot becomes a common household appliance of modern families quickly.
Aiming at the ground to be cleaned which only contains a small amount of garbage or is concentrated in a small-range area, the sweeping robot can adopt a fixed-point sweeping mode to locally sweep in the small-range area with relatively concentrated garbage or garbage, so that the robot resource is saved, and the sweeping efficiency is improved.
When the sweeping robot executes the fixed-point sweeping task, a user is required to move the sweeping robot to a fixed-point sweeping area, certain manpower is required to be consumed, and the advantages of high automation and high intelligence of the sweeping robot cannot be fully embodied.
Disclosure of Invention
Aspects of the application provide a fixed point cleaning method, a sweeping robot and a storage medium, so as to realize automatic fixed point cleaning and improve the automation and intelligence degree of the sweeping robot.
The embodiment of the application provides a fixed-point cleaning method, which comprises the following steps:
detecting a movable guide object based on a vision sensor in a fixed-point cleaning mode;
after the guide object is detected, moving to a fixed-point cleaning area along with the guide object;
and executing a fixed-point cleaning task in the fixed-point cleaning area according to the fixed-point cleaning parameters.
The embodiment of the application further provides a robot of sweeping floor, include: the machine comprises a machine body, wherein the machine body is provided with a visual sensor, one or more processors and one or more memories for storing computer instructions;
the one or more processors to execute the computer instructions to:
detecting a movable guide object based on the vision sensor in a fixed-point sweeping mode;
after the guide object is detected, controlling the sweeping robot to move to a fixed-point sweeping area along with the guide object; and
and controlling the sweeping robot to execute a fixed-point sweeping task in the fixed-point sweeping area according to the fixed-point sweeping parameters.
Embodiments of the present application also provide a computer-readable storage medium having stored thereon computer instructions that, when executed by one or more processors, cause the one or more processors to perform acts comprising:
detecting a movable guide object based on a vision sensor of the sweeping robot in a fixed-point sweeping mode;
after the guide object is detected, controlling the sweeping robot to move to a fixed-point sweeping area along with the guide object;
and controlling the sweeping robot to execute a fixed-point sweeping task in the fixed-point sweeping area according to the fixed-point sweeping parameters.
In this application embodiment, under the mode is cleaned to the fixed point, the robot of sweeping the floor can be based on visual sensor detection guide object, and then follow the guide object and remove to the fixed point and clean the region, clean the parameter at the fixed point and clean the regional execution fixed point and clean the task according to the fixed point, realized automatic fixed point and cleaned, no longer need the user manually to move the robot of sweeping the floor to the fixed point and clean the region, practiced thrift manpower resources, improved the automation, the intelligent degree of the robot of sweeping the floor.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic flow chart of a fixed point cleaning method according to an exemplary embodiment of the present application;
fig. 2a is a schematic view of a system of a sweeping robot for photographing human legs through a vision sensor according to an exemplary embodiment of the present application;
fig. 2b is an environmental image acquired by the vision sensor when the sweeping robot rotates clockwise by α degrees;
fig. 2c is an environmental image acquired by the vision sensor when the sweeping robot rotates clockwise by β degrees;
fig. 2d is an environmental image acquired by the vision sensor when the sweeping robot rotates by δ degrees in the clockwise direction;
FIG. 3a is an image of an environment containing a person's leg taken at a distance L1 from the vision sensor to the person's leg;
FIG. 3b is an image of an environment containing a person's leg taken at a distance L2 from the vision sensor to the person's leg;
FIG. 3c is an image of an environment containing a person's leg taken at a distance L3 from the vision sensor to the person's leg;
FIG. 4 is a schematic flow chart diagram illustrating a method for spot cleaning according to another exemplary embodiment of the present disclosure;
FIG. 5 is a schematic flow chart diagram illustrating a method for spot cleaning according to another exemplary embodiment of the present application;
fig. 6a is a block diagram of a sweeping robot according to another exemplary embodiment of the present disclosure;
fig. 6b is a line drawing of a circular sweeping robot according to another exemplary embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a fixed-point sweeping control device according to still another exemplary embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
When carrying out the fixed point and cleaning the task to the robot of sweeping the floor among the prior art, need the manual work to move it to the fixed point and clean the region, consume the manpower, can't fully embody the high automation of robot of sweeping the floor, the technical problem of high intelligent advantage, in some embodiments of this application, the vision sensor that will sweep the floor the robot combines together with automatic guide, make the robot of sweeping the floor clean under the fixed point mode of cleaning, can follow the guide object automatically and remove to the fixed point and clean the region and carry out the fixed point and clean the task, automatic fixed point cleaning has been realized, no longer need the user manually to move the robot of sweeping the floor to the fixed point and clean the region, manpower resources have been practiced thrift, the automation of the robot of sweeping the floor has been improved, intelligent degree.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic flowchart of a fixed-point cleaning method according to an exemplary embodiment of the present disclosure. As shown in fig. 1, the method includes:
101. in the fixed-point sweeping mode, a movable guide object is detected based on a vision sensor.
102. After the guide object is detected, the guide object is moved to the fixed point cleaning area.
103. And executing the fixed point cleaning task in the fixed point cleaning area according to the fixed point cleaning parameters.
The method provided by the embodiment can be applied to the sweeping robot, and the sweeping robot generally refers to various intelligent devices with sweeping functions. The shape of the sweeping robot is not limited in this embodiment, and may be, for example, a circle, an ellipse, a triangle, a convex polygon, or the like. The sweeping robot may implement the logic of the fixed-point sweeping method provided in this embodiment by installing software, an Application (App), or writing a program code in a corresponding device.
In the present embodiment, the cleaning robot supports a fixed-point cleaning mode and a general cleaning mode. The normal cleaning mode is a general term for cleaning modes other than the fixed point cleaning mode, and there may be one or more of the normal cleaning modes. The fixed point cleaning mode is a mode which needs to perform local cleaning aiming at a fixed point cleaning area; the general cleaning mode is a mode in which the area to be cleaned needs to be cleaned in an overall manner.
The area to be cleaned is a physical area which is required to be in charge of cleaning and has independent significance in the working environment of the sweeping robot. Accordingly, the fixed-point cleaning area refers to a local area in the area to be cleaned, which needs to be cleaned at a fixed point, and may be a small area where a small amount of garbage or garbage is concentrated.
In different operation scenes, the areas to be cleaned are different; accordingly, the spot-sweeping areas in the area to be swept may also be different. For example, the area to be cleaned may be a room in a room, such as a living room, a kitchen, a bedroom, etc.; accordingly, the spot-sweeping area may be a corner of a living room, an area in a kitchen, an area below a bed in a bedroom, etc. For another example, the area to be cleaned may be a corridor, a meeting place, a sports ground, or the like; accordingly, the spot-sweeping area may be a certain area in a corridor, a meeting place, a sports ground.
In practical applications, it often occurs that only a part of the area to be cleaned has trash or that the trash is concentrated in a small area. For example, in a living room setting, children tend to have some snack debris falling onto the floor around the child while eating snacks in the living room, which are generally not present on floors that are remote from the child. For another example, in a corridor scenario, after cleaning the entire corridor, a user carelessly leaves some garbage in a small area while carrying things, while other areas of the corridor meet the cleanliness requirement. In these situations, if the sweeping robot uses a normal sweeping mode to sweep the whole area to be swept (e.g., the whole living room or the whole corridor), not only the resources of the sweeping robot are wasted, but also the effective sweeping efficiency is low. Therefore, when only partial area in the area to be cleaned needs to be cleaned, the sweeping robot can adopt a fixed-point cleaning mode to carry out fixed-point cleaning on the area needing to be cleaned, so that the resource of the sweeping robot is saved, and the cleaning efficiency is improved.
In a scene that the sweeping robot needs to perform fixed-point sweeping on a fixed-point sweeping area, the sweeping robot needs to be located in the fixed-point sweeping area first. In this embodiment, in order to implement automatic fixed-point cleaning, a vision sensor of the cleaning robot is combined with automatic guidance, so that the cleaning robot can automatically move to a fixed-point cleaning area along with a guidance object and perform a fixed-point cleaning task in a fixed-point cleaning mode.
In the cleaning robot, in the fixed-point cleaning mode, a visual sensor of the cleaning robot can be started, and a movable guide object existing in the current environment can be detected based on the visual sensor. The visual sensor may be any device or component capable of acquiring an environmental image, such as a camera or a camera. The guiding object may be any object that is movable and capable of guiding the sweeping robot to the localized sweeping area, and may be, for example, a user, or a leg of a user, or a humanoid robot with a guiding function.
After detecting the guide object, the cleaning robot may move to the fixed-point cleaning area following the guide object, and then perform the fixed-point cleaning task in the fixed-point cleaning area according to the fixed-point cleaning parameters.
In this embodiment, the robot of sweeping the floor can accomplish the fixed point automatically and clean the task, need not that the user will sweep the floor the robot and move the fixed point to clean the region manually, can further liberate the user from cleaning work, also need not receive limitations such as user's age, physical power when using the fixed point to clean the mode in addition, can all adopt the fixed point to clean the mode to old man or weak many patients, and the fixed point cleans efficiently. In addition, under the cooperation of a guide object, the sweeping robot does not need to build an environment map in advance, so that the computing resource can be saved, the adverse effects of factors such as insufficient precision of the environment map and the like can be overcome, the sweeping robot can be more accurately positioned to a fixed-point sweeping area, and the positioning accuracy is higher.
In some exemplary embodiments, the sweeping robot is usually in a normal sweeping mode, and when the sweeping robot is required to perform a fixed-point sweeping task, the sweeping robot needs to enter the fixed-point sweeping mode. According to different factors such as functions, capabilities, realization forms and the like of the sweeping robot, the mode of the sweeping robot entering the fixed-point sweeping mode is different. Several ways for the sweeping robot to enter the fixed-point sweeping mode are listed below:
mode A: the floor sweeping robot has a voice recognition function and can recognize voice instructions of a user. Based on the above, when the sweeping robot is required to execute the fixed-point sweeping task, the user can send a voice instruction for switching the sweeping mode to the sweeping robot. The cleaning robot monitors a voice instruction of a user, and when the voice instruction which is sent by the user and used for switching the cleaning mode is monitored, the cleaning mode can be switched from the common cleaning mode to the fixed-point cleaning mode.
Mode B: the sweeping robot is provided with a sweeping mode switching button, the sweeping mode switching button belongs to a physical button, and the sweeping mode switching button is mainly used for a user to switch the sweeping mode of the sweeping robot. When the sweeping robot is required to execute a fixed-point sweeping task, a user can press a sweeping mode switching button on the sweeping robot. For the sweeping robot, an event that a sweeping mode switching button is pressed can be monitored; when an event that the sweeping mode switching button is pressed is monitored, the sweeping mode may be switched from the general sweeping mode to the spot sweeping mode in response to the event.
Mode C: the user can install the App of the sweeping robot on an intelligent terminal, and then can carry out various controls on the sweeping robot through the App on the intelligent terminal, for example, the sweeping mode of the sweeping robot is switched, and the sweeping parameters of the sweeping robot under various sweeping modes are set. The intelligent terminal can be a smart phone, a tablet computer, a personal computer, wearable equipment and the like. When the sweeping robot is required to execute a fixed-point sweeping task, a user can send a sweeping mode switching instruction to the sweeping robot through the App on the intelligent terminal, and the sweeping mode switching instruction is used for indicating the sweeping robot to switch a sweeping mode from a common sweeping mode to a fixed-point sweeping mode. For the sweeping robot, a sweeping mode switching instruction sent by the intelligent terminal can be received, and the sweeping mode is switched from a common sweeping mode to a fixed-point sweeping mode according to the sweeping mode switching instruction.
Mode D: the sweeping robot is provided with a touch panel, and the touch panel is a human-computer interface for a user to control, set and the like the sweeping robot. For example, the user can set or change the cleaning mode of the sweeping robot through the touch panel. When the sweeping robot is required to perform a fixed-point sweeping task, a user can send a sweeping mode switching operation through the touch panel, wherein the sweeping mode switching operation is used for instructing the sweeping robot to switch the sweeping mode from a common sweeping mode to a fixed-point sweeping mode. The cleaning robot may detect a user operation on the touch panel, and when a cleaning mode switching operation is detected on the touch panel, switch the cleaning mode from the normal cleaning mode to the fixed-point cleaning mode according to the cleaning mode switching operation.
It should be noted that one sweeping robot may support one or more of the above modes, or may support two or more modes simultaneously. Of course, the sweeping robot may support other cleaning mode switching modes besides the above-mentioned modes.
In any of the above manners, the cleaning robot can switch its cleaning mode from the normal cleaning mode to the fixed-point cleaning mode. After the sweeping robot switches its sweeping mode from the normal sweeping mode to the spot sweeping mode, it may activate its vision sensor, such as a camera or a camera, with which it detects the movable guiding object present in the current environment in order to follow the guiding object to the spot sweeping area.
In some exemplary embodiments, in order to more efficiently and rapidly detect the guide object, the sweeping robot may rotate clockwise or counterclockwise, continuously acquire an environment image using a vision sensor during the rotation, identify whether the acquired environment image includes preset guide object parameters, and determine that the guide object is detected when the environment image including the preset guide object parameters is acquired. The guide object parameter refers to a parameter capable of uniquely identifying a guide object, and the guide object parameter is stored in the sweeping robot in advance. The parameters of the guiding object are different according to the guiding object. For example, if the guidance object is a user, the guidance object parameters may be some human body feature parameters describing the user, such as human leg features, facial features, and the like. For example, if the guidance target is a humanoid robot having a guidance function, the guidance target parameters are characteristic parameters describing the form of the humanoid robot, and include, for example, characteristics such as a humanoid arm, a leg, and a head.
It should be noted that, in the process of detecting the guiding object, the maximum angle that the sweeping robot can rotate clockwise or counterclockwise is 360 °. Alternatively, in order to save resources consumed by detecting the guidance object, a maximum rotation angle, for example, 90 °, 180 °, 200 °, may be set in advance according to an application scenario. Based on the rotation, the sweeping robot can rotate clockwise or anticlockwise, and if the guide object is detected before the rotation is rotated by 90 degrees, 180 degrees or 200 degrees, the rotation operation is finished; if the guiding object is not detected when the guiding object is rotated by 90 degrees, 180 degrees or 200 degrees, the current detection process is ended, and an error can be reported or a prompt tone can be output to prompt the guiding object to enter the visual field range of the guiding object as soon as possible.
In embodiments of the present application, after detecting the guide object, the sweeping robot may move to the fixed-point sweeping area following the guide object. After the sweeping robot moves to the fixed-point sweeping area, the moving can be stopped, and the fixed-point sweeping task is started to be executed.
It should be noted that the sweeping robot has a certain randomness when detecting the guiding object, which results in that the advancing direction of the sweeping robot and the direction in which the guiding object is located may be the same or different. If the forward direction of the sweeping robot is the same as the direction of the guide object, the sweeping robot can directly move to the fixed-point sweeping area along with the guide object. If the forward direction of the sweeping robot is different from the direction of the guiding object, the sweeping robot may adjust the forward direction to the direction of the guiding object according to the position of the guiding object in advance, and then move to the fixed-point sweeping area along with the guiding object from the adjusted forward direction. It should be noted that the adjusted forward direction is the forward direction when the sweeping robot starts to move along with the guiding object, and the forward direction of the sweeping robot can change along with the change of the direction of the guiding object in the process of moving along with the guiding object.
In one embodiment, after detecting the guide object, the sweeping robot may determine whether the forward direction of the sweeping robot is the same as the direction of the guide object according to an environment image containing the guide object and acquired by a visual sensor; if the advancing direction of the sweeping robot is found to be different from the direction in which the guide object is located, the sweeping robot can rotate clockwise or anticlockwise so as to adjust the advancing direction of the sweeping robot, and the advancing direction of the sweeping robot is determined to be the same as the direction in which the guide object is located when the environment image containing the guide object is continuously collected by the vision sensor in the rotating process until the first target environment image is collected. And the position relation between the vertical central line of the first target environment image and the vertical central line of the guide object in the first target environment image accords with the set central line position relation.
According to the installation position of the visual sensor on the sweeping robot, the position relation between the vertical center line of the guide object in the environment image acquired by the visual sensor and the vertical center line of the environment image can be calculated when the advancing direction of the sweeping robot is the same as the direction of the guide object. Based on this, when the position relationship between the vertical centerline of the guiding object in the environment image acquired by the vision sensor and the vertical centerline of the environment image conforms to the set centerline position relationship, it can be determined that the current advancing direction of the sweeping robot is the same as the direction in which the guiding object is located, and the current environment image is referred to as a first target environment image. That is, the first target environment image is an environment image in which the positional relationship between the vertical center line of the image and the vertical center line of the guidance target in the image conforms to the set center line positional relationship.
In some exemplary scenarios, the vision sensor is disposed directly in front of the sweeping robot and on a vertical centerline of the sweeping robot. In these scenarios, when the forward direction of the sweeping robot is the same as the direction in which the guiding object is located, the vertical centerline of the guiding object in the environment image acquired by the vision sensor and the vertical centerline of the environment image should coincide. Based on this, in the clockwise or anticlockwise rotation process of the sweeping robot, when the vision sensor acquires an environment image in which the vertical central line of the image coincides with the vertical central line of the guide object contained in the image, it can be determined that the advancing direction of the sweeping robot is adjusted to the direction in which the guide object is located. Taking the guiding object as a human leg as an example, the relationship between the position of the vertical centerline of the guiding object in the environmental image and the variation of the rotation angle of the sweeping robot in the environmental image is shown in fig. 2 b-2 d. Fig. 2a is a schematic diagram of a system for acquiring an environment image including human legs by a sweeping robot through a vision sensor; fig. 2 b-2 d are images of the environment acquired by the system of fig. 2 a. Wherein, fig. 2b is an environment image acquired by the vision sensor when the sweeping robot rotates by α degrees in the clockwise direction, fig. 2c is an environment image acquired by the vision sensor when the sweeping robot rotates by β degrees in the clockwise direction, and fig. 2d is an environment image acquired by the vision sensor when the sweeping robot rotates by δ degrees in the clockwise direction; α < β < δ. Further, in fig. 2d, the vertical central lines of the environmental image and the guiding object in the environmental image coincide with each other, which indicates that the forward direction of the sweeping robot is the same as the direction in which the guiding object is located, and the image is the first target environmental image. In fig. 2 b-2 d, the dashed lines represent vertical centerlines.
In some optional embodiments, after the sweeping robot adjusts its forward direction to the direction in which the guiding object is located, the guiding object may start to move by itself. In other optional embodiments, after the sweeping robot adjusts the forward direction of the sweeping robot to the direction of the guide object, before starting to move along with the guide object, a first voice prompt message may be output to prompt the guide object to start moving. For example, the cleaning robot may output a guidance voice such as "please move" or "please start moving", and when the guidance target hears the guidance voice, the guidance target starts moving in the direction of the fixed point cleaning area.
In addition, in the process that the sweeping robot can move to the fixed point sweeping area along with the guide object, the sweeping robot needs to identify whether the fixed point sweeping area is reached.
Optionally, when the cleaning robot reaches the fixed-point cleaning area, the guiding object may send a prompt message indicating that the cleaning robot has reached the fixed-point cleaning area, so as to prompt that the cleaning robot has reached the fixed-point cleaning area. For example, the guidance object may send a prompt message to the sweeping robot by voice, motion, or the like, indicating that the fixed-point sweeping area has been reached.
Alternatively, the sweeping robot can automatically recognize whether the fixed-point sweeping area is reached based on the own vision sensor. For example, in the process that the sweeping robot moves along with the guide object, the environment image containing the guide object can be continuously acquired by using the visual sensor, and whether a second target environment image is acquired or not is judged; and when the second target environment image is acquired, determining that the fixed point cleaning area is reached. And the distance between the lower edge of the second target environment image and the lower edge of the guide object in the second target environment image meets the set edge distance requirement.
According to the principle that the visual sensor acquires the environment image, the farther the guiding object is away from the sweeping robot (specifically, the visual sensor), the farther the lower edge of the guiding object in the environment image acquired by the visual sensor is away from the lower edge of the environment image; the closer the guide object is to the sweeping robot, the closer the lower edge of the guide object in the environment image acquired by the vision sensor is to the lower edge of the environment image.
In this embodiment, a minimum distance or a distance range that should be maintained between the cleaning robot and the guide object when the cleaning robot is guided to the fixed-point cleaning area by the guide object may be preset, and when the minimum distance or the distance range is calculated, a distance or a distance range between a lower edge of the guide object and a lower edge of the environment image in the environment image acquired by the vision sensor is calculated, and the distance range is set as the edge distance requirement. Based on the above, in the process that the sweeping robot moves along with the guide object, the vision sensor continuously collects the environment images including the guide object, can calculate the distance between the lower edge of the guide object in each collected environment image and the lower edge of the environment image, and judges whether the distance between the lower edge of the guide object in each environment image and the lower edge of the environment image meets the set edge distance requirement or not. When the visual sensor acquires an environment image in which the distance between the lower edge of the guide object and the lower edge of the environment image meets the set edge distance requirement, the guide object may be considered to have guided the sweeping robot to the fixed-point sweeping area, and the environment image in which the distance between the lower edge of the image and the lower edge of the guide object included in the image meets the set edge distance requirement is referred to as a second target environment image.
The embodiment does not limit the requirement of the edge distance, and can be adaptively set according to application requirements. For example, the edge distance requirement may be that the distance between the two is smaller than a set distance threshold, and the second target environment image is an environment image in which the distance between the lower edge of the image and the lower edge of the guide object in the image is smaller than the set distance threshold, and the distance threshold is a numerical value larger than 0. For another example, if the edge distance requirement is that the distance between the two is required to be 0, that is, the two lower edges need to be overlapped, the second target environment image is an environment image in which the lower edge of the image is overlapped with the lower edge of the guide object in the image.
Taking the example of the guiding object as a human leg, the relationship between the lower edge of the human leg and the lower edge of the image in which the human leg is located as a function of the distance from the vision sensor to the human leg is shown in fig. 3 a-3 c. Wherein, fig. 3a is the environment image including the human leg collected when the distance from the visual sensor to the human leg is L1, fig. 3b is the environment image including the human leg collected when the distance from the visual sensor to the human leg is L2, and fig. 3c is the environment image including the human leg collected when the distance from the visual sensor to the human leg is L3; l1> L2> L3. As can be seen from fig. 3a to 3c, as the distance from the vision sensor to the leg of the person decreases, the distance between the lower edge of the leg of the person and the lower edge of the image in which the leg of the person is located also gradually decreases. Further, in fig. 3c, the lower edges of the legs of the person coincide with the lower edges of the image of the legs of the person, it can be considered that the guiding object has guided the sweeping robot to the fixed-point sweeping area, and the image is the second target environment image.
In an actual application scenario, in the process of guiding the sweeping robot to the fixed-point sweeping area by the guide object, there may be situations such as a large turn, a large obstacle shielding, and the like. In these cases, the sweeping robot may lose track of the guiding object, i.e. the visual sensor of the sweeping robot cannot detect the guiding object. Aiming at the situation that the guide object is lost in the process of moving along with the guide object, the sweeping robot can rotate clockwise or anticlockwise, and the guide object is detected again by using a visual sensor in the rotating process; if the guide object is detected again, the guide object is continuously moved along with the guide object; and if the guide object cannot be detected again, outputting second voice prompt information to prompt the guide object to return to the visual field range of the sweeping robot. If the guiding object returns to the visual field range of the sweeping robot again under the prompt of the second voice prompt message, the sweeping robot can continue to move along with the guiding object until the fixed-point sweeping area is reached.
After the sweeping robot reaches the fixed-point sweeping area, the fixed-point sweeping task can be executed in the fixed-point sweeping area according to the fixed-point sweeping parameters. Optionally, the sweeping robot may perform local sweeping according to the fixed point sweeping parameters by positioning the current position of the sweeping robot as the center of the fixed point sweeping area. Or, the sweeping robot may perform local sweeping by using the current position as the starting point of the fixed point sweeping area according to the fixed point sweeping parameters.
The fixed-point cleaning parameters are parameters required by the cleaning robot to complete the fixed-point cleaning task, and the fixed-point cleaning parameters are mainly used for controlling the completion of the fixed-point cleaning task. For example, the spot-cleaning parameters may include at least one of a spot-cleaning range, a spot-cleaning time, a spot-cleaning power consumption, and a total spot-cleaning distance. The fixed point cleaning range describes the range of the area for fixed point cleaning, and when the fixed point cleaning range is completely cleaned, the fixed point cleaning task is ended. The fixed-point cleaning time describes the execution time of the fixed-point cleaning task, and when the actual cleaning time reaches the fixed-point cleaning time, the fixed-point cleaning task is ended. The fixed-point cleaning power consumption describes the power consumption consumed by the fixed-point cleaning task, and when the power consumption consumed by the fixed-point cleaning reaches the fixed-point cleaning power consumption, the fixed-point cleaning task is ended. The fixed point cleaning total distance describes a total distance which needs to be cleaned by the fixed point cleaning task, and when the fixed point cleaning total distance reaches the fixed point cleaning total distance, the fixed point cleaning task is ended.
Optionally, during the process of executing the fixed-point cleaning task, the cleaning robot may perform local cleaning according to the fixed-point cleaning parameters by adopting a path planning type cleaning mode. The path planning type cleaning mode is relative to the random cleaning mode, and refers to a cleaning mode which can accurately plan a cleaning route, realize planning type cleaning, ensure the cleaning path planning and be as unrepeated as possible. The sweeping robot may support one or more different styles of sweeping routes. For example, the sweeping robot may support a bow-shaped cleaning path, an "L" shaped cleaning path, a square shaped cleaning path, a spiral travel set point cleaning path, and the like. Of course, in the process of executing the fixed-point cleaning task, the cleaning robot may also adopt a random cleaning mode to perform local cleaning according to the fixed-point cleaning parameters.
Further, before performing the spot-cleaning task on the spot-cleaning area according to the spot-cleaning parameters, the cleaning robot may acquire the spot-cleaning parameters in any of, but not limited to, the following manners.
Mode 1: the sweeping robot can preset default sweeping parameters corresponding to the fixed-point sweeping mode. Based on this, the sweeping robot can acquire the default sweeping parameters corresponding to the fixed-point sweeping mode as the fixed-point sweeping parameters.
Mode 2: after guiding the sweeping robot to the fixed-point sweeping area, the guiding object can send out a sweeping parameter setting action to set the fixed-point sweeping parameters. Based on the above, the sweeping robot can set actions according to the sweeping parameters acquired by the vision sensor and sent by the guide object, and set fixed-point sweeping parameters according to the sweeping parameter setting actions when the sweeping parameter setting actions sent by the guide object are acquired. For example, a corresponding relationship between the cleaning parameter setting action and the fixed point cleaning parameter may be preset, and the cleaning robot may match the collected cleaning parameter setting action in the corresponding relationship, and obtain the fixed point cleaning parameter in the matching.
Taking the guide object as an example, the user may set the fixed-point sweeping range by a stomping action. For example, assume that a user stomps once, meaning a spot sweep diameter of 50cm, stomps twice, meaning a spot sweep diameter of 100cm, and so on. Based on this, a spot sweep of directly 50cm can be set when the visual sensor senses that the user makes one stomping action.
Mode 3: besides guiding the sweeping robot to the fixed-point sweeping area, the guiding object can set fixed-point sweeping parameters in a voice mode. For example, the guidance object may issue a voice instruction for setting the cleaning parameters before guiding the cleaning robot to move to the fixed-point cleaning area, or during guiding the cleaning robot to move to the fixed-point cleaning area, or after guiding the cleaning robot to move to the fixed-point cleaning area. The sweeping robot can monitor whether the guide object sends a voice instruction for setting sweeping parameters; when the guiding object is monitored to send out a voice instruction for setting the cleaning parameters, the fixed-point cleaning parameters can be set according to the voice instruction. Optionally, the voice command may include fixed-point cleaning parameters such as a fixed-point cleaning range, a fixed-point cleaning time, a fixed-point cleaning total distance, and the like.
Mode 4: in the application process that a user controls the sweeping robot through the sweeping robot App on the intelligent terminal, the user can send a sweeping parameter setting instruction to the sweeping robot through the intelligent terminal so as to indicate the sweeping robot to set fixed-point sweeping parameters. For the sweeping robot, a sweeping parameter setting instruction sent by the intelligent terminal can be received, and fixed-point sweeping parameters are set according to the sweeping parameter setting instruction. Optionally, the cleaning parameter setting instruction may include fixed-point cleaning parameters such as a fixed-point cleaning range, a fixed-point cleaning time, a fixed-point cleaning total distance, and the like.
In order to facilitate understanding of the technical solution of the embodiment of the present application, the fixed point sweeping method provided by the embodiment of the present application is described in detail below with reference to some application scenarios.
In a household application scene, the cleaning device comprises areas to be cleaned, such as a living room, a kitchen or a bedroom, which have independent meanings. In these areas to be cleaned, spot cleaning may be required for a partial area. For example, some food waste may inadvertently fall off the kitchen floor, requiring a spot clean of the area where the food waste is located. For example, the lower area of a bed or a cabinet of a bedroom generally needs to be cleaned at regular intervals. In a public place such as a mall or a park, for example, frequent spot cleaning is required for an area where customers and visitors are concentrated, or more frequent spot cleaning is required for an area in the open air. For these application scenarios, the fixed-point cleaning can be performed by, but not limited to, the following method shown in fig. 4 and 5.
Fig. 4 is a flowchart illustrating a fixed-point sweeping method according to another exemplary embodiment of the present disclosure.
As shown in fig. 4, the method includes:
401. the sweeping robot receives a sweeping mode switching instruction sent by an intelligent terminal of the sweeping robot.
402. And the sweeping robot switches the cleaning mode from the common cleaning mode to the fixed-point cleaning mode according to the cleaning mode switching instruction.
403. The cleaning robot receives a cleaning parameter setting instruction sent by an intelligent terminal of the cleaning robot, and sets fixed-point cleaning parameters according to the cleaning parameter setting instruction.
Optionally, the cleaning parameter setting instruction includes a fixed-point cleaning range and a fixed-point cleaning time. The cleaning robot can set the fixed-point cleaning range and the fixed-point cleaning time contained in the cleaning parameter setting instruction as fixed-point cleaning parameters.
404. The sweeping robot starts a camera arranged in front of the sweeping robot to collect an environment image.
405. The sweeping robot rotates clockwise or anticlockwise respectively, and the camera is used for continuously collecting environment images in the rotating process until the environment images containing human legs are collected, and the human legs are determined to be detected.
In the process that the sweeping robot rotates clockwise or anticlockwise, the maximum rotation angle can be 90 degrees, 180 degrees, 270 degrees and the like.
406. The sweeping robot slowly rotates to adjust the advancing direction according to the position of the human leg in the environment image, and when the vertical central line of the human leg in the environment image is overlapped with the vertical central line of the environment image, first voice prompt information is output to prompt a user to whom the human leg belongs to start moving and start moving along with the user.
407. The sweeping robot follows and positions any one leg of a user by using a camera in the process of following the movement of the user.
Optionally, an environment image captured by the camera may be displayed on a screen of the intelligent terminal, so that the user can know the following situation of the robot in real time.
408. Judging whether a user is lost in the process of following the user or not according to an environment image acquired by a camera; if yes, go to step 409; otherwise, go to step 411.
For example, when a large turn occurs, it may happen that the sweeping robot loses track of the user.
409. The sweeping robot rotates clockwise or anticlockwise, and a camera is used for detecting the user or the human leg of the user again in the rotating process; if the user or the user's leg is not re-detected, then step 410 is entered; if the user or the user's leg is re-detected, step 411 is entered.
410. The sweeping robot outputs a second voice prompt to prompt the guiding object to return to the field of view of the sweeping robot, and returns to step 409 to re-detect the user or the user's legs.
411. The sweeping robot continuously judges whether the lower edges of the human legs in the environment image are overlapped with the lower edges of the environment image according to the positions of the positioned human legs in the environment image in the process of moving along with the user; if the judgment result is no, returning to step 407 to continue tracking the user; otherwise, go to step 412.
412. The sweeping robot takes the current position as the center of the fixed point sweeping area, and local arc sweeping is carried out according to the initialized fixed point sweeping parameters.
413. After the fixed-point cleaning task is finished, the cleaning robot switches the cleaning mode from the fixed-point cleaning mode to the ordinary cleaning mode.
In this embodiment, can install the App of robot of sweeping the floor on intelligent terminal, control the robot of sweeping the floor intelligently through this App, clean the setting of mode and fixed point cleaning parameter to the robot of sweeping the floor through this App, make the robot get into the fixed point and clean the mode, and clean the mode at the fixed point, based on the combination between visual sensor and the automatic guidance, automatically, follow the guide object and remove to the fixed point and clean the region and carry out the fixed point and clean the task, realized automatic fixed point and cleaned, no longer need the user manually to move the robot of sweeping the floor to the fixed point and clean the region, human resources have been practiced thrift, the automation of the robot of sweeping the floor has been improved, intelligent degree.
Fig. 5 is a flowchart illustrating a fixed-point sweeping method according to another exemplary embodiment of the present application.
As shown in fig. 5, the method includes:
501. after the sweeping robot is started, a voice instruction which is sent by a user and used for switching a sweeping mode is monitored.
502. And when a voice command is monitored, switching the cleaning mode from the ordinary cleaning mode to the fixed-point cleaning mode.
503. The sweeping robot starts a camera arranged in front of the sweeping robot to collect an environment image.
504. The sweeping robot rotates clockwise or anticlockwise respectively, and the camera is used for continuously collecting environment images in the rotating process until the environment images containing human legs are collected, and the human legs are determined to be detected.
505. The sweeping robot slowly rotates to adjust the advancing direction according to the position of the human leg in the environment image until the vertical central line of the human leg in the environment image is coincident with the vertical central line of the environment image, and then the sweeping robot starts to move along with the user.
506. The sweeping robot follows and positions any one leg of a user by using a camera in the process of following the movement of the user.
507. The sweeping robot continuously judges whether the lower edges of the human legs in the environment image are overlapped with the lower edges of the environment image according to the positioned positions of the human legs in the environment image; if the judgment result is no, returning to the step 506 to continue tracking the user; otherwise, go to step 508.
Optionally, the distance between the lower edge of the human leg in the environment image and the lower edge of the environment image may be calculated, and when the distance is 0, it is determined that the two coincide; otherwise, determining that the two are not coincident.
508. The sweeping robot takes the current position as the center of the fixed point sweeping area, obtains the default fixed point sweeping parameters of the system to perform local arc sweeping, and executes step 509.
509. After the fixed-point cleaning task is finished, the cleaning robot switches the cleaning mode from the fixed-point cleaning mode to the ordinary cleaning mode.
In this embodiment, the sweeping robot has a voice recognition function. The user can carry out the setting of the mode of cleaning of fixed point to the robot of sweeping the floor through voice command for the robot gets into the fixed point and cleans the mode, and under the fixed point mode of cleaning, based on the combination between visual sensor and the automatic guide, automatically follow the guide object and remove to the fixed point and clean the region and carry out the fixed point and clean the task, realized automatic fixed point and cleaned, no longer need the user manually to remove the robot of sweeping the floor to the fixed point and clean the region, manpower resources have been practiced thrift, the automation of the robot of sweeping the floor, intelligent degree have been improved.
It should be noted that in some of the flows described in the above embodiments and the drawings, a plurality of operations are included in a specific order, but it should be clearly understood that the operations may be executed out of the order presented herein or in parallel, and the sequence numbers of the operations, such as 401, 402, etc., are merely used for distinguishing different operations, and the sequence numbers do not represent any execution order per se. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
The embodiment of the application provides a sweeping robot besides some robot sweeping methods. As shown in fig. 6a, the sweeping robot 600 includes: the machine body 601 is provided with one or more processors 602, one or more memories 603 storing computer instructions, and a vision sensor 604.
In addition to one or more processors 602 and one or more memories 603, some basic components of the sweeping robot 600, such as a vision sensor 604, a sweeping component 605, a sensor component 606, a power component 607, a driving component 608, and the like, are disposed on the machine body 601. The vision sensor 604 may be a camera, a video camera, or the like. Alternatively, the drive assembly 608 may include drive wheels, drive motors, universal wheels, and the like. Alternatively, the sweeping assembly 605 may include a sweeping motor, a sweeping brush, a dusting brush, a dust suction fan, and the like. The basic components and the configurations of the basic components included in different sweeping robots 600 are different, and the embodiments of the present disclosure are only some examples.
It is noted that one or more processors 602 and one or more memories 603 may be disposed inside the machine body 601, or may be disposed on a surface of the machine body 601.
The machine body 601 is an execution mechanism by which the sweeping robot 600 performs a task, and can execute an operation designated by the processor 602 in a certain environment. The mechanical body 601 represents the appearance of the sweeping robot 600 to a certain extent. In the present embodiment, the appearance of the sweeping robot 600 is not limited, and may be, for example, circular, elliptical, triangular, or convex polygonal. As shown in fig. 6b, it is a line diagram of a circular sweeping robot.
The one or more memories 603 are used primarily to store computer instructions that are executable by the one or more processors 602 to cause the one or more processors 602 to control the sweeping robot 600 to perform a cleaning task. In addition to storing computer instructions, the one or more memories 603 may also be configured to store other various data to support operations on the sweeping robot 600. Examples of such data include instructions for any application or method operating on the sweeping robot 600, map data of the environment/scene in which the sweeping robot 600 is located, information of the area to be swept, sweep patterns, spot sweep parameters, and so forth.
The memory or memories 603 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The one or more processors 602, which may be considered a control system for the cleaning robot 600, may be configured to execute computer instructions stored in the one or more memories 603 to control the cleaning robot 600 to perform a fixed-point cleaning task.
In this embodiment, the process of controlling the cleaning robot 600 to perform the fixed-point cleaning task by the one or more processors 602 is as follows:
in the spot cleaning mode, a movable guide object is detected based on the vision sensor 604;
after the guiding object is detected, the sweeping robot 600 is controlled to move to the fixed point sweeping area along with the guiding object; and
and controlling the sweeping robot 600 to execute the fixed-point sweeping task in the fixed-point sweeping area according to the fixed-point sweeping parameters.
The cleaning robot 600 provided by the embodiment supports a fixed-point cleaning mode and a common cleaning mode. The normal cleaning mode is a general term for cleaning modes other than the fixed point cleaning mode, and there may be one or more of the normal cleaning modes. The fixed point cleaning mode is a mode which needs to perform local cleaning aiming at a fixed point cleaning area; the general cleaning mode is a mode in which the area to be cleaned needs to be cleaned in an overall manner.
The area to be cleaned is a physical area which is required to be in charge of cleaning in the working environment of the sweeping robot 600 and has an independent meaning. Accordingly, the fixed-point cleaning area refers to a local area in the area to be cleaned, which needs to be cleaned at a fixed point, and may be a small area where a small amount of garbage or garbage is concentrated.
In different operation scenes, the areas to be cleaned are different; accordingly, the spot-sweeping areas in the area to be swept may also be different. For example, the area to be cleaned may be a room in a room, such as a living room, a kitchen, a bedroom, etc.; accordingly, the spot-sweeping area may be a corner of a living room, an area in a kitchen, an area below a bed in a bedroom, etc. For another example, the area to be cleaned may be a corridor, a meeting place, a sports ground, or the like; accordingly, the spot-sweeping area may be a certain area in a corridor, a meeting place, a sports ground.
In some exemplary embodiments, the sweeping robot 600 is normally in a normal sweeping mode, and when the sweeping robot 600 is required to perform a fixed-point sweeping task, the one or more processors 602 need to control the sweeping robot 600 to enter the fixed-point sweeping mode. The manner in which the sweeping robot 600 enters the fixed-point sweeping mode may also be different according to different factors such as the function, capability, and implementation form of the sweeping robot 600. Several ways in which the sweeping robot 600 can enter the fixed point sweeping mode are listed below:
mode A: the one or more processors 602 may monitor a voice command of the user, and switch the cleaning mode of the cleaning robot 600 from the normal cleaning mode to the fixed-point cleaning mode when the voice command for switching the cleaning mode is monitored.
Mode B: the mechanical body 601 is further provided with a cleaning mode switching button, which belongs to a physical button and is mainly used for a user to switch the cleaning mode of the floor sweeping robot 600. The one or more processors 602 may listen for an event that the purge mode switch button is pressed; when an event that the cleaning mode switching button is pressed is monitored, the cleaning mode of the cleaning robot 600 may be switched from the general cleaning mode to the spot cleaning mode in response to the event.
Mode C: the one or more processors 602 may receive a cleaning mode switching instruction sent by an intelligent terminal for controlling the cleaning robot 600, and switch the cleaning mode from the normal cleaning mode to the fixed-point cleaning mode according to the cleaning mode switching instruction.
Mode D: the machine body 601 is further provided with a touch panel, which is a human-computer interface for a user to control, set and the like the sweeping robot 600. The one or more processors 602 may detect a user operation on the touch panel, and switch the cleaning mode from the normal cleaning mode to the spot cleaning mode according to a cleaning mode switching operation when the cleaning mode switching operation is detected on the touch panel.
It should be noted that the sweeping robot 600 may support one of the above modes, or may support two or more modes at the same time. Of course, the sweeping robot 600 may support other cleaning mode switching methods besides the above-mentioned methods.
In some exemplary embodiments, the one or more processors 602, when controlling the sweeping robot to detect the guide object, are specifically configured to: the sweeping robot 600 is controlled to rotate clockwise or counterclockwise, the visual sensor 604 is used for continuously collecting the environment image in the rotation process of the sweeping robot 600, whether the collected environment image contains preset guide object parameters or not is identified, and when the environment image containing the preset guide object parameters is collected, the guide object can be determined to be detected. The guidance object parameter refers to a parameter that can uniquely identify the guidance object, and is stored in the sweeping robot 600 in advance. The parameters of the guiding object are different according to the guiding object. For example, if the guidance object is a user, the guidance object parameters may be some human body feature parameters describing the user, such as human leg features, facial features, and the like. For example, if the guidance target is a humanoid robot having a guidance function, the guidance target parameters are characteristic parameters describing the form of the humanoid robot, and include, for example, characteristics such as a humanoid arm, a leg, and a head.
It is worth noting that the one or more processors 602 can control the sweeping robot 600 to rotate clockwise or counterclockwise by a maximum angle of 360 °. Alternatively, in order to save resources consumed by detecting the guidance object, a maximum rotation angle, for example, 90 °, 180 °, 200 °, may be set in advance according to an application scenario. Based on this, the one or more processors 602 may rotate the sweeping robot 600 clockwise or counterclockwise, and if the guidance object is detected before rotating 90 °, 180 °, or 200 °, the rotation operation is ended; if the guiding object is not detected when the guiding object is rotated by 90 degrees, 180 degrees or 200 degrees, the current detection process is ended, and an error can be reported or a prompt tone can be output to prompt the guiding object to enter the visual field range of the guiding object as soon as possible.
In some exemplary embodiments, the one or more processors 602, when controlling the sweeping robot 600 to follow the guide object to move to the spot-sweeping area, are specifically configured to: according to the position of the guide object, the advancing direction of the sweeping robot 600 is adjusted to the direction of the guide object, and the sweeping robot 600 is controlled to start from the adjusted advancing direction and move to the fixed-point sweeping area along with the guide object.
Further, when the forward direction of the sweeping robot 600 is adjusted to the direction of the guiding object, the one or more processors 602 are specifically configured to: the sweeping robot 600 is controlled to rotate clockwise or counterclockwise, and when the environment image containing the guiding object is continuously collected by the vision sensor 604 in the rotation process of the sweeping robot 600 until the first target environment image is collected, the self advancing direction is determined to be the same as the direction of the guiding object. And the position relation between the vertical central line of the first target environment image and the vertical central line of the guide object in the first target environment image accords with the set central line position relation.
In some optional embodiments, the one or more processors 602 may output a first voice prompt to prompt the guiding object to start moving before controlling the sweeping robot 600 to start moving to the fixed point sweeping area following the guiding object from the adjusted forward direction.
In some optional embodiments, the one or more processors 602 further need to identify whether the fixed-point cleaning area is reached during the process of controlling the sweeping robot 600 to start moving to the fixed-point cleaning area following the guiding object from the adjusted forward direction. Optionally, the one or more processors 602 are specifically configured to: in the process of controlling the sweeping robot 600 to move along with the guide object, continuously acquiring an environment image containing the guide object by using the visual sensor 604, and judging whether a second target environment image is acquired; and when the second target environment image is acquired, determining that the fixed point cleaning area is reached. And the distance between the lower edge of the second target environment image and the lower edge of the guide object in the second target environment image meets the set edge distance requirement.
In an actual application scenario, in the process of guiding the sweeping robot 600 to the fixed-point sweeping area by the guiding object, there may be a situation of a large turn, a large obstacle blocking, and the like. In these cases, the sweeping robot 600 may lose track of the guiding object, i.e., the visual sensor 604 of the sweeping robot 600 cannot detect the guiding object. For the case of a lost guide object occurring during the movement of the following guide object, the one or more processors 602 may be further operable to: controlling the sweeping robot 600 to rotate clockwise or counterclockwise, and detecting the guide object again by using the visual sensor 604 in the rotation process of the sweeping robot 600; if the guide object is detected again, the guide object is continuously moved along with the guide object; if the guiding object cannot be detected again, a second voice prompt message is output to prompt the guiding object to return to the visual field range of the sweeping robot 600.
In some optional embodiments, the one or more processors 602, when controlling the sweeping robot 600 to perform the fixed-point sweeping task in the fixed-point sweeping area according to the fixed-point sweeping parameters, are specifically configured to: the sweeping robot 600 is controlled to place the current position of the sweeping robot as the center of the fixed-point sweeping area, and local sweeping is carried out according to fixed-point sweeping parameters; or, the sweeping robot 600 is controlled to perform local sweeping by using the current position as the starting point of the fixed point sweeping area according to the fixed point sweeping parameters.
In some optional embodiments, the one or more processors 602 are further configured to perform at least one of the following operations before controlling the sweeping robot 600 to perform the spot-sweeping task at the spot-sweeping area according to the spot-sweeping parameters:
acquiring fixed-point cleaning parameters from default cleaning parameters corresponding to the fixed-point cleaning mode;
setting fixed-point cleaning parameters according to the cleaning parameter setting action sent by the guide object acquired by the vision sensor 604;
setting fixed-point cleaning parameters according to a voice instruction which is sent by a guide object and used for setting the cleaning parameters;
setting fixed-point cleaning parameters according to a cleaning parameter setting instruction sent by an intelligent terminal for controlling the cleaning robot 600;
the fixed point cleaning parameter comprises at least one of a fixed point cleaning range, fixed point cleaning time, fixed point cleaning power consumption and fixed point cleaning total distance.
The robot of sweeping floor that this embodiment provided, under the mode is cleaned to the fixed point, the robot of sweeping floor can be based on visual sensor detection guide object, and then follow the guide object and remove to the fixed point and clean the region, cleans the parameter at the fixed point and clean the regional execution fixed point and clean the task according to the fixed point, has realized automatic fixed point and has cleaned, no longer need the user manually to move the robot of sweeping floor to the fixed point and clean the region, has practiced thrift manpower resources, has improved the automation, the intelligent degree of the robot of sweeping floor. In addition, the sweeping robot provided by the embodiment does not need to build an environment map in advance under the cooperation of the guide object, so that the computing resource can be saved, the adverse effects of factors such as insufficient precision of the environment map and the like can be overcome, the fixed-point cleaning area can be more accurately positioned, and the positioning accuracy is higher.
Fig. 7 is a schematic structural diagram of a fixed-point sweeping control device according to still another exemplary embodiment of the present application. The control device can be used as a functional module of the sweeping robot and is built in the sweeping robot to realize the sweeping, or the control device can also be realized independently of the sweeping robot, but is in communication connection with the sweeping robot, as shown in fig. 7, the control device comprises: a detection module 71, a follow-up control module 72 and a fixed-point sweeping control module 73.
The detection module 71 is configured to detect a movable guiding object based on a vision sensor of the sweeping robot in the fixed-point sweeping mode.
And the following control module 72 is used for controlling the sweeping robot to move to the fixed point sweeping area along with the guide object after the guide object is detected by the detection module 71.
And the fixed-point cleaning control module 73 is used for controlling the cleaning robot to execute a fixed-point cleaning task in the fixed-point cleaning area according to the fixed-point cleaning parameters.
In an optional embodiment, the detection module 71 is further configured to perform at least one of the following operations before detecting the movable guiding object based on the vision sensor of the sweeping robot:
responding to a voice instruction sent by a user for switching a cleaning mode, and switching the cleaning mode from a common cleaning mode to a fixed-point cleaning mode;
switching a cleaning mode from a normal cleaning mode to a fixed point cleaning mode in response to an event that a cleaning mode switching button on the cleaning robot is pressed;
according to a cleaning mode switching instruction sent by an intelligent terminal of the cleaning robot, switching a cleaning mode from a common cleaning mode to a fixed-point cleaning mode;
and switching the cleaning mode from the common cleaning mode to the fixed-point cleaning mode according to the cleaning mode switching operation detected on the touch panel of the cleaning robot.
In an alternative embodiment, the detection module 71 is specifically configured to: controlling the sweeping robot to rotate clockwise or anticlockwise, and continuously acquiring an environment image by using a visual sensor in the rotation process of the sweeping robot; when an environment image containing preset guide object parameters is acquired, the guide object is determined to be detected.
In an optional embodiment, the following control module 72 is specifically configured to, when controlling the sweeping robot to move to the fixed point sweeping area following the guiding object: adjusting the advancing direction of the sweeping robot to the direction of the guide object based on the position of the guide object; and controlling the sweeping robot to move to the fixed-point sweeping area along with the guide object from the self-adjusted advancing direction.
Further optionally, when the following control module 72 adjusts the advancing direction of the sweeping robot to the direction in which the guiding object is located, specifically, the following control module is configured to: the sweeping robot is controlled to rotate clockwise or anticlockwise, the environment images containing the guide object are continuously collected by the vision sensor in the rotation process of the sweeping robot until the first target environment image is collected, and the position relation between the vertical central line of the first target environment image and the vertical central line of the guide object in the first target environment image accords with the set central line position relation.
Further optionally, the follow-up control module 72 is further configured to: and outputting first voice prompt information to prompt the guide object to start moving before the self-adjusted advancing direction starts to move to the fixed point cleaning area along with the guide object.
In an alternative embodiment, the follow-up control module 72 is specifically configured to: controlling the self-adjusted advancing direction of the sweeping robot to start to move along with the guide object, and continuously acquiring an environment image containing the guide object by using a visual sensor in the sweeping robot process; and when a second target environment image is acquired, determining that the fixed point cleaning area is reached, wherein the distance between the lower edge of the second target environment image and the lower edge of the guide object in the second target environment image meets the set edge distance requirement.
Further optionally, the follow-up control module 72 is further configured to: when the sweeping robot is controlled to lose the guide object, the sweeping robot is controlled to rotate clockwise or anticlockwise, and the guide object is detected again by using the visual sensor in the rotation process of the sweeping robot; and if the guide object cannot be detected again, outputting second voice prompt information to prompt the guide object to return to the visual field range of the sweeping robot.
In an alternative embodiment, the fixed-point sweeping control module 73 is specifically configured to: controlling the sweeping robot to take the current position as the center of the fixed-point sweeping area, and locally sweeping according to the fixed-point sweeping parameters; or controlling the sweeping robot to take the current position as the starting point of the fixed point sweeping area, and performing local sweeping according to the fixed point sweeping parameters.
In an alternative embodiment, the fixed-point sweeping control module 73 is further configured to: before executing a fixed-point cleaning task in a fixed-point cleaning area according to the fixed-point cleaning parameters, executing any one of the following operations:
acquiring fixed-point cleaning parameters from default cleaning parameters corresponding to the fixed-point cleaning mode;
setting actions according to cleaning parameters sent by a guide object and acquired by a vision sensor, and setting fixed-point cleaning parameters;
setting fixed-point cleaning parameters according to a voice instruction which is sent by a guide object and used for setting the cleaning parameters;
setting fixed-point cleaning parameters according to a cleaning parameter setting instruction sent by an intelligent terminal of the cleaning robot;
the fixed point cleaning parameter comprises at least one of a fixed point cleaning range, fixed point cleaning time, fixed point cleaning power consumption and fixed point cleaning total distance.
The fixed point cleaning control device that this embodiment provided can be under the fixed point mode of cleaning, based on the vision sensor detection guide object of robot of sweeping the floor, and then the control robot of sweeping the floor is automatic to be followed the guide object and is removed to the fixed point and clean the region, clean the regional fixed point of execution and clean the task at the fixed point according to the fixed point, realized automatic fixed point and cleaned, no longer need the user manually to move the robot of sweeping the floor to the fixed point and clean the region, human resources have been practiced thrift, the automation of robot of sweeping the floor, intelligent degree have been improved.
Accordingly, embodiments of the present application also provide a computer-readable storage medium storing computer instructions, which when executed by one or more processors, cause the one or more processors to perform actions comprising:
detecting a movable guide object based on a vision sensor of the sweeping robot in a fixed-point sweeping mode;
after the guide object is detected, controlling the sweeping robot to move to the fixed point sweeping area along with the guide object;
and controlling the sweeping robot to execute the fixed-point sweeping task in the fixed-point sweeping area according to the fixed-point sweeping parameters.
In an alternative embodiment, the acts performed by the one or more processors further include at least one of:
responding to a voice instruction sent by a user for switching a cleaning mode, and switching the cleaning mode from a common cleaning mode to a fixed-point cleaning mode;
switching a cleaning mode from a normal cleaning mode to a fixed point cleaning mode in response to an event that a cleaning mode switching button on the cleaning robot is pressed;
according to a cleaning mode switching instruction sent by an intelligent terminal of the cleaning robot, switching a cleaning mode from a common cleaning mode to a fixed-point cleaning mode;
and switching the cleaning mode from the common cleaning mode to the fixed-point cleaning mode according to the cleaning mode switching operation detected on the touch panel of the cleaning robot.
In an optional embodiment, the act of detecting the movable guiding object based on the vision sensor of the sweeping robot further includes: controlling the sweeping robot to rotate clockwise or anticlockwise, and continuously acquiring an environment image by using a visual sensor in the rotation process of the sweeping robot; when an environment image containing preset guide object parameters is acquired, the guide object is determined to be detected.
In an optional embodiment, the act of controlling the sweeping robot to move to the fixed point sweeping area following the guide object further includes: adjusting the advancing direction of the sweeping robot to the direction of the guide object based on the position of the guide object; and controlling the sweeping robot to move to the fixed-point sweeping area along with the guide object from the self-adjusted advancing direction.
Further optionally, the adjusting the forward direction of the sweeping robot to the direction in which the guiding object is located further includes: and controlling the sweeping robot to rotate clockwise or anticlockwise, and continuously acquiring the environment images containing the guide object by using the visual sensor in the rotation process of the sweeping robot until a second target environment image is acquired, wherein the position relation between the vertical central line of the second target environment image and the vertical central line of the guide object in the second target environment image accords with the set central line position relation.
Further optionally, the actions performed by the one or more processors further comprise: and outputting first voice prompt information to prompt the guide object to start moving before controlling the self-adjusted advancing direction of the sweeping robot to start moving along with the guide object.
In an optional embodiment, the act of controlling the sweeping robot to start moving to the fixed point sweeping area following the guiding object from the adjusted forward direction further includes: controlling the self-adjusted advancing direction of the sweeping robot to start to move along with the guide object, and continuously acquiring an environment image containing the guide object by using a visual sensor in the moving process of the sweeping robot; and when a second target environment image is acquired, determining that the fixed point cleaning area is reached, wherein the distance between the lower edge of the second target environment image and the lower edge of the guide object in the second target environment image meets the set edge distance requirement.
Further optionally, the actions performed by the one or more processors further comprise: when the sweeping robot is controlled to lose the guide object, the sweeping robot is controlled to rotate clockwise or anticlockwise, and the guide object is detected again by using the visual sensor in the rotation process of the sweeping robot; and if the guide object cannot be detected again, outputting second voice prompt information to prompt the guide object to return to the visual field range of the sweeping robot.
In an optional embodiment, the act of controlling the sweeping robot to perform the fixed-point sweeping task in the fixed-point sweeping area according to the fixed-point sweeping parameter further includes: controlling the sweeping robot to take the current position as the center of the fixed-point sweeping area, and locally sweeping according to the fixed-point sweeping parameters; or controlling the sweeping robot to take the current position as the starting point of the fixed point sweeping area, and performing local sweeping according to the fixed point sweeping parameters.
In an alternative embodiment, the acts performed by the one or more processors further include any of:
acquiring fixed-point cleaning parameters from default cleaning parameters corresponding to the fixed-point cleaning mode;
setting actions according to cleaning parameters sent by a guide object and acquired by a vision sensor, and setting fixed-point cleaning parameters;
setting fixed-point cleaning parameters according to a voice instruction which is sent by a guide object and used for setting the cleaning parameters;
setting fixed-point cleaning parameters according to a cleaning parameter setting instruction sent by an intelligent terminal of the cleaning robot;
the fixed point cleaning parameter comprises at least one of a fixed point cleaning range, fixed point cleaning time, fixed point cleaning power consumption and fixed point cleaning total distance.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (18)

1. A fixed-point working method is suitable for a self-moving robot, and is characterized by comprising the following steps:
detecting a movable guide object based on a vision sensor in a fixed-point working mode;
following the guide object movement after detecting the guide object;
identifying whether the self-moving robot reaches the fixed point working area or not in the process of moving to the fixed point working area along with the guide object;
and after the fixed point work area is identified, executing the fixed point work task in the fixed point work area according to the fixed point work parameters.
2. The method of claim 1, further comprising, prior to detecting the movable guide object based on the visual sensor, at least one of:
responding to a voice instruction sent by a user for switching the working mode, and switching the working mode from the common working mode to the fixed-point working mode;
responding to an event that a working mode switching button on the self-moving robot is pressed, and switching a working mode from a common working mode to a fixed-point working mode;
switching the working mode from a common working mode to a fixed-point working mode according to a working mode switching instruction sent by an intelligent terminal for controlling the self-moving robot;
and switching the working mode from a common working mode to a fixed-point working mode according to the working mode switching operation detected on the touch panel of the self-moving robot.
3. The method of claim 1, wherein the visual-sensor-based detection of the movable guide object comprises:
the self-moving robot rotates clockwise or anticlockwise, and the vision sensor is used for continuously acquiring environment images in the rotating process;
when an environment image containing preset guide object parameters is acquired, the guide object is determined to be detected.
4. The method of claim 1, wherein the following the guide object movement comprises:
adjusting the advancing direction to the direction of the guide object based on the position of the guide object;
the self-adjusted advancing direction starts and moves following the guide object.
5. The method of claim 4, wherein the adjusting the heading to the direction in which the guide object is based on the position of the guide object comprises:
the self-moving robot rotates clockwise or anticlockwise, and continuously acquires environment images containing the guide object by using the vision sensor in the rotation process until a first target environment image is acquired, wherein the position relation between the vertical central line of the first target environment image and the vertical central line of the guide object in the first target environment image conforms to the set central line position relation.
6. The method according to claim 4, before following the guide object to move starting from the self-adjusted advancing direction, further comprising:
and outputting first voice prompt information to prompt the guide object to start moving.
7. The method of claim 4, wherein the self-adjusted heading, beginning with, moves following the guide object, comprising:
starting from the self-adjusted advancing direction, moving along with the guide object, and continuously acquiring an environment image containing the guide object by using the vision sensor in the process of moving along with the guide object;
the identifying whether the fixed point work area is reached comprises:
and when a second target environment image is acquired, determining that the fixed point working area is reached, wherein the distance between the lower edge of the second target environment image and the lower edge of the guide object in the second target environment image meets the set edge distance requirement.
8. The method of claim 7, further comprising:
if the guide object is lost in the process of moving along with the guide object, the self-moving robot rotates clockwise or anticlockwise, and the visual sensor is used for detecting the guide object again in the rotating process;
and if the guide object cannot be detected again, outputting second voice prompt information to prompt the guide object to return to the visual field range of the self-moving robot.
9. The method of claim 1, wherein performing a fixed-point work task in the fixed-point work area according to fixed-point work parameters comprises:
taking the current position of the self-moving robot as the center of the fixed point working area, and performing local work according to the fixed point working parameters; or
And taking the current position of the self-moving robot as the starting point of the fixed point working area, and carrying out local work according to the fixed point working parameters.
10. The method according to any one of claims 1-9, further comprising any one of the following operations before performing the fixed point work task in the fixed point work area according to the fixed point work parameters:
acquiring the fixed point working parameters from default working parameters corresponding to the fixed point working mode;
setting the fixed-point working parameters according to the working parameter setting action sent by the guide object and acquired by the vision sensor;
setting the fixed-point working parameters according to a voice instruction which is sent by the guide object and used for setting working parameters;
setting the fixed-point working parameters according to a working parameter setting instruction sent by an intelligent terminal for controlling the self-moving robot;
the fixed-point working parameters comprise at least one of a fixed-point working range, fixed-point working time, fixed-point working power consumption and fixed-point working total distance.
11. The method according to any one of claims 1-9, further comprising:
and after the fixed-point work task is finished, switching the work mode from the fixed-point work mode to a common work mode.
12. The method of any of claims 1-9, wherein the guiding object is a user or a leg of a user.
13. A self-moving robot, comprising: the machine comprises a machine body, wherein the machine body is provided with a visual sensor, one or more processors and one or more memories for storing computer instructions;
the one or more processors to execute the computer instructions to:
detecting a movable guide object based on the vision sensor in a pointing mode of operation;
after the guide object is detected, controlling the self-moving robot to move along with the guide object;
identifying whether the self-moving robot reaches the fixed point working area or not in the process of moving to the fixed point working area along with the guide object; and
and after the fixed-point working area is identified, controlling the self-moving robot to execute the fixed-point working task in the fixed-point working area according to the fixed-point working parameters.
14. The self-moving robot of claim 13, wherein the one or more processors, when detecting the guidance object, are specifically configured to:
controlling the self-moving robot to rotate clockwise or anticlockwise, and continuously acquiring an environment image by using the visual sensor in the rotation process of the self-moving robot;
when an environment image containing preset guide object parameters is acquired, the guide object is determined to be detected.
15. The self-moving robot according to claim 13, wherein the one or more processors are configured to, in controlling the self-moving robot to follow the guiding object, in particular:
adjusting the advancing direction of the self-moving robot to the direction in which the guide object is located based on the position of the guide object;
and controlling the self-moving robot to start moving along the self-adjusted advancing direction and move along with the guide object.
16. The self-moving robot of claim 15, wherein the one or more processors, when adjusting the heading of the self-moving robot to the direction in which the guiding object is located, are specifically configured to:
and controlling the self-moving robot to rotate clockwise or anticlockwise, and continuously acquiring environment images containing the guide object by using the vision sensor in the rotation process of the self-moving robot until a first target environment image is acquired, wherein the position relation between the vertical central line of the first target environment image and the vertical central line of the guide object in the first target environment image accords with a set central line position relation.
17. The self-moving robot according to claim 15, wherein the one or more processors, when controlling the self-moving robot to follow the guiding object starting from the self-adjusted heading, are specifically configured to:
controlling the self-moving robot to start moving along with the guide object after self-adjustment, and continuously acquiring an environment image containing the guide object by using the vision sensor in the self-moving robot process;
and when a second target environment image is acquired, determining that the fixed point working area is reached, wherein the distance between the lower edge of the second target environment image and the lower edge of the guide object in the second target environment image meets the set edge distance requirement.
18. A computer-readable storage medium having stored thereon computer instructions, which when executed by one or more processors, cause the one or more processors to perform acts comprising:
detecting a movable guide object based on a vision sensor in a fixed-point working mode;
following the guide object movement after detecting the guide object;
identifying whether the self-moving robot reaches the fixed point working area or not in the process of moving to the fixed point working area along with the guide object;
and after the fixed point work area is identified, executing the fixed point work task in the fixed point work area according to the fixed point work parameters.
CN202110706064.3A 2018-06-07 2018-06-07 Fixed-point working method, self-moving robot and storage medium Pending CN113467448A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110706064.3A CN113467448A (en) 2018-06-07 2018-06-07 Fixed-point working method, self-moving robot and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810582948.0A CN110575099B (en) 2018-06-07 2018-06-07 Fixed-point cleaning method, floor sweeping robot and storage medium
CN202110706064.3A CN113467448A (en) 2018-06-07 2018-06-07 Fixed-point working method, self-moving robot and storage medium

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201810582948.0A Division CN110575099B (en) 2018-06-07 2018-06-07 Fixed-point cleaning method, floor sweeping robot and storage medium

Publications (1)

Publication Number Publication Date
CN113467448A true CN113467448A (en) 2021-10-01

Family

ID=68808842

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110706064.3A Pending CN113467448A (en) 2018-06-07 2018-06-07 Fixed-point working method, self-moving robot and storage medium
CN201810582948.0A Active CN110575099B (en) 2018-06-07 2018-06-07 Fixed-point cleaning method, floor sweeping robot and storage medium

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201810582948.0A Active CN110575099B (en) 2018-06-07 2018-06-07 Fixed-point cleaning method, floor sweeping robot and storage medium

Country Status (1)

Country Link
CN (2) CN113467448A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114115250A (en) * 2021-11-11 2022-03-01 深圳市中舟智能科技有限公司 Robot motion map construction method, robot motion method and robot
WO2023219564A1 (en) * 2022-05-09 2023-11-16 Lionsbot International Pte. Ltd. A robot and a method of configuring and operating the robot
WO2024007739A1 (en) * 2022-07-06 2024-01-11 珠海格力电器股份有限公司 Self-moving device deployment method and apparatus, electronic device, and storage medium

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109330494A (en) * 2018-11-01 2019-02-15 珠海格力电器股份有限公司 Sweeping robot control method based on action recognition, system, sweeping robot
CN113116224B (en) * 2020-01-15 2022-07-05 科沃斯机器人股份有限公司 Robot and control method thereof
CN111358368A (en) * 2020-03-05 2020-07-03 宁波大学 Manual guide type floor sweeping robot
CN111665523B (en) * 2020-06-10 2022-11-18 上海有个机器人有限公司 Obstacle detection method and apparatus
CN111813003A (en) * 2020-07-16 2020-10-23 苏州鼎威新能源有限公司 Equipment control method and device of photovoltaic cleaning equipment and storage medium
CN113243829A (en) * 2021-04-09 2021-08-13 深圳市无限动力发展有限公司 Tracking cleaning method and device of sweeper and computer equipment
CN117148836A (en) * 2021-08-20 2023-12-01 科沃斯机器人股份有限公司 Self-moving robot control method, device, equipment and readable storage medium
CN117835884A (en) * 2021-12-31 2024-04-05 深圳市闪至科技有限公司 Control method of robot, method and device for controlling robot to return to base and robot
WO2023217190A1 (en) * 2022-05-10 2023-11-16 美智纵横科技有限责任公司 Cleaning method, cleaning apparatus, cleaning device, and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120316680A1 (en) * 2011-06-13 2012-12-13 Microsoft Corporation Tracking and following of moving objects by a mobile robot
CN207051738U (en) * 2017-06-12 2018-02-27 炬大科技有限公司 A kind of mobile electronic device
CN207115193U (en) * 2017-07-26 2018-03-16 炬大科技有限公司 A kind of mobile electronic device for being used to handle the task of mission area
CN107908195A (en) * 2017-11-06 2018-04-13 深圳市道通智能航空技术有限公司 Target tracking method, device, tracker and computer-readable recording medium

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101084817B (en) * 2007-04-26 2012-08-22 复旦大学 Opening intelligent calculation frame household multifunctional small-sized service robot
WO2013027244A1 (en) * 2011-08-24 2013-02-28 三菱電機株式会社 Navigation device
CN103284662B (en) * 2012-03-02 2016-09-21 恩斯迈电子(深圳)有限公司 Cleaning system and control method thereof
JP5635719B2 (en) * 2012-03-15 2014-12-03 アイロボット コーポレイション Flexible solid bumper for robots
ES2613138T3 (en) * 2013-08-23 2017-05-22 Lg Electronics Inc. Robot cleaner and method to control it
CN104460663A (en) * 2013-09-23 2015-03-25 科沃斯机器人科技(苏州)有限公司 Method for controlling cleaning robot through smart phone
CN105573309A (en) * 2014-10-10 2016-05-11 杜勇 Sweeper adopting photo-electromagnetic combination for positioning
US9798328B2 (en) * 2014-10-10 2017-10-24 Irobot Corporation Mobile robot area cleaning
US9519289B2 (en) * 2014-11-26 2016-12-13 Irobot Corporation Systems and methods for performing simultaneous localization and mapping using machine vision systems
CN204431256U (en) * 2014-12-28 2015-07-01 青岛通产软件科技有限公司 A kind of dining room service robot
US20160260142A1 (en) * 2015-03-06 2016-09-08 Wal-Mart Stores, Inc. Shopping facility assistance systems, devices and methods to support requesting in-person assistance
CN106155049A (en) * 2015-04-15 2016-11-23 小米科技有限责任公司 Intelligent cleaning equipment and bootstrap technique, guiding stake, intelligent cleaning system
CN205006815U (en) * 2015-09-23 2016-02-03 广东美的制冷设备有限公司 Automatic clean all -in -one with air purification
KR102565501B1 (en) * 2016-08-01 2023-08-11 삼성전자주식회사 A robotic cleaner, a refrigerator, a system of delivery of a container and a method of delivering and retrieving of a container of the refrigerator using the robotic cleaner

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120316680A1 (en) * 2011-06-13 2012-12-13 Microsoft Corporation Tracking and following of moving objects by a mobile robot
CN207051738U (en) * 2017-06-12 2018-02-27 炬大科技有限公司 A kind of mobile electronic device
CN207115193U (en) * 2017-07-26 2018-03-16 炬大科技有限公司 A kind of mobile electronic device for being used to handle the task of mission area
CN107908195A (en) * 2017-11-06 2018-04-13 深圳市道通智能航空技术有限公司 Target tracking method, device, tracker and computer-readable recording medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114115250A (en) * 2021-11-11 2022-03-01 深圳市中舟智能科技有限公司 Robot motion map construction method, robot motion method and robot
WO2023219564A1 (en) * 2022-05-09 2023-11-16 Lionsbot International Pte. Ltd. A robot and a method of configuring and operating the robot
WO2024007739A1 (en) * 2022-07-06 2024-01-11 珠海格力电器股份有限公司 Self-moving device deployment method and apparatus, electronic device, and storage medium

Also Published As

Publication number Publication date
CN110575099A (en) 2019-12-17
CN110575099B (en) 2021-07-27

Similar Documents

Publication Publication Date Title
CN110575099B (en) Fixed-point cleaning method, floor sweeping robot and storage medium
US11709497B2 (en) Method for controlling an autonomous mobile robot
US11412906B2 (en) Cleaning robot traveling using region-based human activity data and method of driving cleaning robot
CN108759844B (en) Robot repositioning and environment map constructing method, robot and storage medium
US11226633B2 (en) Mobile robot and method of controlling the same
RU2624737C2 (en) Method and device for cleaning waste
CN110362099B (en) Robot cleaning method, device, robot and storage medium
CN108733419B (en) Continuous awakening method and device of intelligent equipment, intelligent equipment and storage medium
CN111297282B (en) Water outlet control method and device, robot and storage medium
US20220015598A1 (en) Method and System For Path Sweeping of Cleaning Robot, and Chip
CN116509280A (en) Robot control method, robot, and storage medium
WO2023025023A1 (en) Cleaning method and apparatus of mobile robot, and storage medium and electronic apparatus
CN111714028A (en) Method, device and equipment for escaping from restricted zone of cleaning equipment and readable storage medium
CN111374607A (en) Target identification method and device based on sweeping robot, equipment and medium
CN110967703A (en) Indoor navigation method and indoor navigation device using laser radar and camera
CN111343696A (en) Communication method of self-moving equipment, self-moving equipment and storage medium
CN116711996A (en) Operation method, self-mobile device, and storage medium
CN109947094B (en) Travel method, self-moving device and storage medium
CN115373408A (en) Cleaning robot, control method, device, equipment and storage medium thereof
CN115469648A (en) Operation method, self-moving device and storage medium
CN116300844A (en) Intelligent control method and device for cleaning equipment
CN116687274B (en) Pluggable sweeping robot based on mobile phone and sweeping cleaning control method
WO2023217190A1 (en) Cleaning method, cleaning apparatus, cleaning device, and storage medium
CN115316887B (en) Robot control method, robot, and computer-readable storage medium
CN114680739B (en) Cleaning control method and device, intelligent equipment, mobile terminal and server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination